3.1. Plate management
Plate management during HTS varies significantly with the assay and typically includes at least three key components: multiwell plate-transporting robots that move plates through the workflow, liquid-handling workstations to dispense appropriate liquids with precision, and bar-coding devices that label and track individual plates throughout the screening workflow. A wide variety of commercially available instruments for each of these tasks come with varying ranges of precision, ease-of-integration, and required user interaction to operate. A caveat of incorporating multiple instruments into a screening workflow is that a failure of one instrument can be catastrophic, as it can hold up the entire pipeline.
3.1.1. Plate-transporting robots
For HTS, robotic arms along with multiwell plate stackers automate the loading of plates precisely and continuously into our microscope. To enable round-the-clock imaging, we integrated a KiNEDx robotic arm (Peak Robotics, Colorado Springs, CO., KX-300-435-TGP), which has customized grippers that load and unload plates of transfected neurons from stackers onto the microscope stage fitted with a customized plate holder. Having a robot arm with a built-in absolute encoder is important because if an emergency stop is triggered during the screening, then the robot will be able to know its last position and resume the screening from there instead of having to restart the run from the beginning.
3.1.2. Liquid-handling workstations
Liquid-handling workstations replace manual liquid pipetting. They are timesaving, use parallel sample preparation, and provide the precision required for HTS assays. Commercially available liquid-handling systems vary in ease and extent of integration with other equipment. Some systems provide additional screen-related functionalities, such as library reformatting, cherry picking, or pin-transfer. For all our HTS-related liquid-handling tasks, we use the MicroLab Starlet workstation (Hamilton, Reno, NV.). We chose the MicroLab Starlet due to its ease of integration with a robotic arm, the large number of plate locations on the deck, and the flexible scripting language. We use an eight channel head with compressed O-ring expansion for loading and seating tips and capacitance liquid level detection for reliable pipetting. The MicroLab Starlet is also equipped with tilting capabilities to ensure complete aspiration of liquids. We developed protocols in the workstation for many screening-related tasks, from the mundane (e.g., coating and washing plates) to the arduous (e.g., lipid transfections of primary neurons).
3.1.3. Bar-coding devices
Bar coding enables the researcher to manage and track multiwell plates in HTS. Some devices offer wider functionality and flexibility by being compatible with many bar-code symbologies and consist of a reader, printer, and applicator which can read or label the plate. The plate bar code and well number enable forward and backward tracking of data from individual plates, wells, and cells. It provides exceptional security in data tracking, minimizes errors, and allows real-time data exchange. Most commercial multiwell plates for HTS come preprinted with a bar code (standard or user defined). A bar-code reader is typically incorporated at various points in a HTS system depending upon the complexity of the workflow.
3.2. Our HTS microscopy platform
Two key challenges slowed the automation of live-cell fluorescence microscopy-based assays. First, formidable technical issues arose with the integration of the appropriate hardware and software controls to pass full control of image acquisition to a computer. Second, reducing the information rich image datasets from the automated runs into meaningful readouts is still an on-going challenge.
In this section, we describe our solution to the first challenge of creating a software and hardware system to fully automate image acquisition. We based our system on two principles: flexibility in imaging fluorophores within an assay and the ability to adapt the system to assays with different timescales. For example, maximizing our selection of filter sets and filter cubes has been a strong component of maintaining flexibility, and integrating the microscope with a robotic arm and controlled environment chamber ensured that it can handle assays lasting hours or days.
Our system is based on a Nikon Eclipse TE2000E-PFS microscope (Nikon, Melville, NY) with an epifluorescence illuminator and a transmitted light adapter. Motorized components control focus and position of the objectives, positions of filter cube turret, and operation of condenser turret. The microscope is equipped with the Perfect Focus System (Nikon, Melville, NY), which uses reflected light from a near-IR LED to maintain focus at a preset distance below the surface of a multiwell plate. To balance our need for acquiring a larger field of view and high-resolution images, we use a Nikon 20X Plan Fluor ELWD objective, with a 0.45 numerical aperture (NA) and a 6.9–8.2 mm working distance. This higher NA objective collects more light, enabling detection of dim signals, but the depth of field is narrower, so portions of a single cell can be out of focus because they are located above or below the focal plane. An intense and spatially even illumination from Lambda LS illuminator (Sutter Instrument Company, Novato, CA), which houses a 10-position 25-mm excitation wheel and a 300W CERMAX xenon arc lamp with an output range of 340–700 nm (PerkinElmer, Waltham, MA), is carried with a 3-mm liquid-light guide (Sutter Instruments) to the microscope. A separate 10-position 25-mm emission wheel (Sutter Instruments) is situated in front of the CCD camera. Hard-coated DAPI/FITC/Texas Red, C/Y/R, and DAPI/FITC/TRITC/CY5 filter sets from Chroma provide precise matching of the filter combination with the fluorophore across a wide range of the spectrum. A Lambda 10-3 controller (Sutter Instruments) rapidly switches excitation and emission filter wheels when assaying multiple FPs in the same image field and also controls a SmartShutter (Sutter Instruments). Images are collected with a CoolSnap HQ cooled CCD camera (Photometrics, Tucson, AZ) attached to the microscope through the basement port, to maximize the optical efficiency of the light path. Our camera has an LVDS interface that provides 12-bit images with 1392 × 1040 pixels. The CCD has a pixel size of 6.45 × 6.45 μm and is cooled to –30 °C to reduce thermal noise. Automated image acquisition during the HTS assay is controlled by custom-made scripts in Image-Pro Plus software (Media Cybernetics, Bethesda, MD). A multiwell plate is loaded from a plate stacker by a KiNEDx KX-300-435-TGP plate-transporting robot (Peak Robotics, Colorado Springs, CO) with a custom-made gripper onto a plate holder that fits into a MS-2000 XY (Applied Scientific Instrumentation, Eugene, OR) automated closed loop stage. The plate holder contains a computer-controlled actuator that positions the plate in the top left corner once it is loaded by the robot. This stage provides a 120 × 110 mm range of travel, and with the selection of a 25.40-mm lead screw pitch gives an XY axis resolution of 88 nm and speed of 7 mm/s. A linear encoder provides extra resolution and accuracy in the stage movements.
The microscope, robotic arm, and plate stackers are enclosed within a custom-built controlled environment chamber (Technical Instruments, Burlingame, CA) that maintains 37 °C and 5% CO2. The temperature is regulated by a model 300353, and the CO2 is regulated by a model AC100 (World Precision Instruments, Sarasota, FL) detector.
Once data acquisition is completed, the data are exported to a data storage and retrieval system. From then on, the image data moves through the analysis, statistics, mining, and reporting pipelines, which are described in the subsequent sections.
3.3. Data storage and retrieval systems
Storing high-resolution images from high-content screen (HCS) data sets poses significant challenges because the files are large (Megapixel CCDs with 12–16 bit depth can result in megabytes worth of data for each image), and thousands of images can be acquired from a single 96-well plate. Multichannel, longitudinal acquisitions therefore result in tens of gigabytes of raw image data for each plate and terabytes for full screens. The time spent transferring data from acquisition hardware to servers can be lengthy. If errors occur, data can be lost. Before starting an HCS experiment, proper infrastructure should be in place to adequately manage the resulting data.
Two general approaches can be used to organize images. For relatively simple screens, a hierarchical folder structure can be saved on a server with a root folder for each experiment and subfolders for further levels of organization. Our root folders use a name with the date, an experiment descriptor, and subfolders for each fluorescence channel. Each image file name has descriptors so individual images are unambiguously placed in the appropriate folder, and analysis programs can parse the image filenames and logically group them. In longitudinal experiments, we have chosen to embed the date, a unique plate identifier, time point, well, montage index, and fluorescence channel into the filename separated by underscores. Ideally, a more detailed description of each image is contained in the image metadata to allow other viewers to understand exactly what is contained in the image and how it was acquired. Although admirable efforts are being made to standardize image formats and metadata (Linkert et al., 2010
), adoption of these standards by HCS imaging platforms will likely take time. The second approach involves a relational database. Databases are optimal for storing data from larger screens and are the preferred method for labs with the resources to create and manage them. The up-front cost of time and money is greater, but they offer advantages for querying information, grouping data across multiple experiments, and carrying out retrospective analyses for secondary endpoints.
Multiple open source database management programs are available. The most popular are based on the MySQL or PostgreSQL specifications. OMERO (Open Microscope Environment Remote Objects) is a combined client–server platform based on PostgreSQL that can be used for image visualization, management, and analysis. It is easy to implement data management solution for labs that want the advantages of a database but do not want to develop one on their own. MySQL or PostgreSQL directly offer more flexibility and customization but require significantly more technical expertise in database development. Almost all of the high-content imaging platforms have their own proprietary database software that interfaces directly with the imaging hardware. Labs that purchase a bundled HCS system can take advantage of these data management solutions, but it is important to know exactly how flexible and extensible they are and whether they export data into portable file formats, such as delimited text files. These common file formats can be used to transfer data to new imaging systems upon technology upgrades and can be critical for sharing data among labs.
Finally, raw image files can be stored in a hierarchical folder structure, and analysis programs can extract information from these images and save the resulting data into a database. In this approach, the images are not directly contained in the database, but fields within the database can point to the locations of the images on the server. Proprietary data flow programs, such as Pipeline Pilot, facilitate easy reading and writing of images from folders and can interface with databases for storage of analysis data.
Whatever method of storage and retrieval is chosen, there should be a clear vision of how the data relate to each other and what the most intuitive overarching structure is for containing the data. In longitudinal experiments where we track individual neurons over time, we use the data structure represented in . Once a database is created that is adequately flexible to store the results from of a large number of experiments, data mining programs can compare data across experiments, generate hypotheses, and even retrospectively test these hypotheses with existing data. Data sources can be queried through structured query language (SQL) scripts, and the resulting tables can be imported into analysis programs. R Statistics (www.r-project.org
) can interface directly with databases through the RODBC package. Researches can then use the full suite of R statistical capabilities to analyze the query results. Alternatively, highly powerful proprietary software packages, such as Spotfire (TIBCO, Palo Alto, CA) and Pipeline Pilot, provide a more accessible way to visualize data and generate reports.
Schematic showing the data structure for our data storage and retrieval system. The data are organized based on the experiment and are broken down until each neuron has a unique cell ID and its own morphological and intensity data.