|Home | About | Journals | Submit | Contact Us | Français|
Lensfree on-chip holographic microscopy is an emerging technique that offers imaging of biological specimens over a large field-of-view without using any lenses or bulky optical components. Lending itself to a compact, cost-effective and mechanically robust architecture, lensfree on-chip holographic microscopy can offer an alternative toolset addressing some of the emerging needs of microscopic analysis and diagnostics in low-resource settings, especially for telemedicine applications. In this review, we summarize the latest achievements in lensfree optical microscopy based on partially coherent on-chip holography, including portable telemedicine microscopy, cell-phone based microscopy and field-portable optical tomographic microscopy. We also discuss some of the future directions for telemedicine microscopy and its prospects to help combat various global health challenges.
Light microscopy has provided us with the key to observe structures that are orders of magnitude smaller than what naked eye can naturally see. As a result, optical microscopy continues to be an indispensible platform in life sciences and medicine, both as a research and clinical tool. Owing to the increase in the spatial and temporal resolution of conventional microscopes, followed by the expansion of the available sources of optical contrast (such as scattering, absorption, refraction, phase modulation and various luminescence mechanisms), we can routinely obtain physical information on the morphology of transparent samples –, subtle variations inside optically thick specimens – and even on the structural details that are much smaller than the wavelength of the probing illumination –. Nevertheless, these emerging microscopy platforms have been partially restricted to use in advanced laboratories due to their relatively large sizes, high cost and complexities.
Optical microscopy has widespread clinical use for diagnosis of infectious diseases such as tuberculosis, malaria and sickle cell disease, as well as for early detection of cancer –. To increase the penetration of this valuable tool in low-resource settings, the need for field-deployable light microscopy-based diagnostic equipments has continued to increase, and much research is being devoted toward this end. As an example, Fletcher et al. has recently introduced a mobile phone-mounted light microscope capable of bright-field and fluorescence imaging . Achieving a spatial resolution of 1.2 μm across a field-of-view (FOV) of 0.025 mm2 using an optical attachment of >15 cm in length, this system has been shown to image P. falciparum-infected and sickle red blood cells as well as M. tuberculosis-infected sputum samples. Wachsmann-Hogiu et al. has demonstrated separate microscopy and spectroscopy platforms attached to cell phones . Their systems enable imaging with ~1.5 μm spatial resolution over an FOV of 0.025 mm2, and spectroscopy with 5 nm spectral resolution for applications including blood-smear imaging and tissue spectroscopy. Richards-Kortum et al. has recently shown a miniaturized, low-cost lens-based bright-field and fluorescence microscope for detection of infectious diseases such as tuberculosis . Their platform is shown to offer sub-micron spatial resolution using a 100× objective lens. Moreover, Mutter and Brown have recently developed a microscope running on digital cameras and cell phones to acquire photo-micrographs of urine samples toward the detection of renal disease .
Along the same lines, lensfree on-chip microscopy offers a promising platform –, complementing the above-mentioned recent efforts in the development of alternative microscopy techniques, especially towards designing more compact and cost-effective microscopy systems with larger FOV. Among these on-chip imaging modalities, optofluidic microscopy (OFM) employs simple light-sources to record projection images of objects flowing above a sensor-array, and utilizes this flow to digitally achieve a spatial resolution beyond the pixel size of the sensors . In this sense, OFM constitutes an important example of increasing spatial resolution without relying on lens-based magnification. Along the same lines, our research group has been developing alternative lens-less on-chip holographic microscopes to simultaneously achieve high-resolution and large FOV in a compact and mechanically robust architecture for potential use in telemedicine and point-of-care diagnostics applications. These platforms can also be conveniently interfaced with wireless devices such as webcams and cellphones to facilitate telemedicine applications –. That is, the medically relevant data, e.g. microscopic images of a blood smear, can be transmitted to remote stations using existing wireless networks, and a diagnostic decision can be sent back to the user.
In the following sections of this manuscript, we will review our latest advancements in lensfree holographic on-chip imaging, especially towards use in field-deployable systems, and also discuss future directions to further advance this platform, which can ultimately play an important role in our collaborative quest to improve global health. Our lensfree on-chip holographic microscopy architecture – is based on partially-coherent digital in-line holography, which offers microscopic imaging of a large field-of-view (FOV) without relying on optical magnification. Therefore, these lensfree microscopes can simultaneously offer a relatively high-resolution (e.g., <1μm) and a large FOV (e.g., ≥24 mm2). Moreover, owing to their architectural simplicity, which is a result of eliminating lenses and other bulky optical components, lensfree on-chip holographic microscopes – can provide an important toolset for microscopic analysis and diagnosis in the developing world, where there is an urgent need for compact, cost-effective, mechanically robust, and yet sensitive and wide-field microscopes.
Fig. 1 illustrates a schematic diagram of our lensfree on-chip holography system. In this platform, which is based on digital in-line holography ,,–, the sample is placed directly on the top of an optoelectronic sensor array (such as a CMOS chip) with typically <4 mm vertical distance to its active area. For illumination of the sample, a partially-coherent light source, such as a light emitting diode (LED) is utilized. This partially-coherent light is spatially filtered through a photon-efficient pinhole with a diameter of d~0.1 mm, after which it propagates a vertical distance of z1~4–10 cm until it reaches the object plane (see Fig. 1). Upon this propagation over z1, the coherence diameter (Dc) of our illumination  increases roughly to >0.2 mm, which enables coherent illumination of individual micro-objects over a large FOV, e.g., 24 mm2. If the density of objects is not exceedingly large, and/or the objects are weak scatterers as in the case of most biological specimens such as cells, a major part of the illuminating beam remains unscattered as if not having interacted with the objects and it constitutes the reference beam, R(x, y, z). A relatively smaller portion of the illuminating light is scattered by the sample, constituting the object wave, s(x, y, z). The degree of coherence (both temporal and spatial) in the sensor plane permits the reference and object waves to superpose in complex amplitude, giving rise to a measured interference pattern exhibiting destructive and constructive interference depending on the local optical path difference between the two wavefronts (see Fig. 2). Assuming that the reference beam can be defined as a uniform plane wave across the extent of the scattered field, the detected intensity at the sensor plane, (z=z0 as defined in Fig. 1) can thus be written as ,,:
In Eq. (1), the first term is ideally a uniform background signal, which does not comprise useful information regarding the objects, and can be subtracted either by a measured or numerically calculated background image. The second term is a self-interference term that is the intensity of the diffracted wave at the sensor plane. For weakly scattering samples, scattered object wave is much weaker in magnitude than the unscattered reference wave, i.e. |s(x, y, z)| |R(z0)|. Hence, this term has a relatively low intensity, compared to the last two heterodyne terms, and can be neglected for weakly scattering samples such as micro-particles, cells or various micro-organisms. The last two terms are the dominant holographic terms of interest, which form microscopic images of samples after digital reconstruction.
Being a function of the scattered field, s(x, y, z), these measured holograms can be considered as fingerprints of different types of particles that scatter the incoming light according to various factors including their morphology, refractive index distribution and absorption . Thus, while holographic patterns of the same type of micro-objects look almost identical, different types of objects can give rise to significantly different holographic patterns as seen in the insets of Fig. 2, showing holograms of different micro-objects. While the uniqueness of holograms can render these interference patterns quite valuable for several diagnostic applications, they can further be digitally reconstructed to obtain microscopic images of objects.
In order to reconstruct the scattered field at the object plane, which can be interpreted as a complex image, two different numerical approaches can be utilized. In the first one , the hologram is initially reconstructed by digitally multiplying the detected intensity by R(z0), and then digitally propagating the resulting fields using the angular spectrum approach , which is a frequency domain implementation of convolution of an optical field (involving two fast Fourier transforms) with the impulse response of free space propagation. Being complex conjugates of each other, the two holographic terms in Eq. (1) represent optical fields traveling in opposite directions. As a result, upon propagation of the reconstructed hologram toward the object plane (i.e., digital reconstruction), one of these terms converge to an image of the object, while the other further diverges, forming a weaker defocused twin-image related artifact concentric with the actual image. To obtain a more accurate complex image with higher contrast, this twin image artifact can be iteratively erased by propagating the reconstructed fields back-and-forth between the two image planes +z2 and −z2 (as defined in Fig. 1) away from the sensor plane (where each of the holographic terms in Eq. (1) converges to an image) and digitally deleting one of the conjugate images using the size (i.e., support) of the objects as a constraint . We would like to also note that the support estimated from the initial reconstructed image (typically by intensity-based thresholding in phase or amplitude images) is sufficient despite the existence of the twin-image artifact, and therefore a priori knowledge of the size of the objects is not necessary.
The second numerical reconstruction approach, which does not necessarily require the measured intensity to be a hologram, and generally works better for larger objects with higher scattering coefficients, is to use a phase retrieval algorithm . In this approach, the square-root of the hologram intensity (i.e., the amplitude) constitutes a starting point for the initial guess of the optical field at the sensor plane. This initial field is then refined (i.e., corrected) by iteratively updating the phase as the field is propagated back-and-forth between the sensor and object planes while the size of the objects is used as a constraint in these iterations . Both of these approaches typically converge within 10–20 iterations, which take <1 sec using a graphics processing unit (GPU - e.g. NVIDIA GeForce GTX 285).
Using these two iterative approaches, complex images (both phase and amplitude) of the objects placed on the sensor-chip can be digitally obtained, without the need for any lenses or bulky optical components. Since the light source can be a simple LED coupled to a large pinhole that does not require any sensitive alignment or light-coupling optics, the entire lensfree on-chip holography platform lends itself to a compact, cost-effective and mechanically robust architecture, which can especially be useful to build microscopes for field-use in low-resource settings. Moreover, since the object wave is not magnified through an optical system such as an objective lens, the entire active area of the sensor chip serves as our imaging FOV, and enables microscopy over a large imaging area of e.g., ~24 mm2, as shown in Fig. 2. This wide-field imaging capability would especially be useful for blood analysis and water quality management. Along these lines, we have recently introduced several different field-portable optical microscopes providing a set of telemedicine tools which will be reviewed starting with the next section.
Our earlier prototypes for lensfree telemedicine microscopy employ a single LED butt-coupled to a large pinhole (with ~0.1 mm diameter) and an optoelectronic sensor array to record lensfree in-line holograms of objects on a chip, as illustrated in Figs. 1 and and3.3. As a result of this architectural simplicity, these microscopes are rather lightweight (~46 grams), compact (4.2 cm × 4.2 cm × 5.8 cm), mechanically robust and cost-effective . Achieving a lateral spatial resolution of ~1.5 μm over a FOV of ~24 mm2, this microscopy platform can potentially provide a very useful toolset for various healthcare applications in resource limited settings.
To characterize the imaging performance of this platform shown in Fig. 3, we performed experiments with various types of objects such as blood cells, platelets and micro-particles . As shown in Fig. 4, the lensfree images obtained with this telemedicine microscope agree quite well with conventional microscope images of the same samples, and provide sub-cellular structural details that is sufficient to conduct e.g., three-part WBC differential counts. It is important to emphasize that these images are cropped from a much larger FOV, and the imaging quality does not vary across the entire imaging area, hence enabling rapid wide-field imaging with a single-shot holographic image.
To show its proof-of-concept for automated blood analysis, we also demonstrated cytometry on a chip using this lensfree imaging modality . Fig. 5a demonstrates automated counting of red blood cells (RBCs) for different cell densities. The counting error is <5% for cell densities up to ~0.4 Million cells/μL, which corresponds to 10× diluted whole blood. Despite the sample dilution, which is necessary to achieve good holographic imaging quality, >150 thousand cells are imaged at once in a single full FOV holographic image. Further, we performed automated counting of white blood cells (WBCs) with <4% error compared to a commercial Coulter counter, as shown in Fig. 5b. We also measured the volume histogram of RBCs using the reconstructed phase images (see Fig. 5c), which provide both area and thickness information (assuming constant and known refractive index for RBCs ) to enable volume calculations. As shown in Fig. 5d, hemoglobin density measurements can also be performed using the same lensfree on-chip imaging platform. To achieve this, holographic imaging is not required as measuring the photon-transmission of a cuvette containing whole blood is sufficient to calculate the absorption of the sample, which is proportional to the hemoglobin density. As demonstrated in Ref. , the sub-cellular resolution of our single-LED telemedicine microscope also permits differentiating different types WBCs, which is an important step forward towards 3-part differential imaging of these cells. In these experiments, blood smears prepared using standard techniques were used without a pre-dilution step of whole blood (see the Discussion and Future Directions Section for more discussion on lens-free imaging of optically dense samples such as high-density regions of blood smears and tissue samples).
We should also emphasize that this lensfree microscopy platform can further be enhanced by differential interference contrast (DIC) imaging techniques. Toward this end, we also showed that by adding cost-effective thin nonlinear crystals and plastic polarizers, which neither add significant cost nor complexity to the system, our lensfree imagers can also perform DIC microscopy, which can especially be useful to inspect weakly scattering transparent objects with increased contrast ,.
Another important need in the developing parts of the world is proper management of the quality of drinking and recreational water sources. Therefore, field-portable and sensitive equipments are needed that can rapidly and easily detect the pathogens in such water sources. Along these lines, the same lensfree telemedicine microscopy platform shown in Fig. 3 can also provide an effective toolset for water quality management. For this end, we also investigated the use of our lensless microscopes to image and detect pathogenic protozoan parasites such as Giardia Lamblia (G. Lamblia) and Cryptosporidium Parvum (C. Parvum) at low concentrations . Our lensfree microscopes can image different types of parasites with high-contrast, as validated against conventional microscope images of the same specimens (see left panel of Fig. 6).
In addition to imaging, these microscopes can also be used for accurately measuring the concentration levels of these parasites in water samples. In this respect, the wide-field imaging capability of our microscopes plays a key role as it permits accurate enumeration of parasites despite their low concentration in water samples. To demonstrate that, we imaged 4 batches of water samples with different G. Lamblia cyst concentrations, i.e. 1510/mL, 755/mL, 378/mL and 189/mL. For each concentration, we performed 13 measurements (52 measurements in total) and discarded the results with the highest and lowest concentration values, leaving us with 11 measurements for each batch. Based on these experiments, we calculated our mean error for the concentrations, mentioned above, to be ~7.4%, 7.3%, 3.5% and 37.2%, respectively (see Fig. 6). These results validate the performance of our lensfree microscopes to quantify G. Lamblia cyst concentration of a solution down to a level of ~380 parasites/mL with a mean error of <10%. We should emphasize that when combined with pre-concentration steps such as centrifugation and filtration, it should be feasible to further improve our detection limit to <5 parasites/mL .
The penetration of wireless communication links throughout the world is continuing to exhibit a rapid growth. Today, 80% of the world population lives in regions that are covered by GSM networks. Further, ~90% of the entire world population is expected to own a cellphone by 2015 . This enormous growth in wireless phone communication continuously lowers the manufacturing cost of cellphones despite the increase in their functionalities. Offering advanced imaging, sensing and communication interfaces that operate almost in every region of the world, cellphones have the potential to transform healthcare through telemedicine, especially in the developing parts of the world where there is lack of advanced laboratories and trained personnel. In this respect, cellphones can potentially be used as ubiquitous sensor platforms that can be used for e.g., remote diagnosis.
In order to provide a cost-effective and compact microscope running on a cellphone ,, we have recently demonstrated a lensless holographic cellphone microscope, which is shown in Fig. 7. This microscope is based on the same partially-coherent digital in-line holography concept that is introduced in the previous section. By utilizing the existing CMOS sensor built in the cellphone itself, our add-on unit to the body of the cellphone simply consists of a battery operated LED (center wavelength at 587 nm) that is butt-coupled to a pinhole (~100μm diameter), a hollow light tube and a sample-loading tray to mount the objects on top of the digital sensor of the cellphone camera (5 MP, ~24 mm2 active area), whose lens is physically removed.
Digital inline holograms of the objects placed on the sensor with <2mm distance to its active area are recorded by the color (i.e., RGB) sensor installed on the cellphone. Due to quasi-monochromatic illumination of the LED, color sensors that are commonly employed in cellphones are technically not the ideal selections (from a holographic imaging standpoint), since these color filters tiled in a Bayer pattern in front of the pixels result in non-uniform pixel response distorting the holograms. In order to minimize this distortion, we utilized an additional digital correction step in our holographic reconstruction algorithm ,. To achieve that, we first record raw images (uncompressed Bayer pattern images) with the cellphone sensor as opposed to compressed color images, since the conventional full color images provided by color sensors are demosaiced, which is a digital process that can wash away the high-frequency content of our holograms. Once a raw holographic image is obtained, we use our own iterative demosaicing algorithm summarized in Fig. 8 that creates a grayscale holographic image with the least distortion to the holographic patterns, especially protecting the high frequency fringes , which can then be digitally reconstructed following the procedures described in Section II and Ref. .
To demonstrate the imaging performance of our cellphone microscope, we conducted experiments with various types of micro-objects including spherical micro-particles, RBCs, WBCs, platelets and water-borne parasites (G. Lamblia cysts). As shown in Fig. 9, the lensfree images obtained with our cellphone microscope correlate well with images obtained using a conventional bench-top optical microscope (10× objective lens with a numerical aperture (NA) of 0.25). Particularly, we should point the reader to the sub-cellular details in WBC images (for e.g., granulocytes and monocytes) and the high-contrast of the weakly scattering G. Lamblia cyst in these lensfree images.
An important advantage of using a cellphone as a telemedicine microscope is that the acquired holographic data can be transmitted to a remote station, e.g. to a GPU based computer that is installed in a clinic or hospital, for rapid digital processing. As demonstrated in Reference , the raw lensfree holograms captured with our cellphone microscope can be significantly compressed for faster wireless transmission, such that an image corresponding to an FOV of ~5 mm2 can be transmitted using 375 kBytes of data at ~3 bits/pixel in e.g., portable network graphics (PNG) format.
The lensfree on-chip holographic microscopes presented thus far utilize a single LED to capture a single holographic image of the objects. Since the objects are placed much closer to the sensor chip compared to their distance to the light source, these raw holograms are recorded with a fringe magnification of F ~ 1 where F(z1 + z2)/z1; z1 and z2 denoting the pinhole-to-sample and sample-to-sensor distances, respectively . While this choice of hologram recording geometry is the key to achieve mechanical robustness, decrease the spatial and temporal coherence requirements and more importantly to have a large imaging FOV, it also partially limits the spatial resolution depending on the pixel size of the sensor-chip. That is, due to unit fringe magnification, the resolution achievable with a given sensor-array starts to be dominated by its pixel size, i.e. sampling period. Although the detection numerical aperture (NA) of our single-LED microscopes can ideally reach 1.0 due to their short z2 distance, the maximum oscillation frequency of holographic fringes that are properly sampled by the detector allowed reaching a modest NA of ~0.1–0.2, and the maximum resolution achieved by these systems has typically been ~1.5 μm , over a FOV of ~24 mm2.
In order to mitigate this problem and digitally improve the spatial resolution down to sub-micron level, we recently demonstrated the use of pixel super-resolution (SR) techniques in lensfree microscopy ,. The use of SR in lensfree holography with unit-magnification leads to a unique microscopy approach in which the FOV is not compromised to increase the spatial resolution. Furthermore, the simplicity of implementing SR in our lensfree microscopes permits the development of compact, cost-effective and field-portable telemedicine microscopes with sub-micron resolution over large imaging areas.
Toward this end, our lensfree field-portable super-resolution microscope that we have recently developed is shown in Fig. 10. This platform achieves <1 μm spatial resolution while still achieving an FOV of ~24 mm2, in a compact, cost-effective and robust embodiment. In this device, weighing only ~95 grams, 23 LEDs are individually butt-coupled to 23 optical fibers (core diameter ~0.1 mm) such that each fiber illuminates the sample vertically but at a slightly different location at a height of z1~5–6 cm away from the sensor-chip. As these LEDs are sequentially turned on by a low-cost micro-controller, 23 holograms are recorded, each being slightly shifted with respect to each other over the detector active area . These multiple lower-resolution (LR) lensless holograms can be considered as slightly shifted versions of a single high-resolution (SR) hologram that was under-sampled by the sensor array, giving rise to slight variations in the pixel values corresponding to under-sampled frequencies due to aliasing. This spatial aliasing can be digitally resolved using all the recorded LR frames to synthesize a pixel super-resolved (SR) hologram ,. To achieve that, the shifts of the LR holograms with respect to each other need to be determined. This can be done very accurately using gradient-based iterative shift estimation methods, which eliminate the need to know a priori the actual shifts of the holograms (or similarly the actual positions of the fiber tips). Once the hologram shifts are digitally estimated, a single higher resolution SR hologram with a smaller effective pixel size, which is consistent with all the measured LR frames (when shifted and down-sampled) can be iteratively calculated , . In this iterative optimization problem, a cost-function is defined by the square of the absolute error between the target SR hologram and the measured LR holograms. Once a SR hologram is synthesized by iteratively minimizing this cost-function, it can be digitally reconstructed using the procedures described in Section II . To demonstrate the efficacy of this approach, Fig. 11 shows a measured LR hologram (middle) and a calculated SR hologram (right), where the SR hologram contains additional high-resolution fringes, directly translating into an increased NA of the hologram. The blue dots in Fig. 11 show the sub-pixel shift amounts (after removing the integer multiples of pixel shifts) of the measured LR holograms with respect to each other.
In order to characterize the imaging performance of our portable SR microscope, we performed experiments with a micro-pattern etched into glass using focused-ion beam (FIB) milling. As shown in Fig. 12, the reconstructed lensfree image using the SR hologram exhibits a drastically increased resolution compared to the image obtained by using a single LR lensfree hologram. Further, we measured the full-width-at-half-maximum (FWHM) of the spatial derivative of the line profile across the letter “L” in the lensfree SR phase image to be ~0.8μm suggesting sub-micron spatial resolution (see Fig. 12). Additionally, the fact the letters “U” and “C” are clearly resolved, which have ~1μm separation in between, further validates this estimation. Since the imaging performance does not vary across the imaging FOV, our portable super-resolution microscope offers >100 fold larger FOV compared to a microscope objective with comparable resolution.
This field-portable super-resolution microscope can be especially useful for diagnostic imaging applications in low-resource settings. As an initial step toward this goal, we imaged standard thin blood smears of Giemsa-stained human red blood cells infected with malaria parasites (Plasmodium falciparum) using our lensfree microscope shown in Fig. 10. As the results shown in Fig. 13 demonstrate, the parasites, seen as isolated stained regions inside the cells, are clearly resolved both in our phase and amplitude images as validated by conventional bench-top bright-field microscope images (0.65-NA, 40×). These results are quite encouraging for future studies where we aim test our field-portable microscopes for automated diagnosis of malaria in disease-endemic locations.
Recently, there has been an increased interest in three-dimensional (3D) imaging modalities ,–,–, which permit obtaining high-resolution volumetric information about the structures of specimen under observation. These 3D microscopy platforms, including but not limited to optical projection tomography ,, optical diffraction tomography – and light-sheet microscopy ,, can perform bright-field and fluorescence imaging of samples ranging from micrometer to centimeter scale in size. Even though holographic microscopy can in principle enable 3D imaging by digitally reconstructing the holograms at different depths, its low axial-resolution does not permit truly tomographic imaging –. Interestingly, the depth-of-focus (DOF), which governs the achievable axial resolution of inline holography is most often a function of the object size rather than the holographic system. That is, the DOF is comparable to the far-field distance of a particle which is proportional to a2/λ, where a is the particle diameter and λ is the wavelength of illumination . Therefore, reconstructing the hologram at different planes along the optic axis, which is essentially equivalent to numerically propagating the optical field toward different planes through the object, does not provide 3D structural details with sufficiently high resolution especially for relatively large objects (e.g., >10–20 μm).
Similarly, in our wide-field lensfree on-chip holography scheme, the axial resolution can be more than an order of magnitude lower than the lateral resolution. To better illustrate this, we digitally reconstructed a lower-resolution (LR) and a pixel super-resolved (SR) hologram of a 2 μm micro-particle at different depths along the optic axis as shown in the y–z and x–z cross-sections in Figs. 14(a2–a3, b2–b3). Despite the significant improvement in lateral resolution with SR, the FWHM of the axial line profile (along z) is measured as ~90 μm and ~45 μm with a single LR hologram and a SR hologram, respectively. As a result, our portable telemedicine microscopes, where only a vertical illumination hologram (either LR or SR) is used, cannot offer a competitive 3D microscopic slicing of the objects placed over the sensor chip.
In order to significantly increase the axial resolution of our lensfree on-chip microscopy modality, we have recently demonstrated a lensfree optical tomography technique that achieves a 3D spatial resolution of <1 μm × <1 μm × <3 μm (in x, y and z, respectively) over a large imaging volume of 15 mm3 . The key to achieving tomographic imaging in lensfree on-chip holography is to synthesize lensfree SR holograms of the samples with varying directions of illumination to obtain projection images of objects from multiple viewing directions. We perform pixel SR at each angle to enhance the resolution of all the projection images. Once these high-resolution 2D projection images are reconstructed, 3D images of the objects can be numerically computed with significantly improved axial resolution, using e.g., a filtered back-projection algorithm that is commonly employed in X-Ray and electron computed tomography schemes .
Following its bench-top validation , we demonstrated a cost-effective and field-portable implementation of lensfree optical tomography, especially optimized for use in resource-limited settings. Weighing ~110 grams, this field-portable, cost-effective and high-throughput lensfree tomographic microscope, shown in Fig. 15, can achieve ~1 μm lateral resolution and <7 μm axial resolution over a large imaging volume of ~20 mm3 .
To implement this portable tomographic microscope, we used 24 LEDs that are individually butt-coupled to an array of fiber-optic waveguides (with a core diameter of ~0.1mm) tiled along an arc as illustrated in Fig. 15, covering an angular range of ±50°. A cost-effective micro-controller is employed to sequentially and automatically turn on these LEDs and to illuminate the sample from different directions. To increase the temporal coherence of illumination, we used interference-based color filters centered at ~640 nm with ~10 nm bandwidth, which are mounted on a piecewise arc that matches the arc-shaped geometry of the fiber-optic array (see Fig. 15). These filters increase the temporal coherence length of illumination to ~30 μm, which permits digitally generating SR holograms with a numerical aperture (NA) of ~0.3–0.4 up to an object height of ~1mm from the sensor-chip surface.
In order to perform pixel SR for enhancing our resolution at each illumination angle, the fiber-optic waveguide ends are mechanically displaced by small amounts (<500 μm) through electromagnetic actuation. In this scheme, the fibers are connected to a common moveable bridge (arc-shaped plastic piece shown in Fig. 15) with low-cost Neodymium magnets attached on both ends. Compact circular electro-coils are mounted inside the plastic housing, which are used to electromagnetically actuate the magnets resulting in simultaneous shift of all the fibers along both the x and y directions. It is important to emphasize that the exact amounts of displacement for these fiber-ends do not need to be exactly repeatable or accurately controlled as the hologram shifts are digitally estimated with no prior knowledge required ,. With the above described set-up, 10–15 projection holograms are recorded at each illumination angle to digitally synthesize one SR hologram for a given illumination angle (see Figs. 16b1–b3). These lensfree SR holograms are digitally reconstructed to obtain projection images of the samples (see e.g., Figs. 16c1–c3), which can then be merged together using a filtered back-projection algorithm to compute tomograms of the objects located on the sensor-chip –.
To demonstrate the sectional imaging performance of our field-portable lensfree tomographic microscope (Fig. 15), we imaged micro-beads of 5 μm diameter (refractive index ~1.68) immobilized within optical gel (refractive index ~1.52) in a ~50 μm thick chamber placed onto our sensor-chip. Tomograms through the chamber, shown in Figs. 17(b1–b5), illustrate the beads in their corresponding depths, as cross-validated by conventional bench-top microscope images shown in Figs. 17(a1–a5). The inset in Fig. 17, enclosed with the dashed rectangle, shows optical sectioning of two axially overlapping micro-beads, shown by the dashed circles in (a1) and (b5), further validating the sectional imaging performance of our tomographic microscope. We have shown that this platform can achieve an axial resolution of <7 μm, i.e. >13X improvement over what is achievable by a single lensfree hologram .
In order to further demonstrate the performance of our field-portable lensfree tomographic microscope for potential telemedicine applications, we also imaged a Hymenolepis Nana (H. Nana) egg, which is an infectious parasitic flatworm having an approximately spherical structure with ~40 μm diameter. Due to the low axial-resolution of lensfree inline holography, optical sectioning of this egg is not possible by simply reconstructing its recorded hologram at different depths. However, as demonstrated in Fig. 18, separate depth sections of this parasite egg can be created using our tomographic handheld microscope (shown in Fig. 15), exhibiting distinct spatial details/features at each depth layer.
It is important to note that this imaging performance can be achieved over a large FOV of ~20 mm2 and a depth-of-field of ~1 mm, thereby enabling our tomographic microscope to probe a large volume of ~20 mm3 with a decent 3D spatial resolution. Moreover, at the expense of slightly reduced resolution, the DOF can be extended to ~4 mm, enabling sectioning of a volume of ~80 mm3 .
The main reason that for this field-portable unit our axial resolution is limited to ~7 μm is the fact that we captured projection holograms over a single axis with a limited angular range of ±50°. Although our imaging geometry permits detection of holograms at much larger angles, e.g. 70–80°, we restricted the angular range to ±50° due to the poor response of optoelectronic sensors at larger incidence angles, which distorts the raw acquired holograms. Consequently, the missing angles in back-projection lead to a loss of information, lowering the axial resolution. Using recently emerging optoelectronic sensors that are designed to further increase the acceptance angle of pixels we can significantly increase the angular range of accurately measured projection holograms, which could translate into an increased axial resolution in our hand-held designs.
Lensfree on-chip holography is an emerging microscopy modality that offers unique benefits to complement the existing imaging platforms, especially for global health applications. The simplicity of their design, together with the tolerance to misalignments, renders these imaging systems to be quite cost-effective and mechanically robust. Consequently, these microscopy systems can be integrated to mobile devices such as cell-phones, converting these ubiquitous devices to telemedicine tools that can work even in resource-limited settings. Moreover, recording the holograms with unit magnification leads to exceedingly large imaging FOVs, e.g., 24 mm2, the drawbacks of which on lowering the spatial resolution can be mitigated by digital processing of these holograms using pixel super-resolution algorithms. As a result, increasing the resolution without compromising the imaging area brings high-resolution over a large FOV.
Lensfree on-chip holography also lends itself to a tomographic microscope by employing cost-effective LEDs at multiple viewing angles to synthesize SR holograms of the sample with varying directions of illumination. This multi-angle illumination scheme, which can be conveniently and cost-effectively implemented in a field-portable architecture as illustrated in Fig. 15, enables 3D imaging of a large sample volume on a chip using a lightweight lensfree tomographic microscope.
Our in-line holographic on-chip recording scheme with unit fringe magnification also relaxes the stringent spatial and temporal coherence requirements at the source end that exist in conventional in-line holographic microscopes. That is, due to the quasi-planar reference wave and the short sample-to-sensor distance, the optical path difference between the reference and object waves do not typically exceed the coherence length that a source with 1–15 nm bandwidth can provide, eliminating the need to use coherent sources such as lasers. Therefore, choosing an optimum degree of spatial and temporal coherence, several noise terms such as speckle, cross-interference within an object and among different objects, as well as the multiple-reflection based interference terms are minimized.
An additional advantage of reduced spatial coherence requirements at the source end is that an unusually large, (hence photon-efficient) pinhole with a diameter of ~0.05–0.1 mm can be used to filter the spatially incoherent illumination of the LED, thereby removing the need to use coupling optics (e.g., an objective lens mounted on a mechanical stage) between the light-source and the pinhole. In partially-coherent digital in-line holography, as we already discussed in Reference , the demagnified image of the pinhole at the detector plane is effectively convolved with the recorded optical field, partially blurring the recorded hologram. This does not pose any limitation in our imaging system as the demagnification factor is typically ~100–200 owing to the large z1/z2 ratio, hence the pinhole is scaled down to <1 μm at the detector plane.
An important limitation of lensfree on-chip transmission holography, which is common to all in-line holography schemes where a separate reference beam is not generated by using e.g., a beam splitter, is that it requires the samples to have relatively low optical density. That is, if the sample is optically dense (scattering the majority of the impinging photons) the reference beam gets attenuated to a level where the detected intensity becomes dominated by the non-holographic self-interference terms (the second term in Eq. (1) in Section II). As a result, our lensfree telemedicine microscopes, in their current transmission geometry, cannot image dense samples such as histopathology slides or thick whole blood samples. As a result, for blood cell counting applications where the sample was in aqueous form (i.e. not smear), we diluted the blood samples (e.g., 10 times) to increase our counting accuracy. To provide a solution to this important need, reflection based lensfree on-chip microscopes are being developed in our group, which show very promising initial results toward imaging of tissue slides, which will be reported elsewhere.
The requirement of relatively low optical density has further implications on our portable lensfree tomographic microscopes. In addition to sharing the common requirements of inline holographic imaging, lensfree tomographic microscopes additionally require the objects to satisfy the so-called projection approximation ,. Accordingly, the digitally reconstructed images should represent a line integral of a property-of-interest of the object (scattering, absorption, phase, etc.) along rectilinear paths. This necessitates the objects to be relatively weak scatterers that are not thicker than the depth-of-field of reconstructed images.
In order to mitigate some of these issues related to lensfree optical tomography, we are working towards improved implementations of our platforms both in terms of the reconstruction algorithms used, as well as the optical design of these microscopes. Toward this end, we are developing a second-generation field-deployable lensfree tomographic microscope that employs dual-axis illumination, a schematic of which is illustrated in Fig. 19. By illuminating the sample along two orthogonal directions, the missing information about the objects (which is a manifestation of limited angle tomography) can be significantly minimized . Owing to the cost-effective and alignment-free illumination in our designs, adding a secondary axis should not significantly increase the cost and complexity of this field-portable tomographic device.
In addition to this improved optical design (i.e., dual axis illumination - Fig. 19), we are also working on implementing a diffraction tomographic reconstruction scheme to compute the tomograms. This approach, which takes the diffraction within the object into account, can enhance the resolution and accuracy of our reconstructions, since it does not strictly require rectilinear propagation of photons through the object ,,. Therefore, using such a diffraction tomography approach, we aim to partially relax the stringent requirements on the size and scattering properties of the objects located in our 3D imaging volume on a chip.
And finally, we would like to also emphasize that lensfree microscopy work of our group has recently been extended to fluorescence and dark-field imaging. Along these lines, we have demonstrated lens-less computational fluorescence imaging that can achieve a spatial resolution of ~4 μm over an ultra-wide FOV of 60 mm2, which can also be merged with our lensfree on-chip holography scheme to offer dual-mode imaging ,. This platform can be particularly useful for high-throughput imaging applications such as cytometry and rare-cell detection (e.g. detection of circulating tumor cells and leukocytes). Moreover, we have also demonstrated fluorescence and dark-field microscopy on a cell-phone . In this compact and cost-effective implementation (see Fig. 20), the samples (fluorescent and non-fluorescent) were interrogated with an excitation light that is propagating orthogonal to the detection path. Consequently, the required dark-field background in our cellphone microscope was achieved using inexpensive plastic absorption filters, as opposed to costly thin-film based interference filters. This cell-phone microscopy modality, shown in Fig. 20 can achieve ~10 μm spatial resolution over an ultra-wide FOV of ~81 mm2, rapidly probing large sample volumes. These recent developments provide a valuable opportunity to build multi-modal, and yet cost-effective and field-portable, cell-phone based microscopy platforms that can perform bright-field and phase microscopy (holographic) together with fluorescence and dark-field imaging capabilities.
In conclusion, we have reviewed our latest developments in lensfree on-chip holographic microscopy, with an emphasis on field-deployable devices for use in low-resource settings. Offering high-throughput microscopic imaging within a compact, cost-effective, lightweight and robust architecture, our telemedicine microscopes can help improve global health by facilitating critical tasks such as microscopic analysis, cytometry and related diagnosis especially in the developing parts of the world, where resources are limited.
A. Ozcan gratefully acknowledges the support of NSF CAREER Award, the ONR Young Investigator Award 2009 and the NIH Director’s New Innovator Award DP2OD006427 from the Office of The Director, NIH. The authors also acknowledge the support of the Gates Foundation, Vodafone Americas Foundation, and NSF BISH program (under Awards # 0754880 and 0930501).
Serhan O. Isikman (S’02) received the B.S. (2006) and M.S. degrees (2008) in electrical engineering from Koc University, Istanbul, Turkey. He is currently a PhD candidate in electrical engineering at the University of California, Los Angeles (UCLA). His research in optics, holographic microscopy, electromagnetic actuator design and micro-opto-electro-mechanical systems has led to 2 pending US patents and more than 30 scientific journal and conference papers.
He received fellowships from Koc University, The Scientific and Technological Research Council of Turkey (TUBITAK) and UCLA Electrical Engineering Department for his B.S., M.S. and PhD studies, respectively. During 2008–2009, he worked as an engineering consultant for Microvision Inc., WA, USA. He is a student member of IEEE, SPIE and OSA.
Waheb Bishara was born in Israel in 1981. He received his B.Sc. degree in electrical engineering and B.A. degree in physics from the Technion – Israel Institute of Technology in 2003, and his Ph.D. degree in physics from the California Institute of Technology in 2009 for the study of electrical charge signatures of fractional quantum Hall materials.
Since 2009, Dr. Bishara has been a post-doctoral research fellow at the University of California, Los Angeles (UCLA) electrical engineering department, investigating lens-less holographic imaging devices, with an emphasis on high-resolution sub-pixel imaging for point of care devices and global health applications. He was awarded the UCLA Chancellor’s Award for Postdoctoral Research for his contributions to the field of lens-less imaging.
Onur Mudanyali (S’03) received his B.Sc. and M.Sc. degrees in electronics and communication engineering from Istanbul Technical University (ITU), Turkey in 2006 and 2008, respectively. He is now a Ph.D. candidate in electrical engineering department at the University of California, Los Angeles (UCLA), USA where he has been working on lensfree microscopy and effective telemedicine solutions to global health related problems.
Since 2006, he has been assisting with various undergraduate level classes and now serving as a teaching associate at UCLA electrical engineering department. He is the recipient of an undergraduate merit scholarship (ITU), national graduate (M.Sc.) scholarship (The Scientific and Technological Research Council of Turkey - TUBITAK), prestigious international Ph.D. fellowship (The Council of Higher Education of the Republic of Turkey - YOK) and 2011 UCLA Annual Tech Forum poster award. He is the author/co-author of more than 25 peer-reviewed research articles in major scientific journals and conferences.
Ikbal Sencan (S’05) received her B.S. degree in electronics engineering from Fatih University, Istanbul, Turkey as Valedictorian, and her M.S. degree in electrical engineering from University of California, Los Angeles (UCLA) in 2007 and 2010, respectively. She is currently pursuing her PhD degree in electrical engineering at UCLA in the field of high-resolution lensfree on-chip microscopy. Her research on lensfree imaging has been published in 39 scientific journal and conference papers. She is a member IEEE, OSA and SPIE.
Ting-Wei Su (S’08) received his B.S. degree in electrical engineering and M.S. degree in electro-optical engineering from National Taiwan University, in 2000 and 2002, respectively. After his service in Army’s Electronics and Communication School as an instructor, he was a senior engineer in AU Optronics from 2004 to 2007 working on LCD optical performance. He is currently a PhD candidate in the department of electrical engineering, UCLA, developing optical cell-monitoring platforms for high-throughput cytometry and point-of-care diagnostics applications.
Ting-Wei Su is the co-author of more than 60 peer-reviewed papers in scientific journals and conferences. He is the co-inventor of 3 issued and several pending US patents, covering silicon light sources, LCD driving schemes and LCD pixel designs. His current research interests include novel optical modalities for bio-medical imaging and sensing, lens-free imaging, digital holography, and optics-based cytometry.
Derek K. Tseng received the B.S. (2004) in electrical engineering from University of California, San Diego and M.S. (2010) in electrical engineering from University of California, Los Angeles (UCLA). His research in optical gas sensing and lensfree microscopy has been published in more than 17 scientific journal and conference papers.
Between 2005 and 2008, he worked at InnoSense LLC (2005–2008) as a research engineer. Since 2008, he has been working as a junior developing engineer for building lens-free microscope prototypes.
Oguzhan Yaglidere is currently pursuing his B.S. degree in chemical engineering at the University of California, Los Angeles (UCLA). He has been working as an undergraduate research assistant for two years at UCLA Biophotonics Laboratory.
He has co-authored 8 journal and conference papers on lensfree portable microscopy for global health applications. His research interests include portable microscopy for low-resource settings, and renewable energy sources.
Uzair Sikora (S’09) is currently pursuing his B.S. degree in electrical engineering at the University of California, Los Angeles.
His work in lensfree imaging at the UCLA has contributed to 2 journal papers. In 2008, he worked as an intern at Northrop Grumman Corp. Quality Engineering, CA, USA. He is a student member of IEEE, and the Eta Kappa Nu Honor Society.
Aydogan Ozcan (SM’10) Dr. Aydogan Ozcan received his Ph.D. degree at Stanford University Electrical Engineering Department in 2005. After a short post-doctoral fellowship at Stanford University, he is appointed as a Research Faculty Member at Harvard Medical School, Wellman Center for Photomedicine in 2006. Dr. Ozcan joined UCLA in the summer of 2007, where he is currently an Associate Professor leading the Bio-Photonics Laboratory at the Electrical Engineering Department.
Dr. Ozcan holds 17 issued patents and another 12 pending patent applications for his inventions in nanoscopy, wide-field imaging, lensless imaging, nonlinear optics, fiber optics, and optical coherence tomography. Dr. Ozcan is also the author of one book and the co-author of more than 200 peer reviewed research articles in major scientific journals and conferences. In addition, Dr. Ozcan is the founder and a member of the Board of Directors of Microskia Inc., and is a member of the program committee of SPIE Photonics West Conference, SPIE International Symposium on Defense, Security and Sensing, as well as the IEEE Photonics Society Annual Meeting. He also serves as a panelist and a reviewer for National Science Foundation, NIH and for Harvard-MIT Innovative Technology for Medicine Program. Prof. Ozcan also served as the General co-Chair of 2010 IEEE Winter Topical Meeting on Advanced Imaging in BioPhotonics.
Prof. Ozcan received several major awards including the 2011 SPIE Early Career Achievement Award, 2010 NSF CAREER Award, the 2009 NIH Director’s New Innovator Award, the 2009 Office of Naval Research (ONR) Young Investigator Award, the 2009 IEEE Photonics Society (LEOS) Young Investigator Award and the MIT’s Technology Review TR35 Award for his seminal contributions to near-field and on-chip imaging, and telemedicine based diagnostics.
Prof. Ozcan is also the recipient of the 2010 National Geographic Emerging Explorer Award, the 2010 Bill & Melinda Gates Foundation Grand Challenges Award, the 2010 Popular Mechanics Breakthrough Award, the 2010 Netexplorateur Award given by the Netexplorateur Observatory & Forum in France, the 2010 PopTech Science and Public Leaders Fellowship, the 2010 USC’s Body Computing Slam Prize, and the 2009 Wireless Innovation Award organized by the Vodafone Americas Foundation as well as the 2008 Okawa Foundation Award, given by the Okawa Foundation in Japan.
Prof. Ozcan was also selected as one of the top 10 innovators by the U.S. Department of State, USAID, NASA, and NIKE as part of the LAUNCH: Health Forum organized in October 2010.
Dr. Ozcan is a Senior Member of IEEE, and a member of LEOS, EMBS, OSA, SPIE and BMES.
Serhan O. Isikman, Electrical Engineering Department at the University of California, Los Angeles, CA 90095, USA.
Waheb Bishara, Electrical Engineering Department at the University of California, Los Angeles, CA 90095, USA.
Onur Mudanyali, Electrical Engineering Department at the University of California, Los Angeles, CA 90095, USA.
Ikbal Sencan, Electrical Engineering Department at the University of California, Los Angeles, CA 90095, USA.
Ting-Wei Su, Electrical Engineering Department at the University of California, Los Angeles, CA 90095, USA.
Derek Tseng, Electrical Engineering Department at the University of California, Los Angeles, CA 90095, USA.
Oguzhan Yaglidere, Electrical Engineering Department at the University of California, Los Angeles, CA 90095, USA.
Uzair Sikora, Electrical Engineering Department at the University of California, Los Angeles, CA 90095, USA.
Aydogan Ozcan, Electrical Engineering Department at the University of California, Los Angeles, CA 90095, USA (http://innovate.ee.ucla.edu/). California NanoSystems Institute (CNSI), at the University of California, Los Angeles, CA 90095, USA.