|Home | About | Journals | Submit | Contact Us | Français|
We report the experimental implementation of optical diffraction tomography for quantitative 3D mapping of refractive index in live biological cells. Using a heterodyne Mach-Zehnder interferometer, we record complex field images of light transmitted through a sample with varying directions of illumination. To quantitatively reconstruct the 3D map of complex refractive index in live cells, we apply optical diffraction tomography based on the Rytov approximation. In this way, the effect of diffraction is taken into account in the reconstruction process and diffraction-free high resolution 3D images are obtained throughout the entire sample volume. The quantitative refractive index map can potentially serve as an intrinsic assay to provide the molecular concentrations without the addition of exogenous agents and also to provide a method for studying the light scattering properties of single cells.
Refractive index serves as an important intrinsic contrast agent in visualizing nearly transparent living biological cells. Examples are phase contrast microscopy and differential interference microscopy, which have been widely used in cell biology studies. In essence, both of techniques make use of optical interferometry to enhance contrast. Interferometry converts phase changes of the transmitted light mainly induced by the heterogeneous refractive index distribution within the cell into intensity variations. However, these techniques do not provide quantitative maps of phase change.
More advanced phase microscopy techniques have been developed to record quantitative phase images of specimen-induced phase changes [3–7]. These techniques can either provide average refractive index of cells or cell thickness [8–11], but not detailed 3D structure.
These advanced phase microscopy techniques have paved the way to the development of 3D imaging techniques. Several successful experimental implementations are capable of mapping the 3D structure of the specimen. The common idea is to record multiple images for various angles of illumination with respect to the specimen, and then to reconstruct 3D structure with this set of angular images. There are two ways to change the relative angle of illumination with respect to the specimen. One is to rotate the sample with the illumination beam fixed, and the other is to rotate the illumination beam with sample fixed. Rotating the sample makes it possible to cover the entire angular range, and thus obtain the same axial resolution as the transverse resolution. But it is difficult to fix the axis of rotation, and rotation inevitably perturbs the sample. In addition, data acquisition speed is limited due to the use of mechanical rotation of the sample. Therefore, the use of sample rotation is typically restricted to solid non-biological objects such as optical fibers [12, 13]. Special sample preparation is required for imaging biological cells . On the other hand, the rotating beam approach doesn’t perturb the sample during data acquisition, and is thus suitable for imaging live cells in their native state [15, 16]. Data acquisition can be fast enough to study the dynamics of the live cells. Only small modifications are needed for the instrument to fit into a conventional high NA optical microscope. A drawback of this method is related to the lack of complete angular coverage due to the finite numerical aperture of an imaging system, as is usual in conventional optical microscopy. Thus, the axial resolution is poorer than the transverse resolution. Various algorithms have been developed to solve missing angle information with a prior knowledge of the specimen [17, 18].
The reconstruction algorithm is an important factor in determining the spatial resolution and quantification of the complex refractive index. The way of interpreting the experimentally measured complex field determines the algorithm to be used. If the phase of the transmitted field is interpreted as a line integral of the refractive index along the propagation direction, then the filtered back-projection algorithm based on the inverse Radon transform can be applied . For weakly scattering biological cells, this is often a good approximation [14, 16] for points close to the plane of focus. However, since the effect of diffraction is ignored, there is loss of resolution for samples which are large compared to the depth of focus of the imaging system.
A more general approach, which takes the diffraction into account, is diffraction tomography first proposed by Wolf in 1969. The Born approximation is typically adopted to make the relation linear between the complex refractive index of the object and the E-field. Several experimental studies have implemented diffraction tomography in the optical regime [13, 15, 21], but one  imaged only non-biological samples and the others [15, 21] did not provide calibrated values for refractive index.
In previous work, we have developed tomographic phase microscopy (TPM)  for quantitative 3D mapping of refractive index in live cells in their native state. TPM can collect angular images ranging from −60 to 60 degrees in 10 seconds. The rotating-beam geometry was adopted to avoid perturbing specimens during data acquisition, and filtered back-projection along with an iterative constraint algorithm was used for 3D reconstruction. Recently, we also developed an algorithm which corrects for the effect of diffraction by numerically propagating the focus of the images . However, this method is highly computationally intensive as it requires multiple runs of the inverse Radon transform.
In this paper, we report the first experimental implementation of optical diffraction tomography to image live biological cells and provide quantitative 3D refractive index maps. We used tomographic phase microscopy, developed earlier by our group, to record complex transmitted fields at various angles of incidence. By employing optical diffraction tomography based on the Rytov approximation, we take diffraction into account and thus produce high resolution 3D refractive index maps for the entire cell without need for propagation in the reconstruction algorithm. We demonstrate that the Rytov approximation is valid for live cell imaging, while reconstruction based on the Born approximation leads to severe distortions. An iterative constraint algorithm is applied to minimize the effects of incomplete angular coverage.
The quantitative refractive index maps thus obtained can be used to quantify molecular concentrations without adding fluorescence agents . They also provide a means of studying the light scattering of single cells , which may lead to develop in-vivo light scattering instruments for disease diagnosis.
It is relatively straightforward to implement a deconvolution algorithm for creating 3D fluorescence image from a stack of full-field fluorescence images taken with a scanning objective lens. Each fluorescent particle acts as a point source, and there is negligible interference among the particles. The point spread function is thus defined only by the imaging system. On the other hand, it is more complicated to implement 3D deconvolution for absorption and refractive index. Unlike fluorescence imaging, both amplitude and phase images of the transmitted field must be recorded, since absorption coefficient and refractive index affect both amplitude and phase of the field. Moreover, interference among scatterers complicates the deconvolution process. To fully describe the effect of interference, the wave equation must be solved. However, this would require extensive computation time, and it is even more difficult to extract the structure of objects from the transmitted E-field images.
Approximations such as Born and Rytov have been employed in the past to make this problem relatively easy to solve . In essence, by assuming that the scattered field is weak compared to the incident field, the relationship between the three-dimensional scattering potential and the two-dimensional measured field can be simplified. Using the Born approximation, Wolf derived a formulation which enables reconstruction of a 3D object from 2D measured E-fields . For each illumination angle the Fourier transform of the 2D measured E-field is mapped onto a spherical surface in the frequency domain of the 3D scattering potential. This spherical surface is called the Ewald sphere. In this section, we briefly introduce Wolf’s original theory and Devaney's modification to adopt the first Rytov approximation . Based on this section, we will explain the experimental implementation of these theories in section 4.
With scalar field assumption, the propagation of light field, U(), through the medium, can be described by the wave equation as follows:
Here, k0 = 2π/λ0 is the wave number in the free space with λ0 the wavelength in the free space, and n() is the complex refractive index. If the field is decomposed into the incident field U(I) () and scattered field U(S)(R→),
then, the wave equation becomes
with F() −(2πnm/λ0)2((n()/nm)2−1), and nm is the refractive index of the medium. F() is known as the object function. Based on Green’s theorem, the formal solution to Eq. (3) can be written as
with G(r)=exp(inmk0r)/(4πr) the Green’s function. Since the integrand contains the unknown variable, U(), we employ an approximation to obtain a closed form solution for U(S)(). The first Born approximation is the simplest we can introduce when the scattered field is much weaker than the incident field (U(S) U(I)), in which case the scattered field is given by the following equation:
This approximation provides a linear relation between the object function F() and the scattered field U(S)(). By taking Fourier transform of both sides of Eq. (5), we obtain the following relation, known as the Fourier diffraction theorem :
Here, and Û (S) are the 3D and 2D Fourier transform of F and U(S) , respectively; kx and ky are the spatial frequencies corresponding to the spatial coordinate x and y in the transverse image plane, respectively; z+ =0 is the axial coordinate of the detector plane, which is the plane of objective focus in the experiment. (Kx, Ky, Kz) , the spatial frequencies in the object frame, define the spatial frequency vector of (kx, ky, kz) relative to the spatial frequency vector of the incident beam (kx0, ky0, kz0), and kz is determined by the relation. For each illumination angle, the incident wave vector changes, and so does (Kx, Ky, Kz). As a result, we can map different regions of the 3D frequency spectrum of the object function F() with various 2D angular complex E-field images. After completing the mapping, we can take the inverse Fourier transform of to get the 3D distribution of the complex refractive index.
Numerical simulations have demonstrated that the Born approximation is valid when the total phase delay of the E-field induced by the specimen is less than π/2. The thickness of single biological cells is typically about 10 µm, with index difference with respect to the medium about 0.03. Thus, the phase delay induced by typical cells is approximately π at a source wavelength of λ = 633 nm. Therefore, one would not expect the Born approximation to be valid for imaging biological cells.
We note that the Rytov approximation is more relevant to imaging biological cells than the Born approximation. It is not sensitive to the size of the sample or the total phase delay, but rather to the gradient of the refractive index. Specifically, the Rytov approximation is valid when the following condition is satisfied:
and nδ is the index variation in the sample over the length scale of wavelength. This condition basically asserts that the Rytov approximation is independent of the specimen size and only limited by the phase gradient (S). For a weakly scattering sample such as biological cell, the phase change (S) is linearly proportional to nδ to a first approximation, such that the relation is valid when nδ«1. According to our previous work , the index variation nδ is in the range of 0.03 – 0.04 for biological cells. As shown in the Section 4, we obtain a high quality image of a live cell when we use the Rytov approximation, while the Born approximation leads to significant distortions in the reconstructed image.
As suggested by Devaney , the implementation of the Rytov approximation in the Fourier diffraction theorem requires a slightly different approach. Following Devaney’s method, we introduce the complex phase ϕ(), defined by U(R)= eϕ(), and substitute this into the wave equation (Eq. (1)). After applying the approximation of Eq. (7), we again obtain the Fourier diffraction theorem (Eq. (6)), but with U(S) replaced by defined as
The rest of the reconstruction then is the same as described in the text following Eq. (6).
We used tomographic phase microscopy instrument (Fig. 1) to record complex E-field images at various angles of illumination for the sample stationary at the sample stage of the microscope . Although we used only phase information in reconstructing 3D objects in the previous study, the heterodyne Mach-Zehnder interferometer  used in the instrument is capable of recording amplitude images as well as phase images. A He-Ne laser beam of wavelength 633 nm was divided into sample and reference beams. The propagation direction of the sample beam was controlled by a galvanometer mirror and the sample image of the transmitted beam was delivered to the camera by objective and a tube lenses. Two acousto-optic modulators were used to shift the frequency of the reference beam by 1.25kHz, and the frame rate of a CMOS camera (Photron 1024PCI) was adjusted to take images with 200 µs intervals. For each angle of illumination, we recorded four successive interferogram images in 800 µs and used phase-shifting interferometry to produce a pair of quantitative phase and amplitude images. To maximize the range of illumination angles, a high NA condenser (Nikon, 1.4 NA) and objective lens (Olympus UPLSAPO, 1.4 NA) were used. The sample beam was rotated using a galvanometer mirror to cover from −70 to 70 degrees in 0.23 degree steps. It takes about 10 seconds to record a set of angular complex E-field images.
Using the set of phase and amplitude images taken at various angles of illumination, we applied diffraction tomography algorithm described in Section 2. Given a quantitative phase image (x, y;θ) and an amplitude image A(x,y; θ) taken at each illumination angle, θ, we can reconstruct the total E-field, U(x,y;θ) =A(x,y; θ)ei(x,y;θ) , at the image plane. The measured field image is composed of the phase change induced by the sample and the phase ramp introduced by the tilted illumination. A corresponding set of images Ubg(x y;θ)=Abg (x,y;θ)e ikx0x+iky0y taken when no sample is present provides the background field, which can be considered as the incident fields. Figure 2(a) shows the phase image (x,y;θ = 0)of a 6 µm polystyrene bead (Polysciences. Inc.) taken at zero incidence angle. The refractive index of beads according to manufacturer specifications is 1.585 at 633 nm wavelength, and this value was used to validate experimental measurements. Figure 2(b) shows the typical amplitude image of Û (kx ky;θ) on a logarithmic scale.
To apply the Fourier diffraction theorem (Eq. (6)), we convert (kx,ky) on the right hand side into (Kx,Ky) as follows.
We first calculate Û(S)(kx,ky;θ) (Eq. (2)) or (Eq. (8)) from measured complex fields. We then shift them by (−kx0 −ky0) in spatial frequency space following the right hand side of Eq. (9). In mapping the experimental data, we divide them by the incident field Ubg (x,y;θ), which is equivalent to shifting them in Fourier space. In other words, the scattered field used in the Fourier diffraction theorem in the experiment is as follows: In the case of the first Born approximation:
In case of the first Rytov approximation:
Figures 2(c)–(d) show the results of this mapping on the (Kx,Ky,Kz=0) and (Kx, Ky=0,Kz) planes, respectively. The data along the blue line in Fig. 2(b) is mapped onto the blue half-circle on the (Kx, Kz) space of Fig. 2(d). Different angular images are mapped onto different spaces such that they eventually cover a significant portion of the (Kx, Ky, Kz) space of the object function F(). Looking at the frequency spectrum of Figs. 2(c)–(d), ring patterns are clearly visible after mapping various angular images, which is expected for the spherical shape of the sample. By taking the inverse Fourier transform of the entire 3D frequency spectrum, we obtain the 3D distribution of refractive index and absorption coefficient of the object.
With our current configuration, we could get an image within illumination angles of up to ±70 degrees which was measured via the spatial frequency of the fringe pattern at the image plane. As a result, we could not fill the entire region of frequency space, as can be seen in Figs. 2(c)–(d). In other words, the inverse problem is underdetermined. In our first round of reconstruction, we put zero values in the missing angle space. The reconstructed object function then exhibits negative bias around the sample and the refractive index is smaller than the actual value (Fig. 3(a)).
To minimize the artifact introduced by the missing angles, we applied an iterative constraint algorithm [17, 18] based on the prior knowledge that the object function is non-negative for the live cells. The index throughout the field of view, either inside or outside of the cell, is at least the same or higher than the medium. We first take the inverse Fourier transform of the originally mapped data with zero values for the missing space (Fig. 3(d)). In the reconstructed image there are pixels whose index values are smaller than index of the medium (Fig. 3(a)). We forced these to be the same as the index of the medium and take the Fourier transform. The index values in the Fourier space in which we put zero values are no longer zero, and we obtain an approximate solution for the missing angles (Fig. 3(e)). But, at the same time, the data in the space which contains measured data is now modified. Since the experimentally measured data is accurate, we replace the modified data with the measured data. We iterate this procedure until the reconstructed object function converges (Fig. 3(c)–(f)). As a result, the negative bias is removed and the reconstructed image becomes more accurate. For the case of the polystyrene bead, we estimate that the accuracy of the measured refractive index is close to 0.001 after application of the iterative constraint algorithm. When the Fourier maps before (Fig. 3(d)) and after (Fig. 3(f)) iteration are compared, the ring patterns are generated in the missing angle regions (Fig.3(f)). This indicates that iterative constraint algorithm can generate reasonably accurate solutions for the missing angle regions.
To test the performance of the 3D reconstruction methods, we first obtained two sets of angular E-field images of 6 µm polystyrene beads (Polysciences) immersed in oil (Cargille, n = 1.56) at two different foci, one in the middle of the bead and the other 4 µm above the center. When we applied the filtered back-projection algorithm based on the projection approximation, the slice image at the middle of the bead was uniform when the objective focus was set to the middle of the bead (Fig. 4(a)). However, when the focus was above center, the slice image in the middle of the bead presented ring patterns (Fig. 4(b)), which are due to diffraction of the propagating beam.
We applied diffraction tomography based on the Rytov approximation. The resulting slice images of tomograms are shown in Figs. 4(c) and (d) at the objective focus in the middle of the bead and 4 µm above the center of the bead, respectively. Both images show clear boundaries of the bead with uniform index distributions. This indicates that the diffraction tomography properly accounts for the effects of diffraction. Note that the index of the bead relative to that of the oil was set to 0.03. This difference is very close to the relative index of the cell to the culture medium. Hence, we expect the Rytov approximation to be applicable to imaging of single cells. Based on the analysis of the slope at the edge of the bead, the spatial resolution of the reconstructed image was estimated to be 0.35 µm in the transverse and 0.7 µm in the axial direction.
Next, we compared the performance of the Rytov and Born approximations. Figures 5 (a) and (c) show the slice images of a 6 µm bead reconstructed from the Born and Rytov approximations, respectively. The index in the middle of the bead is lower for the Born approximation while it is relatively uniform for the Rytov approximation. For a10 µm bead, distortion inside the bead is even more pronounced for the Born approximation (Fig. 5(b)), whereas the refractive index of the slice image of the Rytov approximation is still uniform. This demonstrates that the validity of the Born approximation is highly dependent on the size of the object and is not suitable at all for the 10 µm polystyrene bead in oil, whose phase delay is close to 3 radians. This suggests that the Born approximation cannot be used for imaging a biological cell which typically induces similar phase delay as a 10 µm bead. On the other hand, the Rytov approximation is less affected by the object size and valid for the index difference of 0.03. Thus the Rytov approximation will be appropriate for imaging biological cells.
We imaged live HT29 cells, a human colon adenocarcinoma cell line. Cells were prepared in an imaging chamber specially designed for the imaging of a live cell. It is composed of two coverslips separated by a 125 µm thick plastic spacer. Cells were incubated at 37 °C for 12 hours before the measurements such that they become fully attached to the coverslip surface. For a fixed objective focus, we took a set of angular complex E-field images and applied both filtered back-projection algorithm and diffraction tomography based on the Rytov approximations.
Figures 6(e)–(g) are x-y slices of tomogram images processed by the projection algorithm. Figures 6(i)–(k) are slice images of a tomogram reconstructed by diffraction tomography based on the Rytov approximation, corresponding to the Figs. 6(e)–(g), respectively. Figures 6(f) and (j) are the slices corresponding to the objective focus plane. Figure 6(e) and (i) are slice images 1.7 µm above the focus, and Figs. 6(g) and (k) 2.9 µm below the focus. The bright field images (Figs. 6(a)–(c)) were taken for comparison by moving the objective lens at the same height as Figs. 6(i)–(k), respectively. It is clear that the structures in both of the refractive index tomograms are well matched to the bright field images. However, if we compare the details of the images at the tomogram slices 2.9 µm below the focus, the difference between tomograms can be seen. Figures 6(d), (h) and (l) are the zoom-in images of the white rectangles in Figs. 6(c), (g) and (k), respectively. Compared with the bright field image, tomograms processed using the filtered back-projection algorithm show ring pattern for the particles in the cytoplasm, and the two holes in the nucleolus are blurred. In contrast, in the tomogram processed by diffraction tomography, the two holes in the nucleolus are clearly resolved and the particles in the cytoplasm are well in focus (Fig. 6(l)). This demonstrates that the Rytov approximation is valid for taking the effect of diffraction into account in reconstructing 3D refractive index maps of live biological cells. As a result, we could clearly image the details of the 3D structures of a single live cell throughout its entire volume as well as quantify the refractive index of subcellular organelles. The observation that the index of the nucleus other than the nucleolus is no higher than the average index of cytoplasm is consistent with our previous report.
To our knowledge, this is the first successful implementation of optical diffraction tomography for providing quantitative 3D refractive index maps of live biological cells. The first order Rytov approximation enabled accurate imaging of biological cells, whereas the first order Born approximation caused distortion in the reconstructed images. The iterative constraint algorithm helped to reduce the effect of missing angles. But the prior knowledge employed, non-negative constraint, is a rather weak constraint. With a better constraint such as the support constraint using cell boundary, the accuracy of reconstruction can be further improved, especially in axial direction. Theoretically, the spatial resolution can be better than twice the diffraction limit due to the Ewald sphere mapping . But in the experiment, we could obtain spatial resolution only close to the diffraction limit. There can be several possible reasons on the discrepancy in the spatial resolution such as the weak contrast of the specimen and the limited sensitivity of the instrument at high angles of scattering. Further study is under way to achieve the theoretically predicted spatial resolution. For biological and biomedical applications, quantitative index maps can provide molecular concentrations without the need of fluorescence agents . Further, since the refractive index is an intrinsic quantity, the dynamics of molecules can be studied without such artifacts as photobleaching. Refractive index maps can also help understand the way individual organelles in single live cells contribute to light scattering, and thus help in the design of in-vivo light scattering instruments for disease diagnosis .
This work was funded by the National Center for Research Resources of the National Institutes of Health (P41-RR02594-18), the National Science Foundation (DBI-0754339) and Hamamatsu Corporation. Yongjin Sung was supported in part by a fellowship from the Kwanjeong Educational Foundation.
OCIS codes: (120.3180) Interferometry; (180.0180) Microscopy; (170.3880) Medical and biological imaging.