Home | About | Journals | Submit | Contact Us | Français |

**|**HHS Author Manuscripts**|**PMC3044606

Formats

Article sections

- Abstract
- I. INTRODUCTION
- II. EXPERIMENTAL DESIGN
- III. MODELING AND ESTIMATION
- IV. RESULTS
- V. CONCLUSION
- References

Authors

Related links

Conf Proc IEEE Eng Med Biol Soc. Author manuscript; available in PMC 2011 February 24.

Published in final edited form as:

PMCID: PMC3044606

NIHMSID: NIHMS271748

Wasim Q. Malik, Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139. He is also with the Massachusetts General Hospital, Harvard Medical School, Boston, MA 02114. Email: wqm/at/mit.edu;

The publisher's final edited version of this article is available at Conf Proc IEEE Eng Med Biol Soc

Multiphoton calcium fluorescence imaging has gained prominence as a valuable tool for the study of brain cells, but the corresponding analytical regimes remain rather naive. In this paper, we develop a statistical framework that facilitates principled quantitative analysis of multiphoton images. The proposed methods discriminate the stimulus-evoked response of a neuron from the background firing and image artifacts. We develop a harmonic regression model with colored noise, and estimate the model parameters with computationally efficient algorithms. We apply this model to in vivo characterization of cells from the ferret visual cortex. The results demonstrate substantially improved tuning curve fitting and image contrast.

Multiphoton fluorescence imaging (MFI) has established itself as a valuable tool for real-time *in vivo* imaging of biological systems in the last decade [1]. It is the only technique that allows recording the activity of a large population of neurons simultaneously with subcellular resolution. A multiphoton microscope excites fluorophores in a biological sample using pulsed lasers, which leads to the emission of a fluorescence signal. A focussed laser beam is scanned in a raster pattern over a 2-D or 3-D region, producing an image typically spanning hundreds of cells. Highly informative and quantitative analyses related to a range of biological systems can thus be obtained from MFI data.

Besides other applications, MFI has been used for the characterization of brain structure and function [2]. Its ability to scan a large cell population enables us to map the neuronal and astrocytic network architectures [3], [4]. Its high spatial resolution can be used to study subcellular structures on the scale of dendritic spines [5], while its high temporal resolution allows the analysis of calcium waves and other cell and network dynamics [6]. With some post-processing, we can also obtain neuronal firing rate estimates comparable to those from electrophysiological recordings [7], [8].

The methods for the analysis of time series data generated from these calcium imaging datasets remain rudimentary. Sophisticated signal processing and statistical modeling techniques are, however, required to extract maximum information from MFI data given its complexity. A number of artifacts that cause significant distortion to the data must also be modeled so that the true cell response can be recovered.

We develop a novel statistical signal processing framework for MFI data analysis in this paper, consisting of a harmonic signal model with colored noise similar to [9], and numerically efficient algorithms to estimate the parameters. We demonstrate the proposed methods with MFI data obtained from the ferret visual cortex as described below.

All experimental procedures followed have been approved by the MIT Committee on Animal Care, and adhere to the NIH guidelines for the Care and Use of Laboratory Animals.

Multiphoton imaging of the fluorescent calcium indicator Oregon Green Bapta (OGB) was performed in the visual cortex of anesthetized ferrets, *in vivo*. Neurons were bulk-loaded with OGB by intracortical injection of the AM-ester conjugated form of OGB using standard techniques [3], [4], [10]. Imaging was performed with a custom-made multiphoton laser scanning microscope consisting of a modified Olympus Fluoview confocal scan head and a titanium/sapphire laser providing approx. 100 fsec. pulses at 80 MHz pumped by a 10 W solid-state source [11]. Fluorescence was detected using photomultiplier tubes in whole-field detection mode. A 20×, 0.95 NA lens was used. Image acquisition was accomplished using Fluoview software. Time series traces of images (XYT) with a field-of-view of approx. 250 × 250 *μ*m were collected at 1 Hz. The images were taken from cortical layer 2/3, which was readily distinguished from layer 1 on the basis of the relative density of astrocytes and neurons.

Visual stimuli were delivered via a 17” LCD display placed 0.15 m away from the eyes of the animal. The stimuli were generated with the Matlab software package using the PsychoPhysics Toolbox [12]. The stimulation protocol consisted of square-wave gratings with 100% contrast which drifted at 3 Hz and rotated 10° every second (each data frame). Thus the stimulus rotates 360° in 36 sec. and the time series of the response of a neuron to this stimulus approximates a full orientation tuning curve. This stimulus was repeated three times to enhance the statistical reliability of the observations. Prior to recording these responses, 10 image frames were acquired in the absence of any visual stimulus to establish the baseline response level.

Image files collected by MFI were imported into Matlab and analyzed with custom routines. The cell bodies were identified by inspection and outlined manually. To avoid spillover from surrounding neuropil, which presumably contains indistinguishable processes of both astrocytes and neurons, conservative boundaries were defined. The relative fluorescence, Δ*F _{k}* = (

Conventional approaches to modeling MFI data consist of averaging the measured fluorescence levels at an image pixel over multiple trials and smoothing across 2-D space and time. We adopt a generalized approach based on Fourier series expansion to model the tuning curves. Our goal here is to reliably separate the stimulus-dependent neuronal response from background activity, noise and other artifacts in the MFI time series data. It is therefore desirable to decompose the response data into a deterministic stimulus-evoked and a stochastic stimulus-free component, i.e.,

(1)

Given the experimental conditions, we denote the periodic orientation stimulus with *ϕ _{k}* = 2

(2)

where *H* denotes the number of harmonics included in the model, *μ* is the intercept, and *a _{h}* and

The stochastic component, *v _{t}*, is a mixture of several noise processes. A

(3)

can well approximate such a colored noise process. Here, *c _{m}* is the

We can express this formalism using matrix notation as

(4)

where **y** = [*y*_{1}, . . . , *y _{K}*]

The above formulation can be used to obtain best-fit estimates of the model parameters. We use ordinary least squares (OLS) estimation to obtain the estimates of the regression coefficients, given by [13]

(5)

The confidence intervals at a given confidence level are easily obtained for these estimates. For the *i*^{th} coefficient, *θ _{i}*, approximate

(6)

where *t*_{γ }*d _{θ}* is the γ

(7)

Based on these intervals, we can design the t-test for the significance of *θ _{i}* by defining the alternative hypothesis

(8)

Thus *θ _{i}* makes a significant marginal contribution to the model if the null hypothesis is rejected.

Using these OLS coefficient estimates, we obtain the estimated stimulus-evoked response as . Its approximate 100(1 − *α*)% confidence intervals are given by

(9)

where

(10)

The residuals from the above OLS estimation procedure yield the *AR*(*p*) process

(11)

The AR coefficient estimates, , of , can be obtained by applying any of a number of well-known techniques, and the variance estimate, , of the residual white noise, , can be obtained similarly. The Burg algorithm, which provides least squares estimates of *ψ* by applying the Levinson-Durbin recursion, offers an efficient procedure. The approximate confidence intervals for AR coefficients are

(12)

where *d _{ψ}* =

(13)

is the standard error of the AR coefficient estimates, and

(14)

The corresponding t-test for the significance of the *i*^{th} AR coefficient, *ψ _{i}*, is given by

(15)

The whiteness of AR residuals, , can be tested using the Ljung-Box portmanteau test, for which the test statistic is

(16)

where *r _{τ}* () =

(17)

and . The null hypothesis for the whiteness test is .

Thus, the estimates for the 2*H* + *p* + 2 model coefficients, , are obtained. The stimulus-evoked response estimate, *ŝ _{k}*, can now be used to reconstruct denoised images.

In this section, we apply the techniques developed in the previous section to the MFI data. The above model is applied to the fluorescence time series, *y _{k}* =

As a representative example, Fig. 1(a) illustrates that the model provides a good fit, Δ*ŷ _{k}*, to the measured relative fluorescence, Δ

Modeling the calcium fluorescence time series at an image pixel, relative to the baseline fluorescence obtained from measurements in the absence of the stimulus. (a) The recorded fluorescence, Δ*F*_{k}, over three full-cycle repetitions of the stimulus, **...**

The whiteness analysis of the AR residuals, , is important to ensure that they are i.i.d. and normal, which would indicate that the model is appropriate. Thus, in Fig. 2, we compare the quantiles of the residuals to the normal distribution, and find only minor deviations at the tails. The sample autocorrelation function, *r _{τ}*(), substantially resembles that of a white noise process within the 95% bounds given by . In addition, it is found that the residuals pass the Ljung-Box test (

Analysis of the statistical characteristics of the residual noise, _{k}, to determine its whiteness and normality. The autocorrelation function of the residuals is shown along with the 95% confidence intervals for up to 30 lags. The inset shows **...**

It is also now straight-forward to reconstruct cellular or population images from *ŝ _{k}* for each pixel. In Fig. 3, we compare the Δ

We have presented a framework to model the time-series data obtained from high-resolution multiphoton imaging of live brain cells. The statistical model and algorithms for the analysis of calcium imaging data proposed in this paper are simple and efficient yet powerful and flexible. Separating the signal and noise components in the time series data for each pixel, our approach facilitates substantially improved characterization of neuronal response characteristics and denoising of cellular and population images. The framework presented in this paper can be easily extended to a broad class of imaging regimes with applications in biomedical systems and other areas in engineering.

This work was supported by NIH Grants DP1 OD003646 and EY07023.

Wasim Q. Malik, Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139. He is also with the Massachusetts General Hospital, Harvard Medical School, Boston, MA 02114. Email: wqm/at/mit.edu.

James Schummers, Picower Institute for Learning and Memory, Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139. Email: schummej/at/mit.edu.

Mriganka Sur, Picower Institute for Learning and Memory, Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139. Email: msur/at/mit.edu.

Emery N. Brown, Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139. He is also with the Massachusetts General Hospital, Harvard Medical School, Boston, MA 02114. Email: enbrown1/at/mit.edu.

1. Denk W, Svoboda K. Photon upmanship: why multiphoton imaging is more than a gimmick. Neuron. 1997 Mar.18(3):351–357. [PubMed]

2. Kerr JND, Denk W. Imaging in vivo: watching the brain in action. Nature Neuroscience. 2008 Mar.9(3):195–205. [PubMed]

3. Schummers J, Yu H, Sur M. Tuned responses of astrocytes and their influence on hemodynamic signals in the visual cortex. Science. 2008 Jun.320(5883):1638–1643. [PubMed]

4. Ohki K, Chung S, Ch'ng YH, Kara P, Reid CR. Functional imaging with cellular resolution reveals precise micro-architecture in visual cortex. Nature. 2005 Feb.433(7026):597–603. [PubMed]

5. Majewska A, Sur M. Motility of dendritic spines in visual cortex in vivo: Changes during the critical period and effects of visual deprivation. Proc. Natl. Acad. Sci. 2003 Dec.100(26):16 024–16 029. [PubMed]

6. Goböel W, Kampa BM, Helmchen F. Imaging cellular network dynamics in three dimensions using fast 3D laser scanning. Nature Methods. 2006 Dec.4(1):73–79. [PubMed]

7. Yaksi E, Friedrich RW. Reconstruction of firing rate changes across neuronal populations by temporally deconvolved Ca^{2+} imaging. Nature Methods. 2006 May;3(5):377–383. [PubMed]

8. Greenberg DS, Houweling AR, Kerr JND. Population imaging of ongoing neuronal activity in the visual cortex of awake rats. Nature Neuroscience. 2008 Jul.11(7):749–751. [PubMed]

9. Brown EN, Choe Y, Luithardt H, Czeisler CA. A statistical model of the human core-temperature circadian rhythm. Am. J. Physiol. Endocrinol. Metab. 2000 Sep.279(3):E669–E683. [PubMed]

10. Stosiek C, Garaschuk O, Holthoff K, Konnerth In vivo two-photon calcium imaging of neuronal networks. Proc. Natl. Acad. Sci. 2003 Jun.100(12):7319–7324. [PubMed]

11. Majewska A, Yiu G, Yuste R. A custom-made two-photon microscope and deconvolution system. Eur. J. Physiol. 2000 Dec.441(2-3):398–408. [PubMed]

12. Brainard DH. The psychophysics toolbox. Spatial Vision. 1997;10(4):433–436. [PubMed]

13. Montgomery DC, Jennings CL, Kulahci M. Introduction to Time Series Analysis and Forecasting. Wiley; 2008.

PubMed Central Canada is a service of the Canadian Institutes of Health Research (CIHR) working in partnership with the National Research Council's Canada Institute for Scientific and Technical Information in cooperation with the National Center for Biotechnology Information at the U.S. National Library of Medicine(NCBI/NLM). It includes content provided to the PubMed Central International archive by participating publishers. |