Home | About | Journals | Submit | Contact Us | Français |

**|**HHS Author Manuscripts**|**PMC2903884

Formats

Article sections

- Abstract
- I. Introduction
- II. Entropy Rate of Complex Gaussian Random Process
- III. Application to IID Sampling for Order Selection
- IV. Experiments
- V. Discussion
- References

Authors

Related links

IEEE Trans Signal Process. Author manuscript; available in PMC 2010 July 14.

Published in final edited form as:

IEEE Trans Signal Process. 2010 April 1; 58(4): 2409–2414.

doi: 10.1109/TSP.2010.2040411PMCID: PMC2903884

NIHMSID: NIHMS184519

Wei Xiong, Student Member, IEEE,^{†} Tülay Adalı, Fellow, IEEE,^{†} Yi-Ou Li, Student Member, IEEE,^{†} Hualiang Li,^{†} and Vince D. Calhoun, Senior Member, IEEE^{‡}

See other articles in PMC that cite the published article.

We derive the entropy rate formula for a complex Gaussian random process by using a widely linear model. The resulting expression is general and applicable to both circular and noncircular Gaussian processes, since any second-order stationary process can be modeled as the output of a widely linear system driven by a circular white noise. Furthermore, we demonstrate application of the derived formula to an order selection problem. We extend a scheme for independent and identically distributed (i.i.d.) sampling to the complex domain to improve the estimation performance of information-theoretic criteria when samples are correlated. We show the effectiveness of the approach for order selection for simulated and actual functional magnetic resonance imaging (fMRI) data that are inherently complex valued.

Entropy rate is a measure of the average information carried by each sample in a random sequence, and is a useful measure with applications in spectral estimation, coding, and biomedical signal processing among others [1], [2]. For example, it has been used to quantify heart variability for anomaly detection [3] and to determine independent samples in functional magnetic resonance data [4].

The entropy rate formula for a real-valued stationary Gaussian random process is developed by Kolmogorov [2], and a practical derivation using a linear model is given by Papoulis in [1]. As complex-valued signals arise typically in many signal processing problems such as communications and medical imaging, it is desirable to extend the formula to the complex domain. One such extension is possible assuming a Gaussian process with uncorrelated real and imaginary parts [5]. However, this is a limiting assumption as it ignores the possible correlation in the real and imaginary parts. As it has been noted recently, many applications such communications and medical imaging involve noncircular signals, *i.e.* those with correlation in the real and imaginary parts of the signal [6]. Thus, the usefulness of a formula that ignores such correlation is, in general, limited.

In this paper, we first derive the general formula for the entropy rate of a general complex-valued Gaussian process by using a widely linear model [7]. Since any second-order stationary process can be modeled as the output of a widely linear system driven by a circular white noise, the resulting expression is general and takes the possible correlation in the real and imaginary parts into account. The expression reduces to the entropy rate of a real-valued Gaussian random process for signals with independent real and imaginary parts.

Next, we demonstrate the application of the complex entropy rate to an order selection problem and extend an independent sampling scheme developed for the real domain [4] to the complex domain. The scheme improves the performance of order selection with information-theoretic criteria when the samples are correlated. Entropy rate is used as the measure of independence such that the samples used in the order selection formulas are approximately independent.

Such an order selection scheme is useful in a number of scenarios one of which is the analysis of functional magnetic resonance imaging (fMRI) data. The fMRI data is acquired as complex-valued spatial-temporal data and order selection is a typical step used prior to the use of analysis techniques such as independent component analysis (ICA) and principal component analysis (PCA). FMRI voxels typically have spatial correlation, hence such a sampling scheme significantly improves the accuracy of order selection results for the analysis of fMRI data when used in its native complex-valued form.

In the next section, we introduce the basic statistical relationships for complex random process and derive the entropy rate formula for a complex Gaussian random process. In Section III, we present the application of entropy rate to i.i.d. sampling for order selection in the complex domain. We present experimental results of the order selection scheme using i.i.d. sampling, on simulated and actual fMRI data in Section IV, and provide a discussion of the results in the final section.

A complex random variable *Z* is written as *Z* = *Z _{r}* +

$$H(\mathbf{Z})\triangleq H({\mathbf{Z}}_{r},{\mathbf{Z}}_{i})=-E\{logp({\mathbf{Z}}_{r},{\mathbf{Z}}_{i})\}.$$

A complex random process *Z _{k}* is written as a sequence of complex random variables indexed by

The Fourier transform of the covariance function yields the *power spectrum* (or *spectral density*) *function S*(*ω*). Similarly, we define the Fourier transform of the pseudo covariance function as the *pseudo power spectrum function *(*ω*). The necessary conditions for a pair of functions *S*(*ω*) and (*ω*) to be the power spectrum function and pseudo power spectrum function of a complex SOS random process are given by [9], [10]:

$$S(\omega )\ge 0,\phantom{\rule{0.38889em}{0ex}}\phantom{\rule{0.38889em}{0ex}}\phantom{\rule{0.38889em}{0ex}}\stackrel{\sim}{S}(\omega )=\stackrel{\sim}{S}(-\omega ),\phantom{\rule{0.38889em}{0ex}}\phantom{\rule{0.38889em}{0ex}}\phantom{\rule{0.38889em}{0ex}}\text{and}\phantom{\rule{0.38889em}{0ex}}\phantom{\rule{0.38889em}{0ex}}\phantom{\rule{0.38889em}{0ex}}\mid \stackrel{\sim}{S}(\omega ){\mid}^{2}\le S(\omega )S(-\omega ).$$

(1)

A stationary random process is said to be *white* if its covariance function satisfies *R*(*m*) = *R*(0)*δ*(*m*) where *δ*(·) is the Kronecker-delta function, while there is no restriction on the pseudo covariance function. A *doubly white* [9] process has restrictions on both the covariance function and the pseudo covariance function, *i.e.*, one should have both *R*(*m*) = *R*(0)*δ*(*m*) and (*m*) = (0)*δ*(*m*) satisfied. The value of (0) is in general complex valued. When the real and imaginary parts are uncorrelated, however it assumes a real value. Furthermore, when the pseudo covariance function vanishes, *i.e.*, (*m*) = 0, the process is a *circular white* process.

Entropy rate is a measure of average information in a random sequence, which can be written for a complex random process *Z _{k}* as

$${h}_{c}(Z)=\underset{n\to \infty}{lim}\frac{H({Z}_{1},{Z}_{2},\dots ,{Z}_{n})}{n}$$

(2)

when the limit exists. As in the real case,
$H({Z}_{1},{Z}_{2},\dots ,{Z}_{n})\le {\sum}_{k=1}^{n}H({Z}_{k})$, with equality if and only if the random variables *Z _{k}* are independent. Therefore the entropy rate can be used to measure the sample dependence and it reaches the upper bound when all samples of the process are independent.

The widely linear filter is introduced in [7], and it improves minimum-mean square estimation performance with respect to the strictly linear filter that is generally used, when the commonly used circularity assumptions do not hold. Also, any SOS complex signal can be modeled as the output of a widely linear system driven by a circular white noise, which cannot be achieved by a strictly linear system [9]. Given the input and output vectors **x***,* **y** * ^{N}*, a widely linear system is given by

$$\mathbf{y}=\mathbf{Fx}+\mathbf{G}{\mathbf{x}}^{\ast}$$

(3)

where **F** and **G** are complex-valued impulse responses in matrix form. The system function of a widely linear system is the pair of functions [*F*(*ω*), *G*(*ω*)]. The input-output relationship for the two power spectrum functions of a widely linear system can be written as

$${S}_{y}(\omega )=\mid F(\omega ){\mid}^{2}{S}_{x}(\omega )+\mid G(\omega ){\mid}^{2}{S}_{x}^{\ast}(-\omega )+F(\omega ){G}^{\ast}(\omega ){\stackrel{\sim}{S}}_{x}(\omega )+{F}^{\ast}(\omega )G(\omega ){\stackrel{\sim}{S}}_{x}^{\ast}(-\omega )$$

(4)

$${\stackrel{\sim}{S}}_{y}(\omega )=F(\omega )G(-\omega ){S}_{x}(\omega )+F(-\omega )G(\omega ){S}_{x}^{\ast}(-\omega )+F(\omega )F(-\omega ){\stackrel{\sim}{S}}_{x}(\omega )+G(\omega )G(-\omega ){\stackrel{\sim}{S}}_{x}^{\ast}(-\omega ).$$

(5)

To write the joint entropy between the input and output of a widely linear system, we first write the relationship of input and output in ^{2}* ^{N}* using (3) as =

$$\mathbf{A}=\left[\begin{array}{ll}{\mathbf{F}}_{r}+{\mathbf{G}}_{r}\hfill & {\mathbf{G}}_{i}-{\mathbf{F}}_{i}\hfill \\ {\mathbf{F}}_{i}+{\mathbf{G}}_{i}\hfill & {\mathbf{F}}_{r}-{\mathbf{G}}_{r}\hfill \end{array}\right]$$

(6)

where the subscripts *r* and *i* refer to the real and imaginary parts respectively. The output entropy is written as

$$H(\mathbf{y})=H(\mathbf{x})+log\mid det(\mathbf{A})\mid .$$

(7)

since the output pdf can be expressed using Jacobian as *p*(**y**) = *p*(**x**)/|det(**A**)|.

In this section, we study the entropy rate of a complex SOS Gaussian random process through a widely linear model using an approach similar to the one given in [1].

The entropy rate of output *y*(*n*) of a widely linear system [*F*(*ω*)*, G*(*ω*)], where *F*(*ω*) and *G*(*ω*) are minimum phase, is given by

$${h}_{c}(Y)={h}_{c}(X)+\frac{1}{4\pi}{\int}_{-\pi}^{\pi}log\{[\mid F({e}^{j\omega}){\mid}^{2}-\mid G({e}^{j\omega}){\mid}^{2}][\mid F({e}^{-j\omega}){\mid}^{2}-\mid G({e}^{-j\omega}){\mid}^{2}]\}d\omega $$

where *h _{c}*(

We assume that *x*(*n*) is applied at *n* = 0, then the resulting system output is

$$y(n)=\sum _{k=0}^{n}x(n-k)f(k)+\sum _{k=0}^{n}{x}^{\ast}(n-k)g(k).$$

The widely linear transform for input **x** = [*x*(0), *x*(1), …, *x*(*n*)]* ^{T}* and output

$$\mathbf{F}=\left[\begin{array}{cccc}f(0)& 0& \cdots & 0\\ f(1)& f(0)& \cdots & 0\\ \vdots & \vdots & \ddots & \vdots \\ f(n)& f(n-1)& \cdots & f(0)\end{array}\right]$$

and **G** is written similarly.

The entropy relation for input **x** and output **y** is given by (7) and the determinant of **A** given in (6) is

$$det(\mathbf{A})={[\mid f(0){\mid}^{2}-\mid g(0){\mid}^{2}]}^{n-1}.$$

(8)

From (2), (7) and (8), the relationship of entropy rate between the input and the output is given by

$${h}_{c}(Y)={h}_{c}(X)+log[\mid f(0){\mid}^{2}-\mid g(0){\mid}^{2}].$$

(9)

Therefore, to complete the proof, we need to express (9) in terms of the integral of the system functions *F*(*ω*) and *G*(*ω*). Using *z* = *e ^{jω}*, we obtain

$$j{\int}_{-\pi}^{\pi}log\{[\mid F({e}^{j\omega}){\mid}^{2}-\mid G({e}^{j\omega}){\mid}^{2}][\mid F({e}^{-j\omega}){\mid}^{2}-\mid G({e}^{-j\omega}){\mid}^{2}]\}d\omega =2\oint \frac{1}{z}log[\mid F(z){\mid}^{2}-\mid G(z){\mid}^{2}]dz$$

(10)

where the path of the contour integral is the unit circle. Assuming *F*(*z*) and *G*(*z*) are minimum-phase systems, the circle of integration can be made arbitrarily large, thus yielding *F*(*z*) = *f*(0), *G*(*z*) = *g*(0) as *z* → ∞. It follows that

$$\oint \frac{1}{z}log[\mid F(z){\mid}^{2}-\mid G(z){\mid}^{2}]dz=j2\pi log[\mid f(0){\mid}^{2}-\mid g(0){\mid}^{2}].$$

(11)

Hence, (9), (10) and (11) lead to the conclusion.

The same result can be obtained using the entropy rate formula for a vector of real-valued random processes given in [11], which is subject to the lower semicontinuity condition defined in [11].

If *z*(*n*) is a complex SOS Gaussian random process with power spectrum function *S*(*ω*) and pseudo power spectrum function (*ω*), its entropy rate *h _{c}* is given by

$${h}_{c}=log(\pi e)+\frac{1}{4\pi}{\int}_{-\pi}^{\pi}log\left[S(\omega )S(-\omega )-\mid \stackrel{\sim}{S}(\omega ){\mid}^{2}\right]d\omega .$$

(12)

Since the real and imaginary parts of *z*(*n*) are real-valued stationary Gaussian random processes, the joint entropy of *z*(*n*) is given by

$$H(\mathbf{z})=\frac{1}{2}log[{(2\pi e)}^{2n}det(\mathbf{K})]$$

(13)

where **K** is the covariance matrix of the real-valued random vector = [*z _{r,}*

We can construct a complex Gaussian random sequence *y*(*n*) as the output of a widely linear system [*F*(*ω*)*, G*(*ω*)]. Let the input *x*(*n*) be a circular white Gaussian random sequence with **K** = (1/2)**I**, where **I** is the 2*n* × 2*n* identity matrix. From (2) and (13), the entropy rate of input *x*(*n*) can be written as

$$h(X)=log(\pi e).$$

(14)

The power spectrum function and pseudo power spectrum function of *x*(*n*) are given as *S _{x}*(

$$\mid F(\omega ){\mid}^{2}+\mid G(\omega ){\mid}^{2}=S(\omega )$$

(15)

$$F(\omega )G(-\omega )=F(-\omega )G(\omega )=\frac{1}{2}\stackrel{\sim}{S}(\omega ),$$

(16)

the output sequence *y*(*n*) has the same power spectrum function and pseudo power spectrum function as *z*(*n*), which indicates that the entropy rate of *z*(*n*) is equal to that of *y*(*n*). In [9], it is shown that widely linear system defined by *F*(*ω*) and *G*(*ω*), which satisfies the equalities (15) and (16), exists and fulfills the necessary conditions (1) for *S*(*ω*), (*ω*). Furthermore, using (15) and (16), we obtain

$$[\mid F(\omega ){\mid}^{2}-\mid G(\omega ){\mid}^{2}][\mid F(-\omega ){\mid}^{2}-\mid G(-\omega ){\mid}^{2}]=S(\omega )S(-\omega )-\mid \stackrel{\sim}{S}(\omega ){\mid}^{2}.$$

(17)

Therefore, the conclusion is followed by (14), (17) and *Proposition 1*.

For a second-order circular process, we have (*ω*) = 0, thus yielding the entropy rate of a second-order circular Gaussian random process as

$${h}_{\text{circ}}=log(\pi e)+\frac{1}{4\pi}{\int}_{-\pi}^{\pi}log[S(\omega )S(-\omega )]d\omega .$$

(18)

For the general case given in (12), |(*ω*)|^{2} ≥ 0. Hence, for the second-order circular and noncircular Gaussian random sequences with the same covariance function *R*(*m*), we have

$${h}_{\text{noncirc}}\le {h}_{\text{circ}},$$

which can be also verified using the result for complex entropy [8], [12] and the definition of entropy rate given in (2).

Next, we show that for a doubly white Gaussian random process with (0) , the complex entropy rate given in (12) is equal to twice the entropy rate of a real-valued Gaussian random process. For a doubly white Gaus’sian random process with (0) , the real and imaginary parts are uncorrelated and white. Since the real and imaginary parts are decoupled in this case, we can directly use the entropy rate formula for real-valued Gaussian process respectively, and the entropy rate of the complex Gaussian process is

$${h}_{c}={h}_{r}+{h}_{i}$$

(19)

where the subscripts *r* and *i* refer to the functions of the real and imaginary parts.

Since the real and imaginary parts are white, their power spectrum functions are given as *S _{r}*(

$$S(\omega )S(-\omega )-\mid \stackrel{\sim}{S}(\omega ){\mid}^{2}=4{S}_{r}(\omega ){S}_{i}(\omega ).$$

(20)

By incorporating (20) into (12), we obtain (19), where

$${h}_{r}=\frac{1}{2}log(2\pi e)+\frac{1}{4\pi}{\int}_{-\pi}^{\pi}log{S}_{r}(\omega )d\omega $$

and *h _{i}* is written similarly [2]. Hence, the entropy rate calculated from formulas in the real and the complex domain are same for a doubly white Gaussian process with (0) .

For a circular white Gaussian random process, we have (*m*) = 0, and thus the entropy rate can be simply calculated using the entropy rate formula in the real domain, leading to the result given in (18).

Information-theoretic criteria are commonly used for order selection in many signal processing problems. There are a number of information-theoretic criteria commonly used for order selection, such as, Akaike’s information criterion (AIC) [13], Kullback-Leibler information criterion (KIC) [14], and the minimum description length (MDL) criterion [15] or the Bayesian information criterion [16]. The formulas for AIC, KIC and MDL criteria assume similar forms such that the last term in

$$\mathcal{E}(k)=-2\mathcal{L}(\mathbf{x}\mid {\mathbf{\Theta}}_{k})+\mathcal{P}({\mathbf{\Theta}}_{k})$$

(21)

is given by 2(**Θ*** _{k}*) for AIC, 3(

In [17], Wax and Kailath provide a practical form of the maximum log-likelihood for complex-valued data, using multivariate Gaussian model, as

$$\mathcal{L}(\mathbf{x}\mid {\mathbf{\Theta}}_{k})=Nlog{\left(\frac{{\prod}_{i=k+1}^{T}{\lambda}_{i}^{{\scriptstyle \frac{1}{T-k}}}}{{\scriptstyle \frac{1}{T-k}}{\sum}_{i=k+1}^{T}{\lambda}_{i}}\right)}^{T-k}$$

(22)

where *T* is the original dimension of the multivariate data, *k* is the candidate order, *N* is the sample size, and *λ _{i}*’s are the eigenvalues of the sample covariance matrix of the multivariate observations. The number of free parameters for complex-valued data is given by (

Order selection formulations are based on the assumption of i.i.d. samples. When dependent samples are used, the actual number of i.i.d. samples is less than *N*, and the likelihood term given by (22) improperly dominates the information-theoretic criteria, resulting in an over-estimation of the order. In the experiments section, we show the over-estimation of order number on simulated fMRI data.

Commonly there is sample dependence among the samples, which violates the i.i.d. assumption of information-theoretic criteria. An example is in fRMI data where samples are not i.i.d. due to the point spread function of the scanner as well as the use of spatial smoothing as a preprocessing step. However, the dependence among the data is typically localized, *i.e.*, lies within few adjacent samples. Hence, an i.i.d. sampling scheme can be used to identify an effectively i.i.d. sample set in the complex domain, as shown in [4] for real-valued data.

We model the data as a complex-valued finite-order moving average (MA) process, *i.e.*, an SOS Gaussian random process, which is the output of a linear system with an i.i.d. complex Gaussian input. The second-order statistics of the finite-order MA sequence *z*(*n*) has finite length, *i.e.*, *R*(*m*) = *E*{*z*(*n* + *m*)*z*^{*}(*n*)} = 0 and (*m*) = *E*{*z*(*n* + *m*)*z*(*n*)} = 0 for |*m*| ≥ *L*, where *L* is a small positive integer. The subsampled sequence *z _{s}*(

Entropy rate can be used to measure the sample dependence, and it reaches the upper bound when the samples are i.i.d. From (12), the theoretical upperbound value for the entropy rate to the normalized complex Gaussian random sequence *z*(*n*), *i.e.*, *R*(0) = 1 and (0) = *c*, where *c* , |*c*| ≤ 1, is given by

$${h}_{c}(Z)\le log(\pi e)+\frac{1}{2}log(1-\mid c{\mid}^{2}).$$

The grid of locations on which the data samples are considered to be effectively independent is determined when the entropy rate reaches its upperbound. Therefore, an effective i.i.d. sample set is obtained on this grid of spatial locations at which the dependence among the samples is small enough to be ignored. Since the subsampling procedure decreases the number of samples for estimation, an eigenspectrum adjustment scheme [18] is used to mitigate the finite sample effect.

We generate eight complex-valued spatial maps to simulate the fMRI sources and corresponding time courses, the magnitudes of which are similar to the ones used in [19]. In an fMRI experiment, the phase difference induced by the task activation is typically less than *π*/9 [20], [21]. Therefore, we keep the phase of each pixel uniformly distributed in the range [−*π*/18, *π*/18]. The phase of each complex-valued time point is generated proportional to its magnitude, but is again restricted to a small range [22], which in our case is [−*π*/18, *π*/18]. The spatial sources are rearranged into one-dimensional vectors and mixed by the corresponding time courses as the spatial ICA model for fMRI data [23]:

$$\mathbf{X}=\sum _{k=1}^{M}{\mathbf{a}}_{k}{\mathbf{s}}_{k}^{T}+\mathbf{V}.$$

Here, **s*** _{k}*,

Complex-valued Gaussian noise is added to the data set with a specified contrast to noise ratio (CNR), calculated as the ratio of the standard deviation of the mixed data set without noise to the standard deviation of the Gaussian noise. The mixture data are spatially smoothed, separately for the real and imaginary parts, by a Gaussian kernel with a full-width at half maximum (FWHM) of 2 pixels.

The mixtures of eight sources, with noise levels of CNR = −3, 0, 3 and 6 dB, are created and the complex-valued order selection with i.i.d. sampling scheme is applied to these mixtures. To study the effect of sample dependence on the estimated order, we also apply order selection without subsampling to simulated data with CNR = 3 dB. The criteria used in the experiment are AIC, KIC and MDL. Fig. 1 shows the results of 10 Monte Carlo simulations where a different noise realization is used for each run. The standard deviation is stacked on the mean value in each bar plot.

Order selection, (a) with and without i.i.d. sampling on simulated data (CNR = 3 dB), (b) on simulated data with different CNR values: −3, 0, 3 and 6 dB, using effectively i.i.d. samples

Without subsampling, the order number is significantly over-estimated as observed in Fig. 1(a), since the samples become correlated after spatial smoothing. As shown in Fig. 1(b), the criteria based on the effective i.i.d. samples yield accurate estimates (8 sources) when the CNR is higher than 0 dB. CNR of actual fMRI data is typically in the range [0*,* 3] dB, and the complex-valued order selection scheme is thus effective in this CNR range.

FMRI data from ten subjects performing a finger-tapping task are used in this study. Details of the data are given in [24]. The real and imaginary images are spatially smoothed using a 10 × 10 × 10 mm full-width half-maximum Gaussian kernel. To study the effect of sample dependence on order selection, the data without spatial smoothing is used in the experiments as the “unsmoothed” fMRI data set in contrast to the “smoothed” fMRI data which are preprocessed using the Gaussian kernel.

To show the effect of i.i.d. sampling, we plot the entropy rate of the subsampled data with different subsampling depths on 10 sets of unsmoothed and smoothed fMRI data sets in Fig. 2. It can be observed that as the subsampling depth increases, the entropy rate increases and converges to its upperbound. Thus the subsampling depth is determined when all the samples are independent. Since the smoothed data have greater sample dependence compared to the unsmoothed data, greater subsampling depth is required to achieve convergence for the smoother data.

The entropy rate of the subsampled data with different subsampling depths, (a) on unsmoothed fMRI data sets, (b) on smoothed fMRI data sets

The results of order selection using i.i.d. sampling is shown in [5] using actual fMRI data. It is shown that the subsamling scheme effectively addresses the over-estimation problem and the range of values indicated by KIC and MDL provide a good regime for the order numbers to be used in complex ICA of fMRI data.

In this paper, we derive the entropy rate for complex SOS random processes using a widely linear model. The resulting expression is general such that it is applicable to both circular and noncircular Gaussian processes, and can be employed in various applications. For example, the traditional spectral estimation problem in the real domain focuses on estimating the spectral density function, and one approach is through the maximization of the entropy rate [1]. When extending this approach to the complex domain, as we have demonstrated, both the spectral density function and the *pseudo* spectral density functions need to be estimated for the entropy rate when the process is noncircular. In other application domains such as biomedical image analysis [23] and communications that generate complex-valued data the given entropy rate can be employed as an effective measure.

In summary, we introduce an application of complex entropy rate in biomedical signal processing. By modeling the complex data as a finite-order MA process, we develop an i.i.d. sampling scheme in the complex domain to improve the performance of order selection with information-theoretic criteria when the samples are correlated. Entropy rate is used to measure the sample dependence in this order selection scheme. The order selection scheme with i.i.d. sampling is implemented for analysis of complex fMRI data, which is useful for data-driven analysis approaches such as PCA and ICA. The method can be further utilized in other signal processing fields, where a lower dimensional informative subspace needs to be identified.

This work is supported by the NSF grants NSF-IIS 0612076 and NSF-CCF 0635129.

1. Papoulis A. Maximum entropy and spectral estimation: a review. IEEE Trans Acoust Speech, Signal Process. 1982;29:1176–1186.

2. Cover TM, Thomas JA. Elements of Information Theory. Wiley; 1991.

3. Porta A, Guzzetti S, Montano N, Furlan R, Pagani M, Malliani A, Cerutti S. Entropy, entropy rate and pattern classification as tools to typify complexity in short heart period variability series. IEEE Trans Biomed Eng. 2001;48:1282–1291. [PubMed]

4. Li YO, Adalı T, Calhoun VD. Estimating the number of independent components for fMRI data. Hum Brain Mapp. 2007;28:1251–1266. [PubMed]

5. Xiong W, Li Y-O, Li H, Adalı T, Calhoun VD. On ICA of complex-valued fMRI: advantages and order selection. Proc. ICASSP; Las Vegas, NV. 2008. pp. 529–532.

6. Adalı T, Li H, Novey M, Cardoso JF. Complex ICA using nonlinear functions. IEEE Trans Signal Process. 2008;56:4536–4544.

7. Picinbono B, Chevalier P. Widely linear systems for estimation. IEEE Trans Signal Process. 1995;43:2030–2033.

8. Neeser FD, Massey JL. Proper complex random processes with applications to information theory. IEEE Trans Inf Theory. 1993;39:1293–1302.

9. Picinbono B, Bondon P. Second-order statistics of complex signals. IEEE Trans Signal Process. 1997;45:411–420.

10. Schreier PJ, Scharf LL. Second-order analysis of improper complex random vectors and processes. IEEE Trans Signal Process. 2003;51:714–725.

11. Pham DT. Mutual information approach to blind separation of stationary sources. IEEE Trans Inf Theory. 2002;48:1935–1946.

12. Schreier PJ. Bounds on the degree of impropriety of complex random vectors. IEEE Signal Process Lett. 2008;15:190–193.

13. Akaike H. A new look at statistical model identification. IEEE Trans Autom Control. 1974;19:716–723.

14. Cavanaugh JE. A large-sample model selection criterion based on Kullback’s symmetric devergence. Stat Probab Lett. 1999;44:333–344.

15. Rissanen J. Modeling by the shortest data description. Automatica. 1978;14:465–471.

16. Schwartz G. Estimating the dimension of a model. Ann Stat. 1978;6:461–464.

17. Wax M, Kailath T. Detection of signals by information theoretic criteria. IEEE Trans Acoust, Speech, Signal Process. 1985;33:387–392.

18. Beckmann CF, Smith SM. Probabilistic independent component analysis for functional magnetic resonance imaging. IEEE Trans Med Imag. 2004;23:137–152. [PubMed]

19. Correa N, Li Y-O, Adalı T, Calhoun VD. Comparison of blind source separation algorithms for fMRI using a new Matlab toolbox: GIFT. Proc. ICASSP; Philadelphia, PA. 2005. pp. 401–404.

20. Hoogenrad FG, Reichenbach JR, Haacke EM, Lai S, Kuppusamy K, Sprenger M. In vivo measurement of changes in venous blood-oxygenation with high resolution functional MRI at .95 Tesla by measuring changes in susceptibility and velocity. Magn Reson Med. 1998;39:97–107. [PubMed]

21. Calhoun VD, Adalı T, van Zijl PCM, Pekar JJ. Independent component analysis of fMRI data in the complex domain. Magn Reson Med. 2002;48:180–192. [PubMed]

22. Rowe DB. Modeling both the magnitude and phase of complex-valued fMRI data. Neuroimage. 2005;25:1310–1324. [PubMed]

23. Adalı T, Calhoun VD. Complex ICA of brain imaging data. IEEE Signal Processing Mag. 2007;24:136–139.

24. Feng Z, Caprihan A, Blagoev K, Zhao F, Calhoun VD. Modeling of phase changes in BOLD fMRI. Proc. ISMRM; Toronto, Canada. 2008.

PubMed Central Canada is a service of the Canadian Institutes of Health Research (CIHR) working in partnership with the National Research Council's national science library in cooperation with the National Center for Biotechnology Information at the U.S. National Library of Medicine(NCBI/NLM). It includes content provided to the PubMed Central International archive by participating publishers. |