Home | About | Journals | Submit | Contact Us | Français |

**|**Sensors (Basel)**|**v.12(5); 2012**|**PMC3386765

Formats

Article sections

- Abstract
- 1. Introduction
- 2. Problem Statements
- 3. Blurred Star Image Denoising
- 4. Blurred Star Image Restoration
- 5. Results and Analysis
- 6. Conclusions
- References

Authors

Related links

Sensors (Basel). 2012; 12(5): 6712–6726.

Published online 2012 May 22. doi: 10.3390/s120506712

PMCID: PMC3386765

Science and Technology on Inertial Laboratory, Key Laboratory of Fundamental Science for National Defense-Novel Inertial Instrument & Navigation System Technology, Beijing 100191, China; E-Mails: nc.moc.oohay@anilgnahzuy (W.Z.); Email: nc.ude.aaub@ougl (L.G.)

Received 2012 April 2; Revised 2012 April 23; Accepted 2012 April 27.

Copyright © 2012 by the authors; licensee MDPI, Basel, Switzerland.

This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).

This article has been cited by other articles in PMC.

The precision of star point location is significant to identify the star map and to acquire the aircraft attitude for star sensors. Under dynamic conditions, star images are not only corrupted by various noises, but also blurred due to the angular rate of the star sensor. According to different angular rates under dynamic conditions, a novel method is proposed in this article, which includes a denoising method based on adaptive wavelet threshold and a restoration method based on the large angular rate. The adaptive threshold is adopted for denoising the star image when the angular rate is in the dynamic range. Then, the mathematical model of motion blur is deduced so as to restore the blurred star map due to large angular rate. Simulation results validate the effectiveness of the proposed method, which is suitable for blurred star image processing and practical for attitude determination of satellites under dynamic conditions.

As high accuracy and high reliability devices, star sensors play an important role in attitude determination for satellites in celestial navigation system (CNS). The main steps for attitude determination include star point location, star identification and attitude tracking [1]. Based on the captured star images, stars can be located and identified. Whether the attitude determination of satellite is successful or not, the pattern recognition is very important for star images in the field of view (FOV) [2]. It indicates that only the available and recognizable star images can ensure star sensor can give an accurate satellite attitude [3], so it is critical to improve the accuracy of star point location.

In the past many years, some algorithms have been developed to extract star centroids utilizing initial star images. Reference [4] shows a new sub-pixel interpolation technique to process image centroids. Reference [5] gives a method of enhancement of the centroiding algorithm for star tracker measure refinement. An analytical and experimental study for autonomous star sensing, including the star centroid process, is presented in [6]. However, these studies are generally used under static conditions or balanced processes. For many agile maneuver satellites, star sensors work under dynamic conditions as a result of the rotation of the star sensor along with the satellite. Therefore, various noises in star field caused by dynamic factors may affect the quality of imaging. Moreover, due to the large angular rate, the star point moves on the focal plane during the exposure time, which may lead to two changes for the star point: the position shifting on the focal plane and the limited starlight energy dispersing into more pixels. As a result, the SNR (Signal to Noise Ratio) of blurred star images may decrease and the measured star centroid would be inaccurate.

Dynamic conditions stress the need for a very accurate and robust processing method for blurred star images. At present, many investigations tend to concentrate on the exploration and analysis of star location under dynamic condition [3]. For example, reference [7] shows the simulation analysis of dynamic working performance for star trackers. Reference [8] provides an analysis of the star image centroid accuracy of an APS star sensor under rotation conditions. Blind iterative restoration of images with spatially-varying blur is the research topic in reference [9]. However, most of them are limited to the useless locating capability when the angular rate is larger than 2°/s. In [3], the method is effective to estimate attitude, but it has the contrary effect when the angular rate is low.

The main theme of this paper is to overcome the difficulties arising from dynamic imaging blur of star sensors, including denoising and restoration of blurred star images by estimating the angular rate. On the one hand, if the angular rate is in the dynamic range of the star sensor, a proposed adaptive wavelet threshold is used for denoising according to the characteristics of the blurred star image, which can guve the accurate centriod within sub-pixels. On the other hand, if the angular rate is larger than the dynamic range, the restoration algorithm based on the angular rate is used to process the “tail” star image. As will be seen later in this paper, the adaptive method outperforms other denoising methods in terms of Power Signal-to-Noise Ratio (PSNR) and visual qualities, and the large variation of the angular rate has little effect on the star centroid determination based on the restoration method.

This paper is divided into six sections. Following this Introduction, the imaging theory of star sensors is outlined in Section 2, as well as the characteristic of blurred star images under dynamic conditions. Then the method of denoising based on adaptive wavelet threshold is described in detail in Section 3. The restoration method is developed in Section 4 by analyzing the Point Spread Function (PSF) of motion blurred star images, where a Wiener filter with an optimal window is used to overcome the edge error. In Section 5, simulation results are shown to demonstrate the proposed methods. At the end, conclusions are drawn in Section 6.

In developing a set of equations to be mechanized by a celestial navigation system and star sensor or in studying the behavior of a given system, it is necessary to introduce several sets of orthogonal coordinates:

**Inertial coordinate system**(*X*) has its origin at the center of the Earth and is non-rotating with respect to the fixed stars. Its_{L-}Y_{L-}Z_{L}*x*-axis is in the equatorial plane and the*z*-axis is normal to that plane; and the*y*-axis complements the right-handed system.**Star sensor coordinate system**(*X*) has its origin at the center of mass of the star sensor. Its_{s-}Y_{s-}Z_{s}*x*-axis points along longitudinal axis of the star sensor; the*z*-axis is perpendicular to the longitudinal plane of symmetry and is along the boresight of the star sensor; and the*y*-axis completes a right-handed system.**Focal plane coordinate system**(*X*) has its origin at the center of the focal plane. Its x-axis points along longitudinal axis of the focal plane; the y-axis is perpendicular to the longitudinal plane._{p-}Y_{p}

Figure 1 illustrates the general large FOV star sensor for attitude determination. After capturing stars in the real sky and imaging by the star sensor, the attitude information is completed by an autonomous procedure (including image pretreatment, star centroiding, star map matching, attitude determination, *etc.*). The schematic of imaging is also shown in Figure 1, where *f* is the focal length of lens and *α* is the angle of FOV.

According to the coordinates of star points in the focal plane coordinate system *X _{p-}Y_{p}*, it is easy to obtain the coordinates matrix

$$A={G}^{-1}S$$

(1)

Under static conditions, the distribution of star points is generally represented as a two-dimensional Gaussian with a 3 × 3 or 5 × 5 dispersion circle by defocusing technology [10], so that the accuracy of star centroid can be kept within a sub-pixel level. However, under dynamic conditions, the original star image is perturbed and blurred by various additive noises, which mainly include photon response uniform noise, photon shot noise, dark current noise, readout noise, *etc.* [11].

At present, the dynamic range of a large FOV star sensor is about 3–5°/s [8]. Suppose the angular rate is *w*. If *w* ranges by 1–1.5°/s under dynamic condition, the rotation of the star sensor has little effect on the star images. However, a star sensor may lose tracking using the technique under static conditions due to the various noises caused by the dynamic conditions. On the other hand, if *w* is larger than the dynamic range, the star point constantly shifts in the focal plane and appears to trail badly during exposure time, which may affect the star centroid accuracy and even result in the failure of attitude determination.

Based on the foregoing discussions, denoising and deblurring are two crucial parts for the pretreatment of blurred star images. Performance parameters of the star sensor used in this paper are shown in Table 1. The CMOS image sensor chip is the STAR 1000 from the Cypress Semiconductor Co. [12].

Supposing the size of the clear image *f(i, j)* is *N* × *N*, a common model of the corrupted image *g(i, j)* is mathematically defined as:

$$g(i,j)=f(i,j)+n(i,j)$$

(2)

where 0 ≤ *i, j* ≤ *N*-1, and *n(i, j)* is additive random noise and independent of *f(i, j)*. The goal is to remove *n(i, j)* and estimate *f(i, j)* which minimizes the Mean Squared Error (MSE) [13].

In general, the important information of *f(i, j)* is mostly distributed as a smooth signal at low frequency, while *n(i, j)* is distributed at high frequency. Based on this, a two-dimensional (2-D) discrete wavelet transform (DWT) can be implemented to transform *g(i, j)* into the wavelet domain. Then, wavelet coefficients denoting different scales and orientations can be obtained with the use of the Mallet algorithm [14].

Figure 2 shows the subbands of the orthogonal DWT of three levels. *LL _{3}* is an approximation subband (or the resolution residual) which contains the low frequency portion of

Thresholding is simple because it operates on one wavelet coefficient at a time. The method of using an adaptive threshold to implement denoising described by Lakhwinder Kaur *et al.* [16] appears more suitable, in which threshold choice is:

$${T}_{N}=\frac{\beta {\sigma}^{2}}{{\sigma}_{y}}$$

(3)

where *σ _{y}* is the standard deviation of each subband, and

$$\beta =\sqrt{log\frac{{L}_{k}}{J}}$$

(4)

where *J* is the largest scale, and *L _{k}* is the length of subband at the scale of

$${\sigma}^{2}=\mathit{\text{Median}}(|H{H}_{1}|)/0.6745$$

(5)

Studies in [17] indicate that the square error relating to *HH _{1}* of

- Apply an M × M local window to compute σ
^{2}_{lJ}, which denotes the coefficients variance of window l in scale J. M is determined by the square root of the number of pixels occupied by the star point, and generally is not more than seven. - Compute noise variance σ
^{2}according to Equation (5). - Obtain the threshold by:where$$T{h}_{i}=\beta \frac{{\sigma}^{2}}{{\sigma}_{lJ}^{2}}$$(6)
*Th*is the threshold in window_{l}*l*of scale*J*.

Based on the foregoing analysis, the proposed method for star image denoising is summarized as follows:

- Execute decomposition of the initial blurred star image using a wavelet transform at level K.
- Compute the noise variance σ
^{2}according to Equation (5). - Compute the scale parameter β of level K using Equation (4).
- Use a 4 × 4 square window in LLK to obtain Thl by Equation (6).
- Process coefficients in scale K using the following threshold function:which keeps the coefficients information if it is larger than threshold; otherwise, it is set to zero.$$\widehat{x}=\{\begin{array}{cc}sgn(x)(|x|-T{h}_{l}),& \left|x\right|\ge T{h}_{l}\\ 0,& \left|x\right|\le T{h}_{l}\end{array}$$(7)
- Invert the multiscale decomposition to reconstruct the denoised star image.

The current angular rate w of satellite can be obtained by the attitude update. As mentioned in Section 2, if *w* is larger than the dynamic range, the attitude cannot be correctly computed because of the “trailed” image. This section mainly focuses on the restoration of motion-blurred star image as a result of large *w*.

Due to the motion of the star sensor during exposure time, what a star sensor captures is a motion-blurred image *g(x, y)*. Suppose, a clear image *f(x, y)* moves on the focal plane, its displacement components of direction *x* and *y* can be respectively termed as *x(t)* and *y(t)*, where *t* is the movement time during exposure time *T*. Then, the expression of *g(x, y)* can be obtained from Equation (8), and the expression of PSF in frequency domain can be obtained from Equation (9), which is similar as a previous reference [18]:

$$g(x,y)={\int}_{0}^{T}f(x-x(t),y-y(t))dt$$

(8)

$$H(u,v)={\int}_{0}^{T}{e}^{-j2\pi [ux(t)+vy(t)]}dt$$

(9)

If the satellite rotates clockwise about the boresight *Z _{p}* with an angular rate

where *X _{p-}Y_{p}* is the coordinate system of focal plane,

$$l=r{w}_{z}t$$

(10)

where
$r=\sqrt{{{x}_{p}}^{2}+{{y}_{p}}^{2}}$, and (*x _{p}*,

Suppose *P* moves along track *l* with angle *γ* to the horizontal axis *X _{p}* by velocity

$$H(u,v)=\frac{Tsin(\pi \mathit{\text{ul}}cos\gamma )}{\pi \mathit{\text{ul}}cos\gamma}{e}^{-j\pi \mathit{\text{lu}}cos\gamma}+\frac{Tsin(\pi \mathit{\text{vl}}sin\gamma )}{\pi \mathit{\text{vl}}sin\gamma}{e}^{-j\pi \mathit{\text{lv}}sin\gamma}$$

(11)

Then, the expression of *H(u,v)* in the time domain is:

$$h(x,y)=\{\begin{array}{ll}1/l\hfill & 0\le x\le lcos\gamma ,y=xtan\gamma \hfill \\ 0\hfill & \mathit{\text{else}}\hfill \end{array}$$

(12)

To accomplish restoration of the original image, the traditional method is to employ Wiener Filtering in the frequency domain [15]. The Wiener filter is intended to be an optimal filter in the sense that it delivers the best estimate of the object in a least-squares sense for additive Gaussian noise. However, the noise *n(x, y)* is typically unknown in practice and the classical Wiener filter is problematic [19]. Therefore, we use the modified Wiener filter which is given by:

$$\widehat{F}(u,v)=\frac{{H}^{}H(u,v){H}^{}G(u,v)}{}$$

(13)

where *H**(*u, v*) denotes the complex conjugate of *H*(*u*, *v*) and *a* can be considered as an adjustable empirical parameter chosen to balance sharpness against noise.

In order to overcome the edge error, a major factor affects the quality in Wiener filter restoration, the optimal window method is used for star image [20]. Then the steps of restoration based on Wiener filtering are detailed as follows:

- Introduce
*h(x,y)*according to analysis in Section 4.1. - Apply the optimal window
*w*(*x, y*) as a weight factor to*g*(*x, y*), then execute the Discrete Fourier Transform (DFT) of*g*(*x, y*) and*h*(*x, y*). - Use the Wiener filter for deconvolution filtering in the frequency domain, and obtain the estimate of
*F*(*u, v*) by Equation (13). - Compute the Inverse DFT (IDFT) of
*f^*(*x*,*y*) to generate*f*(*x, y*) by:$$f(x,y)={\int}_{-\infty}^{+\infty}{\int}_{-\infty}^{+\infty}F(u,v){e}^{j2\pi (ux+vy)}{d}_{u}{d}_{v}$$(14)

In order to verify the proposed method when a star sensor works under dynamic conditions, simulations and experiments are implemented to accomplish denoising and restoration according to blurred star images caused by different *w*. Comparison of PSNR and the star centroid are also analyzed to estimate the effect of algorithm in this section.

Based on the performance of the star sensor shown in Table 1, the SkyMap star map simulation software is used to generate the original star image, as shown in Figure 4. The boresight direction is set as (150°, 15°) and the 14,581 stars brighter than 6.95 m are selected in Tycho2n star catalog.

The experiments are conducted on several blurred star images at different noise levels *σ* = 70, 80, 90 and different angular rates *w* under the dynamic range. For the wavelet transform, four levels of decomposition are used, and the wavelet employed is sym8 (from the MATLAB wavelet toolbox).

To assess the performance of the denoising method proposed in this paper, it is compared with several common denoising techniques like BayesShrink [13], SureShrink [21] and Lowpass filter. The fixed threshold *Th* is used first to segment the background and the star object. Based on *Th*, different denoising methods are employed to estimate the original clear star image. Figure 5 shows the noisy image and resulting images at *σ* = 90 and *w* = 0.6°/s. We can see that the image processed by the proposed method outperforms the others in terms of visual quality. Then, the PSNRs from various methods are compared in Table 2, and the data are collected from an average of four runs. The *AdaptThr* method, namely, is the proposed adaptive thresholding method.

Comparison of noisy image and resulting images. (**a**) Noisy image at *σ* = 90, *w* = 0.6°/s, (**b**) Resulting image with fixed threshold, (**c**) Resulting image with BayesShrink, (**d**) Resulting image with SureShrink, (**e**) Resulting image with Lowpass **...**

Results in Table 2 show that the lower *w* is, the better *AdaptThr* performs than other methods, especially when *σ* is large. *AdaptThr* approximately has the same poor effect of denoising along with the increase of *w*. Actually, in dynamic condition with high *w*, star image is not only perturbed by various noises, but also is blurred by the motion of star sensor.

This also means that by only using the proposed denoising method under dynamic conditions with high *w*, one cannot obtain the star centroid accurately, and one also needs to restore the motion-blurred image. In order to further verify the proposed denoising algorithm, a real star image is adopted in this section. Figure 6 shows the original star image obtained by a star sensor and its gray distribution, from which we can see that the background value in the star image is large. What's more, there is a big ‘singularity spot’ which is larger and lighter than other star points. After discarding the singularity spot, a clear star image can be obtained as shown in Figure 7. It can be seen that the dim star object is extracted perfectly from the background noise. This confirms the notable effect of the proposed method which can adapt to the complex dynamic conditions.

Table 2 shows that if *w* is larger than the dynamic range, star image needs to be restored by deblurring rather than by denoising directly. Figure 8(a) is a real star image slected from the original images obtained by the CMOS star sensor in this paper. Based on the supposition *w* = 10°*/s*, the blurred star image can be generated according to the degradation model, as shown in Figure 8(b).

Gray distributions of the same star point in two different images of Figure 8 are respectively shown in Figure 9, which show that due to the motion blur, the star point smears out intensely, as well as the gray value decreases.

Gray distribution of the same star point. (**a**) Gray distribution of original star point; (**b**) Gray distribution of blurred star point.

In this section, we implement star centroiding [2] to assess the performance of the proposed deblurring method. The comparison results are shown in Table 3, where the angular rate is 10°/s.

Table 3 shows that the extraction errors of *δx* and *δy* are mostly larger than a pixel for each star centroid without deblurring. This is because the SNR of star image decreases as a result of the star points smearing significantly. Moreover, six star points fail to be extracted due to the low gray value of dim blurred star points (as shown in Figure 9(b)), which may affect the star recognition and attitude determination. However, after restoration in advance, the star centroid can be obtained accurately for in that the extraction errors of *δx* and *δy* are within subpixel range, as well as the dim star points with low gray value are extracted. As can be seen in Figure 10, the extraction error of *δx* + *δy* error is larger than three pixels. With the restoration method, all lost star points can be extracted as well as the extraction error of *δx* + *δy* is restricted within one pixel. This is because the proposed deblurring method can keep the accuracy of *δx* and *δy* within subpixel levels, even when *w* is larger than the dynamic range, and the variation of angular rate *w* has little effect on the star centroid.

This article researches how to process blurred star images according to different angular rates of star sensors under dynamic conditions. A new denoising method based on adaptive wavelet threshold is proposed, as well as a restoration method according to large angular rate out of the dynamic range. Experiments on different types of star images have been conducted with the proposed algorithm. The PSNRs of images with different types of angular velocity show the proposed denoising method, in comparison with the normal denoising methods, has good performance, namely, better than PSNRs of other methods under the same conditions when the angular velocity is in the dynamic range, and also in terms of visual quality. Star centroiding against blurred star images have been analyzed to assess the effectiveness of restoration. It is confirmed that the restoration maintains the extraction error within subpixel levels, and the variation of angular velocity has little effect on the accuracy of star centroid, which shows that the proposed method is both effective and feasible. Experimental results show that the processing method according to angular velocity in before/after using the restoration method with different angular velocity are analyzed, and star points which can be extracted in each method are also shown. Without restoration, the larger the angular velocity is, the more star points cannot be extracted while the extraction under dynamic conditions reported in this paper could keep star sensors stable within a certain range and meet the requirements of attitude determination, which needs uninterrupted output data and attitude accuracy of arcsecond level.

The authors thank Jiancheng Fang for his helpful technical support. Support for this research was provided by The National Nature Science Foundation Grant No. (61004140 & 60736025 & 60825305 & 61104198 & 60904093) and 973 Project (2009CB724000).

1. Zhang S.D., Zhang Z.J., Sun H.H., Wang Y.J. High accuracy star image locating and imaging calibration for star sensor technology. Proc. SPIE. 2010;7544 doi: http://dx.doi.org/10.1117/12.885374.

2. Liebe C.C. Accuracy performance of star trackers—A tutorial. IEEE Trans. Aero. Elec. Sys. 2002;38:587–599.

3. Wu X.J., Wang X.L. Multiple blur of star image and the restoration under dynamic conditions. Acta Astronaut. 2011;68:1903–1913.

4. Brendan M.Q., Valery T., Henok M., Richard H. Determining star-image location: A new sub-pixel interpolation technique to process image centroids. Comput. Phys. Commun. 2007;177:700–706.

5. Rufino G., Accardo D. Enhancement of the centroiding algorithm for star tracker measure refinement. Acta Astronaut. 2003;53:135–147.

6. Gwanghyeok J. Autonomous Star Sensing, Pattern Identification and Attitude Determination for Spacecraft: An Analytical and Experimental Study. Texas A & M University; Corpus Christi, TX, USA: 2001.

7. Shen J., Zhang G.J., Wei X.G. Simulation analysis of dynamic working performance for star trackers. J. Opt. Soc. Am. 2010;27:2638–2647. [PubMed]

8. Li X., Zhao H. Analysis of star image centroid accuracy of an APS star sensor in rotaton. Aerospace Control Appl. 2009;35:11–16.

9. Bardsley J., Jefferies S., Nagy J., Plemmons R. Blind iterative restoration of images with spatially-varying blur. Optics Express. 2006;14:1767–1782. [PubMed]

10. Hancock B.R., Stirbl R.C., Cunningham T.J., Pain B., Wrigley C.J., Ringold P.G. CMOS active pixel sensor specific performance effects on star tracker/image position accuracy. Proc. SPIE. 2001;4284:43–53.

11. Pasetti A., Habine S., Creasey R. Dynamical Binning for High Angular Rate Star Tracking. Proceedings of the 4th International Conference on Spacecraft Guidance, Navigation and Control Systems; Noordwijk, The Netherlands. 18–21 October 1999; pp. 255–266.

12. Star1000 1M Pixel Radiation Hard CMOS Image Sensor. Cypress Semiconductor Corporation; San Jose, CA, USA: 2007.

13. Chang S.G., Yu B., Vetterli M. Adaptive wavelet thresholding for image denoising and compression. IEEE Trans. Image Process. 2000;9:1532–1546. [PubMed]

14. Mallat S. A theory for multiresolution signal decomposition: The wavelet representation. IEEE Trans. Patt. Anal. Machine Intell. 1989;11:674–693.

15. Gonzalez R.C., Woods R.E. Digital Image Processing. 2nd ed. Prentice Hall; Englewood Cliffs, NJ, USA: 2002.

16. Kaur L., Gupta S., Chauhan R.C. Image Denoising Using Wavelet Thresholding. Proceeding of the Third Indian Conference on Computer Vision, Graphics & Image Processing; Ahmadabad, India. 16–18 December 2002; pp. 1522–1531.

17. Mihcak K.M., Kozintsev L., Ramchandran K. Low-complexity image denoising based on statistical modeling of wavelet coefficients. IEEE Sign. Process. Lett. 1999;6:300–303.

18. Quan W., Zhang W. Restoration of Motion-Blurred Star Image Based on Wiener Filter. Proceeding of The Fourth IEEE International Conference on Intelligent Computation Technology and Automation; Shenzhen, China. 28–29 March 2011; pp. 691–694.

19. Costello T.P., Mikhael W.B. Efficient restoration of space-variant blurs from physical optics by sectioning with modified Wiener filtering. Digital Signal Process. 2003;13:1–22.

20. Tan K.C., Lim H., Tan B.T.G. Windowing techniques for image restoration. Graph. Mod. Image Process. 1991;53:491–500.

21. Donoho D.L., Johnstone I.M. Adapting to unknown smoothness via wavelet shrinkage. J. Am Stat. Assoc. 1995;432:1200–1224.

Articles from Sensors (Basel, Switzerland) are provided here courtesy of **Multidisciplinary Digital Publishing Institute (MDPI)**

PubMed Central Canada is a service of the Canadian Institutes of Health Research (CIHR) working in partnership with the National Research Council's national science library in cooperation with the National Center for Biotechnology Information at the U.S. National Library of Medicine(NCBI/NLM). It includes content provided to the PubMed Central International archive by participating publishers. |