Home | About | Journals | Submit | Contact Us | Français |

**|**HHS Author Manuscripts**|**PMC2898485

Formats

Article sections

- Abstract
- Introduction
- 1 New ROI Imaging Approach
- 2 BPF Algorithm for ROI-Image Reconstruction
- 3 Numerical Studies
- 4 Conclusions
- References

Authors

Related links

Tsinghua Sci Technol. Author manuscript; available in PMC 2010 July 7.

Published in final edited form as:

Tsinghua Sci Technol. 2010 February; 15(1): 68–73.

doi: 10.1016/S1007-0214(10)70011-0PMCID: PMC2898485

NIHMSID: NIHMS181660

Department of Radiology, University of Chicago, 5841 S. Maryland Avenue, Chicago, IL 60637, USA

See other articles in PMC that cite the published article.

The back-projection-filtration (BPF) algorithm has been applied to image reconstruction for cone-beam configurations with general source trajectories. The BPF algorithm can reconstruct 3-D region-of-interest (ROI) images from data containing truncations. However, like many other existing algorithms for cone-beam configurations, the BPF algorithm involves a back-projection with a spatially varying weighting factor, which can result in the non-uniform noise levels in reconstructed images and increased computation time. In this work, we propose a BPF algorithm to eliminate the spatially varying weighting factor by using a rebinned geometry for a general scanning trajectory. This proposed BPF algorithm has an improved noise property, while retaining the advantages of the original BPF algorithm such as minimum data requirement.

Significant advances have been made in the development of theoretically exact algorithms for image reconstruction from cone-beam projections. Theoretically exact algorithms that were developed initially for a helical trajectory[1–3] have been extended to reconstruct images from cone-beam projections acquired with general source trajectories[4–8]. The back-projection-filtration (BPF) algorithm is one of the recently developed algorithms[9–11]. It can be applied to reconstructing images for a wide class of general scanning trajectories and is capable of reconstructing an image within a region-of-interest (ROI) from projection data containing truncations. This property of the BPF algorithm allows imaging applications of practical significance. For example, in non-conventional computed tomography (CT) applications, it is not uncommon that the available detector covers only a portion of a field of view (FOV) that is needed otherwise for completely covering the entire support of the imaged object. Moreover, data acquisition in many practical applications is achieved through the rotation of the object around the physical center of rotation, which is often chosen as the center of mass of the object and may be at a distance from the FOV center, as shown in Fig. 1. Although this imaging approach leads to an imaging problem with a relatively complex source trajectory and data truncation, one can apply the BPF algorithm to reconstructing an ROI image from truncated data collected with this kind of imaging configurations with general trajectories[12].

Illustration of the 3-D imaging configuration. The object rotates around the rotation axis *O*. The source can move freely vertically and horizontally while the detector can only move horizontally. The region enclosed by the dashed curve indicates the FOV **...**

Like many existing algorithms for divergent-beam configurations, the original BPF algorithm also involves the computation of a spatially varying weighting factor in its back-projection step, which can result in non-uniform noise levels in reconstructed images and increased computation time[13–16]. Therefore, it is desirable to eliminate the spatially varying weighting factor in the BPF algorithm for improving its noise properties and computational efficiency. In this work, based upon the original BPF algorithm, we propose a BPF algorithm in terms of a rebinned geometry for a general scanning trajectory designed for ROI imaging, in which no spatially varying weighting factor is involved in the back-projection step. The use of the re-binned geometry can not only eliminate the spatially varying weighting factor, thus improving the noise properties of BPF reconstructions, but also retain the properties of the original BPF algorithm such as minimum data requirement and ROI-image reconstruction from truncated data.

We display in Fig.1 a 3-dimensional (3-D) scanning configuration under consideration, in which the object rotates around the rotation axis, the source can move freely vertically and horizontally, and the detector can only move horizontally. The line connecting the source and the detector center is always perpendicular to the detector plane. A sketch of this scanning geometry within 2-dimensional (2-D) slice at *z* = 0 is shown in Fig. 2. An ellipsoid support enclosed by the thin curve represents the imaged object in which an ROI is indicated as the shaded region, and the FOV is enclosed by a thick curve. The geometric center of the ellipsoid is placed at the physical center of rotation of the imaging system, which is also chosen as the origin of the fixed coordinate system. We assume that the distance of the FOV center to the physical center of rotation is *R*_{0}, that the width of the detector is 2*D*, and that the distance *H* between the physical center of rotation and the detector remains constant. Let *R* denote the radius of the FOV. Clearly, *R* < *D* for the cone-beam projection under consideration. Projections at different views are collected through the rotation of the object about the physical rotation center and the motion of the source and detector in the space, which is constrained by two conditions: (1) the size of the FOV (enclosed by the thick curve) formed by the source and detector remains unchanged and always covers the ROI, and (2) the source point, the FOV center, and the detector mid-point always remain on the same line, which, without loss of generality, is assumed to be perpendicular to the detector plane. An additional constraint on the source is that it is outside the rotating object.

When the imaged object rotates, the source and the detector are moved accordingly so that they form an FOV that always covers the ROI. Consequently, the location of the FOV center also changes. Considering the motion constraints on the source and detector described above, one can determine the trajectories of the source, the FOV center, and the mid-point of the detector in the *x*–*y* plane. Let λ denote the rotation angle of the object, which is defined as the angle between the long axis of the ellipsoid object and the *x* axis. Therefore, the trajectories of the FOV center (*x*_{F}, *y*_{F}, *z*_{F}) and detector center (*x*_{D}, *y*_{D}, *z*_{D}) in the fixed coordinate system can be expressed as

$$({x}_{\mathrm{F}},{y}_{\mathrm{F}},{z}_{\mathrm{F}})=({R}_{0}\phantom{\rule{thinmathspace}{0ex}}\text{cos}\phantom{\rule{thinmathspace}{0ex}}\lambda ,{R}_{0}\phantom{\rule{thinmathspace}{0ex}}\text{sin}\phantom{\rule{thinmathspace}{0ex}}\lambda ,0)$$

(1)

$$({x}_{\mathrm{D}},{y}_{\mathrm{D}},{z}_{\mathrm{D}})=({R}_{0}\phantom{\rule{thinmathspace}{0ex}}\text{cos}\phantom{\rule{thinmathspace}{0ex}}\lambda ,-H,0)$$

(2)

Also, one can express the source trajectory **r**_{s}(λ) in the fixed coordinate system as

$${\mathit{r}}_{\mathrm{s}}(\lambda )=({x}_{\mathrm{s}},{y}_{\mathrm{s}},{z}_{\mathrm{s}})=({R}_{0}\phantom{\rule{thinmathspace}{0ex}}\text{cos}\phantom{\rule{thinmathspace}{0ex}}\lambda ,{y}_{\mathrm{s}0},0)$$

(3)

where *y*_{s0} = *D*tan(ϕ_{1} + ϕ_{2}) − *H*, and

$${\varphi}_{1}=\text{arctan}\frac{H+{R}_{0}\phantom{\rule{thinmathspace}{0ex}}\text{sin}\phantom{\rule{thinmathspace}{0ex}}\lambda}{D}$$

(4)

$${\varphi}_{2}=\text{arctan}\frac{R}{\sqrt{{D}^{2}+{(H+{R}_{0}\phantom{\rule{thinmathspace}{0ex}}\text{sin}\phantom{\rule{thinmathspace}{0ex}}\lambda )}^{2}-{R}^{2}}}$$

(5)

As shown in Fig. 2, ϕ_{1} is the angle between the *x* axis and the line connecting the FOV center and the detector edge, and ϕ_{2} indicates the angle between the line connecting the FOV center and the detector edge and the line connecting the source and the detector edge.

Considering the physical constraint that the source cannot be within the object, we assume that the object support has a cylindrical shape, which has the same middle transverse slice as the ellipsoid object. Assuming that the long axis of the ellipsoid object is aligned with the *x* axis when λ = 0, we can rewrite the source trajectory in Eq. (3) as

$${\mathit{r}}_{\mathrm{s}}(\lambda )=({x}_{\mathrm{s}},{y}_{\mathrm{s}},{z}_{\mathrm{s}})=({R}_{0}\phantom{\rule{thinmathspace}{0ex}}\text{cos}\phantom{\rule{thinmathspace}{0ex}}\lambda ,{R}_{0}\phantom{\rule{thinmathspace}{0ex}}\text{sin}\phantom{\rule{thinmathspace}{0ex}}\lambda +\text{max}({y}_{\mathrm{s}0}-{R}_{0}\phantom{\rule{thinmathspace}{0ex}}\text{sin}\phantom{\rule{thinmathspace}{0ex}}\lambda ,t),0)$$

(6)

where

$$t=\frac{-{b}^{2}{R}_{0}\phantom{\rule{thinmathspace}{0ex}}\text{sin}\phantom{\rule{thinmathspace}{0ex}}\lambda +T(a,b,{R}_{0};\lambda )}{{a}^{2}\phantom{\rule{thinmathspace}{0ex}}{\text{cos}}^{2}\phantom{\rule{thinmathspace}{0ex}}\lambda +{b}^{2}\phantom{\rule{thinmathspace}{0ex}}{\text{sin}}^{2}\phantom{\rule{thinmathspace}{0ex}}\lambda}$$

(7)

and

$$T(a,b,{R}_{0};\lambda )=\sqrt{{a}^{4}{b}^{2}\phantom{\rule{thinmathspace}{0ex}}{\text{cos}}^{2}\phantom{\rule{thinmathspace}{0ex}}\lambda -{a}^{2}{b}^{2}{R}_{0}^{2}\phantom{\rule{thinmathspace}{0ex}}{\text{cos}}^{2}\phantom{\rule{thinmathspace}{0ex}}\lambda +{a}^{2}{b}^{4}\phantom{\rule{thinmathspace}{0ex}}{\text{sin}}^{2}\phantom{\rule{thinmathspace}{0ex}}\lambda}$$

(8)

In general, reconstruction algorithms are developed for imaging configurations in which the object is fixed. Therefore, we need to determine the source trajectory in a coordinate system in which the object is fixed so that these algorithms can be applied directly. In the case under study, we refer to the coordinate system fixed on the object as a object-fixed coordinate system (*x*_{0}, *y*_{0}, *z*_{0}), which can be related to the original coordinate system (*x*, *y*, *z*) described above as

$$({x}_{0},{y}_{0},{z}_{0})=(x,y,z)\phantom{\rule{thinmathspace}{0ex}}\left[\begin{array}{ccc}\text{cos}\phantom{\rule{thinmathspace}{0ex}}\lambda & -\text{sin}\phantom{\rule{thinmathspace}{0ex}}\lambda & 0\\ \text{sin}\phantom{\rule{thinmathspace}{0ex}}\lambda & \text{cos}\phantom{\rule{thinmathspace}{0ex}}\lambda & 0\\ 0& 0& 1\end{array}\right]$$

(9)

Using *x*_{s}, *y*_{s}, and *z*_{s} to replace *x*, *y*, and *z* in the right-hand-side of Eq. (9), one can obtain the expression of the source trajectory **r**_{0}(λ) = (*x*_{0},*y*_{0},*z*_{0}) in the object-fixed coordinate system, which is displayed in Fig. 3.

In order to describe the original BPF algorithm, we introduce a rotation-coordinate system {*u*, *v*, *w*}, whose origin is fixed on the source point for the convenience of the reconstruction. Its three unit vectors in the fixed-coordinate system are

$$\begin{array}{c}\hfill {\widehat{\mathit{e}}}_{u}={(\text{cos}\phantom{\rule{thinmathspace}{0ex}}\lambda ,-\text{sin}\phantom{\rule{thinmathspace}{0ex}}\lambda ,0)}^{\mathrm{T}},\hfill \\ \hfill {\widehat{\mathit{e}}}_{v}={(0,0,1)}^{\mathrm{T}},\hfill \\ \hfill {\widehat{\mathit{e}}}_{w}={(\text{sin}\phantom{\rule{thinmathspace}{0ex}}\lambda ,\text{cos}\phantom{\rule{thinmathspace}{0ex}}\lambda ,0)}^{\mathrm{T}}\hfill \end{array}$$

(10)

Consider a flat-panel detector with its normal direction along **ê**_{w} and at a distance *S* from the source point. A 2-D coordinate system {*u*, *v*} is assumed to be fixed on the detector plane and the *u* and *v* axes are along the unit vectors **ê**_{u} and **ê**_{v}. Any point on the detector can thus be specified by two parameters *u* and *v*. At source position λ, the cone-beam projection of the object function *f* (* r*) at a point (

$$P({u}_{\mathrm{d}},{v}_{\mathrm{d}},\lambda )={\displaystyle {\int}_{0}^{\infty}f({\mathit{r}}_{0}(\lambda )+t\widehat{\beta})\mathrm{d}t}$$

(11)

where

$$\widehat{\beta}=\frac{1}{\sqrt{{u}_{\mathrm{d}}^{2}+{v}_{\mathrm{d}}^{2}+{S}^{2}}}[{u}_{\mathrm{d}}{\widehat{\mathit{e}}}_{u}(\lambda )+{v}_{\mathrm{d}}{\widehat{\mathit{e}}}_{v}(\lambda )-S{\widehat{\mathit{e}}}_{w}(\lambda )]$$

(12)

is a unit vector indicating the direction of the ray that starts from source points **r**_{0}(λ) and passes through the point (*u*, *v*) on the detector.

The original BPF algorithm can reconstruct ROI images from truncated data acquired with a general trajectory by reconstructing images on chords. For a given continuous trajectory, a chord is a line segment connecting two points **r**_{0}(λ_{a}) and **r**_{0}(λ_{b}) on the trajectory. The form of the original BPF algorithm used here is given in Ref. [8], and the image on the chord specified by λ_{a} and λ_{b} is given by

$${f}_{\mathrm{c}}({x}_{\mathrm{c}},{\lambda}_{\mathrm{a}},{\lambda}_{\mathrm{b}})=\frac{1}{2{\mathrm{\pi}}^{2}}\frac{1}{b({x}_{\mathrm{c}})}\bullet $$

$$\left[{\displaystyle {\int}_{{x}_{\mathrm{c}1}}^{{x}_{\mathrm{c}2}}\frac{b({x}_{\mathrm{c}}^{\prime})\mathrm{d}{x}_{\mathrm{c}}^{\prime}}{{x}_{\mathrm{c}}-{x}_{\mathrm{c}}^{\prime}}}{g}_{\mathrm{c}}({x}_{\mathrm{c}}^{\prime},{\lambda}_{\mathrm{a},}{\lambda}_{\mathrm{b}})+2\mathrm{\pi}{P}_{0}\right],$$

where
$b({x}_{\mathrm{c}})=\sqrt{({x}_{\mathrm{c}2}-{x}_{\mathrm{c}})({x}_{\mathrm{c}}-{x}_{\mathrm{c}1})}$ and *P*_{0} denotes the projection data along the chord. *g*_{c} (*x*_{c}, λ_{a}, λ_{b}) is defined by the following equation,

$${g}_{\mathrm{c}}({x}_{\mathrm{c}},{\lambda}_{\mathrm{a},}{\lambda}_{\mathrm{b}})={\displaystyle {\int}_{{\lambda}_{\mathrm{a}}}^{{\lambda}_{\mathrm{b}}}\frac{\mathrm{d}\lambda}{|\mathit{r}({x}_{\mathrm{c}})-{\mathit{r}}_{0}(\lambda )|}\frac{\mathrm{d}}{\mathrm{d}\lambda}P({u}_{\mathrm{d}},{v}_{\mathrm{d}},\lambda ){|}_{\widehat{\beta}}}$$

(13)

By use of the original BPF algorithm, one can reconstruct ROI images from the truncated data acquired with the source trajectory described in Section 1. It can be observed that the weighting factor $\frac{1}{|\mathit{r}({x}_{\mathrm{c}})-{\mathit{r}}_{0}(\lambda )|}$ in the back-projection step (i.e., Eq. (13)) is spatially varying, which can result in the increased noise level and computation load. We seek to eliminate such a factor by using the rebinned geometry, as discussed below.

In the rebinned BPF algorithm, the acquired projection data are first rebinned into the fan-parallel-beam geometry[17–20]. The rebinned data can be expressed as *Q*′(*u*_{d}, *v*_{d}, ) satisfying

$$Q\prime ({u}_{\mathrm{d}},{v}_{\mathrm{d}},\phi )=P({u}_{\mathrm{d}},{v}_{\mathrm{d}},\lambda )$$

(14)

given that

$$\phi =\lambda -\text{arctan}\frac{{u}_{\mathrm{d}}}{S}$$

(15)

By using this relationship, the weighting factor $\frac{1}{|\mathit{r}({x}_{\mathrm{c}})-{\mathit{r}}_{0}(\lambda )|}$ in Eq. (13) can be eliminated.

We have performed computer-simulation studies to investigate and evaluate the rebinned BPF algorithm for achieving an ROI reconstruction for the trajectory described in Section 2. The numerical phantom is modified from a standard Shepp-Logan phantom. The center of the standard Shepp-Logan phantom is first shifted to the center of the FOV and the size of the Shepp-Logan phantom is then scaled to fit the size of the FOV. A long narrow ellipsoid object is placed at the rotation center which connects the Shepp-Logan phantom to the rotation axis. To cover the ROI, the FOV has a radius *R* = 30 cm and is at a distance *R*_{0} = 50 cm from the physical center of rotation, and the detector has a size of *D* = 64 cm and is placed at a distance from the *x* axis at *H* = 100 cm. The detector has 512 × 512 units. Using these parameters in the imaging configuration, we computed cone-beam projection data from the rotating object at 720 views from
$-\frac{\mathrm{\pi}}{5}$ to
$\frac{6\mathrm{\pi}}{5}$, from which, noisy data were generated by adding Gaussian noise with a standard deviation that is about 1.5% of the maximum projection value. From these noiseless and noisy data sets, we have reconstructed the ROI images. In Fig. 4, we displayed the images reconstructed from the noisy data by use of rebinned and original BPF algorithms.

The noisy images obtained with the rebinned BPF algorithm (top row) and the original BPF algorithm (bottom row) within the 2-D slices at *x*=0 mm, *y*=0 mm, and *z*=0 mm. The display window is [0.98, 1.2].

We have also performed a study to investigate the noise properties of the rebinned BPF. In this study, a uniform cylinder phantom was used. From the uniform phantom, we generated 1000 sets of the data containing stationary Gaussian noise. Without loss of generality, we focus on reconstructing images on a set of chords specified by ${\lambda}_{1}=\frac{\mathrm{\pi}}{4}$ and ${\lambda}_{2}=[\mathrm{\pi},\frac{5\mathrm{\pi}}{4}]$. Using the 1000 images on these chords reconstructed from the noisy data sets, we calculated the empirical image variances on the chords. For comparison, the empirical image variances are obtained from the reconstructed images by use of the original BPF algorithm. The image variances along the chord specified by ${\lambda}_{1}=\frac{\mathrm{\pi}}{4}$ and ${\lambda}_{2}=\frac{6\mathrm{\pi}}{5}$ are displayed in Fig. 5. It can be observed that the rebinned BPF algorithm generally yields images with lower and more spatially uniform variances than does the original BPF algorithm. This uniform noise property in images obtained by use of the rebinned BPF algorithm is a direct result of the elimination of the spatially varying weighting factor from the back-projection in the rebinned BPF algorithm.

In the work, we have described a rebinned BPF algorithm, which involves no spatially varying weighting factor in its back-projection step, for ROI-image reconstruction from truncated data acquired with a general trajectory. We have performed computation-simulation studies to validate and evaluate the rebinned BPF algorithm. The quantitative results demonstrate that the exact ROI-image reconstruction can be obtained with the rebinned BPF algorithm. Most important, the rebinned BPF algorithm can improve the noise properties in terms of image variances. It may find practical implications in numerous non-conventional CT scans involving general trajectories.

^{*}Supported in part by National Institutes of Health (Nos. EB000225 and CA120540). J Bian was supported by the DoD Predoctoral Training Grant (No. BC083239), and E. Y. Sidky was supported in part by the Career Development Award from NIH SPORE (No. CA125183-03)

1. Katsevich A. Analysis of an exact inversion algorithm for spiral cone-beam CT. Phys. Med. Biol. 2002;47:2583–2597. [PubMed]

2. Zou Y, Pan X. Exact image reconstruction on PI-line from minimum data in helical cone-beam CT. Phys. Med. Biol. 2004;49:941–959. [PubMed]

3. Zou Y, Pan X. Image reconstruction on PI-lines by use of filtered backprojection in helical cone-beam CT. Phys. Med. Biol. 2004;49:2717–2731. [PubMed]

4. Zhuang T, Leng S, Nett BE, et al. Fan-beam and cone-beam image reconstruction via filtering the backprojection image of differentiated projection data. Phys. Med. Biol. 2004;49:5489–5503. [PubMed]

5. Pack JD, Noo F, Clackdoyle R. Cone-beam reconstruction using the backprojection of locally filtered projections. IEEE Trans. Med. Imag. 2005;24:2317–2336. [PubMed]

6. Pack JD, Noo F. Cone-beam reconstruction using 1D filtering along the projection of m-lines. Inv. Prob. 2005;21:1105–1120.

7. Yu H, Ye Y, Zhao S, et al. A backprojection-filtration algorithm for nonstandard spiral cone-beam ct with n-PI-window. Phys. Med. Biol. 2005;50:2099–2111. [PubMed]

8. Zou Y, Pan X, Sidky EY. Theory and algorithms for image reconstruction on chords and within region of interests. J. Opt. Soc. Am. A: Opt. Image Sci. Vis. 2005;22:2372–2384. [PubMed]

9. Noo F, Clackdoyle R, Pack J. A two-step Hilbert transform method for 2D image reconstruction. Phys. Med. Biol. 2004;49:3903–3923. [PubMed]

10. Pan X, Zou Y, Xia D. Peripheral and central ROI-image reconstruction from and data-redundancy exploitation in truncated fan-beam data. Med. Phys. 2005;32:673–684. [PubMed]

11. Defrise M, Noo F, Clackdoyle R, et al. Truncated Hilbert transform and image reconstruction from limited tomographic data. Inv. Prob. 2006;22:1037–1053.

12. Bian J, Zhang H, Zhang P, et al. A cone-beam approach of ROI imaging with a detector smaller than the imaged object; Proceedings of the 2007 International Meeting on Fully 3D Image Reconstruction in Radiology and Nuclear Medicine; Lindau, Germany: 2007. pp. 386–389.

13. Besson G. CT image reconstruction from fan-parallel data. Med. Phys. 1999;26:415–426.

14. Pan X. Optimal noise control in and fast reconstruction of fan-beam computed tomography image. Med. Phys. 1999;26:689–697. [PubMed]

15. Xia D, Yu L, Sidky EY, et al. Noise properties of chord-image reconstruction. IEEE Trans. Med. Imaging. 2007;26:1328–1344. [PubMed]

16. Dennerlein F, Noo F, Hornegger J, et al. Fan-beam filtered backprojection reconstruction without backprojection weight. Phys. Med. Biol. 2007;52:3227–3240. [PubMed]

17. Grass M, Köhler T, Proksa R. 3D cone-beam CT reconstruction for circular trajectories. Phys. Med. Biol. 2000;45:329–347. [PubMed]

18. Turbell H. Cone-beam reconstruction using filtered backprojection [Dissertation] Linköping, Sweden: Linköping University; 2001.

19. Heuscher D, Brown K, Noo F. Redundant data and exact helical cone-beam reconstruction. Phys. Med. Biol. 2004;49:2219–2238. [PubMed]

20. Yu L, Xia D, Zou Y, et al. A rebinned backprojection-filtration algorithm for image reconstruction in helical conebeam CT. Phys. Med. Biol. 2007;52:5497–5508. [PubMed]

PubMed Central Canada is a service of the Canadian Institutes of Health Research (CIHR) working in partnership with the National Research Council's national science library in cooperation with the National Center for Biotechnology Information at the U.S. National Library of Medicine(NCBI/NLM). It includes content provided to the PubMed Central International archive by participating publishers. |