Search tips
Search criteria 


Logo of nihpaAbout Author manuscriptsSubmit a manuscriptNIH Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
Gastroenterology. Author manuscript; available in PMC Sep 26, 2006.
Published in final edited form as:
PMCID: PMC1576342
Computed Tomographic Virtual Colonoscopy Computer-Aided Polyp Detection in a Screening Population
Ronald M. Summers, MD, PhD,1 Jianhua Yao, PhD,1 Perry J. Pickhardt, MD,2,3,5 Marek Franaszek, PhD,1 Ingmar Bitter, PhD,1 Daniel Brickman, BS,1 Vamsi Krishna, BA,1 and J. Richard Choi, ScD, MD2,4
1Diagnostic Radiology Department, Warren Grant Magnuson Clinical Center, National Institutes of Health, Bethesda, MD 20892;
2Uniformed Services University of the Health Sciences, Bethesda, MD 20814;
3National Naval Medical Center, Bethesda, MD 20892;
4Walter Reed Army Medical Center, Washington, DC 20307;
Corresponding Author and Reprint Requests: Ronald M. Summers, M.D., Ph.D., Diagnostic Radiology Department, National Institutes of Health, Bldg. 10, Room 1C660, 10 CENTER DR MSC 1182, BETHESDA MD 20892-1182, Phone: (301) 402-5486, FAX: (301) 451-5721, E-mail: rms/at/, Web:
5Present address Department of Radiology, University of Wisconsin Medical School, Madison, WI 53792
Background & Aims
The sensitivity of CT virtual colonoscopy (CT colonography) for detecting polyps varies widely in recently reported large clinical trials. Our objective was to determine whether a computer program is as sensitive as optical colonoscopy for the detection of adenomatous colonic polyps on CT virtual colonoscopy.
The data set was a cohort of 1,186 screening patients at three medical centers. All patients underwent same day virtual and optical colonoscopy. Our enhanced gold standard combined segmental unblinded optical colonoscopy and retrospective identification of precise polyp locations. The data were randomized into separate training (n=394) and test (n=792) sets for analysis by a computer-aided polyp detection (CAD) program.
For the test set, CAD’s per polyp and per patient sensitivities were both 89.3% (25/28, 95% CI [71.8%, 97.7%]) for detecting retrospectively identifiable adenomatous polyps at least 1 cm in size. The false-positive rate was 2.1 [2.0, 2.2] false polyps per patient. Both carcinomas were detected by CAD at a false positive rate of 0.7 per patient; only one of two was detected by optical colonoscopy prior to segmental unblinding. At both 8 mm and 10 mm adenoma size thresholds, the per patient sensitivities of CAD were not significantly different than those of optical colonoscopy prior to segmental unblinding.
The per patient sensitivity of CT virtual colonoscopy computer-aided polyp detection in an asymptomatic screening population is comparable to that of optical colonoscopy for adenomas 8 mm or larger and is generalizable to new CT virtual colonoscopy data.
Keywords: CT, colon, CT, 3D reconstruction, Colon cancer, image processing, computer-aided detection
Colorectal cancer is the second leading cause of cancer death in Americans 1. It is known that with proper screening, colorectal cancer can be prevented. Unfortunately, many patients do not undergo screening due to the perceived inconvenience and discomfort of existing screening tests. Virtual colonoscopy (also known as CT colonography), a CT scan based imaging method, has been under study for the past 10 years and shows promise as a method of colorectal cancer screening that may be acceptable to many patients 2, 3.
Recent large clinical trials have suggested that virtual colonoscopy may have high sensitivity and specificity for polyp detection 4, 5. Other studies have raised questions about its reproducibility and accuracy in actual clinical practice 69. If virtual colonoscopy is to be widely disseminated for colorectal cancer screening, methods that improve consistency and accuracy would be highly desirable.
Computer-aided polyp detection has been proposed by a number of investigators to improve consistency and sensitivity of virtual colonoscopy interpretation and reduce interpretation burden10. Preliminary studies of prototype computer-aided detection (CAD) systems on small patient datasets have reported per polyp sensitivities from 64% to 100% and false-positive rates from 1 to 11 false-positives per patient for detecting polyps 1 cm or larger 1117. However, there is currently insufficient evidence whether CAD is accurate in a screening population and whether the reported results generalize to independent data.
The purpose of this study is to provide this evidence by assessing CAD performance on a large consecutive prospectively-enrolled asymptomatic screening patient population. To ascertain the generalizability of CAD’s performance, we randomized the patients’ data into separate training and test sets, and evaluated the performance of CAD on each dataset.
Patient Population
The patient population consisted of 1,253 asymptomatic adults between 40 and 79 years at three medical centers (“Institutions 1 –3”) of whom 1,233 underwent complete same day virtual and optical colonoscopy 4. Twenty of the 1,253 patients were excluded because of incomplete optical colonoscopy, inadequate preparation or failure of the CT colonographic system. The study was approved by the institutional review boards (IRBs) at all three centers. Written informed consent was obtained from all patients. This study was part of the original IRB-approved project and consent form that led to publication of Ref. 4 and the patient population is the same.
Bowel Prep
Patients underwent a 24-hour colonic preparation that consisted of oral administration of 90 ml sodium phosphate, 10 mg bisacodyl, 500 ml of barium (2.1% by weight) and 120 ml of diatrizoate meglumine and diatrizoate sodium given in divided doses 18.
CT Scanning
A small flexible rectal catheter was inserted and pneumocolon achieved by patient-controlled insufflation of room air. Each patient was scanned in the supine and prone positions during a single breathhold using a four-channel or eight-channel CT scanner (General Electric LightSpeed or LightSpeed Ultra). CT scanning parameters included 1.25 – 2.5 mm section collimation, 15 mm per second table speed, 1 mm reconstruction interval, 100 mAs and 120 kVp.
Optical Colonoscopy
Optical colonoscopy (OC) was performed by one of 17 experienced colonoscopists. Our technique for segmental unblinding of virtual colonoscopy results at OC has been previously described 4 and reduces OC false-negatives as much as 12% for large adenomas (≥ 10 mm) 19. The colonoscopists used a calibrated guidewire to measure polyp size, and recorded whether the polyp was located on a haustral fold and a subjective assessment of polyp shape (sessile, pedunculated or flat).
CT Colonography Database
CT images from the virtual colonoscopy studies from each of the 3 institutions were loaded onto a computer server. The CT images from 47 patients could not be located or restored and were excluded from further analysis; this left 1,186 patients with complete data.
Recording the ground truth
To assess the performance of the CAD software, we developed an enhanced ground truth (calibration data) based upon manual determination of the three-dimensional borders of polyps. Each polyp 6 millimeters or larger found at optical colonoscopy was located on the prone and supine virtual colonoscopy examinations using three-dimensional endoluminal reconstructions with “fly-through” capability and multiplanar reformatted images (Viatronix V3D colon, research version, Viatronix).
For each polyp and for each position (supine and prone), a marker was placed manually in the center of each polyp using computer software. Then the borders of the polyp on each slice that contained the polyp were manually traced. The markers (approx. 500) and borders (approx. 3650) were stored in data files. The markings and tracings were done by a trained research assistant (D.B.) supervised by a radiologist (R.M.S.).
Radiologist False Positives
To assess the potential clinical significance of CAD false positives, we created a database of radiologist false positives to enable comparison of the two sets for any commonality. This database allowed us to determine whether radiologists and CAD made the same false positives. A trained research assistant (V.K.), supervised by a radiologist (R.M.S.) identified the false positive polyps reported on the same cases by the radiologists in Ref. 4. Each false positive that was identifiable in retrospect was marked and manually traced as above.
CAD System
The CAD system has been described in detail elsewhere 12, 17. It consisted of automated identification of the colonic lumen and wall 20, electronic subtraction of opacified colonic fluid 21, calculation of colonic surface features, segmentation of candidate polyps to locate their entire three-dimensional boundaries 22 and classification to distinguish true and false positive polyp detections 23, 24.
The output of the CAD system was a series of locations of polyp candidates in the CT images. The location data could be converted to a graphical overlay on three-dimensional virtual colonoscopy images.
Matching the Ground Truth and Computer Detections
The CAD software compared its detections with the ground truth tracings in a blinded fashion. If any part of a detection matched any part of a manual tracing of a polyp, the detection was considered a true-positive; otherwise the detection was considered a false-positive. Similarly, if any part of a detection matched any part of a manual tracing of a radiologist false-positive, the detection was considered a matching false-positive.
Training Method and Testing
As for other types of radiology CAD such as detecting lung nodules on CT scans or breast cancer on mammography, the CAD system for detecting polyps must be trained on proven cases. The training “teaches” the computer program how to discriminate between true polyps and non-polyps. After training, the entire CAD system, including the classifier, should be applied to new “test” cases to provide a fairer assessment of future performance.
To implement this, the data set was divided into separate training and test sets. We chose to train on 1/3 and test on the remaining 2/3 of the data. This partitioning of the data enables better statistical power during testing and quicker processing during technical development when the training set is used. The division into training and test data sets was done using a random number generator that assigned patients from all three centers to either the training or test sets (Microsoft Access). Characteristics of the patients in the training and test sets are shown in Table 1.
Table 1
Table 1
Patient Population in the Database
Testing cases were sequestered and not used during development or training 25. When an acceptable training was accomplished, testing was run to produce the results shown herein. We did do training and testing with and without merging of overlapping detections but based on superior performance with merging during training, we present only results for merged detections. Details of the training and classifier design have been previously published 23, 24, 26.
The training was done using detections from the training set cases from all three institutions. Training was done for adenomas at 10, 8, and 6 mm size thresholds. Adenomas smaller than these size thresholds and all non-adenomatous polyps were placed in the false-positive set during training. The outputs of the training were three different classifiers, one for each size threshold, that were individually applied to the CT colonography test data.
The CAD software executed on both the Linux and Windows operating systems. The majority of the cases (> 99%) were run on a Linux super-cluster (a network of inexpensive computers linked together) to more efficiently analyze the large number of CTC exams 27. As many as 64 exams could be analyzed simultaneously on the super-cluster. CAD successfully analyzed all but four training (two supine and two prone) and three test exams (two supine and one prone). The processing time per patient was 20.2±8.0 minutes (n=1179), approximately half of which time was spent reading the images across the network.
Data Analysis
We used free-response receiver operating characteristic (FROC) analysis, the standard method for evaluating CAD performance 28. FROC analysis produces curves that graphically show the sensitivity of CAD for detecting polyps versus false-positive rate (number of false positives per patient) for different settings of a tunable parameter in the classifier. As is typical in CAD, one can tune the CAD system to yield higher sensitivity at the expense of a greater number of false-positives. FROC curves are presented for different adenoma size categories and for training and testing. Because we are focusing on the more clinically significant adenomatous polyps, true-positive detections on proven non-adenomatous polyps were ignored and not included in the false-positive rates for the FROC analysis. Because the number of non-adenomatous polyps (Table 2) was small relative to the number of patients, the effect of this procedure on false positive rates is negligible.
Table 2
Table 2
Polyps Identified
While FROC curves show the spectrum of CAD sensitivities across a range of false-positive rates, for clinical use a CAD system is typically set at a specific operating point on the FROC curve with fixed sensitivity and false-positive rate. For each of the three size thresholds, we selected an operating point on the FROC curve. We report the sensitivities and false-positive rates at these operating points in the Tables. The operating points were chosen in relatively flat parts of the FROC curves where there were diminishing gains in sensitivity as the false-positive rates were increased. The operating points were chosen somewhat arbitrarily but represent reasonable trade-offs between sensitivity and false-positive rates.
Assessments of false-positives
A random subset of 64 false-positives was selected from those found after application of the classifier trained on adenomas 10 mm or larger to determine their cause. Images of these false-positives were loaded into a software application developed by author J.Y. that creates a mosaic of images that can be reviewed rapidly to determine the cause of the false-positives.
Sub-group analyses
To better characterize CAD performance, we computed CAD’s sensitivity three ways: for all polyps, for those surrounded by luminal air and for those submerged in opacified fluid. A polyp was considered to be submerged if by visual assessment 50% or more of its surface was covered by fluid. Polyps were not considered submerged if they were merely coated with a thin layer of opacified fluid. We also stratified detection performance by polyp shape (sessile, pedunculated or flat), location in the colon and whether the polyps were on folds.
Statistical Analysis
Sensitivity was computed two ways: using all polyps found at segmentally unblinded optical colonoscopy and using only those polyps visible upon retrospective review of the CT colonography images. The former is useful for comparing the overall sensitivity of CAD to that of optical colonoscopy prior to segmental unblinding and literature reports of radiologist interpretation. The latter is useful for distinguishing CAD’s performance from shortcomings of the CTC technique itself. For example, some polyps, particularly those 6 or 7 mm in size, could not be found on the supine and/or prone views. Consequently, it is not possible to train on them or to confirm whether CAD detected them.
We report exact 95% confidence intervals for sensitivities and false positive rates (SAS Software Version 9.1), use the Fisher exact test to compare proportions, and consider statistical significance to be p<0.05. Bootstrapping was used to compute standard deviations over a range of operating points for the FROC analysis. The bootstrapping was done by determining FROC curves for each of 100 random samples of 792 test patients with replacement (duplicates allowed) and then estimating the standard deviation at fixed values of the sensitivity and false positive rate on the FROC curves.
The patients were distributed into the training and test sets as shown in Table 1, with similar age and gender distributions, accounting for the 2:1 split. The polyp distributions are shown in Table 2.
The FROC curves are shown in Figure 1 for the three different classifiers trained to detect adenomatous polyps ≥ 10, 8, and 6 mm. These curves indicate that at a constant false-positive rate, sensitivity was higher for larger polyps. Sensitivity was also higher on the training set compared to the test set, although the differences were small (< 5%) for the 8 mm and 10 mm size thresholds. The three operating points are indicated by their associated error bars.
Figure 1
Figure 1
Free-response receiver operating characteristic (FROC) curves. FROC curves for the training (open markers) and test (filled markers) sets are shown for adenomatous polyps 10 mm or larger (circles), 8 mm or larger (squares), and 6 mm or larger (triangles). (more ...)
The per polyp and per patient sensitivities at the operating point at each size threshold are shown in Table 3. At a false-positive rate of 2.1 per patient for polyps 10 mm or larger the per polyp and per patient sensitivities were both 89.3%. Both carcinomas were found at a false-positive rate of 0.7 per patient. The sensitivities were lower for the two smaller size thresholds. Example virtual colonoscopy images of 1.4, 0.8 and 0.6 centimeter polyps detected by CAD are shown in Figure 2Figure 4.
Table 3
Table 3
Performance Characteristics of Virtual Colonoscopy CAD for the Detection of Adenomas Based on Retrospective Review
Figure 2
Figure 2
(A) Optical and (B, C) three-dimensional virtual colonoscopy images of 1.4 cm polyp in the transverse colon of a 64 y.o. female in the test set The blue coloring in (C) indicates the part of the polyp detected by CAD. A portion of the colon centerline (more ...)
Figure 4
Figure 4
(A) Optical and (B, C) three-dimensional virtual colonoscopy images of 0.6 cm polyp in the transverse colon of a 65 y.o. male in the test set The blue coloring in (C) indicates the part of the polyp detected by CAD. A portion of the colon centerline is (more ...)
The sensitivities of first-look optical colonoscopy (prior to segmental unblinding) and virtual colonoscopy CAD, using a baseline of all adenomas found by segmentally-unblinded optical colonoscopy, are compared in Table 4. The per patient sensitivities of CAD were not significantly different than that of first-look optical colonoscopy at the 8-mm and 10-mm size thresholds; the per polyp sensitivities were not significantly different at the 10-mm size threshold. Optical colonoscopy initially missed one of the two carcinomas prior to segmental unblinding; CAD detected both cancers.
Table 4
Table 4
Performance Characteristics of Virtual Colonoscopy CAD and First-Look Optical Colonoscopy for the Detection of Adenomas Based on All Adenomas
Standard deviations of sensitivity ranged from 4% to 6% and of false positive rate from 0.1 to 0.3 per patient at the operating points ( Figure 1). The bootstrap analysis revealed that the standard deviations in sensitivity increased at lower false positive rates to a maximum of 10%. The standard deviations in false positive rate increased at higher false positive rates, to a maximum of 0.8 per patient.
Sensitivity was higher for adenomatous polyps in the air-filled part of the colonic lumen compared to the fluid-filled part (Table 5). The sensitivity differences were statistically significant for five of six pair-wise comparisons. In general, polyps were more frequently located in the air-filled part of the colonic lumen.
Table 5
Table 5
CAD Sensitivity for Adenomas Surrounded by Air or Fluid
Sensitivity of polyp detection as a function of shape, location and relationship to a haustral fold is shown in Table 6. Larger polyps were most frequently pedunculated, smaller polyps were most frequently sessile. For the 6 mm and larger polyps, CAD sensitivity was lower for sessile polyps compared to pedunculated polyps and for polyps on a fold compared to polyps not on folds. There were no significant sensitivity differences for left-sided compared to right-sided polyps. None of five flat polyps were detected by CAD.
Table 6
Table 6
CAD Sensitivity According to Adenoma Shape+, Relationship to a Haustral Fold or Location in the Colon
Of CAD false-negatives, 67% (2/3), 90% (9/10) and 89% (41/46) were for adenomatous polyps on or touching a fold and 67% (2/3), 80% (8/10) and 24% (11/46) were on or near (within a few voxels of) the air-fluid boundary at the 10, 8 and 6 mm size thresholds, respectively.
Analysis of 64 random CAD false-positives 1 cm or larger showed that the majority were caused by the ileocecal valve (52/64, 81%) at a FP rate of 2.1 per patient. The remainder were due to haustral or rectal folds, residual stool or fluid, or other causes.
The radiologists identified 165 false-positive polyps of all sizes in the test set of which 126 could be found on at least one view (supine or prone). Of 1,692 CAD false-positive detections in the test set (FP rate 2.1 per patient), only 15 CAD false-positives (0.9%) matched radiologist false-positives .
CT virtual colonoscopy has progressed rapidly since its inception in 1994 29. Several large clinical trials have been published 4, 6, 8, 30. Some of these trials have reported excellent sensitivity but others have demonstrated relatively poor sensitivity. The causes of poor sensitivities have been variously attributed to out-of-date CT scanner technology, absence of bowel opacification, inadequate interpretation software, improper interpretation approach (2-D rather than 3-D) or lack of training of the interpreters 7, 3134. While there is consensus that virtual colonoscopy is appropriate for such indications as incomplete colonoscopy, there is ongoing debate about its role in the asymptomatic average risk (screening) patient.
The process of interpreting virtual colonoscopy exams is one area that has received considerable scrutiny in recent years. For example, there is debate over whether images should be read using a primary 2-D versus primary 3-D approach, whether different interpretation software yields different results and whether training or occupation affect interpretation skill 6, 7, 9, 3539. It is clear that different observers interpret virtual colonoscopy images with different levels of skill. For example, Fletcher et al. found that 17 of 30 false negative polyps one centimeter or larger were missed because of perceptual error 40. By detecting disease on radiologic images with high sensitivity and low false-positive rate CAD can potentially improve overall physician interpretative performance, diminish the frequency of perceptual errors, and allow more poorly performing interpreters to attain performance levels comparable to experts 41, 42.
A number CAD systems for polyp detection have been described 12, 14, 4352. In a typical implementation, computer-aided polyp detection analyzes the surface of the colon to identify polyp-like shapes that protrude into the colonic lumen. Factors such as colonic wall thickness, surface curvature and contrast enhancement have been proposed as useful features that can be quantitated and can distinguish polyps from normal colonic mucosa 1114, 17, 44, 47, 53. While these works are encouraging, in general they have used small highly selected patient populations, unclear patient selection criteria, or more readily detectable conspicuous polyps to develop and assess the CAD system. In addition, with few exceptions 54, 55, data have come from a single institution with testing done on the same data used for training.
While CAD development for polyp detection has proceeded along many fronts, one common and critical element is validation of performance on a database of proven cases. There are many important issues about developing the database and validating performance if the CAD system is to be generalizable to new patient data. It is accepted by many experts that the key elements of the database are that it be an unbiased collection of proven cases of sufficient number to adequately reflect the diversity of polyp sizes, shapes and locations in the patient population. It is also critical to determine the generalizability of the CAD system by assessing its performance on a fresh set of data (a test set) different from that upon which it was developed (the training set). Our database and validation methods were chosen to fulfill these important criteria. In this paper, we used data from 1,253 consecutive screening cases from three medical institutions, less about 5% which were excluded, and divided it into separate training and testing samples. The CT colonography data were validated with an enhanced gold standard, segmentally-unblinded optical colonoscopy. To our knowledge, this is the largest virtual colonoscopy database of its kind.
When we analyzed all polyps visible in retrospect on CT colonography, both the per polyp and per patient sensitivities were 89.3% at a false-positive rate of 2.1 per patient for polyps 10 mm or larger. At the 8 mm size threshold, the per polyp and per patient sensitivities were 80.8% and 87.2%, respectively, at a false-positive rate of 6.7 false polyps per patient. These results indicate that CAD reliably finds retrospectively-visible adenomatous polyps 8 mm or larger on CT colonography images.
When compared to sensitivities of first-look optical colonoscopy and to radiologist interpretation in the largest CT colonography trials, CAD’s per adenoma sensitivity (86.2%) was equivalent or better at the 10 mm size threshold. For example, CAD’s sensitivity was not significantly different compared to radiologists’ as reported by Pickhardt et al. (47/51, 92.2% [95% CI: 81.1 – 97.8]), but was significantly greater than that reported by Cotton et al. (28/54, 52.0% [38.7 – 65.3]), Rockey et al. (35/55, 64% [49 – 77]) and Johnson et al. (double read, 26/41, 63.4% [46.9 – 77.9])4, 68. Note that Cotton et al. did not break down per polyp sensitivity by polyp histology so that all colorectal lesions (including hyperplastic polyps) are included. Rockey et al. report combined sensitivities for detecting adenomas and cancers.
Similarly, when compared to sensitivities of first-look optical colonoscopy (85.7% and 89.6%) and to radiologist interpretation in the largest CT colonography trials, CAD’s per patient sensitivities (89.3% and 85.4%) were equivalent or better at the 10 mm and 8 mm size thresholds, respectively, and are therefore likely to be in the clinically acceptable range. For example, at the 10 mm size threshold CAD’s sensitivity was not significantly different compared to radiologists’ as reported by Pickhardt et al. (45/48, 93.8% [82.8 – 98.7]) but was significantly greater than that reported by Cotton et al. (23/42, 55.0% [39.9 – 70.0]), Rockey et al. (37/63, 58.7% [45 – 71]) and Johnson et al. (double read, 30/47, 63.8% [48.5 – 77.3]) 4, 68. Note that Cotton, Rockey and Johnson did not break down per patient sensitivity by polyp histology so that all colorectal lesions (including hyperplastic polyps) are included. At the 8 mm size threshold our per patient sensitivities were not significantly different compared to that reported by Pickhardt et al. (77/82, 93.9% [86.3 – 98.0]). These comparisons do not take into account any changes in specificity that might occur as a consequence of CAD false positives.
We found that CAD developed on training data was generalizable to a separate test set. For example, the sensitivity and false positive rate of CAD were essentially identical for the training and test sets at the 10 mm size threshold. For smaller size thresholds, there was a decrease in sensitivities between the training and test sets that ranged from about 5% to 10% on average at the 8 mm and 6 mm size thresholds, respectively ( Figure 1). Standard deviations at the operating points were low for sensitivity (4% to 6%) and negligible for false positive rate (0.1 to 0.3). These standard deviations, which provide an estimate of the expected change in sensitivities and false positive rates on new datasets, are likely to be in the clinically acceptable range.
For guiding practical use by clinicians and future technical improvements by researchers, it is important to ascertain particular situations in which CAD is less effective. The sensitivity of our CAD system was lower for polyps under fluid, for small sessile and flat polyps and for small polyps on folds. Many false-negatives were at the air-fluid boundary, a location difficult for CAD to analyze. Factors such as the CT attenuation and amount of opacified colonic fluid may also affect CAD performance. The bowel prep used in this study produced a relatively large volume of residual colonic fluid 56. Subsequent modifications of the bowel preparation have since reduced the amount of retained colonic fluid, which would likely improve CAD performance.
The significance of the false-positive rate is harder to assess. Physician acceptance of 2.1 or 6.7 false positive rates, at the 10 mm and 8 mm thresholds respectively, depends on a number of issues: the efficiency (speed) with which physicians can review CAD “hits” and how difficult it is to decide if a CAD hit is true or false. The former is determined by the quality of the user interface for the interpretation software and was not specifically investigated by us. The latter was studied by us at a false-positive rate of 2.1. We found that most false-positives were readily identified to be normal structures like the ileocecal valve or colonic folds. In addition, few (0.9%) of the CAD false-positives coincided with radiologist false-positives. This suggests that most CAD false-positives would be rejected by the radiologist as being unlikely to represent true polyps. There is preliminary evidence that CAD false-positives do not significantly impair radiologists’ specificity even when almost 30 false-positives are shown per patient 52.
Because of the large number of CT colonography datasets in this study, we used a Linux super-cluster to perform the CAD analyses more efficiently. In clinical practice, the CAD system described herein would be run on a readily available desktop personal computer running either the Linux or Microsoft Windows operating systems. We estimate the typical processing time to be under 10 minutes per patient using such a system.
This study has several limitations. First, we could have incorrectly matched polyps found at optical and virtual colonoscopy. This error could either increase or decrease the measured sensitivity of CAD. Second, there were a number of polyps found at optical colonoscopy that we could not find retrospectively at virtual colonoscopy. Although it is possible that CAD “false-positives” were actually true-positive detections of such polyps, we suspect this occurred infrequently. To avoid bias, we did not attempt to reclassify such polyps.
We do not report performance on hyperplastic polyps. For polyps in the test set 6 mm or larger, 31.9% (65/204) were hyperplastic polyps. While hyperplastic polyps may appear indistinguishable from adenomas on CT colonography, they have no malignant potential and consequently it is less important to detect them.
CT colonography CAD is an active area of research pursued by a number of investigators both in the academic and commercial sectors. Future improvements in CAD algorithms will likely lead to even better performance. CAD systems for CT colonography are likely to become commercially available within the next few years, pending approval by the appropriate regulatory agencies.
The economics of CT colonography CAD is an important and open issue. Unlike the situation for mammography CAD, colonography CAD is not yet reimbursable. CAD could decrease expensive radiologist interpretation time and missed cancer diagnoses, leading to cost savings. However, the work-up of radiologist false positives induced by CAD could increase costs. Each of these issues will need to be assessed.
In conclusion, we found that the sensitivity and false-positive rate of computer-aided polyp detection in an asymptomatic screening population were in the range likely to be clinically acceptable and were generalizable to fresh CT virtual colonoscopy data.
Figure 3
Figure 3
(A) Optical and (B, C) three-dimensional virtual colonoscopy images of 0.8 cm polyp in the sigmoid colon of a 60 y.o. male in the test set The blue coloring in (C) indicates the part of the polyp detected by CAD. A portion of the colon centerline is shown (more ...)
We thank William R. Schindler, DO, Naval Medical Center San Diego, San Diego, CA, for providing CT colonography and supporting data; Andrew Dwyer, MD, for critical review of the manuscript; Shawn Albert and Tina R. Scott for database support; Nicholas Petrick, PhD, for helpful discussions; Maruf Haider, MD, and Meghan Miller for additional image analysis; and Sharon Robertson for manuscript preparation. Viatronix supplied the V3D Colon software free of charge. This study utilized the high-performance computational capabilities of the Biowulf Linux cluster at the National Institutes of Health, Bethesda, Md. ( This research was supported by the Intramural Research Program of the National Institutes of Health, Warren G. Magnuson Clinical Center.
1. Jemal A, Tiwari RC, Murray T, Ghafoor A, Samuels A, Ward E, Feuer EJ, Thun MJ. Cancer statistics, 2004. CA Cancer J Clin. 2004;54:8–29. [PubMed]
2. Gluecker TM, Johnson CD, Harmsen WS, Offord KP, Harris AM, Wilson LA, Ahlquist DA. Colorectal cancer screening with CT colonography, colonoscopy, and double-contrast barium enema examination: prospective assessment of patient perceptions and preferences. Radiology. 2003;227:378–84. [PubMed]
3. van Gelder RE, Birnie E, Florie J, Schutter MP, Bartelsman JF, Snel P, Lameris JS, Bonsel GJ, Stoker J. CT colonography and colonoscopy: assessment of patient preference in a 5-week follow-up study. Radiology. 2004;233:328–37. [PubMed]
4. Pickhardt PJ, Choi JR, Hwang I, Butler JA, Puckett ML, Hildebrandt HA, Wong RK, Nugent PA, Mysliwiec PA, Schindler WR. Computed tomographic virtual colonoscopy to screen for colorectal neoplasia in asymptomatic adults. N Engl J Med. 2003;349:2191–200. [PubMed]
5. Pineau BC, Paskett ED, Chen GJ, Espeland MA, Phillips K, Han JP, Mikulaninec C, Vining DJ. Virtual colonoscopy using oral contrast compared with colonoscopy for the detection of patients with colorectal polyps. Gastroenterology. 2003;125:304–310. [PubMed]
6. Cotton PB, Durkalski VL, Benoit PC, Palesch YY, Mauldin PD, Hoffman B, Vining DJ, Small WC, Affronti J, Rex D, Kopecky KK, Ackerman S, Burdick JS, Brewington C, Turner MA, Zfass A, Wright AR, Iyer RB, Lynch P, Sivak MV, Butler H. Computed tomographic colonography (virtual colonoscopy) - A multicenter comparison with standard colonoscopy for detection of colorectal neoplasia. Jama-Journal of the American Medical Association. 2004;291:1713–1719.
7. Rockey DC, Paulson E, Niedzwiecki D, Davis W, Bosworth HB, Sanders L, Yee J, Henderson J, Hatten P, Burdick S, Sanyal A, Rubin DT, Sterling M, Akerkar G, Bhutani MS, Binmoeller K, Garvie J, Bini EJ, McQuaid K, Foster WL, Thompson WM, Dachman A, Halvorsen R. Analysis of air contrast barium enema, computed tomographic colonography, and colonoscopy: prospective comparison. Lancet. 2005;365:305–11. [PubMed]
8. Johnson CD, Harmsen WS, Wilson LA, Maccarty RL, Welch TJ, Ilstrup DM, Ahlquist DA. Prospective blinded evaluation of computed tomographic colonography for screen detection of colorectal polyps. Gastroenterology. 2003;125:311–9. [PubMed]
9. Johnson CD, Toledano AY, Herman BA, Dachman AH, McFarland EG, Barish MA, Brink JA, Ernst RD, Fletcher JG, Halvorsen RA, Jr, Hara AK, Hopper KD, Koehler RE, Lu DS, Macari M, Maccarty RL, Miller FH, Morrin M, Paulson EK, Yee J, Zalis M. Computerized tomographic colonography: performance evaluation in a retrospective multicenter setting. Gastroenterology. 2003;125:688–95. [PubMed]
10. Summers RM, Yoshida H. Future Directions: Computer-Aided Diagnosis. In: Dachman AH, ed. Atlas of Virtual Colonoscopy. New York: Springer, 2003:55–62.
11. Yoshida H, Nappi J, MacEneaney P, Rubin DT, Dachman AH. Computer-aided diagnosis scheme for detection of polyps at CT colonography. Radiographics. 2002;22:963–79. [PubMed]
12. Summers RM, Johnson CD, Pusanik LM, Malley JD, Youssef AM, Reed JE. Automated polyp detection at CT colonography: feasibility assessment in a human population. Radiology. 2001;219:51–59. [PubMed]
13. Kiss G, Van Cleynenbreugel J, Thomeer M, Suetens P, Marchal G. Computer-aided diagnosis in virtual colonography via combination of surface normal and sphere fitting methods. Eur Radiol. 2002;12:77–81. [PubMed]
14. Paik DS, Beaulieu CF, Rubin GD, Acar B, Jeffrey RB, Yee J, Dey J, Napel S. Surface normal overlap: A computer-aided detection algorithm, with application to colonic polyps and lung nodules in helical CT. IEEE Transactions on Medical Imaging. 2004;23:661–675. [PubMed]
15. Cathier P, Periaswamy S, Jerebko A, Dundar M, Liang J, Fung G, Stoeckel J, Venkata T, Amara R, Krishnan A, Rao B, Gupta A, Vega E, Laks S, Megibow A, Macari M, Bogoni L. CAD for polyp detection: an invaluable tool to meet the increasing need for coloncancer screening, In CARS 2004 -Computer Assisted Radiology and Surgery. Proceedings of the 18th International Congress and Exhibition, Chicago, 23–26 June. 2004;2004:978–982.
16. Kiraly AP, Laks S, Macari M, Geiger B, Bogoni L, Nova CL. A fast method for colon polyp detection in high-resolution CT data screening, In CARS 2004 - Computer Assisted Radiology and Surgery. Proceedings of the 18th International Congress and Exhibition, Chicago, 23–26 June. 2004;2004:983–988.
17. Summers RM, Jerebko AK, Franaszek M, Malley JD, Johnson CD. Colonic polyps: complementary role of computer-aided detection in CT colonography. Radiology. 2002;225:391–9. [PubMed]
18. Pickhardt PJ, Choi JH. Electronic cleansing and stool tagging in CT colonography: advantages and pitfalls with primary three-dimensional evaluation. AJR Am J Roentgenol. 2003;181:799–805. [PubMed]
19. Pickhardt PJ, Nugent PA, Mysliwiec PA, Choi JR, Schindler WR. Location of adenomas missed by optical colonoscopy. Ann Intern Med. 2004;141:352–9. [PubMed]
20. Iordanescu G, Pickhardt PJ, Choi JR, Summers RM. Automated seed placement for colon segmentation in computed tomography colonography. Acad Radiol. 2005;12:182–90. [PubMed]
21. Summers RM, Franaszek M, Miller MT, Pickhardt PJ, Choi JR, Schindler WR. Computer-Aided Detection of Polyps on Oral Contrast-Enhanced CT Colonography. Am J Roentgenol. 2005;184:105–108. [PubMed]
22. Yao J, Summers RM. Three-dimensional colonic polyp segmentation using dynamic deformable surfaces, In SPIE Medical Imaging, San Diego, SPIE, 2004:280–289.
23. Jerebko AK, Malley JD, Franaszek M, Summers RM. Computer-aided polyp detection in CT colonography using an ensemble of support vector machines, In CARS 2003. Computer Assisted Radiology and Surgery. Proceedings of the 17th International Congress and Exhibition, London, Elsevier, 2003:1019 – 1024.
24. Malley JD, Jerebko AK, Summers RM. Committee of support vector machines for detection of colonic polyps from CT scans, In SPIE Medical Imaging, San Diego, SPIE, 2003:570–578.
25. Gur D, Wagner RF, Chan HP. On the repeated use of databases for testing incremental improvement of computer-aided detection schemes. Acad Radiol. 2004;11:103–5. [PubMed]
26. Yao JH, Campbell S, Hara AK, Summers RM. Progressive feature vector selection scheme for computer-aided colonic polyp detection, In RSNA Scientific Assembly and Annual Meeting Program, Chicago, RSNA, 2004:633.
27. Bitter I, Brown JE, Brickman D, Summers RM. Large scale validation of a computer aided polyp detection algorithm for CT colonography using cluster computing, In SPIE Medical Imaging, San Diego, SPIE, 2004:290–294.
28. Chakraborty DP. Chapter 16. The FROC, AFROC and DROC variants of the ROC analysis. In: Beutel J, Kundel HL, Van Metter RL, eds. Handbook of medical imaging. Bellingham, Wash.: SPIE Press, 2000:771–796.
29. Vining DJ, Shifrin RY, Grishaw EK, Liu K, Gelfand DW. Virtual colonoscopy. Radiology. 1994;193(P):446.
30. Yee J, Akerkar GA, Hung RK, Steinauer-Gebauer AM, Wall SD, McQuaid KR. Colorectal neoplasia: performance characteristics of CT colonography for detection in 300 patients. Radiology. 2001;219:685–92. [PubMed]
31. Ferrucci J, Barish M, Choi R, Dachman A, Fenlon H, Glick S, Laghi A, Macari M, Morrin M, Paulson E, Pickhardt PJ, Soto J, Yee J, Zalis M. Virtual colonoscopy. Jama 2004;292:431–2; author reply 433.
32. Halligan S, Taylor S, Burling D. Virtual colonoscopy. Jama-Journal of the American Medical Association. 2004;292:432–432.
33. Pickhardt PJ. Virtual colonoscopy. Jama 2004;292:431; author reply 433.
34. Summers RM, Bitter I, Petrick N. Virtual colonoscopy. Jama-Journal of the American Medical Association. 2004;292:432–433.
35. Macari M, Milano A, Lavelle M, Berman P, Megibow AJ. Comparison of time-efficient CT colonography with two- and three- dimensional colonic evaluation for detecting colorectal polyps. AJR Am J Roentgenol. 2000;174:1543–9. [PubMed]
36. McFarland EG, Brink JA, Pilgram TK, Heiken JP, Balfe DM, Hirselj DA, Weinstock L, Littenberg B. Spiral CT colonography: reader agreement and diagnostic performance with two- and three-dimensional image-display techniques. Radiology. 2001;218:375–83. [PubMed]
37. Gluecker T, Meuwly JY, Pescatore P, Schnyder P, Delarive J, Jornod P, Meuli R, Dorta G. Effect of investigator experience in CT colonography. Eur Radiol. 2002;12:1405–9. [PubMed]
38. Taylor SA, Halligan S, Burling D, Morley S, Bassett P, Atkin W, Bartram CI. CT colonography: effect of experience and training on reader performance. Eur Radiol. 2004;14:1025–33. [PubMed]
39. Pickhardt PJ. Three-Dimensional Endoluminal CT Colonography (Virtual Colonoscopy): Comparison of Three Commercially Available Systems. AJR Am J Roentgenol. 2003;181:1599–1606. [PubMed]
40. Fletcher JG, Johnson CD, Welch TJ, MacCarty RL, Ahlquist DA, Reed JE, Harmsen WS, Wilson LA. Optimization of CT colonography technique: prospective trial in 180 patients. Radiology. 2000;216:704–11. [PubMed]
41. Jiang Y, Nishikawa RM, Schmidt RA, Toledano AY, Doi K. Potential of computer-aided diagnosis to reduce variability in radiologists’ interpretations of mammograms depicting microcalcifications. Radiology. 2001;220:787–94. [PubMed]
42. Freer TW, Ulissey MJ. Screening mammography with computer-aided detection: prospective study of 12,860 patients in a community breast center. Radiology. 2001;220:781–6. [PubMed]
43. Paik DS, Beaulieu CF, Jeffrey RB, Yee J, Steinauer-Gebauer AM, Napel S. Computer aided detection of polyps in CT colonography: Method and free-response ROC evaluation of performance. Radiology. 2000;217:704.
44. Gokturk SB, Tomasi C, Acar B, Beaulieu CF, Paik DS, Jeffrey RB, Yee J, Napel S. A statistical 3-D pattern processing method for computer-aided detection of polyps in CT colonography. IEEE Transactions on Medical Imaging. 2001;20:1251–1260. [PubMed]
45. Acar B, Beaulieu CF, Gokturk SB, Tomasi C, Paik DS, Jeffrey RB, Yee J, Napel S. Edge displacement field-based classification for improved detection of polyps in CT colonography. IEEE Transactions on Medical Imaging. 2002;21:1461–1467. [PubMed]
46. Kiss G, Van Cleynenbreugel J, Thomeer M, Suetens P, Marchal G. Computer aided diagnosis for virtual colonography, In MICCAI, Utrecht, Springer-Verlag, 2001:621–628.
47. Yoshida H, Nappi J. Three-dimensional computer-aided diagnosis scheme for detection of colonic polyps. IEEE Transactions on Medical Imaging. 2001;20:1261–1274. [PubMed]
48. Yoshida H, Nappi J, MacEneaney P, Rubin DT, Dachman AH. Computer-aided diagnosis scheme for the detection of polyps in CT colonography. Radiographics. 2002;22:963–979. [PubMed]
49. Nappi JJ, Frimmel H, Dachman AH, Yoshida H. Computerized detection of colorectal masses in CT colonography based on fuzzy merging and wall-thickening analysis. Medical Physics. 2004;31:860–872. [PubMed]
50. Summers RM, Beaulieu CF, Pusanik LM, Malley JD, Jeffrey RB, Jr, Glazer DI, Napel S. Automated polyp detector for CT colonography: feasibility study. Radiology. 2000;216:284–90. [PubMed]
51. Summers RM, Jerebko A, Franaszek M, Malley JD, Johnson CD. Complementary role of computer-aided detection of colonic polyps with CT colonography. Radiology. 2002;225:391–399. [PubMed]
52. Mani A, Napel S, Paik DS, Jeffrey RB, Yee J, Olcott EW, Prokesch R, Davila M, Schraedley-Desmond P, Beaulieu CF. Computed tomography colonography - Feasibility of computer-aided polyp detection in a "First reader" paradigm. Journal of Computer Assisted Tomography. 2004;28:318–326. [PubMed]
53. Summers RM, Dempsey J, Campbell SR, Yao J, Franaszek M, Brickman D, Dwyer A, Hara A. CT colonography with intravenous contrast enhancement: Computer-aided polyp and mass detection. American Journal of Roentgenology. 2004;182:75–75.
54. Yoshida H, Nappi JJ, Frimmel H, Miller FH, Dalal KA, Dachman AH. Computer-aided detection of polyps in CT colonography: Performance evaluation based on combination of independent databases, In RSNA Scientific Assembly, Chicago, Nov, RSNA, 2003:672.
55. Bogoni L, Jerebko A, Dundar M, Lee J, Baker M, Macari M. A multisite study to evaluate performance of CAD in polyp detection, In RSNA Scientific Assembly, Chicago, RSNA, 2004:577.
56. Franaszek M, Summers RM, Pickhardt PJ, Choi JR, Schindler W. Assessment of Obscured Colonic Surface in CT Colonography, In RSNA Scientific Assembly and Annual Meeting Program, Chicago, RSNA, 2004:618.