SC Strategies for Collection. A variety of strategies ensued until the backlog of screens already obtained by the SCs had been delivered to the CTIL. Four SCs collected all T0 exams, then all T1 exams, then all T2 exams. Six SCs collected all three screens for participant #1, then those for #2, et cetera. Two SCs chose a combination. After the backlog had been discharged, exams were more typically accumulated, at the SCs, as their participants were screened; exams were then delivered in quantities and frequencies related to shipping expense and effort.
Method of Delivery. Nearly all CT exams were delivered on XHDs (94.0%) or DVDs (5.9%) while very few were transmitted by virtual private network over the Internet (0.1%). Most SCs chose to submit the bulk of their exams on XHDs, while one SC chose to submit its exams solely on DVDs. CT exams on an XHD containing very few exams were typically copied to another XHD from the SC, and the former was then formatted and made available as a future swap or returned to the SC for additional exams.
SC CTIL-related Workforce. Most SCs budgeted the work over a 3-year period, though one chose 1 year and another chose 2. For multiple-year budgets, six SCs allocated the same number of FTEs each year while five SCs projected diminishing numbers of FTEs. The average (±stdev) number of SC FTEs per year per 1,000 exams per year was 0.41 (±0.38), median 0.27, range 0.14–1.41.
Delivery Patterns. Figure shows delivery patterns for all SCs, by month, from January, 2005. Points in the lower half are typically DVD or Internet deliveries while those in the upper half are more likely external hard drive deliveries. Some SCs chose to deliver large numbers infrequently while others chose smaller numbers more frequently.
Fig 2 Delivery patterns by screening center (A–L) and month. Gross view of the variability in shipment frequency and numbers of exams per shipment from screening centers. Observations above the 100 mark were typically shipments on external hard drives, (more ...)
Cumulative Progress. Initial deliveries of de-identified exams began in January, 2005. Time spent hiring and training image librarians as well as fine-tuning workflow operations prevented actual archiving until May, 2005. The presence of image-embedded PHI in some exams required modification of de-identification software to detect such exams prior to their delivery to the CTIL. By Fall 2005, it was apparent that an additional workforce of part-time image viewers would be needed. From December, 2005 through March, 2007, six part-time image viewers were hired for varying hours/week and duration of months. The two image librarians and six part-time image viewers averaged (stdev) 5,964 (±2,859) exams viewed, range: 2,383–11,864. The part-time image viewers viewed 36,096 exams while working variable numbers of hours (average 535; stdev 264; range 193–773) and viewing varying numbers of image exams (6,016; 3,376; 2,381–12,015), translating to FTE/1,000 exams (0.05; 0.02; 0.02–0.08).
Figure shows cumulative numbers of exams by stage: received, prelim-QA’d, visual-QA’d, and archived. The number received includes duplicate exams and resubmissions (replacing problematic priors) tendered by SCs. The gap in exams received versus exams archived was traced to the need for additional viewers as well as the need to develop database-management tools of greater complexity than originally anticipated, a longer average time to fully process the exams than originally anticipated (requiring supplemental image viewers), and additional unforeseen exam-specific problems that arose after initial collection design. For example, the discovery of image-embedded PHI during visual-QA led to the installation of more robust de-identification software at both the SCs and CTIL prelim-QA checkpoints; the software was then run on all exams at the CTIL awaiting processing. By the end of December, 2006 (the original target completion date), only 37,798 (78% of 48,547 expected) had been delivered. Remaining exams were delivered in 2007, and archiving was completed in February, 2008. A good part of the final months’ effort was spent resolving outstanding issues with problematic exams and verifying with SCs the specific exams that were unavailable (lost, corrupt, compressed). These resolutions were delayed, at times, because SCs had not budgeted for this reconciliation period, and personnel were not always readily available.
Fig 3 Cumulative exams by processing stage. Total received (49,750) included duplicate and problematic exams not further processed; total archived (48,547) included those input that passed prelim-QA and visual-QA. Most activity completed early 2007 as seen (more ...)
Number of Archived CT Exams. Of the maximum number of possible CT exams (51,927 or three exams from 17,309 participants), performed screens numbered 48,723 (94%). Screens not performed were due to participant withdrawal (voluntary, death, required by NLST protocol), but the details of participant withdrawals remain unknown to CTIL personnel. Of the performed screens, only 176 (0.36%) were unavailable (lost, corrupt, compressed) from the SCs during the CTIL collection period, leaving 48,547 (99.64% of 48,723 screens performed) actually delivered and archived.
Number of Exams and Images per Exam, by SC. SCs enrolled varying numbers of participants. Figure shows the distribution of CT exams by SC, both the potential maximum number (three screens from every participant) and the actual number received and archived. The number of image slices per exam varied for many reasons, among them: participant size, protocol applied (reconstructed slice thickness and interval), and number of separately reconstructed series per exam. Some SCs saved only a required single protocol-specified image series, while other SCs reconstructed and saved multiple series. Figure also shows the average number of slices per exam by SC and the variation within each SC. For each SC, the lighter gray bar is the actual number of exams archived (scaled to the left ordinate); adding to that, a darker gray cap yields the maximum number of exams had all participants received three screens (i.e., no drop-outs). A circled center of a two-stdev error bar (scaled to right ordinate) is the average number of slices/exam from that SC. Overall, the average number of slices per exam was 257.
Exams and average slices/exam, by screening center.
PHI Detections. The potential for transmission of PHI was anticipated, but the locations in which it actually appeared were unexpected. Despite successful de-identification of the DICOM elements, we encountered exam dates in patient-protocol text-only image series, demographics in scout images, and radiology reports in secondary-capture image series. This required immediate notification of the originating SCs, Westat, and NCI, and simultaneous suspension of further collections from all SCs. Prior to resuming collection, the de-identification software was patched to detect and remove these kinds of image series. The upgraded software was then delivered to all SCs, the SCs ran the software and provided evidence of successful implementation, and collection was resumed. At the CTIL, all unprocessed exams, both those which had and had not yet arrived, were subjected to the same checks addressed by the software patches delivered to the SCs. Though twice applied (at the SCs and at CTIL), the checks made by these patches required but a few seconds per exam.
Problems. The vast majority of exams were delivered and archived without incident. However, multiple unexpected problems were encountered. Some could be solved in such a way as to prevent recurrence while others could only be solved semi-automatically; for example, an image series that contained multiple reconstructions needed to be divided into separate series, each containing images from a single reconstruction. Other issues required engagement of SC personnel; for example, a series lacking full lung coverage was accepted “as is” only after the SC confirmed it had delivered all of the images in its possession. As collection began, exams with problems detected by QA processes, automatic and visual, were “parked” for resolution while non-problematic exams were advanced through the system. Parked exams were processed in the background, when other activities were slack and/or CTIL management was available for resolution analysis. Librarians paper-documented such problems and filed them by SC and PID. Eventually, the CTIL management database was modified to record such problems, and this facilitated problem resolutions. The majority of problems encountered are found in Table .
Major Problems Encountered in CTIL Exams
CTIL Workforce. The principal investigator (5% FTE), laboratory director (5%), and project manager (5%) provided overall direction. A general-manager data administrator (50%) and database manager (40%) supervised two QA image librarians (each 75%) and a variable workforce of six part-time image viewers, working 8–32 h/week, no more than three at a time, for various numbers of months from December, 2005 through March, 2007. A systems administrator (15%) and network administrator (10%) provided installations (hardware and software), upgrades, and troubleshooting. NCI (<5%) and Westat (<5%) representatives monitored progress and coordinated with SCs to ensure timely exam-delivery.
Hardware/Software Failures. Two failed laptop hard drives were promptly repaired/replaced under a warranty agreement, as was one laptop’s DVD writer. Two of 60 XHDs failed. In one case, all but 50 of 1,504 exams were rescued with special salvage software (Undelete 4.0; Executive Software International, Burbank, CA); and the SC was asked to re-send the remaining exams that were unrecoverable. In the other case, all exams had been copied from the XHD and queued for processing, so it was unnecessary to ask the SC to re-send. Later, when trying to verify a problem with one of the exams, it was discovered that this XHD was unreadable and in need of re-formatting. In the EMC Centera mirrored archive, two nodes failed; because of the built-in redundancy, no data were lost. In late 2006, we were plagued with bottlenecks in the Merge Healthcare interface to the archive; but this was remedied with additional disk storage.