|Home | About | Journals | Submit | Contact Us | Français|
State of the art radiotherapy treatment delivery has changed dramatically over the last decade, moving from manual individual field setup and treatment to automated computer-controlled delivery of complex treatments including intensity modulated radiation therapy (IMRT) and other similarly complex delivery strategies. However, the quality assurance (QA) methods typically used to make sure treatment is performed precisely and correctly have not evolved in a similarly dramatic way. This paper reviews the old manual treatment process and use of record and verify systems, and describes differences with modern computer-controlled treatment delivery (CCTD). The process and technology used for CCTD are analyzed in terms of potential (and actual) problems as well as relevant published guidance on QA. The potential for improved QA for computer-controlled delivery is discussed.
Until recently, most radiotherapy treatments were performed with “manual” delivery methods used since the 1950s. The treatment plan was documented in the paper treatment chart which the therapist used each day as the reference for patient positioning and setting machine parameters (individually). The machine console was used as an electro-mechanical device whether or not computers were involved (i.e., all parameters were manually set). One important addition was the implementation of computer-based “record and verify” (R/V) systems to avoid errors in parameters used for each treatment field.
In the past 10+ years, many new radiotherapy devices and techniques have been introduced. Multileaf collimators (MLCs) and automated field shaping are now standard. Most electro-mechanical control systems have been replaced with integrated computer-control systems. Conformal radiotherapy with IMRT(1) is widely used, and Image-Guided Radiotherapy (IGRT) is a major development effort. Routine use of these new technologies has forced the delivery process to migrate from the old manual process into a new computer-controlled treatment delivery (CCTD) process. This paper describes differences between manual and CCTD processes and discusses QA implications of these differences.
For many decades, treatment has been ”manual”. Treatment planning (with simulator, computerized planning, or by manual means) determined the field arrangement, the plan was written into the patient chart, and the prescription written and signed. Patients were positioned for treatment using skin marks, light field cross hairs and lasers. The treatment machine was set for each field individually by the therapist, based on treatment chart information, and blocks and wedges were manually added. After exiting the room, therapists input monitor units (MUs) and time into the console, and the field was irradiated. This scenario was repeated for each field. Port films were typically obtained weekly.
This process has many opportunities for transcription errors. When a transcription error occurs during planning or preparation, a systematic error in delivery results, as incorrect information will then be used for each treatment. Since individuals manually set each parameter for each field, there are also many opportunities to mis-read or mis-set parameters. Many standard QA measures were developed to avoid these common transcription errors, including many recommended by AAPM TG40(2).
R/V systems were developed to reduce the frequency of these common random treatment errors(3–6). Treatment error rates with R/V were reported to range from 0.5% to 3% per session(3,5,7). The R/V system must know patient, plan, and field parameters, so this information must be entered and called up correctly each treatment. When parameters are incorrectly input into the R/V system, systematic treatment errors will occur. R/V use contributes to some errors(8): a study of 43,000 treated volumes showed 15.6% of the errors were directly attributed to incorrect R/V information(9). One can address this thorny issue by making R/V an integrated part of the planning/delivery system. However, when the R/V system controls parameters used for treatment (as most do now), it is no longer “R/V”, rather, it is part of the computer-control system.
Delivery technologies now performed with computer control have been envisioned for decades (for example, see work by Takahashi(10) and the Joint Center(11)). However, it was the 80s before the first commercial CCTD system, the Scanditronix MM50 Racetrack Microtron(12), became available. This system incorporated a fully computerized control system, MLC(13), and photon and electron beams (to 50 MeV) flattened with computer-controlled scanning(14). This system was used for most early work on use of fully computer-controlled machines(15–22). Most modern treatment machines now incorporate computer control, automated MLC-based field shaping, IMRT, plan download, automated delivery, and integrated imaging. Gantry-mounted accelerators, tomotherapy, robot-mounted accelerators, and proton therapy all share the need for a CCTD process.
QA needs for modern CCTD include QA of individual hardware and software systems, consideration of the planning/delivery process and techniques to be used, and most importantly flow through the entire process (Fig. 1). Assuring quality depends crucially on system design, the clinical process, careful testing and documentation, and on integration of appropriate QA into the process. Appropriate QA depends on understanding possible failure modes and incorporating appropriate safety checks to address common failure modes.
The need for comprehensive CCTD system QA was demonstrated clearly 20 years ago by a series of fatal accidents involving the Therac 25 accelerator (Atomic Energy of Canada, Ltd.). These accidents resulted from procedural problems as well as hardware/software design. Leveson(23) describes the failures, illustrates how software engineering addresses reliability of sophisticated software-controlled systems, and discusses non-software-related QA for these systems. The report concludes that “Most accidents involving complex technology are caused by a combination of organizational, managerial, technical, and sometime sociological or political factors. Preventing accidents requires paying attention to all the root causes …”(23).
The main guidance on QA for computer-controlled radiotherapy comes from AAPM TG35, which was formed to address the Therac 25 accidents. The TG35 report(24) defines a classification scheme to help determine QA priorities, describes procedures for handling potential safety hazards (inappropriate responses to problem indications are a significant component of most accidents), and suggests careful, continuing therapist training. TG35 also recommends vigorous control system testing (user interface, safety interlocks, computer-control functions), clear delineation of clinical, service and testing modes, and documentation of valid parameter ranges for system settings. The knowledge, documentation, and understanding of control system architecture, design, and implementation a typical user has is quite limited, making effective testing difficult. TG35 recommends that vendors provide reasons for changes, bug fix descriptions, modification details, operational changes, site-dependent and user-accessible data or software which may be affected, testing procedures, revised specifications, support documentation, operations manuals, and beta test results with any software installation or update Unfortunately, little of this is typically available, and vendor adherence to TG35 recommendations would make possible significantly more effective testing and use by the user.
Testing results for some commercial control systems have been published(15,26–28). The specification, design, implementation and safety analysis of a radiotherapy control system has been performed by Jacky(25). Testing scripts, control system simulators and other testing-related capabilities(19) were designed into our CCRS system(18–21).
CCTD information flow (Fig. 1) highlights important QA issues involving data transfer. In the manual process, plan information was transcribed into the paper chart used for daily treatment setup. CCTD data are transferred automatically from imaging system(s) to planning system and then to the delivery system electronic database (e-chart), so there are virtually no random transcription errors. However, there may be more potential for systematic errors, so QA efforts must be directed toward problems in planning and transfer, since they lead to systematic treatment errors. The QA required to avoid these systematic problems is quite different than that for random transcription errors (which are the target of typical MU calculation and weekly chart checks).
Comparison of delivery errors for manual versus CCTD processes has demonstrated a significant decrease in random errors with CCTD(29). 34,000 consecutive treatment sessions (15 months) were studied, evaluating all treatment errors (52 in all) documented by therapists or QA reviews. Manual treatments on two machines (therapists used paper charts, individually set up fields, without R/V) were compared to CCTD (using CCRS(18–21)) on two other machines. Errors were divided into segment errors (affecting an individual field or segment) and patient/plan errors. The CCTD segment error rate was 0.006%/segment, 40 times less than the manual error rate, even though the CCTD treatments were significantly more complex. For the completely computer-controlled machine, the only segment errors were an incorrect parameter override (one field, first day), and missing bolus (one field). Analysis of CCTD patient/plan errors showed 75% involved wrong patient setup, mainly incorrect table height (determined only by retrospective analysis). These results demonstrate why IGRT with automated repositioning is important for high-precision treatment.
Another CCTD QA issue involves automatic data transfers through all steps in the process, since the plan may flow through the process and be used daily without anyone actually looking at or thinking about the plan. All “reasonableness” checks that usually happen as people manipulate a plan can be lost. This was a major consideration in our CCRS system design, leading to inclusion of a graphical simulator for plan review with a graphical machine model(20) for both plan preparation and delivery(18). All CCTD systems should incorporate graphical and image-based review capabilities to provide easy review of plans as they are moved, modified and treated, to assure that human review QA is not lost to automation. Each department’s QA program should be recast to respond to new CCTD methodology: e.g., weekly “chart rounds” reviews should be redefined to include full staff participation, and use of images and electronic treatment information to verify accuracy of automated and image-guided treatment deliveries.
What kind of weekly physics checks are necessary for CCTD? Standard practice dictates review of chart and treatment parameters weekly to check documentation and look for unexpected changes, even though random recording errors and unplanned changes should not happen for CCTD. Manual checks are ineffective for IMRT and similarly complex treatments which include thousands of parameters, clearly impossible to manually review carefully. Sophisticated QA reports can compare consistency of each parameter from each session to its planned values, and flag unexpected or out-of-tolerance results for review. Fig. 2 illustrates a CCRS(18–21) report which automatically compared all parameters each day, noting normal completion and highlighting exceptions. All planned changes for the treatment course were known to the system, so this weekly report performed a much more complete check of all parameters than possible by hand while also providing automatic documentation of all exceptions and changes to plan or delivery and automated email notification of significant discrepancies.. This report left more time for physicists to investigate unexpected situations or evaluate plans and deliveries. More sophisticated QA review software would be helpful for all vendor’s CCTD systems.
Daily QA required for CCTD systems depends on the machine and technologies used. Modern CCTD systems control every aspect of the machine’s performance, so daily testing should be more extensive than the standard morning check, a potential increase in work and time required each day that must be addressed. Rethinking morning QA to concentrate on CCTD will minimize effort spent on less relevant standard tests. Computer-control can sequence the machine through a series of QA tests, interfacing QA devices, film, and/or portal imaging to further automate testing. Thompson has described such an automated QA program(30), including output checks, and analysis of control and mechanical accuracy. Incorporation of imaging systems can further automate some checks (31).
Much modern radiotherapy is performed with computer-controlled treatment delivery systems which are electronically linked to the treatment planning system. Given the high degree of connectivity and functionality, the entire planning/delivery process must be carefully crafted with QA designed to support that process. Though there has been some research on the CCTD process, many aspects of the old manual methods have simply been ported to the new systems. QA procedures have changed even more slowly, and many QA steps appropriate for the old process have been taken over to CCTD, often without much consideration of appropriateness.
New and more systematic approaches to QA for CCTD are necessary. Random transcription errors which invariably happen as humans transfer information manually are no longer the most important issue, as transfers are automated. More important are the much less common but potentially more severe systematic errors which can occur, especially in interfaces between systems. Automation also removes much human scrutiny from the process, and ways to re-implement human scrutiny must be included. More sophisticated tools for QA analysis, tuned to the real needs (and capabilities) of CCTD systems, must be provided by vendors so improved radiotherapy quality can accompany the rapidly increasing complexity of modern treatment techniques.
Conflict of Interest Statement: The author is an investigator in a research agreement between the University of Michigan and Varian Oncology Systems.
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.