PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of brjradiolSubmitSubscribeAboutBJR
 
Br J Radiol. 2012 November; 85(1019): e1067–e1073.
PMCID: PMC3500805

A survey of the practice and management of radiotherapy linear accelerator quality control in the UK

A Palmer, MSc, MIPEM,1,2 J Kearton, MSc, MIPEM,1 and O Hayman, MSc, MIPEM1

Abstract

Objectives

The objective of this study was to determine current radiotherapy linear accelerator quality control (QC) practice in the UK, as a comparative benchmark and indicator of development needs, and to raise awareness of QC as a key performance indicator.

Methods

All UK radiotherapy centres were invited to complete an online questionnaire regarding their local QC processes, and submit their QC schedules. The range of QC tests, frequency of measurements and acceptable tolerances in use across the UK were analysed, and consensus and range statistics determined.

Results

72% of the UK's 62 radiotherapy centres completed the questionnaire and 40% provided their QC schedules. 60 separate QC tests were identified from the returned schedules. There was a large variation in the total time devoted to QC between centres: interquartile range from 13 to 26 h per linear accelerator per month. There has been a move from weekly to monthly testing of output calibration in the last decade, with reliance on daily constancy testing equipment. 33% of centres thought their schedules were in need of an update and only 30% used risk-assessment approaches to determine local QC schedule content. Less than 30% of centres regularly complete all planned QC tests each month, although 96% achieve over 80% of tests.

Conclusions

A comprehensive “snapshot” of linear accelerator QC testing practice in the UK has been collated, which demonstrates reasonable agreement between centres in their stated QC test frequencies. However, intelligent design of QC schedules and management is necessary to ensure efficiency and appropriateness.

A quality assurance system in radiotherapy is necessary to ensure treatment delivery is as intended. This will include a multitude of quality control (QC) tests, implemented to evaluate actual operating performance in comparison with goal values, and to enable rectification of any differences outside prescribed tolerances. To improve safety in radiotherapy, a historic approach of testing anything that can be tested has provided reassurance, but has consumed a great deal of time and resources, and has not prevented all errors. With increasing complexity of radiotherapy equipment and techniques, these traditional methods of determining the content of a QC schedule can lead to an unmanageable quantity of tests, and this approach is not sustainable into the future. Benchmarking of QC practices between centres is a valuable initial step to provide evidence for rationalisation and optimisation of QC tests. This may need to be followed by a radical redesign of approaches to QC, using, for example, failure modes and effects analysis (FMEA) [1], cost/benefit/risk analysis [2-4], focusing resources optimally on patient safety [5] and formal application of the Department of Health “quality, innovation, productivity and prevention” (QIPP) concept [6].

Guidelines for linear accelerator QC schedules exist, including Institute of Physics and Engineering in Medicine (IPEM) Report 81 [7], American Association of Physicists in Medicine (AAPM) Task Group 142 report (2009) [8], Fan and Price’s “Conventional linear accelerators” [9], and the forthcoming EU guidance RP162 [10], alongside QC for specific equipment (e.g. reference 11), but direct application of all of these can lead to an excessive quantity of work and duplication. However, arbitrarily reducing QC testing without full risk assessments and supporting documentation may be difficult to defend. Further, QC guidance from the Institute of Physics and Engineering in Medicine (the physics professional body in UK), IPEM Report 81 [7], has been withdrawn from print and is due for review. IPEM Report 81 is based on, and includes, results of a survey of UK QC practice that was undertaken in 1991 [12]. This survey was based on a questionnaire asking how frequently specified tests were performed. There was a large spread in check frequencies, some from “daily” to “commissioning only”, but the authors reported that in most cases consensus was achieved and they chose to report median frequencies.

While there have been numerous publications on QC tests for specific linear accelerator functionality since the publication of IPEM Report 81, and a subsequent survey of QC practice was undertaken by the START clinical trial team published in 2001 [13], a contemporary repeat survey of UK QC practice is required to reflect changes that have taken place in the last decade, which include changed functionality and reliability of equipment and clinical applications. Most notably, benchmarking of QC practices for intensity-modulated radiotherapy (IMRT) is specifically required, especially as centres consider whether IMRT patient-specific QC is supplementary to, or a replacement for, aspects of conventional linear accelerator QC.

The publication of a comprehensive assessment of current UK linear accelerator QC practice has several potential benefits for individual radiotherapy departments. Centres may be reassured that their QC systems are in line with national average practice. Alternatively, centres may identify discrepancies against the national average of practice and, following investigation, this may lead to either a reduction of tests or frequencies (and hence efficiency savings) or resolution of deficiencies and potential improvements in safety and quality.

The IPEM Interdepartmental Audit Group E (also known as the “South East Central” regional audit group) commissioned a survey of the QC being performed on conventional C-arm gantry linear accelerators to provide a reference benchmark of current practice. Initially intended as a regional audit, this was expanded to include all centres in the UK. A supplementary questionnaire survey of linear accelerator QC services was also proposed to gauge QC management practices and performance issues across the UK, including whether QC is a key performance indicator, percentage completion of tests, total time spent performing QC, self-assessment of potential efficiency savings and level of involvement of stakeholders. The results of the benchmarking comparison of QC schedules and the survey of QC attitudes and services is presented in this work. It should be noted that this study is concerned with planned QC tests only, and not machine servicing, preventative maintenance or QC tests following repair, which add additional workload overhead.

Methods and materials

All 62 radiotherapy centres in the UK were contacted by email to request their contribution to the study. Centres were asked to complete an online questionnaire and to e-mail their schedules for routine linear accelerator system QC and patient-specific IMRT QC for analysis. An initial invitation and two follow-up e-mails were sent to elicit participation of radiotherapy centres.

The online survey was produced via www.evaluate-it.co.uk, and was open for completion through September and October 2011. Eight questions were included in the survey, with between three and nine multiple choice responses for each, and opportunity for free text comments. The topics included in the survey are given in Table 1. Where appropriate, statistical analysis of the responses was undertaken.

Table 1
Questionnaire topics

Data from the QC schedules were analysed by first collating a comprehensive list of all tests performed across the UK, and either matching these to the QC procedures in IPEM Report 81 [7] or identifying new test descriptors. This was performed for both linear accelerator system QC and patient-specific IMRT QC. The test frequency and tolerance values used by each centre were then compiled for each of the tests, subdivided by equipment manufacturer. Statistical analysis of the data was undertaken to evaluate the range of individual practice.

Results

Of the 62 radiotherapy centres in the UK that were contacted, 72% completed the online questionnaire and 40% submitted their QC schedules for analysis; the latter comprised 62% with Varian Medical Systems (Palo Alto, CA), 24% Elekta (Stockholm, Sweden), 5% Siemens Medical Solutions (Forchheim, Germany) and 4% both Varian and Elekta linear accelerators.

Online questionnaire survey

When asked how frequently centres' QC schedules had been reviewed, 54% stated a review had taken place within the last year, independent of any modifications owing to new equipment or specific need, and 26% stated a review had taken place within the last two years. 33% confirmed awareness that their QC schedules were in need of an update and review, and 35% suspected that efficiency savings and productivity improvements could be made in the QC work that was undertaken.

When asked how decisions are made about what QC is performed, 54% used machine reliability and historic data, 41% simply added additional QC tests for each new equipment/functionality, 37% used previous near misses, errors or incidents, 30% used risk assessments, 20% used “plan–do–study–act” systems [14], 11% used “lean quality system”, 7% used FMEA or other “industry quality system”, and 3% used QIPP analysis.

Figure 1 shows a “word cloud” visual representation of the most important guidance and publications that were identified by centres in the survey to determine their linear accelerator QC requirements. The font size used in the figure is proportional to the number of centres that indicated the document was a primary reference in determining what QC they perform.

Figure 1
“Word cloud” diagram of primary influences determining UK radiotherapy centres' QC schedules for linear accelerators (font size proportional to number of citations) [3,8,15-21].

96% of centres achieve on average at least 80% of their planned monthly linear accelerator QC. However, fewer than 30% of centres routinely complete all tests specified in their quality system documentation. 3% of centres do not routinely record compliance data and 18% have removed some equipment functionality from clinical use to reduce overall QC demands (e.g. stopped electron treatments on one linear accelerator).

Table 2 indicates the various combinations of normal day (09:00 to 17:30), early morning, evening or weekend sessions used to perform routine QC across the UK, and the proportion of centres adopting the combinations. The most common schemes are normal day only, and normal day with supplementary evening sessions, both at 30% of centres. 61% of centres use regular evening and 18% use regular weekend sessions for QC, with only 5% of centres not having any access to linear accelerators for QC during the normal working day.

Table 2
Proportion of centres routinely using indicated combinations of normal day, evening, weekend and early morning sessions to undertake linear accelerator QC

Table 3 provides data on the time spent performing linear accelerator QC at centres in the UK. The mean total machine time for routine system QC, excluding patient-specific IMRT QC, is 15.0 h per linear accelerator per month. An additional mean 4.5 h per linear accelerator per month is required for offline QC analysis, away from the linear accelerator. The mean total time (linear accelerator and analysis time) for patient-specific IMRT QC was 1.5 h per patient.

Table 3
Time (hours) spent undertaking linear accelerator QC testing

The final questions related to the reporting of QC results and the involvement of stakeholders. 45% of centres reported percentage completion of QC tests as a performance indicator within radiotherapy physics, and 23% reported these results to other groups or managers. 9% of centres thought staff groups outside radiotherapy physics were actively interested in the percentage completion of QC tests or the results obtained. 55% of centres thought radiographers were fully informed of QC test acceptable values, while 27% thought clinicians were fully aware of tolerances for key QC tests, such as standard output measurements. 80% of respondents to the survey were happy that all QC tests undertaken had a positive impact on patient outcome.

Quality control schedules

There was wide variation in the submitted QC schedule document formats, including spreadsheets, text documents and tick-sheets. Approximately half of the submitted documents included tolerance or action values for each QC test. No statistically significant differences were found in quoted QC test frequencies or tolerance values between different linear accelerator manufacturers, based on comparison of mean values; hence data from all are combined in the results below. All results relate to conventional C-arm gantry linear accelerator platforms (not tomotherapy).

Common QC test descriptors were extracted from the returned QC schedules, and compared with the linear accelerator QC recommendations in IPEM Report 81 [7]. Table 4 presents the matched QC tests and Table 5 the QC tests that were not included in the IPEM report. The percentage of QC schedules that contained each test and the percentage of those centres testing at various frequencies (per patient, daily, weekly, monthly, annual and intermediate) are presented in the tables. It is important to note the tables represent QC listed within physics department QC schedules, and supplementary tests may be undertaken by radiographers or engineering departments. Also, a less than 100% take-up of simple tests may be the result of combination with more complex assessments (e.g. “weekly output calibration” may instead be undertaken with “IMRT point dose measurements”). Therefore, the actual coverage of tests within a radiotherapy department may be higher than implied by Table 4, which should be interpreted as a lower limit estimate.

Table 4
Stated frequency of QC checks and percentage occurrence within submitted physics department QC schedules, for checks identified in IPEM Report 81 [7], with comparison with previously recommended frequency
Table 5
Stated frequency of QC checks and percentage occurrence within submitted physics department QC schedules, for checks not included in IPEM Report 81 [7]

Only 13% of tests have unanimous agreement of testing frequency across the 60 tests in the two tables. The average percentage agreement among centres on the most common frequency for each test is 61%. In 62% of tests, the most common frequency agrees with that proposed in the IPEM Report 81 [7] (Table 4). Agreement is highest where the IPEM report recommended daily or annual checks and these are still being performed at these frequencies in most centres, such as daily output constancy measurements and annual three-dimensional scanning water tank measurements. Notable differences between current common practice and recommendations in the IPEM report are: output “calibration” measurements, which are performed monthly by 60% of centres, being previously recommended weekly; output for non-standard fields, which are performed monthly by 58% of centres, with previous quarterly recommendation; and beam flatness and symmetry at zero gantry angle, performed monthly by 81%, and previously recommended two-weekly.

There are an additional 23 common QC tests identified in Table 5 that were not included in IPEM Report 81 [7]. These cover IMRT, multileaf collimator (MLC) and imaging technology, which have gained widespread implementation since the publication of the IPEM report. The most commonly reported additional test is the MLC “picket fence” test, used routinely at 52% of centres. The majority of additional tests are at either monthly or multimonth frequency, although there is less agreement in testing requirements than for the more established tests reported in Table 4.

Table 6 presents a summary of the stated tolerance or action values used for QC testing at the surveyed centres, for a sample of tests. The modal value and minimum and maximum range of tolerances used across the UK is provided. The mode was a good representation of the value used in the majority of centres, and frequently also equates to the maximum range.

Table 6
Modal value and range of stated tolerance or action value for selected QC checks, where sufficient responses were provided for significance (n>5)

Discussion

A 72% response rate to the online questionnaire gives high significance to the results, being a good representation of UK practice. In most cases the respondent was either the head of radiotherapy physics or the clinical scientist responsible for QC, ensuring validity of the responses.

A surprising number of radiotherapy centres reported that they were aware their QC schedules were in need of an update and suspected that efficiency or productivity could be improved. It is possible this is a result of reliance on the aged IPEM Report 81 [7], which was cited by 100% of respondents as a primary source of guidance in determining QC schedules. Contemporary references were used as additional key guidance by only 30% of centres. Workload and resource pressures in radiotherapy physics departments mean prioritisation of tasks must be made, and it is possible that the need to keep pace with advanced radiotherapy techniques is to the detriment of review of basic QC testing methodologies. There is certainly scope for a more modern approach to determining QC schedules, with very few centres adopting intelligent design for QC testing, and only 30% using risk assessments. Indeed, a risk-based approach is considered preferable to the adoption of a standardised list of required QC, since it will take into account local equipment factors and proven reliability, local physics knowledge and expertise, availability and complexity of QC testing equipment, and local radiotherapy treatment techniques. This is also the reason why complete standardisation of QC schedules and testing between radiotherapy centres is not possible. The data presented in this work are useful as an input to risk assessment, however, demonstrating a benchmark comparison of QC tests upon which local circumstances can be considered.

There was significant variation in the time spent performing linear accelerator QC at centres across the UK. A limitation of the study is that “physics departments” were asked for their QC schedules, and this may have led to the non-reporting of QC tests performed by radiographers or engineers in some departments. This may have led to under-reporting of daily or weekly “quick check” tests. The quartile range or maximum values of testing frequency may be more reliable than the minimum for this reason. However, even accounting for this potential varied interpretation of data, the range in time spent performing QC is considerable. There must be significant differences in efficiency of processes, design of QC schedules or quantity of QC testing performed at centres across the UK. The quantity of QC testing performed may be related to ease of access to equipment, established staffing or other local demands.

It is surprising that less than half of UK radiotherapy centres report percentage completion of QC tests within their own departments, especially as only one-third of centres routinely achieve all planned QC tests each month. Including QC as a key performance indicator may improve compliance. It was also perceived that other professional groups within radiotherapy had little interest in QC and moderate understanding of the QC tolerance values used. Raising the profile of QC may be beneficial—for example, ensuring that clinicians are aware of the accuracy of the treatment being delivered, such as absolute dosimetry output accuracy.

Collation of QC schedules from 40% of UK radiotherapy centres is deemed sufficient to be representative of practice across the UK, since respondents were distributed between both large and small centres (which may impact on QC testing economies of scale), and were not subject to bias in terms of geographical location. A number of centres could not contribute their QC schedules as they stated a desire to “update” or “review” the documents prior to submission, and this was not possible within the timescale of the study. There is a bias towards Varian linear accelerator centres, which could reduce the value of the results for alternative manufacturers' equipment, particularly where equipment-design-specific elements are considered. However, from the data received, there were no apparent differences in tolerance values or testing frequency between the different linear accelerators. For fundamental QC tests, such as output calibration and beam quality measurements, equipment manufacturer is not expected to be a deterministic factor, and hence the majority of results are applicable to all linear accelerators. Local interpretation of the presented results is essential prior to any implementation or amendment to QC testing.

The survey presented in this work differs from that conducted for IPEM Report 81 [7] since we have analysed all QC tests which are undertaken in each centre rather than only requesting the frequency at which specific pre-defined test descriptors are conducted. This is expected to lead to an increased number of tests. There is a high level of agreement between the QC testing frequencies reported here and that of IPEM Report 81 [7], and also with the survey conducted by Venables et al in 2001 for the START trial [13]. This indicates little change in basic QC practice over the last decade. A notable difference is routine output (calibration) measurement, which is now commonly performed monthly (in combination with a daily check device), compared with weekly in both the survey by Venables et al [13] and IPEM Report 81 [7]. The 2009 report on QC testing from the USA, AAPM TG 142 [8], also recommends monthly physics output measurement and daily constancy checks. The AAPM report [8] includes modern treatment equipment and techniques missing from IPEM Report 81 [7], and is hence a good source of further reference.

The relationship between linear accelerator system QC and patient-specific IMRT QC is relevant in considering linear accelerator QC schedules. While the adoption of patient-specific testing may reduce the need for certain general system checks, it is preferable to improve the basic QC testing of linear accelerators to reduce the demands of individual plan delivery checks. The majority of radiotherapy centres in the UK aim to increase the percentage of treatments that utilise IMRT techniques. A prerequisite for significant increase will be the reduction of patient-specific QC testing. The adoption of dynamic MLC checking into routine system QC, the adoption of robust independent software checks on treatment plans, the use of linear accelerator-generated MLC performance data, and the use of electronic portal imaging devices and in vivo dosimetry may all reduce the total QC time required for linear accelerators, at the same time as improving adoption of modern treatment techniques. QC schedules must be continually reviewed with the introduction of these additional verification methods.

Conclusion

A benchmark data set of linear accelerator QC testing frequency and tolerance values has been presented that is representative of practice across the UK, and shows a high level of consistency between centres. There is relatively good agreement between current practice and recommendations from IPEM Report 81 [7], published over a decade ago; however, this may be more indicative of a lack of local review and modernisation of QC practices rather than a contemporary endorsement of the report. The document is still the most frequently cited as a reference source for QC guidance, but does not include the most modern radiotherapy technology and techniques.

A modernisation of approaches to QC is required to ensure continued safety and highest-quality radiotherapy into the future as technology and procedures continue to increase in complexity alongside increasing workforce pressures. QC testing schedules must be designed intelligently, including risk-based assessments of need rather than perpetually increasing the number of QC tests to be performed. It is essential that time and resources are sufficient to investigate the unusual tests, rather than simply carry out the usual tests. Inclusion of QC testing as a key performance indicator within radiotherapy physics is advocated.

The contents of this report must not be interpreted as professional advice as to the requirements of a linear accelerator QC schedule and are presented as a benchmark only for comparison. Local decisions on QC testing must be made based on full risk assessment, further analysis and local factors.

Acknowledgments

The authors wish to thank and acknowledge the contribution of all centres that provided information to this study. We also thank Stuart Williams and Colin Lee for their assistance in data analysis, and gratefully acknowledge Matthew Hayman, of Innovation with Substance, for implementation of the online questionnaire.

References

1. Thomadsen B. Critique of traditional quality assurance paradigm. Int J Radiation Oncology Biol Phys 2008;71:S166–9. [PubMed]
2. Kapanen M, Bly R, Sipila P, Jarvinen H, Tenhunen M. How can a cost/benefit ratio be optimized for an output measurement program of external photon radiotherapy beams? Phys Med Biol 2011;56:2119–30. [PubMed]
3. McKenzie A, Briggs G, Buchanan R, Harvey L, Iles A, Kirby M, et al. Balancing costs and benefits of checking in radiotherapy. Report no. 92. York, UK: IPEM; 2006.
4. Blache L, Robbins P, Brown S, Jones P, Liu T, LeFever J. Risk management and its application to medical device management. Report no. 95. York, UK: IPEM; 2008.
5. Ishikura S. Quality assurance of radiotherapy in cancer treatment: toward improvement of patient safety and quality of care. Jpn J Clin Oncol 2008;38:723–9. [PubMed]
6. Department of Health Quality, innovation, productivity and prevention (QIPP). London, UK: HMSO; 2011. Available from: www.dh.gov.uk/en/Healthcare/Qualityandproductivity/QIPP/index.htm.
7. Mayles WPM, Lake R, McKenzie A, Macaulay EM, Morgan HM, Jordan TJ, et al. Physics aspects of quality control in radiotherapy. Report no. 81. York, UK: IPEM; 1999.
8. Klein EE, Hanley J, Yin F-F, Simon W, Dresser S, Serago C, et al. Quality assurance of medical accelerators. AAPM Task Group 142 report. Med Phys 2009;36:4197–212. [PubMed]
9. Fan J, Price RA. Conventional linear accelerators. In: Pawlicki T, Dunscombe PB, Mundt AJ, Scalliet P, editors. , (eds). Quality and safety in radiotherapy. Abingdon, UK: Taylor & Francis; 2011.
10. EU Guidance RP162. Radiation criteria for acceptability of medical radiological equipment used in diagnostic radiology, nuclear medicine and radiotherapy. European Commission Contract No. ENER/10/NUCL/SI2.581655. (Draft document not available for citation)
11. Williamson JF, Dunscombe PB, Sharpe MB, Thomadsen BR, Purdy JA, Deye JA. Quality assurance needs for modern image-based radiotherapy: recommendations from 2007 interorganizational symposium on “quality assurance of radiation therapy: challenges of advanced technology”. Int J Radiat Oncol Biol Phys 2008;71:S2–12. [PubMed]
12. IPEM Radiotherapy Physics Topic Group Survey of quality control practice in UK hospitals. Scope 1992;1:49–61 In: Mayles WPM, Lake R, McKenzie A, Macaulay EM, Morgan HM, Jordan TJ, Powley SK, editors. , (eds). Physics aspects of quality control in radiotherapy. Report no. 81. York, UK: IPEM; 1999.
13. Venables K, Winefield E, Deighton A, Aird E, Hoskin P. A survey of radiotherapy quality control practice in the United Kingdom for the START trial. Radiother Oncol 2001;60:311–18. [PubMed]
14. NHS Institute for Innovation and Improvement Quality and service improvement tools: plan, do, study, act (PDS). Coventry, UK: NHS Institute; 2008. Available from: www.institute.nhs.uk/quality_and_service_improvement_tools/quality_and_service_improvement_tools/plan_do_study_act.html.
15. Aletti P, Bey P. Recommendations for a quality assurance programme in external beam radiotherapy. ESTRO booklet no. 2. Brussels, Belgium: ESTRO; 1995. Available from: www.estro-education.org/publications/Documents/Booklet_n2.pdf.
16. Kutcher GJ, Coia L, Gilin M, Hanson WF, Leibel S, Morton RJ, et al. Comprehensive QA for radiation oncology. AAPM Task Group 40 report. Med Phys 1994;21 Available from: http://www.aapm.org/pubs/reports/rpt_46.pdf. [PubMed]
17. Hiles P, Mackenzie A, Scally A, Wall B. Recommended standards for the routine performance testing of diagnostic x-ray imaging systems. Report no. 91 York, UK: IPEM; 2005.
18. Kirby M, Carpenter D, Lawrence G, Poynter A, Studdart P. Guidance for commissioning and QA of a networked radiotherapy department. Report no. 93. York, UK: IPEM; 2006.
19. Kirby M, Ryde S, Hall C. Acceptance testing and commissioning of linear accelerators. Report no. 94 York, UK: IPEM; 2007.
20. James H, Beavis A, Budgell G, Clark C, Convery D, Mott J, et al. Guidance for the clinical implementation of intensity modulated radiation therapy. Report no. 96. York, UK: IPEM; 2008.
21. Aspradakis MM, editor. (ed). Small field MV photon dosimetry. Report no. 103. York, UK: IPEM; 2010.

Articles from The British Journal of Radiology are provided here courtesy of British Institute of Radiology