We conducted this study to characterize variation in adherence to RAND-proposed EBRT quality measures for the treatment of localized prostate cancer using the linked SEER-Medicare database. Analyzing administrative claims data from 11,674 elderly men diagnosed between 2000 and 2002, we observed a high level of adherence to each of the three EBRT technical measures. However, overall high technical measure adherence masked variation in adherence by geography, socioeconomic status, and radiotherapy facility teaching affiliation.
Our findings are consistent with and extend previous studies describing cancer care quality. Results of the National Initiative for Cancer Care Quality documented overall high adherence to evidence-based quality measures among patients with colorectal (86% adherence) and breast cancer (78% adherence), but noted variability in adherence across metropolitan areas.7
Other studies of breast cancer care quality have found that geography and higher socioeconomic status are associated with higher adherence to quality measures and variation in treatment.22, 23
While previous work has noted the importance of geographic and socioeconomic factors in initial therapy selection for men with prostate cancer24, 25
, our work highlights these factors as potentially contributing to variation in the quality of radiotherapy treatment delivery. Lastly, our work extends findings from patterns of care surveys that have documented increasing use of conformal radiotherapy for the treatment of localized prostate cancer.26
The SEER-Medicare data lack sufficient detail to determine the reasons underlying the observed variation in adherence. The principle advantage of using SEER-Medicare data is that it portrays, in broad brush strokes, an overview of radiation oncology treatment delivery to older Americans. Follow-up studies using more detailed sources of data may drill down to determine reasons for underlying variation. The variation we observed has several plausible explanations. Patients may receive care in settings which may differ in their capacity to deliver high quality care.27
This may relate to the geographic distribution of facilities with advanced technologies and patients’ ability to access to them.28
Because all patients in our cohort had Medicare insurance, financial access barriers are a less likely source of variation. Nonetheless, financial hurdles above and beyond insurance (such as co-payments, transportation costs, or time spent away from work) may present challenges for some patients.
We observed widely different adherence to the follow-up measure. The RAND report specifies that at least 2 follow-up visits be completed by the treating physician
in the first post treatment year.10, 11
However, while only one-third of patients consistently follow-up with their radiation oncologists, 80% of patients follow-up with either their radiation oncologists or urologists, suggesting that urologists are playing an active role in caring for patients who receive radiotherapy. We also found that two-thirds of patients had follow-up visits with their radiation oncologists at least once in the year following therapy. Perhaps, following radiotherapy, radiation oncologists triage patients back to referring urologists. It is also possible that patients prefer to maintain their care relationships with both referring urologists and treating radiation oncologists. In addition, urologists and treating radiation oncologists may have a cooperative approach to follow-up visits that incorporates both specialties.
Multidisciplinary care is important and likely beneficial to men after radiotherapy for prostate cancer. Yet, patients may also gain from maintaining follow-up with their radiation oncologists, who have particular training in evaluating and managing the potential long-term toxicities of radiotherapy. Whether care quality or outcomes differ among patients followed by referring urologists compared to treating radiation oncologists is uncertain and cannot be ascertained from our data. It is clear that observed practice patterns deviate from the RAND metric specifying two follow-up visits by treating radiation oncologists. As a result, greater awareness of follow-up in the radiation oncology community may be required.29
Furthermore, in view of the prevalence of “shared care” models between urologists and radiation oncologists, the quality metric itself merits clarification.
This work augments the emerging literature examining the RAND-proposed quality measures for localized prostate cancer.12, 30
One study assessed RAND measures for 168 men at a single academic institution and found generally high adherence.12
However, the investigators were unable to measure adherence to the technical measures of radiotherapy despite extensive review of electronic data and medical charts. Often, radiotherapy documentation is stored within outpatient radiotherapy departments or stand-alone facilities, separate from inpatient hospital documentation and not consistently included in patient charts. We found that important, specific technical measures of radiotherapy quality can be assessed for large numbers of patients using administrative claims data. Nonetheless, challenges in accessing radiotherapy documentation could be alleviated by developing standardized treatment summaries that include appropriate quality metrics and become part of patients’ permanent medical records.
Recently, Miller et al assessed adherence to RAND radiotherapy quality metrics among a sample of 1,385 men diagnosed with localized prostate cancer between 2000 and 2001 who received EBRT.31
The great majority (93%) of the cohort was 60 years or older. Using explicit chart review to assess quality measure adherence, the investigators found similarly high adherence to measures of computed tomography planning (88% adherence), high-energy photons (82%), and board certification (94%), compared to 85%, 75% and 85%, respectively, observed in the current claims-based analysis. However, they found lower adherence to immobilization (66%) and higher adherence to follow-up visits (66%) than we observed (97% and 34% in the current study, respectively). While the similarities lend external validity to our findings, the differences raise questions about the extent to which quality measure performance is accurately reflected in medical chart documentation versus Medicare claims.32
Differences in claims reimbursement among quality measures may be one reason for the discordance between the findings of Miller et al and the current study. Claims for conformal radiotherapy and beam energy represent the largest portion of EBRT reimbursement.19
Chart documentation and Medicare claims are likely to be similar for these measures, as omitting chart documentation or claims for these highly reimbursed processes () would be akin to not noting or billing for surgeries. There may be less financial incentive to report claims for follow-up visits, leading to differences between what is noted in patient charts and what is reported in claims. Nonetheless, the high proportion of patients (91%) who had at least 1 visit with radiation oncologists and urologists suggests that unbilled visits may not be a logical explanation for the lower follow-up observed in our study. Furthermore, we observed higher adherence to the immobilization measure in claims data compared to chart data, despite its lower reimbursement, suggesting that immobilization may have been provided but not documented in the chart. Therefore, the definition of what constitutes follow-up and with whom and the extent of immobilization may differ between claims data and chart data. These differences highlight the need for collaborative, complimentary methods of care quality assessment, combining the broad-based view of administrative databases with the nuanced, detail-rich information offered by direct medical chart abstraction.33
The use of administrative claims data for quality assessment has both advantages and limitations. While claims data can be efficiently analyzed to monitor population-based adherence to quality measures, such data are not collected for research purposes, can lack important, clinically relevant information, and may under-report treatment or processes of care that may be critical to quality ascertainment.34-37
Our research did not examine a comprehensive set of quality measures for EBRT. For example, we were not able to assess radiotherapy dose, a RAND-proposed quality measure, because it is not captured in SEER data or Medicare claims. Furthermore, while we were able to assess claims for conformal radiotherapy, we could not measure the quality of conformal planning itself (eg protection of the rectal mucosa, another RAND-proposed measure). Another limitation is that measures of socioeconomic status were captured at the census tract rather than the individual level. While previous studies have validated the use of proxy census indicators for individual socioeconomic status in health services research, such estimates may not provide precise measures of individual socioeconomic status.38
Lastly, our conclusions may reflect variation in coding practices rather than variation in care quality. While measure adherence differed among SEER registries, perhaps this reflects systematic, less detailed claims reporting in certain geographic areas rather than variation in care quality.35
In a population-based cohort of men with localized prostate cancer, we found that high adherence to care quality measures proposed by a RAND expert panel masked substantial variation in care quality by geography, socioeconomic status, and radiotherapy facility teaching affiliation. Future research should examine reasons for variation in these measures and whether variation is associated with important clinical outcomes.