In our systematic review of state-level hospital quality publicly reporting programs, specifically focused on clinical outcomes for adult patients, we found that just half of states were engaged in public reporting of hospital quality. These programs varied in content, as only a third provided data on hospital processes of care, whereas nearly three-quarters provided data on hospital outcomes, including acquired infection, readmission and mortality rates. Moreover, these programs varied in clinical focus, with many reporting on care for cardiac surgery patients, as well as for adults hospitalized for AMI, heart failure, and pneumonia, whereas others variably reported on care for less common causes of hospitalization, including gastrointestinal hemorrhage, carotid endarterectomy, and craniotomy. Finally, it is important to note that identifying information about and from state programs was challenging, making it unclear how useful the information currently is to patients and communities.
As public reporting of quality and performance has become increasingly common, and national public reporting efforts by CMS are only expected to expand, it is important to note that the vast majority of state public reporting programs were found to provide hospital quality information that was complementary to, rather than redundant with, the information currently publicly reported by CMS. Although states providing data on hospital processes of care were all focused on the same clinical conditions that are currently reported on by CMS, specifically care processes during hospitalizations for AMI, heart failure, and pneumonia, reporting was not limited to Medicare fee-for-service beneficiaries aged 65 years or older and included younger adults and older adults insured through private plans and Medicare-affiliated health maintenance organizations. Similarly, whereas states providing data on hospital outcomes of care for the most part reported on AMI, heart failure and pneumonia mortality, as CMS does currently, states predominantly reported in-patient mortality estimated using the AHRQ IQI methodology as opposed to 30-day mortality, in addition to reporting on a variety of other causes of hospitalization. However, in this instance, CMS’s adherence to a standardized period for outcome assessment (i.e., 30-day morality and readmission rates) is preferable to reporting on in-patient mortality, which has been shown to bias performance estimates to favor hospitals with shorter length of stays (20
Given the number and breadth of the state public reporting programs we identified, policy makers should consider three initiatives that may further improve and facilitate the availability of hospital quality information. First, state reporting efforts could improve accessibility by using a single, easily navigable Internet site that includes information from each state’s public reporting program. Ideally, this Internet site would integrate, or at least include information from the CMS public reporting program. Information from state reporting programs was difficult to find and using a single site for all state public reporting programs would facilitate information accessibility. However, such a site would face bureaucratic challenges with respect to negotiating responsibilities for site coordination, production and development, and payment. Second, given these likely challenges, state public reporting program administrators should increase efforts to meet, either in-person or remotely, to share stories of successes and failures in their programs. There were many similarities across state efforts, particularly in clinical focus, suggesting that states could have much to learn from one another’s experiences. Finally, the state public reporting programs require rigorous, systematic evaluation in order to ensure that the information being made available is being used by patients, physicians, or hospital administrators to inform healthcare decisions and that it is valid, comprehensive, comprehensible, relevant, reasonable, and functional (5
). Furthermore, the impact of state public reporting programs on clinical outcomes should also be assessed. The measurement and reporting of quality information is a public good that promotes transparency and accountability, but programs are not without cost and should be evaluated to ensure they are of sufficient value to the community.
Our study has several limitations. First, we conducted our survey during the summer of 2009. However, state public reporting initiatives are changing rapidly. For instance, we are aware of several states that have implemented changes to their programs during 2010, including the launch of expanded programs in Maryland and Ohio, inclusion of hospital-acquired infection rates in Oregon, and the reporting of the new surgical care improvement score, focused on antibiotic and venous thromboembolism prophylaxis pre- and post-surgery, in states such as New Jersey. Second, we focused on state government or affiliated agency programs designed to publicly report hospital quality. We identified one other program not affiliated with the state government, in Wisconsin, that measured and reported similar information on hospital quality. It is possible that there were additional non-government state programs that we did not identify, as well as programs sponsored by insurance plans or other organizations that we did not capture in our review. In addition, despite using a systematic approach to identify state programs, given the challenges we identified in finding information on state public reporting programs, there remains the possibility that there were state public reporting programs that we did not find. Third, we focused on programs measuring clinical aspects of hospital quality for adult patients. There were several other states that also reported non-clinical aspects of hospital quality, such as costs, volume, and length of stay. In addition, our review neither captures programs measuring clinical aspects of hospital quality for pediatric patients, nor clinical aspects of ambulatory care quality.
In conclusion, our systematic review of state publicly reporting programs focused on hospital clinical outcomes found that identifying information about state programs was challenging. However, while just half of states were engaged in public reporting of hospital quality, state public reporting programs provided hospital quality information that was complementary to, rather than redundant of, the information currently publicly reported by CMS. Nevertheless, there were clear state-to-state differences in their investment in public reporting and no standardized approach to data collection, analysis, and presentation. Future research should focus on what public reporting efforts have achieved.