PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of bmjThis ArticleThe BMJ
 
BMJ. 2007 November 10; 335(7627): 971–973.
PMCID: PMC2072029
Measuring Quality through Performance

Making performance indicators work: experiences of US Veterans Health Administration

Eve A Kerr, associate director1 and Barbara Fleming, chief officer, quality and performance2

Eve Kerr and Barbara Fleming explain how measuring performance helped transform a failing healthcare system

Many healthcare organisations are having to confront the challenge of how to provide high quality care within a fixed (or sometimes shrinking) budget.1 The Veterans Health Administration, which provides care for over 5 million veterans within the largest integrated healthcare system in the United States, faced this problem in the early 1990s, when it was struggling to overcome a reputation for providing inferior and inefficient health care. In 1995 it began a programme to simultaneously improve the organisation and quality of its care, with performance monitoring having a key role.2 3 Within 10 years, it was lauded as providing the best care in the US.4

The turnaround shows the value of monitoring performance and providing appropriate incentives to improve care. We explain how the organisation brought about the changes and look at some of the remaining challenges.

Foundation for change

The administration made several organisational changes as a foundation for the quality improvements.3 5 Firstly, it reorganised care into regional networks (veterans integrated service networks), which were provided with fixed resources and held accountable for managing all care within their facilities. Secondly, it shifted care to ambulatory settings, opening new outpatient clinics and closing many inpatient beds. Thirdly, the capacity of the administration's automated information system was improved to allow providers to access and enter all patient information within a unified electronic medical record, thus enhancing coordination of care.6

A cornerstone of the efforts to transform care was the systematic use of data driven measures to monitor performance across several domains, including technical quality of care, access, functional status, and patient satisfaction.3 Many of the measures paralleled those developed by other US quality assessment organisations, but the administration also included measures to assess care of particular relevance to veterans.

Initially, assessment focused primarily on process measures concerning outpatient management of chronic conditions (control of diabetes, use of inhalers for obstructive lung disease, diet and exercise counselling for hypertension and obesity, drug management and cholesterol testing after myocardial infarction) and preventive care (immunisations; screening for breast, colon, cervical, and prostate cancer; and counselling on alcohol and tobacco use). Currently, the administration assesses over 50 measures covering acute and chronic conditions as well as palliative and preventive care (box). An external contractor collects data quarterly by auditing the electronic medical records for a sample of veterans from all the administration's facilities. It also surveys a sample of patients at each facility about their healthcare experiences, satisfaction, and health status.

Veterans Health Administration areas of performance measurement, 1997-2006*

Chronic and acute care

  • Diabetes
  • Acute myocardial infarction
  • Obstructive lung disease
  • Obesity
  • Hypertension
  • Pain assessment
  • Major depression
  • Smoking cessation
  • Community acquired pneumonia
  • Acute coronary syndrome
  • Substance use disorders
  • Heart failure

Preventive care

  • Influenza vaccination
  • Pneumococcal vaccination
  • Prostate cancer education/screening
  • Mammography
  • Cervical cancer screening
  • Colorectal cancer screening
  • Hyperlipidaemia screening
  • Alcohol screening
  • Tobacco screening

*Some areas were not covered in all years, and measures within areas also varied by year

In addition to monitoring quality, the administration instituted mechanisms to make it more likely that performance monitoring would drive quality improvement. Each regional director was held accountable through a performance contract, which included incentives equivalent to roughly 10% of the director's salary, for meeting specified quality standards. The director, in turn, held managers and clinicians accountable for the performance standards, and the performance results of each regional network and facility were widely available within the administration. Consequently, regional networks began to compete with each other on performance, and facilities within each network did the same.

Although implementation of quality improvement initiatives was ultimately in the hands of individual networks and facilities, there were also centrally led quality improvement efforts. The administration also drew on researchers from the Department of Veterans Affairs' health services research and development service and from nine disease specific, quality enhancement research initiatives to systematically identify quality gaps and develop and assess interventions to close those gaps.7 8

Response to measurement

What is the evidence that quality improved in the areas monitored? Figure 11 shows change in three representative measures monitored from 1997 to 2006. The rate of β blocker administration after myocardial infarction rose from 83% in 1997 to 93% in 2006. Similarly, annual testing for glycaemic control rose from 85% in 1993 to 96% in 2007. Jha and colleagues showed that quality improved significantly from 1997 to 2000 on other measures. For example, influenza vaccination rates rose from 61% in 1997 to 78% in 2000, pneumococcal vaccination rose from 60% to 81%, and aspirin administration after myocardial infarction rose from 92% to 98%.9 They also showed that the absolute level of quality of care for veterans was higher than for patients covered by Medicare.9 Similarly, we showed that quality of diabetes care (which had been included in performance monitoring since 1997) was higher in the administration in 2000-2001 than in geographically matched commercial managed care plans for almost every aspect studied, including timely eye screening, testing glucose and lipid concentrations, and glucose and lipid control.10

figure kere456178.f1
Fig 1 Changes in performance of Veterans Health Administration facilities on three quality measures, 1997-2007

The improvements in care occurred mainly in conditions that were being monitored, as shown by results from a study comparing the quality of care for patients in the administration with that of a sample of people from 12 major US communities between 1997-2000.11 This study used a global quality assessment tool that comprised over 300 measures across 26 diseases, including 26 measures targeted by the administration's performance monitoring. Although overall care was higher for veterans than in the community, the advantage was greatest for the measures that the administration was using to monitor quality (such as retinal screening for people with diabetes) and spilled over beyond the targeted measures to the conditions covered by performance monitoring (such as diabetes). However, for conditions not part of the performance monitoring system, veterans had no advantage (fig 22).11

figure kere456178.f2
Fig 2 Comparison of quality of care for veterans and national sample on performance measures monitored by Veterans Health Administration, measures related to monitored conditions, and measures unrelated to monitored conditions

Although this was a retrospective, observational study, these results, taken together with evidence for improvement over time, suggest that performance on measures being monitored improves, that improvement may extend beyond the single measure to conditions being monitored, but that areas not being monitored are less likely to improve. Additionally, national surveys have shown that patient satisfaction in both the inpatient and outpatient settings is higher for veterans than in other surveyed settings.12

Management challenges

One of the most important decisions in facilitating change was to invest heavily in auditing electronic medical records. This enabled the administration to collect detailed clinical data that are unavailable electronically and helped to ensure that measures are clinically meaningful and evidence based. Instead, of limiting measures to those that can be constructed with administrative utilisation data,13 the administration has used clinical data to construct measures that incorporate exceptions (such as contraindications) and measure processes strongly linked to outcomes.14

Although quality has improved in targeted clinical areas, the administration faces new challenges in improving care for all conditions. It is expanding measures for acute and hospital care, as well as for conditions faced by young veterans returning from conflict. Such expansion risks measurement overload—when measurement ceases to improve performance. To minimise this risk, the administration removes measures that get consistently high performance from the regional directors' contracts, although it continues to monitor them so that they can be placed back in the contract if performance drops.

The administration is also looking at other ways to stimulate quality improvement through performance monitoring. It is currently implementing pay for performance initiatives to reward providers for higher performance, but further research is needed to identify whether rewarding individuals or teams is more effective, the types of quality measures that best motivate true quality improvement,14 15 and the levels of incentives necessary to further stimulate change in provider behaviour.

The administration's experience has shown the valuable role that well constructed and clinically detailed measures of performance can have on improving quality of care, even without large monetary incentives for individual doctors. Nevertheless, monitoring can produce unintended consequences such as patient deselection,16 17 18 overtreatment of patients not likely to benefit from an intervention,14 19 20 21 22 and neglect of areas not covered in performance monitoring.11 Like other large healthcare organisations seeking to improve the quality of its care, the administration now needs to find ways to measure and ensure quality across the continuum of care and guard against unintended consequences of measurement.

Summary points

  • Care provided by the US Veterans Health Administration has greatly improved over the past 10 years
  • Key to the transformation was use of clinically based measures to monitor performance in targeted areas
  • Competition between regions and financial incentives to regional directors helped drive change
  • Challenges remain to improve care in unmeasured areas

Notes

This is the second article in a series looking at use of performance indicators in the UK and elsewhere.

This series is edited by Azeem Majeed, professor of primary care, Imperial College London (ku.ca.lairepmi@deejam.a) and Helen Lester, professor of primary care, University of Manchester (ku.ca.retsehcnam@retsel.neleh).

Notes

Contributors and sources: EAK is associate professor of internal medicine, University of Michigan Medical School. She has studied and reported widely on quality of care in US, methods to improve quality assessment and performance in the Veterans Health Administration. This article arose from the first author's research experience, review of the literature, and experiences of the second author in leading quality improvement efforts in the Veterans Health Administration.

Competing interests: EAK has received research funding from the US Department of Veterans Affairs. The opinions presented here do not necessarily represent those of the Department of Veterans Affairs or the University of Michigan.

Funding: EAK's time for preparing this article was supported, in part, by the VA Quality Enhancement Research Initiative for Diabetes Mellitus (DIB #98-001) and by the Michigan Diabetes Research and Training Center Grant P60DK-20572 from the NIDDK of the National Institutes of Health.

Provenance and peer review: Commissioned; externally peer reviewed.

References

1. Committee on Quality of Health Care in America. Crossing the quality chasm: a new health system for the 21st century Washington, DC: National Academy Press, 2001 [PubMed]
2. Kizer KW, Demakis JG, Feussner JR. Reinventing VA health care: systematizing quality improvement and quality innovation. Med Care 2000;38:I7-16. [PubMed]
3. Perlin JB, Kolodner RM, Roswell RH. The Veterans Health Administration: quality, value, accountability, and information as transforming strategies for patient-centered care. Am J Manag Care 2004;10:828-36. [PubMed]
4. Longman P. The best care anywhere. Washington Monthly 2005;37(1-2):38(11).
5. Oliver A. The Veterans Health Administration: an American success story? Milbank Q 2007;85:5-35. [PubMed]
6. Kupersmith J, Francis J, Kerr E, Krein SL, Pogach L, Kolodner RM, et al. Advancing evidence-based care in diabetes through health information technology: lessons from the Veterans Health Administration. Health Affairs 2007;26:W156-68. [PubMed]
7. Demakis J, McQueen L, Kizer K, Feussner J. Quality enhancement research initiative (QUERI): a collaboration between research and clinical practice. Med Care 2000;38(VA QUERI suppl):I17-25. [PubMed]
8. Lomas J. Health services research. BMJ 2003;327:1301-2. [PMC free article] [PubMed]
9. Jha AK, Perlin JB, Kizer KW, Dudley RA. Effect of the transformation of the Veterans Affairs Health Care System on the quality of care. N Engl J Med 2003;348:2218-27. [PubMed]
10. Kerr EA, Gerzoff RB, Krein SL, Selby JV, Piette JD, Curb JD, et al. Diabetes care quality in the Veterans Affairs health care system and commercial managed care: the TRIAD study. Ann Intern Med 2004;141:272-81. [PubMed]
11. Asch SM, McGlynn EA, Hogan MM, Hayward RA, Shekelle P, Rubenstein L, et al. Comparison of quality of care for patients in the Veterans Health Administration and patients in a national sample. Ann Intern Med 2004;141:938-45. [PubMed]
12. US Department of Veteran Affairs. Veterans' health care outscores private sector—again. Press release 18 Jan 2006. www1.va.gov/opa/pressrel/pressrelease.cfm?id=1069
13. MacLean CH, Louie R, Shekelle PG, Roth CP, Saliba D, Higashi T, et al. Comparison of administrative data and medical records to measure the quality of medical care provided to vulnerable older patients. Med Care 2006;44:141-8. [PubMed]
14. Kerr EA, Krein SL, Vijan S, Hofer TP, Hayward RA. Avoiding pitfalls in chronic disease quality measurement: a case for the next generation of technical quality measures. Am J Manag Care 2001;7:1033-43. [PubMed]
15. Kerr EA, Smith DM, Hogan MH, Hofer TP, Krein SL, Hayward RA. Building a better quality measure: Are some patients with poor intermediate outcomes really getting good quality care? Med Care 2003;41:1173-82. [PubMed]
16. Casalino LP, Elster A. Will pay-for-performance and quality reporting affect health care disparities? Health Aff (Millwood) 2007;26(3):w405-14. [PubMed]
17. Hofer TP, Hayward RA, Greenfield S, Wagner EH, Kaplan SH, Manning WG. The unreliability of individual physician “report cards” for assessing the costs and quality of care of a chronic disease. JAMA 1999;281:2098-105. [PubMed]
18. Werner RM, Asch DA, Polsky D. Racial profiling: the unintended consequences of coronary artery bypass graft report cards. Circulation 2005;111:1257-63. [PubMed]
19. Pogach L, Engelgau M, Aron D. Measuring progress toward achieving hemoglobin A1c goals in diabetes care: pass/fail or partial credit. JAMA 2007;297:520-3. [PubMed]
20. Boyd CM, Darer J, Boult C, Fried LP, Boult L, Wu AW. Clinical practice guidelines and quality of care for older patients with multiple comorbid diseases: implications for pay for performance. JAMA 2005;294:716-24. [PubMed]
21. Hayward RA, Kent DM, Vijan S, Hofer TP. Reporting clinical trial results to inform providers, payers, and consumers. Health Aff 2005;24:1571-8. [PubMed]
22. Hayward RA. Performance measurement in search of a path. N Engl J Med 2007;356:951-3. [PubMed]

Articles from The BMJ are provided here courtesy of BMJ Group