PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptNIH Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
Med Care. Author manuscript; available in PMC Jul 25, 2013.
Published in final edited form as:
PMCID: PMC3723387
NIHMSID: NIHMS494087
Process of Care Performance Measures and Long-Term Outcomes in Patients Hospitalized With Heart Failure
Mark E. Patterson, PhD, MPH,*§ Adrian F. Hernandez, MD, MHS,* Bradley G. Hammill, MS,* Gregg C. Fonarow, MD,|| Eric D. Peterson, MD, MPH, Kevin A. Schulman, MD,* and Lesley H. Curtis, PhD*
*Center for Clinical and Genetic Economics, Duke University School of Medicine, Durham, North Carolina
Duke Clinical Research Institute, Duke University School of Medicine, Durham, North Carolina
Department of Medicine, Duke University School of Medicine, Durham, North Carolina
§RTI International, Research Triangle Park, North Carolina
||Ahmanson-UCLA Cardiomyopathy Center, Department of Medicine, UCLA Medical Center, Los Angeles, California.
Complete Author Information
Mark E. Patterson, PhD, MPH, Department of Health Care Quality, RTI International, 3040 Cornwallis Rd, PO Box 12194, Research Triangle Park, NC 27709; telephone: 919-316-3984; fax: 919-990-8454; mpatterson/at/rti.org.
Adrian F. Hernandez, MD, MHS, Duke Clinical Research Institute, PO Box 17969, Durham, NC 27715; telephone: 919-668-7515; adrian.hernandez/at/duke.edu.
Bradley G. Hammill, MS, Duke Clinical Research Institute, PO Box 17969, Durham, NC 27715; telephone: 919-668-8101; fax: 919-668-7124; brad.hammill/at/duke.edu.
Gregg C. Fonarow, MD, Ahmanson-UCLA Cardiomyopathy Center, UCLA Medical Center, 10833 LeConte Ave, Room BH-307 CHS, Los Angeles, CA 90095-1679; telephone: 310-206-9112; fax: 310-206-9111; gfonarow/at/mednet.ucla.edu.
Eric D. Peterson, MD, MPH, Duke Clinical Research Institute, PO Box 17969, Durham, NC 27715; telephone: 919-668-8830; fax: 919-668-7061; peter016/at/mc.duke.edu.
Kevin A. Schulman, MD, Duke Clinical Research Institute, PO Box 17969, Durham, NC 27715; telephone: 919-668-8101; fax: 919-668-7124; kevin.schulman/at/duke.edu.
Lesley H. Curtis, PhD, Duke Clinical Research Institute, PO Box 17969, Durham, NC 27715; telephone: 919-668-8101; fax: 919-668-7124; lesley.curtis/at/duke.edu.
Address correspondence to: Lesley H. Curtis, PhD, Center for Clinical and Genetic Economics, Duke Clinical Research Institute, PO Box 17969, Durham, NC 27715; telephone: 919-668-8101; fax: 919-668-7124; lesley.curtis/at/duke.edu.
Background
Recent efforts to improve care for patients hospitalized with heart failure have focused on process-based performance measures. Data supporting the link between current process measures and patient outcomes are sparse.
Objective
To examine the relationship between adherence to hospital-level process measures and long-term patient-level mortality and readmission.
Research Design
Analysis of data from a national clinical registry linked to outcome data from the Centers for Medicare & Medicaid Services (CMS).
Subjects
22750 Medicare fee-for-service beneficiaries enrolled in the Organized Program to Initiate Lifesaving Treatment in Hospitalized Patients with Heart Failure (OPTIMIZE-HF) between March 2003 and December 2004.
Measures
Mortality at 1 year; cardiovascular readmission at 1 year; and adherence to hospital-level process measures, including discharge instructions, assessment of left ventricular function, prescription of angiotensin-converting enzyme inhibitor or angiotensin receptor blocker at discharge, prescription of beta-blockers at discharge, and smoking cessation counseling for eligible patients.
Results
Hospital conformity rates ranged from 52% to 86% across the 5 process measures. Unadjusted overall 1-year mortality and cardiovascular readmission rates were 33% and 40%, respectively. In covariate-adjusted analyses, the CMS composite score was not associated with 1-year mortality (hazard ratio, 1.00; 95% confidence interval, 0.98-1.03; P = .91) or readmission (hazard ratio, 1.01; 95% confidence interval, 0.99-1.04; P = .37). Current CMS process measures were not independently associated with mortality, though prescription of beta-blockers at discharge was independently associated with lower mortality (hazard ratio, 0.94; 95% confidence interval, 0.90-098; P = .004).
Conclusion
Hospital process performance for heart failure as judged by current CMS measures is not associated with patient outcomes within 1 year of discharge, calling into question whether existing CMS metrics can accurately discriminate hospital quality of care for heart failure.
Keywords: heart failure, mortality, outcome and process assessment (health care), patient readmission
Substantial variation exists in the provision of evidence-based, guideline-recommended care to patients hospitalized for heart failure in the United States.1 Recent efforts to improve the quality of care for these patients have focused on process-based performance measures. The Centers for Medicare & Medicaid Services (CMS) and the Joint Commission have designated 4 such process measures, and the American Heart Association (AHA) and the American College of Cardiology (ACC) have designated 5 discharge measures for heart failure (the 4 CMS measures plus anticoagulation for atrial fibrillation).2 Medicare and other payers use such measures in pay-for-performance programs and report the measures publicly on the Hospital Compare Web site to help patients select high-quality providers. Central to these programs is the implicit assumption that conformance with process measures improves patient outcomes. However, data supporting the process–outcome link are sparse.
Previous studies have examined associations between hospital-level performance and hospital-level outcomes3-5 and associations between patient-level adherence to process measures and patient-level outcomes.5 Hospital-level analyses have found no association between hospital-level adherence and 30-day mortality.3 Patient-level analyses suggest that adherence to certain process measures is strongly associated with 60-day to 90-day postdischarge outcomes and that adherence to other process measures is not.5 These types of analyses do not address an important question from the patient's perspective: Are hospital-level performance measures important indicators of long-term patient outcomes? That is, is receiving care at a hospital with better conformity to recommended processes of care associated with better long-term outcomes for patients with heart failure?
Using data from the Organized Program to Initiate Treatment in Hospitalized Patients with Heart Failure (OPTIMIZE-HF) registry linked to Medicare claims data, we examined the relationship between adherence to hospital-level process measures and patient-level mortality and readmission in the first year after discharge.
Data Sources
Patients in this study were from the OPTIMIZE-HF registry, which has been described in detail previously.5-7 The registry was established to collect data regarding processes of care for patients hospitalized with heart failure. The 259 participating US hospitals enrolled 48612 patients from March 1, 2003, through December 31, 2004, and used a case ascertainment approach similar to that used by the Joint Commission.8 Patients were eligible for the registry if (a) they presented with symptoms of heart failure during a hospitalization for which heart failure was the primary discharge diagnosis or (b) the primary reason for admission was an episode of worsening heart failure. The International Classification of Diseases, Ninth Revision, Clinical Modification codes used as enrollment criteria for OPTIMIZE-HF and case finding were identical to those used by CMS. Patients from all geographic regions of the United States were included and a variety of institutions participated, from community hospitals to large tertiary centers. Each center's institutional review board or a central institutional review board approved the study protocol. Hospital staff used a Web-based case report form to record patient-level information, including demographic characteristics, comorbid conditions, vital signs, and drug therapy. Automatic electronic data checks prevented out-of-range entries and duplications. In addition, an audit of the database based on predetermined criteria verified data against source documents for a 5% random sample of the first 10000 patients.
For this study, we merged patient data from the OPTIMIZE-HF registry with Medicare Part A inpatient claims, matching by date of birth, sex, hospital, and admission and discharge dates.9 Of 36165 hospitalizations of patients aged 65 years or older, 29301 (80.8%) were matched to Medicare claims, representing 25901 distinct patients. We excluded 1218 patients who died before discharge, 1143 patients who were ineligible for any of the 4 process measures, and 790 patients in 88 hospitals with fewer than 25 eligible patients, a convention used in previous studies to improve the stability of process measure estimates.3 The final data set contained data on 22750 patients from 150 hospitals. In addition to the overall cohort, we created 4 separate cohorts of patients who were eligible for each of the 4 process measures of interest. These cohorts included only data from hospitals with at least 25 eligible patients for a given process measure.
Process Measures
We analyzed a total of 5 process measures. These included the 4 process measures endorsed by CMS, the Joint Commission, and the ACC/AHA: (a) discharge instructions that address diet, exercise, medications, and relevant follow-up care for patients discharged to home; (b) assessment of left ventricular function; (c) prescription of an angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) at discharge to eligible patients with left ventricular systolic dysfunction without contraindications; and (d) smoking cessation counseling for patients who had smoked within 1 year of admission. In addition, we analyzed prescription of beta-blockers at discharge to eligible patients with left ventricular systolic dysfunction without contraindications. Although not endorsed by CMS, this process measure has been shown to be associated with improvement in short-term outcomes.5
We constructed the performance measures by using the numerator and denominator definitions in the Joint Commission ORYX specifications; that is, we assessed use among eligible patients without documented contraindications, intolerance, or other physician documentation.8 Patients who died, were transferred to another acute care hospital, were discharged to hospice or a federal hospital, or left against medical advice were considered ineligible to receive any of the 5 processes of care.8 We summarized each process measure at the hospital level by dividing the number of patients for whom the process measure was documented by the number of patients eligible for the measure. In patient-level analyses, we applied hospital-level adherence rates uniformly to all patients within a given hospital; thus, the hospital-level rates can be considered continuous measures of hospital quality.
For each hospital, we constructed 2 overall scores. First, we constructed a composite score by dividing the total number of documented CMS-endorsed processes of care by the total number of opportunities to provide those processes of care, a score similar to that currently used in Medicare's Hospital Compare as a basis for pay-for-performance programs for the 4 CMS measures.8,11 For example, a patient who received 2 of 4 processes of care for which she was eligible would contribute 2 to the numerator of the composite score and 4 to the denominator. The composite score indicates how often patients in a given hospital received the processes of care for which they were eligible. Second, we constructed a “defect-free” score to indicate the proportion of patients in the hospital who received all of the CMS-endorsed processes of care for which they were eligible. In this case, the patient from the previous example would contribute 0 to the numerator and 1 to the denominator, because she did not receive all of the processes of care.12,13
The main outcome measure was mortality within 1 year after hospital discharge. We also analyzed cardiovascular readmission within 1 year after discharge. We obtained dates of death from CMS data through December 31, 2006. We defined cardiovascular readmission as the first subsequent inpatient admission for a cardiovascular reason as identified in Medicare Part A claims and defined by diagnosis related group (DRG) codes 104-112, 115-118, 121-125, 127-145, 476, 514-518, 525-527, 535-536, and 547-558, excluding transfers or subsequent admissions for rehabilitation.
Covariates
Baseline patient-level covariates from the OPTIMIZE-HF registry included age, race, history of acute myocardial infarction, diabetes mellitus, prior cerebrovascular disease, peripheral vascular disease, depression, hyperlipidemia, chronic obstructive pulmonary disease, and atrial arrhythmia; and mean serum creatinine, hemoglobin, systolic and diastolic blood pressure, and weight at admission. Between 1% and 6% of the patients had missing values for creatinine, hemoglobin, systolic and diastolic blood pressure, and weight. We imputed the mean values of the overall cohort for these missing values. From the CMS data, we calculated the total number of heart failure hospitalizations for each hospital and heart failure hospitalizations as a percentage of total hospital discharges and included these as hospital-level covariates.
Statistical Analysis
We calculated frequencies and means for baseline demographic characteristics, comorbidities, and clinical characteristics for the full sample of 22750 patients, and hospital-level volume and performance measures for the 150 hospitals. We present Kaplan-Meier estimates of unadjusted mortality, and we calculated unadjusted cardiovascular readmission rates using the cumulative incidence function.14 In the primary analysis, we examined the relationship between hospital-level adherence and patient-level outcomes. Specifically, we used Cox proportional hazards models to estimate the unadjusted and adjusted effects of each hospital-level process measure on mortality and cardiovascular readmission. The multivariable models included the patient-level and hospital-level covariates described above. To account for the clustering of patients within hospitals, we calculated robust standard errors.15 We performed 2 sensitivity analyses. First, we relaxed the requirement for eligible patients per hospital from 25 to 10. Second, to assess the need for random effects, we modeled the mortality end point using a generalized linear model with a logit link and binomial variance function and specified site-level random intercepts.
To address the question of whether higher-performing hospitals have lower 1-year risk-adjusted mortality rates compared to lower-performing hospitals, we estimated the relationship between hospital-level process measures and hospital-level risk-adjusted outcomes using a bootstrap approach. For each patient, we first calculated predicted probabilities of mortality and cardiovascular readmission based on regression models that included the baseline patient-level covariates listed above. We then drew 1000 samples (with replacement) of 22750 patients from the data used in the main analysis. For each sample, we calculated the hospital-level conformity rates and hospital-level risk-adjusted outcome rates. Conformity rates were calculated as previously described. Risk-adjusted outcome rates were calculated by dividing the observed outcome rate by the expected outcome rate and multiplying this quantity by the observed 1-year outcome rate in the overall sample. In each sample, we regressed these hospital-level risk-adjusted mortality and readmission rates on each of the hospital-level process measures. For each outcome, the mean of all parameter estimates is reported for each process measure. To address statistical significance, we provide the 95th bootstrap percentile interval. We used SAS software version 9.1 (SAS Institute Inc, Cary, North Carolina) for all analyses.
The mean age of the overall cohort was 79 years, 44% were men, and 83% were white. Approximately one quarter of the patients had a history of acute myocardial infarction or non–insulin-dependent diabetes mellitus, and almost one third had a history of hyperlipidemia or chronic obstructive pulmonary disease (Table 1). Unadjusted overall 1-year mortality and cardiovascular readmission rates were 33% and 40%, respectively.
Table 1
Table 1
Baseline Patient Characteristics (N = 22750)*
The median number of patients with heart failure treated annually at each hospital was 227 (interquartile range, 136-381). Mean hospital-level adherence rates for individual process measures varied considerably. On average, hospitals assessed left ventricular function in 86% of eligible patients but provided discharge instructions to only 52% of eligible patients. The mean hospital-level composite score, which indicates the proportion of CMS-endorsed care processes that were correctly provided, was 72%. The defect-free measure, which indicates the proportion of patients receiving all of the CMS-endorsed processes of care for which they were eligible, was 51% (Table 2). When applied uniformly to all patients in a given hospital, the resulting distributions of adherence rates were similar (data not shown).
Table 2
Table 2
Hospital-Level Process Measure Adherence
Hospital-level adherence to CMS-endorsed process measures including discharge instructions, assessment of left ventricular function, prescription of an ACE inhibitor or ARB, and smoking cessation counseling was not associated with lower patient-level mortality at 1 year in the adjusted analyses. Estimated effect sizes for these process measures were small. For each 10% incremental increase in hospital-level adherence, no process measure reduced the odds of mortality by more than 4%. Hospital-level prescription of beta-blockers at discharge was significantly associated with patient-level mortality. A 10% incremental increase in hospital-level adherence was associated with 6% lower odds of mortality. Neither the CMS composite measure nor the defect-free measure was significantly related to patient-level mortality (Table 3). Similar to the mortality analyses, most of the process or composite measures were not associated with 1-year cardiovascular readmission after adjustment, with the exception of assessment of left ventricular function. A 10% increase in hospital-level adherence to the assessment of left ventricular function was associated with a 4% increase in the odds of cardiovascular readmission at 1 year.
Table 3
Table 3
Relationship Between Process Measures and 1-Year Outcomes
In the first sensitivity analysis, we relaxed the requirement for eligible patients per hospital from 25 to 10. Using this criterion, the sample increased to 188 hospitals and 23318 patients (smoking cessation at 76 hospitals; ACE inhibitor or ARB at 140 hospitals). Although most findings were unchanged, the conformity to the ACE inhibitor/ARB measure trended toward a lower adjusted mortality rate (hazard ratio, 0.96; 95% confidence interval, 0.92-1.01; P = .08). Associations between all of the process measures and cardiovascular readmission were unchanged. In a separate sensitivity analysis, we assessed the need for random effects by fitting a hierarchical model for the mortality end point. The results corresponded almost exactly with those from the proportional hazards model with robust standard errors (data not shown).
Table 4 shows the results of the bootstrap analyses. None of the hospital-level individual process measure adherence rates or composite scores was found to be significantly associated with hospital-level risk-adjusted outcomes. Effect sizes were again found to be small.
Table 4
Table 4
Relationship Between Hospital-Level Process Measures and Hospital-Level Risk-Adjusted Outcomes at 1 Year
In this analysis of 22750 Medicare beneficiaries hospitalized with heart failure at 150 US hospitals, we found substantial variation in hospital adherence to the 4 CMS process measures. Yet, with the exception of the positive association between hospital-level conformity to the assessment of left ventricular function and cardiovascular readmission, there were no associations between the CMS hospital performance measures or the composite measures and patient-level mortality or cardiovascular readmission rates at 1 year. However, we did find a significant association between hospital-level adherence to prescription of beta-blockers at discharge and lower mortality at 1 year. To explore these associations with risk-adjusted hospital-level outcomes, we conducted bootstrap analyses and found the results to be generally consistent with the primary analysis.
These findings are generally consistent with a previous analysis examining patient-level predictors and outcomes of 5791 patients from the 91 hospitals who participated in OPTIMIZE-HF. In that study, only conformity with a measure for prescription of a beta-blocker for left ventricular systolic dysfunction was significantly associated with a lower risk of 60-day to 90-day mortality after propensity adjustment and risk adjustment.5 The findings are also consistent with a study using an administrative data source to examine associations between hospital-level processes of care and hospital-level outcomes in 3657 acute care hospitals, which found that assessment of left ventricular function and prescription of an ACE inhibitor at discharge were not significantly associated with improved survival at 1 year.3 The absolute risk reduction in risk-adjusted mortality between hospitals performing in the 25th percentile compared with those performing in the 75th percentile was 0.002 (P = .05) for assessment of left ventricular function, –0.003 (P = .04) for ACE inhibitor use, and 0.002 (P = .08) for 1-year mortality. In contrast, a study of 2958 patients drawn from a 20-hospital health care system in a single community reported an association between CMS process measures at discharge and 1-year survival, though multiple known confounders were not included in the multivariable models and nurse case managers continued to be involved in the care of patients after discharge.16
The present analysis expands upon findings from previous studies in two key ways. First, this study links Medicare administrative data to a detailed clinical source to allow for both longitudinal outcome assessment and rigorous risk adjustment. Thus, we were able to determine whether CMS process measures for heart failure had measurable effects up to 1 year after discharge in a broad cohort of patients from all regions of the United States. In addition, the analysis examines how overall hospital adherence levels are related to patient-level mortality and cardiovascular readmission, thereby addressing the question of whether patients who are treated at hospitals that score higher on process measures have better outcomes. This analytic approach addresses whether receiving care at a hospital with better conformity to recommended processes of care is associated with improvements in long-term outcomes for patients with heart failure. Previous research from CRUSADE (Can Rapid Risk Stratification of Unstable Angina Patients Suppress Adverse Outcomes with Early Implementation of the ACC/AHA Guidelines) has also addressed the associations between hospital-level predictors and patient level outcomes, but for patients hospitalized with acute coronary syndromes.17-19 Although hospital profit status17 and the presence of an inpatient cardiology service18 were not significantly associated with inpatient outcomes, hospital participation in clinical trials19 was significantly related to patient-level mortality.
There are several potential explanations for the lack of associations in this study. First, the processes of care selected for the performance measures may truly not be associated with outcomes. Evidence of associations between discharge instructions, assessment of left ventricular function, and smoking cessation counseling are based on expert opinion rather than randomized clinical trials. Furthermore, outcomes after hospital discharge likely reflect a combination of many domains of care and may be dominated by postdischarge care processes, frequency of follow-up, and the underlying disease process. For example, being discharged with an ACE inhibitor or ARB does not ensure that a patient will remain on therapy or that an effective dose has been prescribed, nor does it ensure that the clinical effects will be observable within 1 year. However, the significant relationship observed between beta-blockers at discharge and mortality at 1 year demonstrates that associations can be detected when they exist. Second, hospital documentation of process measures may not reflect actual care. For example, patient education may be documented in the medical record even if it was completed at discharge in a rushed or superficial manner. Conversely, physicians or nurses may instruct a patient about medications, diet, symptoms of worsening heart failure, and daily weight monitoring but may not record this in the patient's medical record. Third, the self-reported nature of the process measure forms carries the risk that hospitals purposely underreport eligible patients to inflate the process measure adherence score, a violation that was suspected but not confirmed in a study of process measure adherence in family practices in the United Kingdom.20 Finally, studies examining effects of system-level exposures on individual-level outcomes may be limited by the inability to control for unobserved system-level characteristics, which could result in null associations.
Other findings in this study warrant comment. First, we found a small but significant association between assessment of left ventricular function and greater risk of cardiovascular readmission. The reason for this finding is unclear; we suspect it may reflect residual confounding in which patients who are sicker in ways we did not measure may have been more likely to undergo assessment of left ventricular function and be hospitalized as compared with healthier patients. Second, the demographic characteristics of the sample are comparable to another study estimating trends in mortality among hospitalized Medicare beneficiaries with heart failure,21 providing some evidence of how the results of the current study are generalizable to Medicare fee-for-service beneficiaries. Third, the high mortality and cardiovascular readmission rates found in this patient population indicate that this is a high-risk population that would likely benefit from improved process measure conformity in measures with a strong process–outcome link.
Our study has some limitations. First, the process–outcome association may be confounded by socioeconomic factors or other unmeasured confounders related to both health status and hospital adherence level. Second, to the extent that Medicare beneficiaries enrolled in OPTIMIZE-HF are not representative of all Medicare beneficiaries with heart failure, the results may not be generalizable. Evidence suggests, however, that Medicare beneficiaries enrolled in OPTIMIZE-HF are similar to Medicare fee-for-service beneficiaries hospitalized with heart failure in terms of baseline characteristics, survival, and all-cause readmission.22 Third, the generalizability of the results may be further limited if participating hospitals differ from nonparticipating hospitals in ways not reflected patient demographic characteristics, core measures, or postdischarge outcomes. Fourth, patient eligibility for a performance measure was based on documentation in the medical record, which may not always be accurate. For example, some patients may have had undocumented contraindications or intolerances, leading to an overestimation of the number of patients eligible for the performance measure. Finally, the cross-sectional nature of the data did not allow us to assess changes in performance measure conformity and clinical outcomes over time.
Performance measures are used for public reporting of the quality of cardiovascular care at the hospital level, affecting financial payments to medical centers and individual physicians. Thus, it is essential that measures be prioritized to include those that are known to be closely associated with patient outcomes. Given the lack of associations between individual measures and a composite measure and postdischarge clinical outcomes, the use of the CMS heart failure performance measures in their current form in pay-for-performance programs may not be the most efficacious way to assess and reward quality. Although clearly stated methods have been used to develop and implement heart failure performance measures, these measures are not fulfilling their stated purpose. Consequently, additional measures with stronger process–outcome links after hospital discharge should be considered. If a documentation process at the hospital does not accurately capture the most important elements of care provided, it may be unreasonable to expect that incentives for these process measures would improve outcomes.
To our knowledge, this analysis is the first examine how overall hospital conformity to the 4 current CMS heart failure-specific process measures is associated with individual-level, long-term outcomes in a broad cohort of patients from all regions of the United States. To build upon these results, future research is needed to refine how performance measures are created and selected. Consideration should be given to prospective validation and testing of measures, rather than the selection of measures by expert panels. Before implementing pay-for-performance broadly across all systems, the limitations of current performance measures and the differences in measure reliability across disease types, provider settings, and patient populations need to be better recognized. In addition, a minimally important difference needs to be defined before policy makers decide to implement new process measures, especially given the small effect sizes.4 The small effect sizes may not be sufficient to justify broad policy changes, especially if the cost of such changes would not justify changes that were not clinically significant. It is essential that new process of care measures for heart failure be developed and implemented so that the quality of care can be more accurately measured and outcomes of this high-risk patient population can be improved.
Acknowledgments
Supported by grant U18HS10548 from the Agency for Healthcare Research and Quality; and by GlaxoSmithKline. Dr Hernandez is a recipient of an American Heart Association Pharmaceutical Roundtable grant (0675060N). Drs Curtis and Schulman were supported in part by grant R01AG026038 from the National Institute on Aging. Dr Fonarow is supported by the Ahmanson Foundation and the Corday Family Foundation.
1. Fonarow GC, Yancy CW, Heywood JT. Adherence to heart failure quality-of-care indicators in US hospitals: analysis of the ADHERE Registry. Arch Intern Med. 2005;165:1469–1477. [PubMed]
2. Bonow RO, Bennett S, Casey DE, Jr, et al. ACC/AHA clinical performance measures for adults with chronic heart failure: a report of the American College of Cardiology/American Heart Association Task Force on Performance Measures (Writing Committee to Develop Heart Failure Clinical Performance Measures) endorsed by the Heart Failure Society of America. J Am Coll Cardiol. 2005;46:1144–1178. [PubMed]
3. Werner RM, Bradlow ET. Relationship between Medicare's Hospital Compare performance measures and mortality rates. JAMA. 2006;296:2694–702. [PubMed]
4. Glickman SW, Ou FS, DeLong ER, et al. Pay for performance, quality of care, and outcomes in acute myocardial infarction. JAMA. 2007;297:2373–2380. [PubMed]
5. Fonarow GC, Abraham WT, Albert NM, et al. Association between performance measures and clinical outcomes for patients hospitalized with heart failure. JAMA. 2007;297:61–70. [PubMed]
6. Fonarow GC, Abraham WT, Albert NM, et al. Organized Program to Initiate Lifesaving Treatment in Hospitalized Patients with Heart Failure (OPTIMIZE-HF): rationale and design. Am Heart J. 2004;148:43–51. [PubMed]
7. Fonarow GC, Abraham WT, Albert NM, et al. Carvedilol use at discharge in patients hospitalized for heart failure is associated with improved survival: an analysis from Organized Program to Initiate Lifesaving Treatment in Hospitalized Patients with Heart Failure (OPTIMIZE-HF). Am Heart J. 2007;153:82, e1–e11. [PubMed]
8. Joint Commission on Accreditation of Healthcare Organizations Specification Manual for National Implementation of Hospital Core Measures. Joint Commission on Accreditation of Healthcare Organizations; Oak Terrace, IL: 2004.
9. Hammill BG, Hernandez AF, Peterson ED, Fonarow GC, Schulman KA, Curtis LH. Linking inpatient clinical registry data to Medicare claims data using indirect identifiers. Am Heart J. In press. [PMC free article] [PubMed]
10. Fonarow GC, Abraham WT, Albert NM, et al. Prospective evaluation of beta-blocker use at the time of hospital discharge as a heart failure performance measure: results from OPTIMIZE-HF. J Card Fail. 2007;13:722–731. [PubMed]
11. [April 15, 2008];Hospital Compare. http://www.hospitalcompare.hhs.gov.
12. Nolan T, Berwick DM. All-or-none measurement raises the bar on performance. JAMA. 2006;295:1168–1170. [PubMed]
13. Hiratzka LF, Eagle KA, Liang L, Fonarow GC, LaBresh KA, Peterson ED. Get With the Guidelines Steering Committee. Atherosclerosis secondary prevention performance measures after coronary bypass graft surgery compared with percutaneous catheter intervention and nonintervention patients in the Get With the Guidelines database. Circulation. 2007;116(11 Suppl):I207–I212. [PubMed]
14. Kalbfleisch JD, Prentice RL. The Statistical Analysis of Failure Time Data. Wiley; New York: 1980.
15. Lin DY, Wei LJ. The robust inference for the Cox proportional hazards model. J Am Stat Assoc. 1989;84:1074–1078.
16. Kfoury AG, French TK, Horne BD, et al. Incremental survival benefit with adherence to standardized. J Card Fail. 2008;14:95–102. [PubMed]
17. Shah BR, Glickman SW, Liang L, et al. The impact of for-profit health status on the care and outcomes of patients with non-ST segment elevation myocardial infarction: results from the CRUSADE Initiative. J Am Coll Cardiol. 2007;50:1462–1468. [PubMed]
18. Roe MT, Chen AY, Mehta RH, et al. Influence of inpatient service specialty on care processes and outcomes for patients with non ST-segment elevation acute coronary syndromes. Circulation. 20087(117):1153–1161. [PubMed]
19. Majumdar SR, Roe MT, Peterson ED, Chen AY, Gibler WB, Armstrong PW. Better outcomes for patients treated at hospitals that participate in clinical trials. Arch Intern Med. 2008;168:657–662. [PubMed]
20. Doran T, Fullwood C, Gravelle H, et al. Pay-for-performance programs in family practices in the United Kingdom. N Engl J Med. 2006;355:375–384. [PubMed]
21. Curtis LH, Greiner MA, Hammill BG, et al. Early and long-term outcomes of heart failure in elderly persons, 2001-2005. Arch Intern Med. 2008;168:2481–2488. [PMC free article] [PubMed]
22. Curtis LH, Greiner MA, Hammill BG, et al. Representativeness of a national heart failure quality-of-care registry: comparison of Medicare patients in the OPTIMIZE-HF registry with non-OPTIMIZE-HF Medicare patients. Circulation: Cardiovascular Quality and Outcomes. In press. [PMC free article] [PubMed]