Drug-eluting coronary stents (DES) rapidly dominated the marketplace in the United States after approval by the Food and Drug Administration in April 2003, but utilization rates were initially lower among African-American patients. We assess whether racial differences persisted as DES diffused into practice.
Medicare claims data were used to identify coronary stenting procedures among elderly patients with acute coronary syndromes (ACS). Regression models of the choice of DES versus bare mental stent controlled for demographics, ACS type, comorbidities, and hospital characteristics. Diffusion was assessed in the short run (2003–2004) and long run (2007), with the effect of African-American race calculated in each period to allow for a varying effect.
The sample included 381,887 Medicare beneficiaries treated with stent insertion; approximately five percent were African-American. Initially (May 2003–February 2004), African-American race was associated with lower DES use compared to other races (44.3% vs. 46.5%, p<0.01). Once DES usage was high in all patients (March-December 2004), differences were not significant (79.8% vs. 80.3%, p=0.45). Subsequent concerns regarding DES safety caused broad reductions in DES use, with African-Americans having lower rates of use than other racial groups in 2007 (63.1% versus 65.2%, p<0.01).
Racial disparities in DES use initially disappeared during a period of rapid diffusion and high usage rates; the reappearance of disparities in use by 2007 may reflect DES use tailored to unmeasured aspects of case mix and socioeconomic status. Further work is needed to whether underlying differences in race reflect physician decisions regarding appropriateness of treatment.
Drug-eluting stents; Health Care Disparities; Biomedical Technology
To use coronary revascularization choice to illustrate the application of a method simulating a treatment's effect on subsequent resource use.
Medicare inpatient and outpatient claims from 2002–2008 for patients receiving multi-vessel revascularization for symptomatic coronary disease in 2003–2004.
This retrospective cohort study of 102,877 beneficiaries assessed survival, days in institutional settings, and Medicare payments for up to six years following receipt of percutaneous coronary intervention (PCI) or coronary artery bypass grafting (CABG).
A three-part estimator designed to provide robust estimates of a treatment's effect in the setting of mortality and censored follow-up was used. The estimator decomposes the treatment effect into effects attributable to survival differences versus treatment-related intensity of resource use.
After adjustment, on average CABG recipients survived 23 days longer, spent an 11 additional days in institutional settings, and had cumulative Medicare payments that were $12,834 higher than PCI recipients. The majority of the differences in institutional days and payments were due to intensity rather than survival effects.
In this example, the survival benefit from CABG was modest and the resource implications were substantial, although further adjustments for treatment selection are needed.
Censoring; comparative effectiveness research; coronary artery bypass grafting; percutaneous coronary intervention
To determine diagnostic testing patterns after percutaneous coronary intervention (PCI).
Little is known about patterns of diagnostic testing after PCI in the U.S. or the relationship of these patterns with clinical outcomes.
We linked Centers for Medicare & Medicaid Services inpatient and outpatient claims to the National Cardiovascular Data Registry® CathPCI Registry® data from 2005–2007. Hospital quartiles of the cumulative incidence of diagnostic testing use within 12 and 24 months post-PCI were compared for patient characteristics, repeat revascularization, acute myocardial infarction (AMI), and death.
A total of 247,052 patients underwent PCI at 656 institutions. Patient and site characteristics were similar across testing use quartiles. There was a 9% and 20% higher adjusted risk of repeat revascularization in Quartile 3 and Quartile 4 (highest testing rate), respectively, when compared to Quartile 1 (lowest testing rate) (p=0.020 and <0.0001, respectively). The adjusted risk for death or AMI did not differ among quartiles.
While patient characteristics were largely independent of rates of post-PCI testing, higher testing rates was not associated with lower risks of myocardial infarction or death, but repeat revascularization was significantly higher at these sites. Additional studies should examine whether increased testing is a marker for improved quality of post-PCI care or simply increased healthcare utilization.
stress testing; diagnostic catheterization; site-level patterns; patient outcomes
Limited data are available on the use of coronary computed tomography angiography (CCTA) in patients who have received percutaneous coronary intervention (PCI). To evaluate patterns of cardiac testing including CCTA after PCI, we created a retrospective observational data set linking the National Cardiovascular Data Registry® CathPCI Registry® baseline data with longitudinal inpatient and outpatient Medicare claims data for patients who received coronary stenting between November 1, 2005 and December 31, 2007. Among 192,009 PCI patients (median age 74 years), the first test after coronary stenting was CCTA for 553 (0.3%), stress testing for 89,900 (46.8%), and coronary angiography for 22,308 (11.6%); 79,248 (41.3%) had no further testing. Patients referred to CCTA first had generally similar or lower baseline risk than those referred for stress testing or catheterization first. Compared to patients with stress testing first after PCI, patients who underwent CCTA first had higher unadjusted rates of subsequent noninvasive testing (10% vs. 3%), catheterization (26% vs. 15%), and revascularization (13% vs. 8%) within 90 days of initial post-PCI testing (p<0.0001 for all). In conclusion, despite similar or lesser risk profiles, patients initially evaluated with CCTA after PCI had more downstream testing and revascularization than patients initially evaluated with stress testing. It is unclear whether these differences derive from patient selection, the performance of CCTA relative to other testing strategies, or the association of early CCTA adoption with distinct patterns of care.
coronary computed tomography angiography; percutaneous coronary intervention; patterns of care
Patterns of non-invasive stress test (ST) and invasive coronary angiography (CA) utilization after percutaneous coronary intervention (PCI) are not well described in older populations.
Methods and Results
We linked National Cardiovascular Data Registry® CathPCI Registry® data with longitudinal Medicare claims data for 250,350 patients undergoing PCI from 2005 to 2007 and described subsequent testing and outcomes. Between 60 days post-PCI and end of follow-up (median 24 months), 49% (n=122,894) received stress testing first, 10% (n=25,512) underwent invasive CA first, and 41% (n=101,944) had no testing (NT). A number of clinical risk factors at time of index PCI were associated with decreased likelihood of downstream testing (ST or CA, p<0.05 for all), including older age (HR 0.784 per 10 year increase), male sex (HR 0.946), heart failure (HR 0.925), diabetes (HR 0.954), smoking (HR 0.804), and renal failure (HR 0.880). Fifteen percent of patients with ST first proceeded to subsequent CA within 90 days of testing (n=18,472/101,884); of these, 48% (n=8831) underwent revascularization within 90 days, compared to 53% (n=13,316) of CA first patients (p<0.0001).
In this descriptive analysis, stress testing and invasive CA were common in older patients after PCI. Paradoxically, patients with higher-risk features at baseline were less likely to undergo post-PCI testing. The revascularization yield was low on patients referred for ST after PCI, with only 9% undergoing revascularization within 90 days.
non-invasive stress test; coronary angiography; percutaneous coronary intervention; clinical outcomes
Exercise testing with echocardiography or myocardial perfusion imaging is widely used to risk‐stratify patients with suspected coronary artery disease. However, reports of diagnostic performance rarely adjust for referral bias, and this practice may adversely influence patient care. Therefore, we evaluated the potential impact of referral bias on diagnostic effectiveness and clinical decision‐making.
Methods and Results
Searching PubMed and EMBASE (1990–2012), 2 investigators independently evaluated eligibility and abstracted data on study characteristics and referral patterns. Diagnostic performance reported in 4 previously published meta‐analyses of exercise echocardiography and myocardial perfusion imaging was adjusted using pooled referral rates and Bayesian methods. Twenty‐one studies reported referral patterns in 49 006 patients (mean age 60.7 years, 39.6% women, and 0.8% prior history of myocardial infarction). Catheterization referral rates after normal and abnormal exercise tests were 4.0% (95% CI, 2.9% to 5.0%) and 42.5% (36.2% to 48.9%), respectively, with odds ratio for referral after an abnormal test of 14.6 (10.7 to 19.9). After adjustment for referral, exercise echocardiography sensitivity fell from 84% (80% to 89%) to 34% (27% to 41%), and specificity rose from 77% (69% to 86%) to 99% (99% to 100%). Similarly, exercise myocardial perfusion imaging sensitivity fell from 85% (81% to 88%) to 38% (31% to 44%), and specificity rose from 69% (61% to 78%) to 99% (99% to 100%). Summary receiver operating curve analysis demonstrated only modest changes in overall discriminatory power but adjusting for referral increased positive‐predictive value and reduced negative‐predictive value.
Exercise echocardiography and myocardial perfusion imaging are considerably less sensitive and more specific for coronary artery disease after adjustment for referral. Given these findings, future work should assess the comparative ability of these and other tests to rule‐in versus rule‐out coronary artery disease.
coronary artery disease; diagnostic performance; echocardiography; exercise testing; myocardial perfusion imaging
The optimal use of stress testing after coronary revascularization remains unclear, and over-utilization of stress testing may increase rates of repeat revascularization. We analyzed the relationship at both the patient and regional level between the use of stress testing and repeat revascularization for a cohort of Medicare beneficiaries receiving revascularization within 30 days of an admission for symptomatic coronary artery disease (CAD).The sample consisted of 219,748 Medicare beneficiaries older than 65 years who received percutaneous coronary intervention (PCI) or cardiac bypass surgery (CABG) following hospital admission for symptomatic CAD in 2003–2004. Medicare claims data through 2008 identified the use of stress testing and repeat revascularization. Associations between the cumulative incidence of stress testing and repeat revascularization were analyzed using linear regressions. Within six years of initial revascularization, the cumulative incidence of events was 0.61 for stress testing and 0.23 for repeat revascularization. Most (53.1%) repeat revascularizations were preceded by a stress test. Only 10.3% of repeat revascularization procedures were preceded by myocardial infarction. Four-year cumulative incidence of repeat revascularization and stress testing varied between the Hospital Referral Regions represented by the sample, and the positive correlation between the rates by HRR accounted for only a small portion of the total HRR variation in revascularization rates. Stress testing is commonly performed among Medicare patients after initial revascularization, and the majority of repeat procedures are performed for stable CAD. Variation in stress testing patterns only explains a modest fraction of regional variation in repeat revascularization rates.
Stress testing; Coronary revascularization; Medicare claims data
We evaluated temporal trends and geographic variation in choice of stress testing modality post-PCI, as well as associations between modality and procedure use after testing.
Stress testing is frequently performed post-PCI, but the choices amongst available modalities (electrocardiogram [ECG]-only, nuclear, or echocardiography; pharmacologic or exercise stress) and consequences of such choices are not well characterized.
CathPCI Registry® data were linked with identifiable Medicare claims to capture stress testing use between 60 and 365 days post-PCI and procedures within 90 days after testing. Testing rates and modality used were modeled based on patient, procedure, and PCI facility factors, calendar quarter, and Census Divisions using Poisson and logistic regression. Post-test procedure use was assessed using Gray’s test.
In 284,971 patients, the overall stress testing rate after PCI was 53.1 per 100 person-years. Testing rates declined from 59.3 in Quarter 1 (2006) to 47.1 in Quarter 4 (2008), but the relative use of modalities changed little. Among exercise testing recipients, adjusted proportions receiving ECG-only testing varied from 6.8%-22.8% across Census Divisions and among exercise testing recipients having an imaging test, the proportion receiving echocardiography (versus nuclear) varied from 9.4%-34.1%. Post-test procedure use varied among modalities; exercise ECG-only testing was associated with more subsequent stress testing (13.7% vs. 2.9%; p<0.001), but less catheterization (7.4% vs. 14.1%; p<0.001) than imaging-based tests.
Modest reductions in stress testing after PCI occurring between 2006 and 2008 cannot be ascribed to trends in use of any single modality. Additional research should assess whether this trend represents better patient selection for testing or administrative policies (e.g., restricted access for patients with legitimate testing needs). Geographic variation in utilization of stress modalities and differences in downstream procedure use among modalities suggest a need to identify optimal use of the different test modalities in individual patients.
imaging; echocardiography, stress; myocardial perfusion imaging
Instrumental variable (IV) methods can correct for unmeasured confounding when using administrative (claims) data for cardiovascular outcomes research, but difficulties identifying valid IVs have limited their use. We evaluated the safety and efficacy of drug-eluting coronary stents (DES) compared to bare metal stents (BMS) for Medicare beneficiaries with acute coronary syndromes (ACS) using the rapid uptake of DES in clinical practice as an instrument. We compared results from IV to those from propensity score matching (PSM) and multivariable regression models.
Retrospective cohort study involving 62,309 fee-for-service beneficiaries aged 66 and older treated with coronary stenting between May 2003 and February 2004. Outcomes were measured for 46 months after revascularization using claims data.
DES recipients were younger, had lower prevalence of myocardial infarction, and had fewer comorbidities compared to BMS recipients. Use of DES was associated with lower rates of mortality by PSM (HR 0.80, CI: [0.77, 0.83]) but not by IV (HR 0.99, CI: [0.87, 1.11]). IV models estimated a larger reduction in repeat revascularization (HR 0.76, CI: [0.63, 0.89]) than did PSM (HR 0.90, CI: [0.87, 0.93]).
Based on IV analysis, the increased utilization of DES relative to BMS among Medicare beneficiaries with ACS is associated with reduced rates of repeat revascularization and no difference in mortality. IV approaches provide a useful complement to conventional approaches to cardiovascular outcomes research with administrative data.
The significance of heterogeneous vancomycin-intermediate Staphylococcus aureus (hVISA) is unknown. Using a multinational collection of isolates from methicillin-resistant S. aureus (MRSA) infective endocarditis (IE), we characterized IE patients with and without hVISA, and genotyped the infecting strains.
MRSA bloodstream isolates from 65 patients with definite IE from 8 countries underwent PCR for 31 virulence genes, pulsed-field gel electrophoresis, and multilocus sequence typing. hVISA was defined using population analysis profiling (PAP).
Nineteen (29.2%) of 65 MRSA IE isolates exhibited hVISA by PAP. Isolates from Oceania and Europe were more likely to exhibit hVISA than isolates from the United States (77.8% vs. 35.0% vs. 13.9%; P < .001). The prevalence of hVISA was higher among isolates with a vancomycin minimum inhibitory concentration of 2 mg/L (P = .026). hVISA-infected patients were more likely to have persistent bacteremia (68.4% vs. 37.0%; P = .029) and heart failure (47.4% vs. 19.6%; P = .033). Mortality of hVISA- and non-hVISA-infected patients did not differ (42.1% vs. 34.8%, P = .586). hVISA and non-hVISA isolates were genotypically similar.
In these analyses, hVISA occurred in over one-quarter of MRSA IE isolates, was associated with certain IE complications, and varied in frequency by geographic region.
hVISA; Methicillin-resistant Staphylococcus aureus; endocarditis; genotype
Clopidogrel’s effectiveness is likely reduced significantly for prevention of thrombotic events after acute coronary syndrome (ACS) in patients exhibiting a decreased ability to metabolize clopidogrel into its active form. A genetic mutation responsible for this reduced effectiveness is detectable by genotyping. Ticagrelor is not dependent on gene-based metabolic activation and demonstrated greater clinical efficacy than clopidogrel in a recent secondary prevention trial. In 2011, clopidogrel will lose its patent protection and likely will be substantially less expensive than ticagrelor.
To determine the cost-effectiveness of ticagrelor compared with a genotype-driven selection of antiplatelet agents.
A hybrid decision tree/Markov model was used to estimate the 5-year medical costs (in 2009 US$) and outcomes for a cohort of ACS patients enrolled in Medicare receiving either genotype-driven or ticagrelor-only treatment. Outcomes included life years and quality-adjusted life years (QALYs) gained. Data comparing the clinical performance of ticagrelor and clopidogrel were derived from the Platelet Inhibition and Patient Outcomes trial.
The incremental cost-effectiveness ratio (ICER) for universal ticagrelor was $10,059 per QALY compared to genotype-driven treatment, and was most sensitive to the price of ticagrelor and the hazard ratio for death for ticagrelor compared with clopidogrel. The ICER remained below $50,000 per QALY until a monthly ticagrelor price of $693 or a 0.93 hazard ratio for death for ticagrelor relative to clopidogrel. In probabilistic analyses, universal ticagrelor was below $50,000 per QALY in 97.7% of simulations.
Prescribing ticagrelor universally increases quality-adjusted life years for ACS patients at a cost below a typically accepted threshold.
acute coronary syndrome; clopidogrel; ticagrelor; cost-benefit analysis; secondary prevention
When patients choose percutaneous coronary intervention (PCI) over coronary artery bypass grafting (CABG), they accept an increased long-term risk of repeat revascularization in exchange for short term morbidity benefits. This paper quantifies the risk-benefit trade-off faced by patients with multiple vessel coronary artery disease.
Methods and Results
Data from the Arterial Revascularization Therapies Study are used to generate risk-benefit acceptability curves for PCI versus CABG. Risks are measured by the long-term likelihood of repeat revascularization while benefits are measured by short term reductions in pain or improvements in health-related quality of life (HRQL). PCI patients faced a risk of 0.81 additional revascularization events over three years in exchange for being pain-free at one month. A patient would need to be willing to tolerate a risk of 1.06 additional revascularization events at three years, in exchange for being pain free at one month to be 95% confident that choosing PCI over CABG is risk-effective for him/her.
The risk-benefit framework outlined in this study provides information to enable physicians to help their patients weigh directly each procedure’s risks and benefits. While trade-offs are typically measured in quality-adjusted life years, using pain reduction to reflect benefits may provide a more tangible framework for patients.
Percutaneous stents; coronary artery bypass graft
Comparative effectiveness of interventional treatment strategies for the very elderly with acute coronary syndrome remains poorly defined due to study exclusions. Interventions include percutaneous coronary intervention (PCI), usually with stents, or coronary artery bypass grafting (CABG). The elderly are frequently directed to PCI because of provider perceptions that PCI is at therapeutic equipoise with CABG and that CABG incurs increased risk. We evaluated long-term outcomes of CABG versus PCI in a cohort of very elderly Medicare beneficiaries presenting with acute coronary syndrome.
Using Medicare claims data, we analyzed outcomes of multivessel PCI or CABG treatment for a cohort of 10,141 beneficiaries age 85 and older diagnosed with acute coronary syndrome in 2003 and 2004. The cohort was followed for survival and composite outcomes (death, repeat revascularization, stroke, acute myocardial infarction) for three years. Logistic regressions controlled for patient demographics and comorbidities with propensity score adjustment for procedure selection.
Percutaneous coronary intervention showed early benefits of lesser morbidity and mortality, but CABG outcomes improved relative to PCI outcomes by three years (p < 0.01). At 36 months post-initial revascularization, 66.0% of CABG recipients survived (versus 62.7% of PCI recipients, p < 0.05) and 46.1% of CABG recipients were free from composite outcome (versus 38.7% of PCI recipients, p < 0.01).
In very elderly patients with ACS and multivessel CAD, CABG appears to offer an advantage over PCI of survival and freedom from composite endpoint at three years. Optimizing the benefit of CABG in very elderly patients requires absence of significant congestive heart failure, lung disease, and peripheral vascular disease.
We investigated associations between the genotypic and phenotypic features of Staphylococcus aureus bloodstream isolates and the clinical characteristics of bacteremic patients enrolled in a phase III trial of S. aureus bacteremia and endocarditis. Isolates underwent pulsed-field gel electrophoresis, PCR for 33 putative virulence genes, and screening for heteroresistant glycopeptide intermediate S. aureus (hGISA). A total of 230 isolates (141 methicillin-susceptible S. aureus and 89 methicillin-resistant S. aureus [MRSA]) were analyzed. North American and European S. aureus isolates differed in their genotypic characteristics. Overall, 26% of the MRSA bloodstream isolates were USA 300 strains. Patients with USA 300 MRSA bacteremia were more likely to be injection drug users (61% versus 15%; P < 0.001), to have right-sided endocarditis (39% versus 9%; P = 0.002), and to be cured of right-sided endocarditis (100% versus 33%; P = 0.01) than patients with non-USA 300 MRSA bacteremia. Patients with persistent bacteremia were less likely to be infected with Panton-Valentine leukocidin gene (pvl)-constitutive MRSA (19% versus 56%; P = 0.005). Although 7 of 89 MRSA isolates (8%) exhibited the hGISA phenotype, no association with persistent bacteremia, daptomycin resistance, or bacterial genotype was observed. This study suggests that the virulence gene profiles of S. aureus bloodstream isolates from North America and Europe differ significantly. In this study of bloodstream isolates collected as part of a multinational randomized clinical trial, USA 300 and pvl-constitutive MRSA strains were associated with better clinical outcomes.
The impact of bacterial genetic characteristics on the outcome of patients with Staphylococcus aureus infections is uncertain. This investigation evaluated potential associations between bacterial genotype and clinical outcome using isolates collected as part of an international phase 2 clinical trial (FAST II) evaluating telavancin for the treatment of complicated skin and skin structure infections (cSSSI). Ninety S. aureus isolates from microbiologically evaluable patients with cSSSI enrolled in the FAST II trial from 11 sites in the United States (56 isolates, or 62%) and 7 sites in South Africa (34 isolates, or 38%) were examined for staphylococcal cassette chromosome mec, agr, and the presence of 31 virulence genes and subjected to pulsed-field gel electrophoresis (PFGE). South African methicillin-susceptible S. aureus (MSSA) isolates were more likely to carry certain virulence genes, including sdrD (P = 0.01), sea (P < 0.01), and pvl (P = 0.01). All 44 (49%) methicillin-resistant S. aureus (MRSA) isolates were from the United States; 37 (84%) were strain USA 300 by PFGE. In the United States, MRSA isolates were more likely than MSSA isolates to carry genes for sdrC (P = 0.03), map/eap (P = 0.05), fnbB (P = 0.11), tst (P = 0.02), sea (P = 0.04), sed (P = 0.04), seg (P = 0.11), sej (P = 0.11), agr (P = 0.09), V8 (P = 0.06), sdrD, sdrE, eta, etb, and see (P < 0.01 for all). MRSA isolates were more often clonal than MSSA isolates by PFGE. Isolates from patients who were cured were significantly more likely to contain the pvl gene than isolates from patients that failed or had indeterminate outcomes (79/84 [94%] versus 3/6 [50%]; P = 0.01). S. aureus strains from different geographic regions have different distributions of virulence genes.