The potential to save money within a short time frame provides a more compelling “business case” for quality improvement (QI) than merely demonstrating cost-effectiveness. Our objective was to demonstrate the potential for cost savings from improved control in patients anticoagulated for atrial fibrillation (AF).
Methods and Results
Our population consisted of 67,077 real Veterans Health Administration (VA) patients anticoagulated for AF between 10/1/2006-9/30/2008. We simulated the number of adverse events, and their associated costs and utilities, both before and after various degrees of improvement in percent time in therapeutic range (TTR). The simulation had a two-year time horizon and costs were calculated from the perspective of the payer. In the base-case analysis, improving TTR by 5% prevented 1,114 adverse events, including 662 deaths; it gained 863 QALYs and saved $15.9 million compared to the status quo, not accounting for the cost of the QI program. Improving TTR by 10% prevented 2,087 events, gained 1,606 QALYs, and saved $29.7 million. In sensitivity analyses, costs were most sensitive to the estimated risk of stroke and the expected stroke reduction from improved TTR. Utilities were most sensitive to the estimated risk of death and the expected mortality benefit from improved TTR.
A QI program to improve anticoagulation control would likely be cost-saving for the payer, even if it were only modestly effective in improving control, and even without considering the value of improved health. This study demonstrates how to make a business case for a QI initiative.
anticoagulants; atrial fibrillation; patient simulation; quality improvement
Lifestyle and socioeconomic status have been implicated in the prevalence of hypertension; thus, we evaluated factors associated with hypertension in a cohort of blacks and whites with similar socioeconomic status characteristics.
Methods and Results
We evaluated the prevalence and factors associated with self-reported hypertension (SR-HTN) and ascertained hypertension (A-HTN) among 69 211 participants in the Southern Community Cohort Study. Multivariable logistic regression models were used to estimate the odds ratios (ORs) and 95% confidence intervals (CIs) for factors associated with hypertension. The prevalence of SR-HTN was 57% overall. Body mass index was associated with SR-HTN in all race-sex groups, with the OR rising to 4.03 (95% CI, 3.74–4.33) for morbidly obese participants (body mass index, >40 kg/m2). Blacks were more likely to have SR-HTN than whites (OR, 1.84; 95% CI, 1.75–1.93), and the association with black race was more pronounced among women (OR, 2.08; 95% CI, 1.95–2.21) than men (OR, 1.47; 95% CI, 1.36–1.60). Similar findings were noted in the analysis of A-HTN. Among those with SR-HTN and A-HTN who reported use of an antihypertensive agent, 94% were on at least one of the major classes of antihypertensive agents, but only 44% were on ≥2 classes and only 29% were on a diuretic. The odds of both uncontrolled hypertension (SR-HTN and A-HTN) and unreported hypertension (no SR-HTN and A-HTN) were twice as high among blacks as whites (OR, 2.13; 95% CI, 1.68–2.69; and OR, 1.99; 95% CI, 1.59–2.48, respectively).
Despite socioeconomic status similarities, we observed suboptimal use of antihypertensives in this cohort and racial differences in the prevalence of uncontrolled and unreported hypertension, which merit further investigation.
African Continental Ancestry Group; European Continental Ancestry Group; hypertension; prevalence
Simultaneous contribution of hundreds of electrocardiographic biomarkers to prediction of long-term mortality in post-menopausal women with clinically normal resting electrocardiograms (ECGs) is unknown.
Methods and Results
We analyzed ECGs and all-cause mortality in 33,144 women enrolled in Women’s Health Initiative trials, who were without baseline cardiovascular disease or cancer, and had normal ECGs by Minnesota and Novacode criteria. Four hundred and seventy seven ECG biomarkers, encompassing global and individual ECG findings, were measured using computer algorithms. During a median follow-up of 8.1 years (range for survivors 0.5–11.2 years), 1,229 women died. For analyses cohort was randomly split into derivation (n=22,096, deaths=819) and validation (n=11,048, deaths=410) subsets. ECG biomarkers, demographic, and clinical characteristics were simultaneously analyzed using both traditional Cox regression and Random Survival Forest (RSF), a novel algorithmic machine-learning approach. Regression modeling failed to converge. RSF variable selection yielded 20 variables that were independently predictive of long-term mortality, 14 of which were ECG biomarkers related to autonomic tone, atrial conduction, and ventricular depolarization and repolarization.
We identified 14 ECG biomarkers from amongst hundreds that were associated with long-term prognosis using a novel random forest variable selection methodology. These were related to autonomic tone, atrial conduction, ventricular depolarization, and ventricular repolarization. Quantitative ECG biomarkers have prognostic importance, and may be markers of subclinical disease in apparently healthy post-menopausal women.
Electrocardiography; epidemiology; women; prognosis
Despite ongoing efforts to improve the quality of pediatric resuscitation, it remains unknown whether survival in children with in-hospital cardiac arrest has improved.
METHODS & RESULTS
Between 2000 and 2009, we identified children (<18 years) with an in-hospital cardiac arrest at hospitals with ≥ 3 years of participation and ≥ 5 cases annually within the national Get With The Guidelines-Resuscitation registry. Multivariable logistic regression was used to examine temporal trends in survival to discharge. We also explored whether trends in survival were due to improvement in acute resuscitation or post-resuscitation care and examined trends in neurological disability among survivors. Among 1031 children at 12 hospitals, the initial cardiac arrest rhythm was asystole and pulseless electrical activity in 874 children (84.8%) and ventricular fibrillation and pulseless ventricular tachycardia in 157 children (15.2%), with an increase in cardiac arrests due to asystole and pulseless electrical activity over time (P for trend <0.001). Risk-adjusted rates of survival to discharge increased from 14.3% in 2000 to 43.4% in 2009 (adjusted rate ratio per 1-year 1.08; 95% CI [1.01,1.16]; P for trend 0.02). Improvement in survival was largely driven by an improvement in acute resuscitation survival (risk adjusted rates: 42.9% in 2000, 81.2% in 2009; adjusted rate ratio per 1-year: 1.04; 95% CI [1.01,1.08]; P for trend 0.006). Moreover, survival trends were not accompanied by higher rates of neurological disability among survivors over time (unadjusted P for trend 0.32), suggesting an overall increase in the number of survivors without neurological disability over time.
Rates of survival to hospital discharge in children with in-hospital cardiac arrests have improved over the past decade without higher rates of neurological disability among survivors.
cardiopulmonary resuscitation; pediatrics; survival
Mixed methods studies, in which qualitative and quantitative methods are combined in a single program of inquiry, can be valuable in biomedical and health services research, where the complementary strengths of each approach can yield greater insight into complex phenomena than either approach alone. Although interest in mixed methods is growing among science funders and investigators, written guidance on how to conduct and assess rigorous mixed methods studies is not readily accessible to the general readership of peer-reviewed biomedical and health services journals. Furthermore, existing guidelines for publishing mixed methods studies are not well known or applied by researchers and journal editors. Accordingly, this paper is intended to serve as a concise, practical resource for readers interested in core principles and practices of mixed methods research. We briefly describe mixed methods approaches and present illustrations from published biomedical and health services literature, including in cardiovascular care, summarize standards for the design and reporting of these studies, and highlight four central considerations for investigators interested in using these methods.
Mixed Methods; Health Services Research
Individuals living in primary care health professional shortage areas (PC-HPSA) often have difficulty obtaining medical care; however, no previous studies have examined association of PC-HPSA residence with prevalence of CVD risk factors.
Methods and Results
To examine this question, the authors used data from the Multi-Ethnic Study of Atherosclerosis baseline exam (2000–2002). Outcomes included the prevalence of diabetes, hypertension, hyperlipidemia, smoking and obesity as well as the awareness and control of diabetes, hypertension, and hyperlipidemia. Multivariable Poisson models were used to examine the independent association of PC-HPSA residence with each outcome. Models were sequentially adjusted for demographics, acculturation, socioeconomic status, access to health care and neighborhood socioeconomic status. Similar to the national average, 16.7% of MESA participants lived in a PC-HPSA. In unadjusted analyses, prevalence rates of diabetes (14.8% vs 11.0%), hypertension (48.2% vs 43.1%), obesity (35.7% vs 31.1%) and smoking (15.5% vs 12.1%) were significantly higher among residents of PC-HPSAs. There were no significant differences in the awareness or control of diabetes, hypertension, or hyperlipidemia. After adjustment, residence in a PC-HPSA was not independently associated with CVD risk factor prevalence, awareness or control.
This study suggests that increased prevalence of CVD risk factors in PC-HPSAs are explained by the demographic and socioeconomic characteristics of their residents. Future interventions aimed at increasing the number of primary care physicians may not improve cardiovascular risk without first addressing other factors underlying healthcare disparities.
epidemiology; prevention; risk factors
Vitamin D status has been linked to the risk of cardiovascular disease (CVD). However, the optimal 25hydroxy-vitamin D (25(OH)-vitamin D) levels for potential cardiovascular health benefits remain unclear.
Methods and Results
We searched MEDLINE and EMBASE from 1966 through February 2012 for prospective studies that assessed the association of 25(OH)-vitamin D concentrations with CVD risk. A total of 24 articles met our inclusion criteria, from which 19 independent studies with 6,123 CVD cases in 65,994 participants were included for a meta-analysis. Comparing the lowest to the highest 25(OH)-vitamin D categories, the pooled relative risks (RR) was 1.52 (95% CI: 1.30-1.77) for total CVD, 1.42 (95% CI: 1.19-1.71) for CVD mortality, 1.38 (95% CI: 1.21-1.57) for coronary heart disease, and 1.64 (95% CI: 1.27-2.10) for stroke. These associations remained strong and significant when analyses were limited to studies that excluded participants with baseline CVD and had better controlled for season and confounding. We used a fractional polynomial spline regression analysis to assess the linearity of dose-response association between continuous 25(OH)-vitamin D and CVD risk. The CVD risk increased monotonically across decreasing 25(OH)-vitamin D below approximately 60 nmol/L, with a RR of 1.03 (95% CI: 1.00-1.06) per 25 nmol/L decrement in 25(OH)-vitamin D.
This meta-analysis demonstrated a generally linear, inverse association between circulating 25(OH)-vitamin D in the range of 20-60 nmol/L and risk of CVD. Further research is needed to clarify the association of 25(OH)-vitamin D higher than 60 nmol/L with CVD risk and assess causality of the observed associations.
25hydroxy-vitamin D; cardiovascular disease; meta-analysis; prospective study
Implantable cardioverter defibrillators (ICDs) are increasingly used for primary prevention followingrandomized controlled trials (RCTs) demonstrating that they reduce the risk of death in patients with left ventricular systolic dysfunction (LVSD). The extent to which the clinical characteristics and long-term outcomes of unselected, community-based patients with LVSD undergoing primary prevention ICD implantation in a real-world setting compare with those enrolled in the RCTs is not well characterized. The Longitudinal Study of ICDs is being conducted to address these questions.
Methods and Results
The study cohort includes consecutive patients undergoing primary prevention ICD placement between 1/1/2006 and 12/31/2009 in seven health plans. Baseline clinical characteristics were acquired from the NCDRICD Registry. Longitudinal data collection is underway and will include hospitalization, mortality, and resource utilization from the Virtual Data Warehouse. Data regarding ICD therapies will be obtained through chart abstraction and adjudicated by a panel of experts in device therapy. Compared with the populations of primary prevention ICD therapy RCTs, the cohort (n=2,621) is on average significantly older (by 2.5-6.5 years); more often female, more often from racial and ethnic minority groups, and has a significantly higher burden of coexisting conditions. The cohort is similar, however, to a national population undergoing primary prevention ICD placement.
Patients undergoing primary prevention ICD implantation in the Longitudinal Study of ICDs differ from those enrolled in the RCTs that established the efficacy of ICDs. Understanding a broad range of health outcomes, including ICD therapies, in this cohort will provide patients, clinicians, and policy-makers with contemporary data to inform decision-making.
arrhythmia; electrophysiology; epidemiology
IABPs are frequently used to provide hemodynamic support during high risk percutaneous coronary intervention (PCI), but clinical evidence to support their use is mixed. We examined hospital variation in IABP use among high risk PCI patients, and determined the association of IABP use on mortality in this population.
Methods and Results
We analyzed data submitted to the CathPCI Registry® between January 2005 and December 2007. High risk PCI was defined as having at least one of the following features: unprotected left main artery as the target vessel, cardiogenic shock, severely depressed left ventricular function, or ST segment elevation myocardial infarction. Hospitals were categorized into quartiles by their proportional use of IABP. We examined differences in in-hospital mortality across hospital quartiles using a hierarchical logistic regression model to adjust for differences in patient and hospital characteristics across hospital quartiles of IABP use. IABPs were used in 18,990 (10.5%) of 181,599 high risk PCIs. Proportional use of IABP varied significantly across hospital quartiles: Q1: 0.0%–6.5%; Q2: 6.6% to 9.2%; Q3: 9.3% to 14.1%; and Q4: 14.2% to 40.0%. In multivariable analysis, after adjustment for differences in patient and hospital characteristics, in-hospital mortality was comparable across quartiles of hospital IABP usage (Q1: Ref; Q2: Odds Ratio (OR) 1.11, 95% CI 0.99–1.24; Q3: OR 1.03, 95% CI 0.92–1.15; Q4: OR 1.06, 95% CI 0.94–1.18).
IABP use varied significantly across hospitals for high risk PCI. However, this variation in IABP use was not associated with differences in in-hospital mortality.
Angioplasty; Atherosclerosis; Heart assist device
Little is known regarding the adoption of direct thrombin inhibitors in clinical practice. We examine trends in oral anticoagulation for the prevention of thromboembolism in the United States.
Methods and Results
We used the IMS Health National Disease and Therapeutic Index, a nationally representative audit of office-based providers, to quantify patterns of oral anticoagulant use among all subjects and stratified by clinical indication. We quantified oral anticoagulant expenditures using the IMS Health National Prescription Audit. Between 2007 and 2011, warfarin treatment visits declined from approximately 2.1 million [M] quarterly visits to approximately 1.6M visits. Dabigatran use increased from 0.062M quarterly visits (2010Q4) to 0.363M visits (2011Q4), reflecting its increasing share of oral anticoagulant visits from 3.1% to 18.9%. In contrast to warfarin, the majority of dabigatran visits have been for atrial fibrillation, though this proportion decreased from 92% (2010Q4) to 63% (2011Q4), with concomitant increases in dabigatran’s off-label use. Among atrial fibrillation visits, warfarin use decreased from 55.8% of visits (2010Q4) to 44.4% (2011Q4), while dabigatran use increased from 4.0% to 16.9%. Of atrial fibrillation visits, the fraction not treated with any oral anticoagulants has remained unchanged at approximately 40%. Expenditures related to dabigatran increased rapidly from $16M in 2010Q4 to $166M in 2011Q4, exceeding expenditures on warfarin ($144M) in 2011Q4.
Dabigatran has been rapidly adopted into ambulatory practice in the United States, primarily for treatment of atrial fibrillation, but increasingly for off-label indications. We did not find evidence that it has increased overall atrial fibrillation treatment rates.
anticoagulants; coumarins; other anticoagulants
Cardiovascular disease continues to cause significant morbidity, mortality, and impaired quality of life, with unrealized health gains from the underuse of available evidence. The Transitions, Risks, and Actions in Coronary Events Center for Outcomes Research and Education (TRACE-CORE) aims to advance the science of acute coronary syndromes (ACS) by examining the determinants and outcomes of the quality of the transition from the hospital to the community and by quantifying the impact of potentially-modifiable characteristics associated with decreased quality of life, rehospitalization, and mortality.
Methods and Results
TRACE-CORE is composed of a longitudinal multi-racial cohort of patients hospitalized with ACS, two research projects, and development of a nucleus of early stage investigators. We are currently enrolling 2,500 adults hospitalized for ACS at 6 hospitals in the northeastern and southeastern United States. We will follow these patients for 24 months after hospitalization through medical record abstraction and six patient interviews focusing on quality of life, cardiac events, rehospitalizations, mortality, and medical, behavioral, and psychosocial characteristics. The Transitions Project studies determinants of and disparities in outcomes of the quality of patients’ transition from the hospital to the community. Focusing on potentially modifiable factors, the Action Scores Project will develop and validate action scores to predict recurrent cardiac events, death, and quality of life, describe longitudinal variation in these scores, and develop a dashboard for patient and provider action based on these scores.
In TRACE-CORE, sound methodologic principles of observational studies converge with outcomes and effectiveness research approaches. We expect that our data, research infrastructure, and research projects will inform the development of novel secondary prevention approaches and underpin the careers of CVD outcomes researchers.
angina; infarction; risk factors; follow-up studies
Stroke; Cerebrovascular circulation; Risk factors; Atherosclerosis; Lifestyle
Acute myocardial infarction; Epidemiology; Health policy and outcome research; Primary prevention; Secondary prevention; Resource utilization
Many individuals with diabetes, hypertension and hyperlipidemia have difficulty achieving control of all three conditions. We assessed the incidence and duration of simultaneous control of hyperglycemia, blood pressure, and low-density lipoprotein (LDL) cholesterol in patients from two health care systems in Colorado.
Methods and Results
Retrospective cohort study of adults at Denver Health (DH) and Kaiser Permanente Colorado (KP) with diabetes, hypertension, and hyperlipidemia from 2000 through 2008. Over a median of 4.0 and 4.4 years, 16% and 30% of individuals at DH and KP achieved the primary outcome (simultaneous control with a glycosylated hemoglobin (HbA1c) < 7.0%, blood pressure < 130/80 mmHg and LDL cholesterol < 100 mg/dL), respectively. With less strict goals (HbA1c < 8.0%, BP < 140/90 mmHg, and LDL cholesterol < 130 mg/dL), 44% and 70% of individuals at DH and KP achieved simultaneous control. Socio-demographic characteristics (increasing age, white ethnicity), and the presence of cardiovascular disease or other comorbidities were significantly but not strongly predictive of achieving simultaneous control in multivariable models. Simultaneous control was less likely as severity of the underlying conditions increased, and more likely as medication adherence increased.
Simultaneous control of diabetes, hypertension, and hyperlipidemia was uncommon and generally transient. Less stringent goals had a relatively large effect on the proportion achieving simultaneous control. Individuals who simultaneously achieve multiple treatment goals may provide insight into self-care strategies for individuals with comorbid health conditions.
Diabetes mellitus; hypertension; hypercholesterolemia; epidemiology
Recent studies have suggested poor quality and diminished quantity of sleep may be independently linked to vascular events, though prospective and multiethnic studies are limited. This study aimed to explore the relationship between daytime sleepiness and the risk of ischemic stroke and vascular events in an elderly, multi-ethnic prospective cohort.
Methods and Results
As part of the Northern Manhattan Study, the Epworth Sleepiness Scale (ESS) was collected during the 2004 annual follow-up. Daytime sleepiness was trichotomized using previously reported cut points of “no dozing,” “some dozing,” and “significant dozing”. Subjects were followed annually for a mean of 5.1 years. Cox proportional hazards models were used to calculate hazard ratios (HR) and 95% confidence intervals (95% CI) for stroke, MI and death outcomes. We obtained the ESS on 2088 community residents. The mean age was 73.5 ± 9.3 yrs; 64% were women; 17% white, 20% black, 60% Hispanic, and 3% other. Over 44% of the cohort reported no daytime dozing, 47% reported “some dozing” and 9% “significant daytime dozing.” Compared to those reporting no daytime dozing, individuals reporting significant dozing had an increased risk of ischemic stroke [HR=2.74 (95% CI 1.38-5.43)], all 6 stroke [3.00 (1.57-5.73)], the combination of ischemic stroke, MI and vascular death [2.38 (1.50-3.78)], and all vascular events [2.48 (1.57-3.91)], after adjusting for medical comorbidities.
Daytime sleepiness is an independent risk factor for stroke and other vascular events. These findings suggest the importance of screening for sleep problems at the primary care level.
Ischemic Stroke; Sleep; Epidemiology; Vascular Disease; race/ethnicity
Dabigatran, an oral thrombin inhibitor, and rivaroxaban and apixaban, oral factor Xa inhibitors, have been found safe and effective in reducing stroke risk in patients with atrial fibrillation. We sought to compare the efficacy and safety of the 3 new agents based on data from their published warfarin-controlled randomized trials, using the method of adjusted indirect comparisons.
Methods and Results
We included findings from 44,535 patients enrolled in three trials of the efficacy of dabigatran (RE-LY), apixaban (ARISTOTLE), and rivaroxaban (ROCKET-AF), each compared with warfarin. The primary efficacy endpoint was stroke or systemic embolism; the safety endpoint we studied was major hemorrhage. To address a lack of comparability between trial populations caused by the restriction of ROCKET-AF to high risk patients, we conducted a subgroup analysis in patients with a CHADS2 score ≥3. We found no statistically significant efficacy differences among the three drugs, although apixaban and dabigatran were numerically superior to rivaroxaban. Apixaban produced significantly fewer major hemorrhages than dabigatran and rivaroxaban.
An indirect comparison of new anticoagulants based on existing trial data indicates that in patients with a CHADS2 score ≥3 dabigatran 150mg, apixaban 5mg, and rivaroxaban 20 mg resulted in statistically similar rates of stroke and systemic embolism, but apixaban had a lower risk of major hemorrhage compared to dabigatran and rivaroxaban . Until head-to-head trials or large-scale observational studies that reflect routine use of these agents are available, such adjusted indirect comparisons based on trial data are one tool to guide initial therapeutic choices.
Indirect comparison; anticoagulation; dabigatran; rivaroxaban; apixaban; warfarin; randomized controlled trial
The impact of polyvascular disease (peripheral arterial disease [PAD] and/or cerebrovascular disease [CVD]) on long-term cardiovascular outcomes among older patients with acute myocardial infarction (MI) has not been well studied.
Non–ST-elevation MI (NSTEMI) patients aged ≥65 years from the CRUSADE registry who survived to hospital discharge were linked to longitudinal data from the Centers for Medicare and Medicaid Services (n=34,205). All patients were presumed to have coronary artery disease (CAD) and were classified into 4 groups: 10.7% had prior CVD (CAD+CVD group); 11.5% had prior PAD (CAD+PAD); 3.1% had prior PAD and CVD (CAD+PAD+CVD); and 74.7% had no polyvascular disease (CAD alone). Cox proportional hazard modeling was used to examine the hazard of long-term mortality and the composite of death, readmission for MI, or readmission for stroke (median follow-up 35 months, IQR 17–49) among the 4 groups.
Compared with the CAD-alone group, patients with polyvascular disease had a greater comorbidity burden, were less likely to undergo revascularization, and less often received recommended discharge interventions. Three-year mortality rates increased with a greater number of arterial beds involved: 33% for CAD alone, 49% for CAD+PAD, 52% for CAD+CVD, and 59% for CAD+PAD+CVD. Relative to the CAD-alone group, patients with all 3 arterial beds involved had the highest risk of long-term mortality (adjusted HR [95% CI]: 1.49 [1.38–1.61], with a lower risk for those with CAD+CVD, 1.38 [1.31–1.44], and those with CAD+PAD, 1.29 [1.23–1.35]). Similarly, the adjusted risk of long-term composite ischemic events was highest among the CAD+PAD+CVD group.
Older NSTEMI patients with polyvascular disease have substantially higher long-term risk, such that the 3-year mortality rate is >50%. Future studies targeting greater adherance to secondary prevention strategies and novel therapies are needed to help reduce long-term cardiovascular events in this vulnerable population.
Performance measures that emphasize only a treat-to-target approach may motivate overtreatment with high dose statins, potentially leading to adverse events and unnecessary costs. We developed a clinical action performance measure for lipid management in patients with diabetes that is designed to encourage appropriate treatment with moderate dose statins while minimizing overtreatment.
Methods and Results
We examined data from July 2010 to June 2011 for 964,818 active VA primary care patients >=18 years with diabetes. We defined 3 conditions as successfully meeting the clinical action measure for patients 50-75 years old: 1) LDL < 100 mg/dL; 2) On a moderate dose statin, regardless of LDL level or measurement; or 3) If LDL > 100 mg/dL, received appropriate clinical action (starting, switching or intensifying statin therapy). We examined possible overtreatment for patients 18 and older by examining the proportion of patients without ischemic heart disease who were on a high dose statin. We then examined variability in measure attainment across 881 facilities using two level hierarchical multivariable logistic models. Of 668,209 patients with diabetes aged 50-75 years, 84.6% passed the clinical action measure: 67.2% with LDL <100 mg/dL; 13.0% with LDL >=100 mg/dL and on either a moderate dose statin (7.5%) or with appropriate clinical action (5.5%); and 4.4% with no index LDL on at least a moderate dose statin. Of the entire cohort aged >=18 years, 13.7% were potentially overtreated. Facilities with higher rates of meeting the current threshold measure (LDL <100 mg/dL) had higher rates of potential overtreatment (p <0.001).
Use of a performance measure that credits appropriate clinical action indicates that almost 85% of diabetic Veterans aged 50-75 are receiving appropriate dyslipidemia management. However, many patients are potentially overtreated with high dose statins.
Cholesterol; Performance Measures; Quality of Care; Diabetes Mellitus; Lipids
Radial artery access for coronary angiography and interventions has been promoted for reducing hemostasis time and vascular complications compared to femoral access, yet it can take longer to perform and is not always successful, leading to concerns about its cost. We report a cost-benefit analysis of radial catheterization, based on results from a systematic review of published randomized controlled trials (RCTs).
Methods and results
The systematic review added five additional RCTs to a prior review, for a total of 14 studies. Meta-analyses, following Cochrane procedures, suggested that radial catheterization significantly increased catheterization failure (OR 4.92, 95% CI 2.69–8.98), but reduced major complications (OR 0.32, CI 0.24–0.42), major bleeding (OR 0.39, CI 0.27–0.57) and hematoma (OR 0.36, CI 0.27–0.48) compared to femoral catheterization. It added approximately 1.4 minutes to procedure time (CI −0.22 to 2.97), and reduced hemostasis time by about 13 minutes (CI −2.30 to −23.90). There were no differences in procedure success rates or major adverse cardiovascular events.
A stochastic simulation model of per-case costs took into account procedure and hemostasis time, costs of repeating the catheterization at the alternate site if the first catheterization failed, and the inpatient hospital costs associated with complications from the procedure. Using base-case estimates based on our meta-analysis results, we found the radial approach cost $275 (95% CI: −$374 to −$183) less per patient from the hospital perspective. Radial catheterization was favored over femoral catheterization under all conditions tested.
Radial catheterization was favored over femoral catheterization in our cost-benefit analysis.
Coronary angiography; heart catheterization; radial artery; femoral artery; cost-benefit analysis; meta-analysis
Transfer delays for primary percutaneous coronary intervention (PPCI) may increase mortality in patients with ST-segment elevation myocardial infarction (STEMI). We examined the association between door 1 to door 2 (D1D2) time, a measure capturing the entire transfer process, and outcomes in patients undergoing inter-hospital transfer for primary PCI.
Methods and Results
We evaluated the relationship between D1D2 time and the 90 day incidence of death, shock, and heart failure in the sub-set of 2075 (36.1%) of 5745 patients who underwent inter-hospital transfer for PPCI in the APEX-AMI trial. There was no significant difference in the 90 day incidence of death, shock, and heart failure between the transferred and the non-transferred groups (10.3% vs 10.2%, p=0.89). The median difference in symptom to balloon time between the two groups was 45 minutes (229 vs 184, p<0.001). The primary outcome per 30 minute delay was higher for patients with a D1D2 time ≤ 150 minutes (HR 1.19: 95% Confidence Interval [CI], 1.06 to 1.33 p=0.004) but not for D1D2 times > 150 minutes (HR, 0.99: 95% CI, 0.96 to 1.02; p=0.496). The association between longer D1D2 time and worsening outcome was no longer statistically significant after multivariable adjustment.
Longer transfer times were associated with higher rate of death, shock, and heart failure among patients undergoing inter-hospital transfer from PPCI, although this difference did not persist after adjusting for baseline characteristics.
Clinical Trial Registration Information
URL: www.clincaltrials.gov, Unique Identifier: NCT00091637
STEMI; Primary PCI; Transfer
While the Framingham Risk Score provides a reasonable estimation of risk in certain subgroups, the majority of MIs occur in individuals classified as low or moderate risk. Coronary Artery Calcium (CAC) testing provides an individualized measure of atherosclerotic burden that integrates an individual’s cumulative lifetime risk factor exposure that cannot be obtained from serum markers.
Methods and Results
We briefly summarize the existing evidence for the use of CAC scanning in primary prevention and performed a meta-analysis of the existing randomized controlled data investigating the impact of CAC screening on lifestyle modification, risk factors, and downstream testing. We identified four trials published between 2003 and 2011 with a total of 2,490 participants, >75% of whom came from the Early Identification of Subclinical Atherosclerosis by Noninvasive Imaging Research (EISNER) trial. Three of the trials reported a non-significant increase in smoking cessation in the scan versus no-scan group with a pooled mean of 1.15 (95% CI 0.77 – 1.71). A significant reduction in SBP and LDL was noted in the EISNER trial, but the pooled estimates were 0.23mmHg (95% CI −2.25 – 2.71) and 0.23mg/dL (95% CI −5.96 – 6.42), respectively. Only the EISNER trial reported medication usage according to CAC score. They found a higher CAC score associated with an increased prescription of lipid lowering medications (p=<0.001) and a CAC=0 associated with fewer prescriptions for lipid lowering medications (p=0.02).
Our meta-analysis highlights the paucity of randomized evidence linking CAC scanning to improved intermediate and hard outcomes in primary prevention. Future trials are urgently needed to determine the impact of CAC screening on lifestyle modification, risk factor modification, and downstream testing.
coronary artery disease; risk factors; primary prevention; imaging
In 2010, recognizing the value of outcomes research to understand and bridge translational gaps, establish evidence in the clinical practice and delivery of medicine, and generate new hypotheses about ongoing questions of treatment and care, the National Heart, Lung, and Blood Institute (NHLBI) of the National Institutes of Health (NIH) established the Centers for Cardiovascular Outcomes Research (CCOR) program.
Methods and Results
The NHLBI funded three centers and a research coordinating unit. Each center has an individual project focus, including: (1) characterizing care transition and predicting clinical events and quality of life for patients discharged after an acute coronary syndrome; (2) identifying center and regional factors associated with better patient outcomes across several cardiovascular conditions and procedures; and (3) examining the impact of health care reform in Massachusetts on overall and disparate care and outcomes for several cardiovascular conditions and venous thromboembolism. Cross-program collaborations seek to advance the field methodologically and to develop early stage investigators committed to careers in outcomes research.
The CCOR program represents a significant expansion of the NHLBI's investment in cardiovascular outcomes research. The vision of this program is to leverage scientific rigor and cross-program collaboration to advance the science of health care delivery and outcomes beyond what any individual unit could achieve alone.
outcomes research; translation of knowledge; cross-collaboration
Prevention; Healthcare delivery; Outcomes