Recent proposals suggest that risk-stratified analyses of clinical trials be routinely performed to better enable tailoring of treatment decisions to individuals. Trial data can be stratified using externally developed risk models (e.g. Framingham risk score), but such models are not always available. We sought to determine whether internally developed risk models, developed directly on trial data, introduce bias compared to external models.
Methods and Results
We simulated a large patient population with known risk factors and outcomes. Clinical trials were then simulated by repeatedly drawing from the patient population assuming a specified relative treatment effect in the experimental arm, which either did or did not vary according to a subjects baseline risk. For each simulated trial, two internal risk models were developed on either the control population only (internal controls only, ICO) or on the whole trial population blinded to treatment (internal whole trial, IWT). Bias was estimated for the internal models by comparing treatment effect predictions to predictions from the external model.
Under all treatment assumptions, internal models introduced only modest bias compared to external models. The magnitude of these biases were slightly smaller for IWT models than for ICO models. IWT models were also slightly less sensitive to bias introduced by overfitting and less sensitive to falsely identifying the existence of variability in treatment effect across the risk spectrum than ICO models.
Appropriately developed internal models produce relatively unbiased estimates of treatment effect across the spectrum of risk. When estimating treatment effect, internally developed risk models using both treatment arms should, in general, be preferred to models developed on the control population.
clinical trials; modeling
Early detection and treatment of cardiovascular disease (CVD) risk factors produces significant clinical benefits, but no consensus exists on optimal screening algorithms. This study aimed to evaluate the comparative and cost effectiveness of staged laboratory-based and non-laboratory-based total cardiovascular disease risk assessment.
Methods and Results
We used receiver operating characteristic (ROC) curve and cost-effectiveness modeling methods to compare strategies with and without laboratory components, and using single-stage and multistage algorithms, including approaches based on Framingham risk scores (laboratory-based assessments for all individuals). Analyses were conducted using data from 5,998 adults in the Third National Health and Nutrition Examination Survey without history of CVD, using 10-year CVD death as the main outcome. A micro-simulation model projected lifetime costs, quality-adjusted life years (QALYs), and incremental cost-effectiveness ratios (ICERs) for 60 Framingham-based, non-laboratory-based, and staged screening approaches. Across strategies the area under the ROC curve (AUC) was 0.774–0.780 in men and 0.812–0.834 in women. There were no statistically significant differences in AUC between multistage and Framingham-based approaches. In cost-effectiveness analyses, multistage strategies had ICERs of $52,000/QALY and $83,000/QALY for men and women, respectively. Single-stage/Framingham-based strategies were dominated (higher cost and lower QALYs) or had unattractive ICERs (>$300,000/QALY) compared to single-stage/non-laboratory-based and multistage approaches.
Non-laboratory-based CVD risk assessment can be useful in primary CVD prevention, as a substitute for laboratory-based assessments or as the initial component of a multistage approach. Cost-effective multistage screening strategies could avoid 25–75% of laboratory testing used in CVD risk screening with predictive power comparable to Framingham risks.
screening; statin therapy; economics; primary prevention
Rapid treatment of acute coronary syndromes (ACS) is important; causes of delay in emergency medical services (EMS) care of ACS are poorly understood.
Methods and results
Analysis of data from the IMMEDIATE randomized controlled trial of EMS treatment of people with symptoms suggesting ACS, using hierarchical multiple regression of elapsed time. Out-of-hospital electrocardiograms were performed on 54,230 adults calling 9-1-1; 871 had presumed ACS, 303 of whom had ST-segment elevation myocardial infarction and underwent percutaneous coronary intervention. Compared to their counterparts, women, participants with diabetes, and participants without prior cardiovascular disease waited longer to call 9-1-1(by 28, p <0.01; 10, p 0.03; and 6 minutes, p 0.02, respectively). EMS arrival to electrocardiogram was longer for women (1.5 minutes, p <0.01), older individuals (1.3 minutes, p <0.01), and those without a primary complaint of chest pain (3.5 minutes, p <0.01). On-scene times were longer for women (2 minutes, p < 0.01) and older individuals (2 minutes, p <0.01). Older individuals and participants presenting on weekends and nights had longer door-to-balloon times (by 10, 14 and 11 minutes respectively, p < 0.01). Women and older individuals had longer total times (medical contact to balloon inflation 16, p 0.01, and 9 minutes, p <0.01, respectively; symptom onset to balloon inflation 31.5 minutes for women, p 0.02).
We found delays throughout ACS care, resulting in substantial differences in total times for women and older individuals. These delays may impact outcomes; a comprehensive approach to reduce delay is needed.
acute coronary syndrome; emergency medical services; women
Women have an unexplained worse outcome after myocardial infarction (MI) compared with men in many studies. Depressive symptoms predict adverse post-MI outcomes and are more prevalent among women than men. We examined whether depressive symptoms contribute to women’s worse outcomes after MI.
Methods and Results
In a prospective multicenter study (PREMIER), 2411 (807 women) MI patients were enrolled. Depressive symptoms were assessed with the Patient Health Questionnaire. Outcomes included 1-year rehospitalization, presence of angina using the Seattle Angina Questionnaire, and 2-year mortality. Multivariable analyses were used to evaluate the association between sex and these outcomes, adjusting for clinical characteristics. The depressive symptoms score was added to the models to evaluate whether it attenuated the association between sex and outcomes. Depressive symptoms were more prevalent in women compared with men (29% versus 18.8%, P<0.001). After adjusting for demographic factors, comorbidities, and MI severity, women had a mildly higher risk of rehospitalization (hazard ratio, 1.20; 95% CI, 1.04 to 1.40), angina (odds ratio, 1.32; 95% CI, 1.00 to 1.75), and mortality (hazard ratio, 1.27; 95% CI, 0.98 to 1.64). After adding depressive symptoms to the multivariable models, the relationship further declined toward the null, particularly for rehospitalization (hazard ratio, 1.14; 95% CI, 0.98 to 1.34) and angina (odds ratio, 1.22; 95% CI, 0.91 to 1.63), whereas there was little change in the estimate for mortality (hazard ratio, 1.24; 95% CI, 0.95 to 1.62). Depressive symptoms were significantly associated with each of the study outcomes with a similar magnitude of effect in both women and men.
A higher prevalence of depressive symptoms in women modestly contributes to their higher rates of rehospitalization and angina compared with men but not mortality after MI. Our results support the recent recommendations of improving recognition of depressive symptoms after MI.
sex; depression; myocardial infarction; women
Raising the cholesterol of HDL particles is targeted as a cardiovascular disease prevention strategy. However, HDL particles are heterogeneous in composition and structure, which may relate to differences in antiatherogenic potential. We prospectively evaluated the association of HDL subclasses, defined by a recently proposed nomenclature, with incident coronary heart disease (CHD).
Methods and Results
Baseline HDL particle concentrations were measured by nuclear magnetic resonance spectroscopy and categorized into five subclasses (very large, large, medium, small, and very small) among 26,332 initially healthy women. During a median follow-up of 17 years, 969 cases of incident CHD (myocardial infarction, revascularization, and CHD death) were ascertained. In Cox models that adjusted for age, race/ethnicity, blood pressure, smoking, postmenopausal status, and hormone therapy, associations with incident CHD were inverse (p-trend<0.0001) for concentrations of very large (hazard ratio [HR] for top versus bottom quartile 0.49, 95% confidence interval [CI] 0.41–0.60), large (0.54, 0.45–0.64), and medium (0.69, 0.58–0.83) HDL subclasses. Conversely, HRs (95% CIs) for small and very small HDL were 1.22 (1.01–1.46, p-trend=0.08) and 1.67 (1.39–2.02, p-trend<0.0001), respectively. However, after additionally adjusting for metabolic and lipoprotein variables, associations for the spectrum of large, medium, and small HDL subclasses were inverse (p-trend<0.05 for large and small, and 0.07 for medium), while subclasses at either end of the spectrum were not associated with CHD (p-trend=0.97 for very large and 0.21 for very small HDL).
In this prospective study, associations with incident CHD differed by HDL particle subclass, which may be relevant for developing HDL-modulating therapies.
Coronary Disease; Lipids; Lipoproteins; Epidemiology
The Trial to Assess Chelation Therapy (TACT) showed clinical benefit of an ethylene diamine tetraacetic acid (EDTA-based) infusion regimen in patients 50 years or older with prior myocardial infarction (MI). Diabetes prior to enrollment was a pre-specified subgroup.
Methods and Results
Patients received 40 infusions of EDTA chelation or placebo. 633 (37%) had diabetes (322 EDTA, 311 placebo). EDTA reduced the primary endpoint (death, reinfarction, stroke, coronary revascularization, or hospitalization for angina) [25% vs 38%, hazard ratio (HR) 0.59, 95% confidence interval (CI) (0.44, 0.79), p<0.001] over 5 years. The result remained significant after Bonferroni adjustment for multiple subgroups (99.4% CI (0.39, 0.88), adjusted p=0.002). All-cause mortality was reduced by EDTA chelation [10% vs 16%, HR 0.57, 95% CI (0.36, 0.88) p=0.011], as was the secondary endpoint (cardiovascular death, reinfarction, or stroke) [11% vs 17% HR 0.60, 95% CI (0.39, 0.91), p=0.017]. After adjusting for multiple subgroups, however, those results were no longer significant. The number needed to treat to reduce one primary endpoint was 6.5 over 5 years (95% CI (4.4, 12.7). There was no reduction in events in non-diabetics (n=1075, p=0.877), resulting in a treatment by diabetes interaction (p=0.004).
Post-MI diabetic patients age 50 or older demonstrated a marked reduction in cardiovascular events with EDTA chelation. These findings support efforts to replicate these findings and define the mechanisms of benefit. They do not, however, constitute sufficient evidence to indicate the routine use of chelation therapy for all post-MI diabetic patients.
myocardial Infarction; diabetes mellitus; secondary prevention
Although survival after in-hospital cardiac arrest is likely to vary among hospitals caring for children, validated methods to risk-standardize pediatric survival rates across sites do not currently exist.
Methods and Results
Within the American Heart Association’s Get With the Guidelines-Resuscitation registry for in-hospital cardiac arrest, we identified 1,551 cardiac arrests in children (<18 years) from 2006 to 2010. Using multivariable hierarchical logistic regression, we developed and validated a model to predict survival to hospital discharge and calculated risk-standardized rates of cardiac arrest survival for hospitals with a minimum of 10 pediatric cardiac arrest cases. A total of 13 patient-level predictors were identified: age, sex, cardiac arrest rhythm, location of arrest, mechanical ventilation, acute non-stroke neurologic event, major trauma, hypotension, metabolic or electrolyte abnormalities, renal insufficiency, sepsis, illness category, and need for intravenous vasoactive agents prior to the arrest. The model had good discrimination (C-statistic of 0.71), confirmed by bootstrap validation (validation C-statistic of 0.69). Among 30 hospitals with at least 10 cardiac arrests, unadjusted hospital survival rates varied considerably (median, 37%; inter-quartile range [IQR]: 24%–42%; range: 0%–61%). After risk-standardization, the range of hospital survival rates narrowed (median, 37%; IQR: 33%–38%; range: 29%– 48%), but variation in survival persisted.
Using a national registry, we developed and validated a model to predict survival after in-hospital cardiac arrest in children. After risk-standardization, significant variation in survival rates across hospitals remained. Leveraging these models, future studies can identify best practices at high-performing hospitals to improve survival outcomes for pediatric cardiac arrest.
pediatrics; heart arrest; cardiopulmonary resuscitation; risk model
Endovascular aortic aneurysm repair (EVAR) is often offered to patients with abdominal aortic aneurysms (AAAs) considered preoperatively to be unfit for open AAA repair (oAAA). This study describes the short- and long-term outcomes of patients undergoing EVAR with AAAs <6.5 cm who are considered unfit for oAAA.
Methods and Results
We analyzed elective EVARs for AAAs <6.5 cm diameter in the Vascular Study Group of New England (2003–2011). Patients were designated as fit or unfit for oAAA by the treating surgeon. End points included in-hospital major adverse events and long-term mortality. We identified patient characteristics associated with being unfit for open repair and predictors of survival using multivariable analyses. Of 1653 EVARs, 309 (18.7%) patients were deemed unfit for oAAA. These patients were more likely to have advanced age, cardiac disease, chronic obstructive pulmonary disease, and larger aneurysms at the time of repair (54 versus 56 mm, P=0.001). Patients unfit for oAAA had higher rates of cardiac (7.8% versus 3.1%, P<0.01) and pulmonary (3.6 versus 1.6, P<0.01) complications and worse survival rates at 5 years (61% versus 80%; log rank P<0.01) compared with those deemed fit for oAAA. Finally, patients designated as unfit for oAAA had worse survival, even adjusting for patient characteristics and aneurysm size (hazard ratio, 1.6; 95% confidence interval, 1.2–2.2; P<0.01).
In patients with AAAs <6.5 cm, designation by the operating surgeon as unfit for oAAA provides insight into both short- and long-term efficacy of EVAR. Patients unable to tolerate oAAA may not benefit from EVAR unless their risk of AAA rupture is very high.
aneurysm; complications; mortality
Transcatheter aortic valve replacement (TAVR) has emerged as a less invasive option for valve replacement for patients with severe aortic stenosis. Although it has been recommended that TAVR not be offered to patients who will not improve functionally or derive meaningful survival benefit from the procedure, no guidance exists on how best to identify such patients. The first step in this process is to define a poor outcome that can then be used as a foundation for subsequent case identification. We sought to evaluate potential definitions of a poor outcome after TAVR that combine both mortality and quality of life (QoL) components.
Methods and Results
Using data from 463 patients who underwent TAVR as part of the Placement of AoRTic TraNscathetER Valve (PARTNER) Trial, we evaluated 6-month mortality and QoL outcomes using the Kansas City Cardiomyopathy Questionnaire (KCCQ) to explore potential definitions of a poor outcome. We then compared the strengths and weaknesses of each potential definition by examining the relationship between baseline and 6-month KCCQ scores for each patient. Based on these analyses, we argue that the most appropriate definition of a poor outcome after TAVR is either (1) death, (2) KCCQ overall summary score <45, or (3) KCCQ decrease of ≥ 10 points, which best reflects a failure to achieve the therapeutic goals of TAVR.
Using empiric data on a large number of patients enrolled in the PARTNER trial, we propose a definition for poor outcome after TAVR that combines both mortality and QoL measures into a single composite endpoint. Use of this endpoint (or other similar endpoints) in future studies can facilitate development of predictive models that may be useful to identify patients who are poor candidates for TAVR and to provide such patients and their families with appropriate expectations of functional recovery after TAVR.
aortic valve stenosis; quality of life; transcatheter aortic valve; valve
Computed tomographic (CT) scans are central diagnostic tests for ischemic stroke. Their inefficient use is a negative quality measure tracked by the Centers for Medicare and Medicaid Services.
Methods and Results
We performed a retrospective analysis of Medicare fee-for-service claims data for adults admitted for ischemic stroke from 2008 to 2009, with 1-year follow-up. The outcome measures were risk-adjusted rates of high-intensity CT use (≥4 head CT scans) and risk- and price-adjusted Medicare expenditures in the year after admission. The average number of head CT scans in the year after admission, for the 327 521 study patients, was 1.94, whereas 11.9% had ≥4. Risk-adjusted rates of high-intensity CT use ranged from 4.6% (Napa, CA) to 20.0% (East Long Island, NY). These rates were 2.6% higher for blacks than for whites (95% confidence interval, 2.1%–3.1%), with considerable regional variation. Higher fragmentation of care (number of different doctors seen) was associated with high-intensity CT use. Patients living in the top quintile regions of fragmentation experienced a 5.9% higher rate of high-intensity CT use, with the lowest quintile as reference; the corresponding odds ratio was 1.77 (95% confidence interval, 1.71–1.83). Similarly, 1-year risk- and price-adjusted expenditures exhibited considerable regional variation, ranging from $31 175 (Salem, MA) to $61 895 (McAllen, TX). Regional rates of high-intensity CT scans were positively associated with 1-year expenditures (r=0.56; P<0.01).
Rates of high-intensity CT use for patients with ischemic stroke reflect wide practice patterns across regions and races. Medicare expenditures parallel these disparities. Fragmentation of care is associated with high-intensity CT use.
Medicare; multidetector computed tomography; stroke
The prevalence of hypertension among collegiate football athletes is not well described.
Methods and Results
A retrospective cohort of all male athletes who participated in varsity athletics at a National Collegiate Athletic Association Division I university between 1999–2012 was examined through chart review. Mandatory annual preparticipation physical examinations included blood pressure, body mass index, medication use, and supplement use. Prevalence of hypertension was compared between football and non-football athletes. A mixed-effects linear regression model examined change in blood pressure over time. 636 collegiate athletes, including 323 football players, were identified. In the initial year of athletic participation, 19.2% of football athletes had hypertension and 61.9% had prehypertension. The prevalence of hypertension was higher among football athletes than non-football athletes in their initial (19.2% vs. 7.0%, P< 0.001) and final (19.2% vs. 10.2%, P=0.001) years of athletic participation. In adjusted analyses, the odds of hypertension was higher among football athletes in the initial year (AOR 2.28, 95% CI 1.21 to 4.30) but not the final year (AOR 1.25, 95% CI 0.69 to 2.28). Over the course of their collegiate career, football athletes had an annual decrease in systolic blood pressure (−0.82 mmHg, P=0.002), while non-football athletes did not (0.18 mmHg, P=0.58).
Hypertension and prehypertension were common among collegiate football athletes, and football athletes were more likely to have hypertension than male non-football athletes. This presents a potential cardiovascular risk in a young population of athletes. Strategies for increasing awareness, prevention and treatment are needed.
blood pressure; epidemiology; hypertension; physical exercise; athletes
Cardiovascular diseases are rising as a cause of death and disability in China. To improve outcomes for patients with these conditions, the Chinese government, academic researchers, clinicians, and more than 200 hospitals have created China Patient-centered Evaluative Assessment of Cardiac Events (China-PEACE), a national network for research and performance improvement. The first study from China PEACE, the Retrospective Study of Acute Myocardial Infarction (China PEACE-Retrospective AMI Study), is designed to promote improvements in AMI quality of care by generating knowledge about the characteristics, treatments, and outcomes of patients hospitalized with acute myocardial infarction (AMI) across a representative sample of Chinese hospitals over the last decade.
Methods and Results
The China PEACE-Retrospective AMI Study will examine more than 18,000 patient records from 162 hospitals identified using a 2-stage cluster sampling design within economic-geographic regions. Records were chosen from 2001, 2006, and 2011 to identify temporal trends. Data quality will be monitored by a central coordinating center and will, in particular, address case ascertainment, data abstraction, and data management. Analyses will examine patient characteristics, diagnostic testing patterns, in-hospital treatments, in-hospital outcomes, and variation in results by time and site of care. In addition to publications, data will be shared with participating hospitals and the Chinese government to develop strategies to promote quality improvement.
The China PEACE-Retrospective AMI Study is the first to leverage the China PEACE platform to better understand AMI across representative sites of care and over the last decade in China. The China PEACE collaboration between government, academicians, clinicians and hospitals is poised to translate research about trends and patterns of AMI practices and outcomes into improved care for patients.
myocardial infarction; epidemiology; morbidity; mortality
Patients with heart failure (HF) are typically designated as having reduced or preserved ejection fraction (HFREF, HFPEF) because of the importance of left ventricular ejection fraction (LVEF) on therapeutic decisions and prognosis. Such designations are not necessarily static, yet few data exist to describe the natural history of LVEF over time.
Methods and Results
We identified 2413 patients from Kaiser Permanente Colorado with a primary discharge diagnosis of HF between January 1, 2001, and December 31, 2008, who had ≥2 LVEF measurements separated by ≥30 days. We used multi-state Markov modeling to examine transitions among HFREF, HFPEF, and death. We observed a total of 8183 transitions. Women were more likely than men to transition from HFREF to HFPEF (hazard ratio, 1.85; 95% confidence interval, 1.38–2.47). Patients who were adherent to β-blockers were more likely to transition from HFREF to HFPEF (hazard ratio, 1.53; 95% confidence interval, 1.10–2.13) compared with patients who were nonadherent to β-blockers, whereas angiotensin-converting enzyme or angiotensin II receptor blocker adherence was not associated with LVEF transitions. Patients who had a previous myocardial infarction were more likely to transition from HFPEF to HFREF (hazard ratio, 1.75; 95% confidence interval, 1.26–2.42).
In this cohort of patients with HF, LVEF is a dynamic factor related to sex, coexisting conditions, and drug therapy. These findings have implications for left ventricular systolic function ascertainment in patients with HF and support evidence-based therapy use, especially β-blockers.
heart failure; outcomes assessment; prognosis; ventricular ejection fraction
The Walking Impairment Questionnaire (WIQ) is a subjective measure of patient-reported walking performance developed for peripheral arterial disease. The purpose of this study is to examine whether this simple tool can improve the predictive capacity of established risk models and whether the WIQ can be used in patients without peripheral arterial disease.
Methods and Results
At baseline we assessed the walking distance, stair-climbing, and walking speed WIQ category scores among individuals who were undergoing coronary angiography. During a median follow-up of 5.0 years, there were 172 mortalities among 1417 study participants. Adjusted Cox proportional hazards models showed that all 3 WIQ categories independently predicted future all-cause and cardiovascular mortality, including among individuals without peripheral arterial disease (P<0.001). Compared with the cardiovascular risk factors model, we observed significantly increased risk discrimination with a C-index of 0.741 (change in C-index, 0.040; 95% confidence interval, 0.011–0.068) and 0.832 (change in C-index, 0.080; 95% confidence interval, 0.034–0.126) for all-cause and cardiovascular mortality, respectively. Examination of risk reclassification using the net reclassification improvement index showed a 48.4% (P<0.001) improvement for all-cause mortality and a 77.4% (P<0.001) improvement for cardiovascular mortality compared with the cardiovascular risk factors model.
All 3 WIQ categories independently predicted future all-cause and cardiovascular mortality. Importantly, we found that this subjective measure of walking ability could be extended to patients without peripheral arterial disease. The addition of the WIQ scores to established cardiovascular risk models significantly improved risk discrimination and reclassification, suggesting broad clinical use for this simple, inexpensive test.
high-risk populations; mortality; peripheral artery disease; risk factors
The cost-effectiveness of the optimal use of hospital-based acute myocardial infarction (AMI) treatments and their potential impact on coronary heart disease (CHD) mortality in China is not well known.
Methods and Results
The effectiveness and costs of optimal use of hospital-based AMI treatments were estimated by the CHD Policy Model-China, a Markov-style computer simulation model. Changes in simulated AMI, CHD mortality, quality-adjusted life years, and total healthcare costs were the outcomes. The incremental cost-effectiveness ratio was used to assess projected cost-effectiveness. Optimal use of 4 oral drugs (aspirin, β-blockers, statins, and angiotensin-converting enzyme inhibitors) in all eligible patients with AMI or unfractionated heparin in non–ST-segment–elevation myocardial infarction was a highly cost-effective strategy (incremental cost-effectiveness ratios approximately US $3100 or less). Optimal use of reperfusion therapies in eligible patients with ST-segment–elevation myocardial infarction was moderately cost effective (incremental cost-effectiveness ratio ≤$10 700). Optimal use of clopidogrel for all eligible patients with AMI or primary percutaneous coronary intervention among high-risk patients with non–ST-segment– elevation myocardial infarction in tertiary hospitals alone was less cost effective. Use of all the selected hospital-based AMI treatment strategies together would be cost-effective and reduce the total CHD mortality rate in China by ≈9.6%.
Optimal use of most standard hospital-based AMI treatment strategies, especially combined strategies, would be cost effective in China. However, because so many AMI deaths occur outside of the hospital in China, the overall impact on preventing CHD deaths was projected to be modest.
cost-benefit analysis; myocardial infarction; quality-adjusted life years; therapy
Cadmium has been associated with peripheral arterial disease in cross-sectional studies but prospective evidence is lacking. Our goal was to evaluate the association of urine cadmium concentrations with incident peripheral arterial disease in a large population-based cohort.
Methods and Results
A prospective cohort study was performed with 2,864 adult American Indians 45-74 years old from Arizona, Oklahoma and North and South Dakota who participated in the Strong Heart Study in 1989-91 and were followed through two follow-up examination visits in 1993-1995 and 1997-1999. Participants were free of peripheral arterial disease, defined as an ankle brachial index <0.9 or >1.4, at baseline and had complete baseline information on urine cadmium, potential confounders and ankle brachial index determinations in the follow-up examinations. Urine cadmium was measured using inductively coupled plasma mass spectrometry (ICPMS) and corrected for urinary dilution by normalization to urine creatinine.. Multivariable-adjusted hazard ratios (HR) were computed using Cox-proportional hazards models for interval-censored data. A total of 470 cases of incident peripheral arterial disease, defined as an ankle brachial index <0.9 or >1.4, were identified. After adjustment for cardiovascular disease risk factors including smoking status and pack-years, the hazard ratio comparing the 80th to the 20th percentile of urine cadmium concentrations was 1.41 (1.05, 1.81). The hazard ratio comparing the highest to the lowest tertile was 1.96 (1.32, 2.81). The associations persisted after excluding participants with ankle brachial index > 1.4 only as well as in subgroups defined by sex and smoking status.
Urine cadmium, a biomarker of long-term cadmium exposure, was independently associated with incident peripheral arterial disease, providing further support for cadmium as a cardiovascular disease risk factor.
Cadmium; peripheral arterial disease; prospective cohort study; Strong Heart Study
While older patients frequently undergo percutaneous coronary interventions (PCI), frailty, comorbidity, and quality of life (QOL) are seldom part of risk prediction approaches. We assessed their incremental prognostic value over and above the risk factors in the Mayo Clinic risk score (MCRS).
Methods and Results
Patients ≥ 65 years who underwent PCI were assessed for frailty (Fried criteria), comorbidity (Charlson index), and QOL [SF-36]. Of the 628 discharged [median follow-up of 35.0 months (IQR, 22.7-42.9)], 78 died and 72 had an MI. Three year mortality was 28% for frail patients, 6% for non-frail patients. The respective 3-year rates of death or MI were 41% and 17%. Following adjustment, frailty [hazard ratio (HR) 4.19 [95% confidence interval (CI), 1.85, 9.51], physical component score of the SF-36 (HR, 1.59; 95% CI, 1.24-2.02), and comorbidity, (HR, 1.10; 95% CI, 1.05, 1.16) were associated with mortality. Frailty was associated with mortality/MI (HR, 2.61, 1.52, 4.50). Models with conventional MCRS had C-statistics of 0.628, 0.573 for mortality and mortality/MI respectively. Adding frailty, QOL, and comorbidity, the C statistic was (0.675, 0.694, 0.671) for mortality, and (0.607, 0.587, 0.576) for mortality/MI respectively. Including frailty, comorbidities, and SF-36, conferred a discernible improvement to predict death and death/MI (integrated discrimination improvement 0.027 and 0.016 and net reclassification improvement of 43% and 18% respectively).
Following PCI, frailty, comorbidity and poor QOL are prevalent and are associated with adverse long-term outcomes. Their inclusion improves the discriminatory ability of the MCRS derived from the routine cardiovascular risk factors.
aging; angioplasty; coronary disease
Residents who live in neighborhoods that are primarily African-American, Latino, or poor are more likely to have an out-of-hospital cardiac arrest (OHCA), less likely to receive cardiopulmonary resuscitation (CPR), and less likely to survive. No prior studies have been conducted to understand the contributing factors that may decrease the likelihood of residents learning and performing CPR in these neighborhoods. The goal of this study was to identify barriers and facilitators to learning and performing CPR in three low-income, “high-risk” predominantly African American, neighborhoods in Columbus, Ohio.
Methods and Results
Community-Based Participatory Research (CBPR) approaches were used to develop and conduct six focus groups in conjunction with community partners in three target high-risk neighborhoods in Columbus, Ohio in January-February 2011. Snowball and purposeful sampling, done by community liaisons, was used to recruit participants. Three reviewers analyzed the data in an iterative process to identify recurrent and unifying themes. Three major barriers to learning CPR were identified and included financial, informational, and motivational factors. Four major barriers were identified for performing CPR and included fear of legal consequences, emotional issues, knowledge, and situational concerns. Participants suggested that family/self-preservation, emotional, and economic factors may serve as potential facilitators in increasing the provision of bystander CPR.
The financial cost of CPR training, lack of information, and the fear of risking one's own life must be addressed when designing a community-based CPR educational program. Using data from the community can facilitate improved design and implementation of CPR programs.
heart arrest; CPR; sudden death
Coronary computed tomography angiography (cCTA) allows for rapid non-invasive exclusion of obstructive coronary artery disease (CAD). However, concern exists whether implementation of cCTA in the assessment of patients presenting to the emergency room with acute chest pain will lead to increased downstream testing and costs compared to alternative strategies. Our aim was to compare observed actual costs of usual care (UC) with projected costs of a strategy including early cCTA in the evaluation of patients with acute chest pain in the Rule Out Myocardial Infarction Using Computed Tomography (ROMICAT I) study.
Methods and Results
We compared cost and hospital length of stay of UC observed among 368 patients enrolled in the ROMICAT I trial with projected costs of management based on cCTA. Costs of UC were determined by an electronic cost accounting system. Notably, UC was not influenced by cCTA results, as patients and caregivers were blinded to the cCTA results. Costs after early implementation cCTA were estimated assuming changes in management based on cCTA findings of presence and severity of CAD. Sensitivity analysis was used to test influence of key variables on both outcomes and costs.
We determined that in comparison to UC, cCTA-guided triage whereby patients with no CAD are discharged, could reduce total hospital costs by 23%, p < 0.001. However, when the prevalence of obstructive CAD increases, index hospitalization cost increases such that when the prevalence of ≥50% stenosis is greater than 28–33%, the use of cCTA becomes more costly than UC.
cCTA may be a cost saving tool in acute chest pain populations that have a prevalence of potentially obstructive CAD lower than 30%. However, increased cost would be anticipated in populations with higher prevalence of disease.
coronary CT angiography; chest pain; acute coronary syndrome; economics
outcomes research; epidemiology; health services research; cardiovascular diseases
Cognitive impairment (CI), highly prevalent in patients with heart failure (HF), increases risk for hospitalization, and mortality. However, the course of cognitive change in HF is not well characterized. The purpose of this systematic review was to examine the available evidence regarding longitudinal changes in cognitive function in patients with HF.
Methods and Results
A literature search of several electronic databases was performed. Studies published from January 1st, 1980 to September 30th, 2012 that used validated measures to diagnose HF and assess cognitive function two or more times in adults with HF were eligible for inclusion. Change in cognitive function was examined in the context of HF treatments applied (e.g., medication initiation, left ventricular assist device implantation), length of follow-up, and by comparison group. 15 studies met eligibility criteria. Significant decline in cognitive function was noted among patients with HF followed up for >1 year. Improvements in cognition were observed among patients with HF undergoing interventions to improve cardiac function (e.g., heart transplant) and among patients examined over short time periods (< 1 year). Studies comparing HF patient to their own baseline tended to report improvements while studies using a comparison group without HF tended to report declines or stability in cognition over time among patients with HF.
Patients with HF are at increased risk for cognitive decline but this risk appears to be modifiable with cardiac treatment. Further research is needed to identify the mechanisms that cause cognitive change in HF.
heart failure; cognition; epidemiology
In COURAGE, some stable ischemic heart disease (SIHD) patients randomized to optimal medical therapy (OMT) crossed over to early revascularization. The predictors and outcomes of patients who crossed over from OMT to revascularization are unknown.
Methods and Results
We compared characteristics of OMT patients who did and did not undergo revascularization within 12 months and created a Cox regression model to identify predictors of early revascularization. Patients' health status was measured with the Seattle Angina Questionnaire (SAQ). To quantify the potential consequences of initiating OMT without percutaneous coronary intervention (PCI), we compared the outcomes of crossover patients with a matched cohort randomized to immediate PCI. Among 1148 patients randomized to OMT, 185 (16.1%) underwent early revascularization. Patient characteristics independently associated with early revascularization were worse baseline SAQ scores and healthcare system. Among 156 OMT patients undergoing early revascularization matched to 156 patients randomized to PCI, rates of mortality (HR= 0.51 (0.13, 2.1) and nonfatal MI (HR=1.9 (0.75-4.6) were similar, as were 1-year SAQ scores. OMT patients, however, experienced worse health status over the initial year of treatment and more unstable angina admissions (HR=2.8 (1.1, 7.5)).
Among COURAGE patients assigned to OMT alone, patients' angina, dissatisfaction with their current treatment and, to a lesser extent, their health system were associated with early revascularization. Since early crossover was not associated with an increase in irreversible ischemic events, or impaired 12-month health status, these findings support an initial trial of OMT in SIHD with close follow-up of the most symptomatic patients.
Angina; Chronic Coronary Artery Disease; Percutaneous Coronary Interventions; Clinical Trials; Quality of Life
Although readmission following hospitalization for heart failure (HF) has received increasing attention, little is known about its root causes. Prior investigations have relied on administrative databases, chart review, and single-question surveys.
Methods and Results
We performed semi-structured 30-60 minute interviews of patients (n=28) readmitted within 6 months of index HF admission. Established qualitative approaches were used to analyze and interpret data. Interview findings were the primary focus of the study but patient information and provider comments from chart data were also consulted. Patient median age was 61 years, 29% were non-white, 50% were married, 32% had preserved ejection fraction, and median time from discharge to readmission was 31 days. Reasons for readmission were multi-factorial and not easily categorized into mutually exclusive reasons. Five themes emerged as reasons cited for hospital readmission: distressing symptoms, unavoidable progression of illness, influence of psychosocial factors, good but imperfect self-care adherence, and health system failures.
Our study provides the first systematic qualitative assessment of patient perspectives regarding HF readmission. Contrary to prior literature and distinct from what we found documented in the medical record, patient experiences were highly heterogeneous, not easily categorized as preventable versus not preventable, and not easily attributed to a single cause. These findings suggest that future interventions designed to reduce HF readmissions should be multi-faceted, systemic in nature, and integrate patient input.
heart failure; patient centered care; patient readmission; qualitative research; systems of care
ischemic stroke; hemorrhagic transformation; icd9