To determine the hospitalization rates and outcomes of endocarditis among older adults.
Endocarditis is the most serious cardiovascular infection and is especially common among older adults. Little is known about recent trends for endocarditis hospitalizations and outcomes.
Using Medicare inpatient Standard Analytic Files, we identified all Fee-For-Service beneficiaries aged ≥65 years with a principal or secondary diagnosis of endocarditis from 1999-2010. We used Medicare Denominator Files to report hospitalizations per 100,000 person-years. Rates of 30-day and 1-year mortality were calculated using Vital Status Files. We used mixed-effects models to calculate adjusted rates of hospitalization and mortality and to compare the results before and after 2007, when the American Heart Association revised recommendations for endocarditis prophylaxis.
Overall, 262,658 beneficiaries were hospitalized with endocarditis. The adjusted hospitalization rate increased from 1999-2005, reaching 83.5 per 100,000 person-years in 2005, and declined during 2006-2007. After 2007, the decline continued, reaching 70.6 per 100,000 person-years in 2010. Adjusted 30-day and 1-year mortality rates ranged from 14.2% to 16.5% and from 32.6% to 36.2%, respectively. There were no consistent changes in adjusted rates of 30-day and 1-year mortality after 2007. Trends in rates of hospitalization and outcomes were consistent across demographic subgroups. Adjusted rates of hospitalization and mortality declined consistently in the subgroup with principal diagnosis of endocarditis.
Our study highlights the high burden of endocarditis among older adults. We did not observe an increase in adjusted rates of hospitalization or mortality associated with endocarditis after publication of the 2007 guidelines.
Endocarditis; prophylaxis; guidelines; hospitalizations; mortality
To investigate the modulatory effect of the Coxsackie and adenovirus receptor (CAR) on ventricular conduction and arrhythmia vulnerability in the setting of myocardial ischemia.
A heritable component in risk for ventricular fibrillation (VF) during myocardial infarction (MI) has been well established. A recent genome-wide association study (GWAS) for VF during acute MI has led to the identification of a locus on chromosome 21q21 (rs2824292) in the vicinity of the CXADR gene. CXADR encodes the coxsackie and adenovirus receptor (CAR), a cell adhesion molecule predominantly located at intercalated discs of the cardiomyocyte.
The correlation between CAR transcript levels and rs2824292 genotype was investigated in human left ventricular samples. Electrophysiological studies and molecular analyses were performed CAR haploinsufficient mice (CAR+/−).
In human left ventricular samples, the risk allele at the chr21q21 GWAS locus was associated with lower CXADR mRNA levels, suggesting that decreased cardiac levels of CAR predispose to ischemia-induced VF. Hearts from CAR+/− mice displayed ventricular conduction slowing in addition to an earlier onset of ventricular arrhythmias during the early phase of acute myocardial ischemia following LAD ligation. Connexin43 expression and distribution was unaffected, but CAR+/− hearts displayed increased arrhythmia susceptibility upon pharmacological electrical uncoupling. Patch-clamp analysis of isolated CAR+/− myocytes showed reduced sodium current magnitude specifically at the intercalated disc. Moreover, CAR co-precipitated with NaV1.5 in vitro, suggesting that CAR affects sodium channel function through a physical interaction with NaV1.5.
We identify CAR as a novel modifier of ventricular conduction and arrhythmia vulnerability in the setting of myocardial ischemia. Genetic determinants of arrhythmia susceptibility (such as CAR) may constitute future targets for risk stratification of potentially lethal ventricular arrhythmias in patients with coronary artery disease
arrhythmia; ventricular fibrillation; ischemia; single nucleotide polymorphism genetics; ion channels
To assess the prognostic utility of lipoprotein (a) [Lp(a)] in individuals with coronary artery disease (CAD).
Data regarding an association between Lp(a) and cardiovascular (CV) risk in secondary prevention populations are sparse.
Plasma Lp(a) was measured in 6762 subjects with CAD from three studies; data were then combined with eight previously published studies for a total of 18,979 subjects.
Across the three studies, increasing levels of Lp(a) were not associated with the risk of CV events when modeled as a continuous variable (OR 1.03 per log-transformed SD, 95% CI 0.96-1.11) or by quintile (OR Q5:Q1 1.05, 95% CI, 0.83-1.34). When data were combined with previously published studies of Lp(a) in secondary prevention, subjects with Lp(a) levels in the highest quantile were at increased risk of CV events (OR 1.40, 95% CI 1.15-1.71), but with significant between-study heterogeneity (P=0.001). When stratified on the basis of LDL cholesterol, the association between Lp(a) and CV events was significant in studies in which average LDL cholesterol was ≥130 mg/dl (OR 1.46, 95% CI 1.23-1.73, P<0.001), whereas this relationship was not significant for studies with an average LDL cholesterol <130 mg/dl (OR 1.20, 95 CI 0.90-1.60, P=0.21).
Lp(a) is significantly associated with the risk of CV events in patients with established CAD; however, there exists marked heterogeneity across trials. In particular, the prognostic value of Lp(a) in patients with low cholesterol levels remains unclear.
lipoprotein (a); biomarkers; secondary prevention; risk stratification
This study assesses practice variation of secondary prevention medication prescription among coronary artery disease (CAD) patients treated in outpatient practices participating in the NCDR® PINNACLE Registry®.
Among patients with CAD, secondary prevention with a combination of beta-blockers, angiotensin converting enzyme inhibitors/angiotensin receptor blockers, and statins reduces cardiac mortality and myocardial infarction (MI). Accordingly, every CAD patient should receive the combination of these medications for which they are eligible. However, little is known about current prescription patterns of these medications and the variation in use among outpatient cardiology clinics.
Using data from NCDR® PINNACLE Registry®, a national outpatient cardiology practice registry, we assessed medication prescription patterns among eligible CAD patients between July 2008 and December 2010. Overall rates of prescription and variation by practice were calculated, adjusting for patient characteristics.
Among 156,145 CAD patients in 58 practices, 103,830 (66.5%) were prescribed the optimal combination of medications for which they were eligible. The median rate of optimal combined prescription by practice was 73.5% and varied from 28.8% to 100%. After adjustment for patient factors, the practice median rate ratio for prescription was 1.25 (95% CI 1.2,1.32), indicating a 25% likelihood that 2 random practices would differ in treating identical CAD patients.
Among a national registry of CAD patients treated in outpatient cardiology practices, over one-third of patients failed to receive their optimal combination of secondary prevention medications. Significant variation was observed across practices, even after adjusting for patient characteristics, suggesting that quality improvement efforts may be needed to support more uniform practice.
CAD; Outpatient Practice; Secondary Prevention
The purpose of this study was to examine the incidence of nuisance bleeding after AMI and its impact on QOL.
Prolonged dual antiplatelet therapy (DAPT) is recommended after acute myocardial infarction (AMI) to reduce ischemic events, but it is associated with increased rates of major and minor bleeding. The incidence of even lesser degrees of post-discharge “nuisance” bleeding with DAPT and its impact on quality of life (QOL) are unknown.
Data from the 24-center TRIUMPH (Translational Research Investigating Underlying Disparities in Acute Myocardial Infarction Patients’ Health Status) study of 3,560 patients, who were interviewed at 1, 6, and 12 months after AMI, were used to investigate the incidence of nuisance bleeding (defined as Bleeding Academic Research Consortium type 1). Baseline characteristics associated with “nuisance” bleeding and its association with QOL, as measured by the EuroQol 5 Dimension visual analog scale, and subsequent re-hospitalization were examined.
Nuisance (Bleeding Academic Research Consortium type 1) bleeding occurred in 1,335 patients (37.5%) over the 12 months after AMI. After adjusting for baseline bleeding and mortality risk, ongoing DAPT was the strongest predictor of nuisance bleeding (rate ratio [RR]: 1.44, 95% confidence interval [CI]: 1.17 to 1.76 at 1 month; RR: 1.89, 95% CI: 1.35 to 2.65 at 6 months; and RR: 1.39, 95% CI: 1.08 to 1.79 at 12 months; p < 0.01 for all comparisons). Nuisance bleeding at 1 month was independently associated with a decrement in QOL at 1 month (−2.81 points on EuroQol 5 Dimension visual analog scale; 95% CI: 1.09 to 5.64) and non-significantly toward higher re-hospitalization (hazard ratio: 1.20; 95% CI: 0.95 to 1.52).
Nuisance bleeding is common in the year after AMI, associated with ongoing use of DAPT, and independently associated with worse QOL. Improved selection of patients for prolonged DAPT may help minimize the incidence and adverse consequences of nuisance bleeding.
Brain lesions on diffusion-weighted imaging (DWI) are frequently found after carotid artery stenting (CAS), but their clinical relevance remains unclear.
This study sought to investigate whether periprocedural ischemic DWI lesions after CAS or carotid endarterectomy (CEA) are associated with an increased risk of recurrent cerebrovascular events.
In the magnetic resonance imaging (MRI) substudy of ICSS (International Carotid Stenting Study), 231 patients with symptomatic carotid stenosis were randomized to undergo CAS (n = 124) or CEA (n = 107). MRIs were performed 1 to 7 days before and 1 to 3 days after treatment. The primary outcome event was stroke or transient ischemic attack in any territory occurring between the post-treatment MRI and the end of follow-up. Time to occurrence of the primary outcome event was compared between patients with (DWI+) and without (DWI–) new DWI lesions on the post-treatment scan in the CAS and CEA groups separately.
Median time of follow-up was 4.1 years (interquartile range: 3.0 to 5.2). In the CAS group, recurrent stroke or transient ischemic attack occurred more often among DWI+ patients (12 of 62) than among DWI– patients (6 of 62), with a cumulative 5-year incidence of 22.8% (standard error [SE]: 7.1%) and 8.8% (SE: 3.8%), respectively (unadjusted hazard ratio: 2.85; 95% confidence interval: 1.05 to 7.72; p = 0.04). In DWI+ and DWI– patients, 8 and 2 events, respectively, occurred within 6 months after treatment. In the CEA group, there was no difference in recurrent cerebrovascular events between DWI+ and DWI– patients.
Ischemic brain lesions discovered on DWI after CAS seem to be a marker of increased risk for recurrent cerebrovascular events. Patients with periprocedural DWI lesions might benefit from more aggressive and prolonged antiplatelet therapy after CAS. (A Randomised Comparison of the Risks, Benefits and Cost Effectiveness of Primary Carotid Stenting With Carotid Endarterectomy: International Carotid Stenting Study; ISRCTN25337470)
carotid stenosis; DWI lesions; endarterectomy; long-term outcome; stenting; ARWMC, age-related white matter changes; CAS, carotid artery stenting; CEA, carotid endarterectomy; DWI, diffusion-weighted imaging; FLAIR, fluid-attenuated inversion recovery; MRI, magnetic resonance imaging; TIA, transient ischemic attack
The purpose of the study was to assess the effects of maternal HIV-1 (human immunodeficiency virus) infection and vertically transmitted HIV-1 infection on the prevalence of congenital cardiovascular malformations in children.
In the United States, an estimated 7000 children are born to HIV-infected women annually. Previous limited reports have suggested an increase in the prevalence of congenital cardiovascular malformations in vertically transmitted HIV-infected children.
In a prospective longitudinal multicenter study, diagnostic echocardiograms were performed at 4–6-month intervals on two cohorts of children exposed to maternal HIV-1 infection: 1) a Neonatal Cohort of 90 HIV-infected, 449 HIV-uninfected and 19 HIV-indeterminate children; and 2) an Older HIV-Infected Cohort of 201 children with vertically transmitted HIV-1 infection recruited after 28 days of age.
In the Neonatal Cohort, 36 lesions were seen in 36 patients, yielding an overall congenital cardiovascular malformation prevalence of 6.5% (36/558), with a 8.9% (8/90) prevalence in HIV-infected children and a 5.6% (25/449) prevalence in HIV-uninfected children. Two children (2/558, 0.4%) had cyanotic lesions. In the Older HIV-Infected Cohort, there was a congenital cardiovascular malformation prevalence of 7.5% (15/201). The distribution of lesions did not differ significantly between the groups.
There was no statistically significant difference in congenital cardiovascular malformation prevalence in HIV-infected versus HIV-uninfected children born to HIV-infected women. With the use of early screening echocardiography, rates of congenital cardiovascular malformations in both the HIV-infected and HIV-uninfected children were five- to ten-fold higher than rates reported in population-based epidemiologic studies but not higher than in normal populations similarly screened. Potentially important subclinical congenital cardiovascular malformations were detected.
Preclinical Diastolic Dysfunction (PDD) has been broadly defined as subjects with left ventricular diastolic dysfunction, without the diagnosis of congestive heart failure (HF), and with normal systolic function. PDD is an entity which remains poorly understood, yet has definite clinical significance. Although few original studies have focused on PDD, it has been shown that PDD is prevalent, and that there is a clear progression from PDD to symptomatic heart failure including dyspnea, edema, and fatigue. In diabetic patients and patients with coronary artery disease or hypertension, it has been shown that patients with PDD have a significantly higher risk of progression to heart failure and death compared to patients without PDD. Because of these findings and the increasing prevalence of the heart failure epidemic, it is clear that an understanding of PDD is essential to decreasing patients’ morbidity and mortality. This review will focus on what is known concerning preclinical diastolic dysfunction, including definitions, staging, epidemiology, pathophysiology, and the natural history of the disease. In addition, given the paucity of trials focused on PDD treatment, studies targeting risk factors associated with the development of PDD and therapeutic trials for heart failure with preserved ejection fraction will be reviewed.
Heart failure with preserved ejection fraction; Diastolic dysfunction; Echocardiography; Heart failure treatment; Heart failure epidemiology
cardiac biomarkers; diastolic heart failure; echocardiography; mechanics; systolic strain
vasopressin; heart failure; receptor
Phase II trial to assess flurpiridaz F 18 for safety and compare its diagnostic performance for PET myocardial perfusion imaging (MPI) to Tc-99m SPECT-MPI regarding image quality, interpretative certainty, defect magnitude and detection of coronary artery disease (CAD)(≥ 50% stenosis) on invasive coronary angiography (ICA).
In preclinical and phase I studies, flurpiridaz F 18 has shown characteristics of an essentially ideal MPI tracer.
143 patients from 21 centers underwent rest-stress PET and Tc-99m SPECT-MPI. Eighty-six patients underwent ICA, and 39 had low-likelihood of CAD. Images were scored by 3 independent, blinded readers.
A higher % of images were rated as excellent/good on PET vs. SPECT on stress (99.2% vs. 88.5%, p<0.01) and rest (96.9% vs. 66.4, p<0.01) images. Diagnostic certainty of interpretation (% cases with definitely abnormal/normal interpretation) was higher for PET vs. SPECT (90.8% vs. 70.9%, p<0.01). In 86 patients who underwent ICA, sensitivity of PET was higher than SPECT [78.8% vs. 61.5%, respectively (p=0.02)]. Specificity was not significantly different (PET:76.5% vs. SPECT:73.5%). Receiver operating characteristic curve area was 0.82±0.05 for PET and 0.70±0.06 for SPECT (p=0.04). Normalcy rate was 89.7% with PET and 97.4% with SPECT (p=NS). In patients with CAD on ICA, the magnitude of reversible defects was greater with PET than SPECT (p=0.008). Extensive safety assessment revealed that flurpiridaz F 18 was safe in this cohort.
In this Phase 2 trial, PET MPI using flurpiridaz F 18 was safe and superior to SPECT MPI for image quality, interpretative certainty, and overall CAD diagnosis.
Flurpiridaz F18; Myocarial Perfusion; SPECT
Increases in serum creatinine (ΔSCr) from baseline signify acute kidney injury (AKI) but offer little granular information regarding its characteristics. The 10th Consensus Conference of the Acute Dialysis Quality Initiative (ADQI) suggested that combining AKI biomarkers would provide better precision for AKI course prognostication.
This study investigated the value of combining a functional damage biomarker (plasma cystatin C [pCysC]) with a tubular damage biomarker (urine neutrophil gelatinase-associated lipocalin [uNGAL]), forming a composite biomarker for prediction of discrete characteristics of AKI.
Data from 345 children after cardiopulmonary bypass (CPB) were analyzed. Severe AKI was defined as Kidney Disease Global Outcomes Initiative stages 2 to 3 (>100% ΔSCr) within 7 days of CPB. Persistent AKI lasted >2 days. SCr in reversible AKI returned to baseline ≤48 h after CPB. The composite of uNGAL (>200 ng/mg urine Cr = positive [+]) and pCysC (>0.8 mg/l = positive [+]), uNGAL+/pCysC+, measured 2 h after CPB initiation, was compared to ΔSCr increases of ≤50% for correlation with AKI characteristics by using predictive probabilities, likelihood ratios (LR), and area under the curve receiver operating curve (AUC-ROC) values.
Severe AKI occurred in 18% of patients. The composite uNGAL+/pCysC+ demonstrated a greater likelihood than ΔSCr for severe AKI (+LR: 34.2 [13.0:94.0] vs. 3.8 [1.9:7.2]) and persistent AKI (+LR: 15.6 [8.8:27.5] versus 4.5 [2.3:8.8]). In AKI patients, the uNGAL−/pCysC+ composite was superior to ΔSCr for prediction of transient AKI. Biomarker composites carried greater probability for specific outcomes than ΔSCr strata.
Composites of functional and tubular damage biomarkers are superior to ΔSCr for predicting discrete characteristics of AKI.
Acute Dialysis Quality Initiative; acute kidney injury phenotypes; biomarker combinations; cardiac surgery; functional acute kidney injury; pediatric acute kidney injury
We aimed to investigate the characteristics and outcomes of patients with heart failure with preserved ejection fraction (HFpEF) and angina pectoris (AP).
AP is a predictor of adverse events in patients with heart failure with reduced EF. The implications of AP in HFpEF are unknown.
We analyzed HFpEF patients (EF≥50%) who underwent coronary angiography at Duke University Medical Center from 2000–2010 with and without AP in the previous 6 weeks. Time to first event was examined using Kaplan-Meier methods for the primary endpoint of death/myocardial infarction (MI)/revascularization/stroke (i.e., MACE) and secondary endpoints of death/MI/revascularization, death/MI/stroke, death/MI, death and cardiovascular death/cardiovascular hospitalization.
In the Duke Databank, 3517 patients met criteria for inclusion and 1402 (40%) had AP. Those with AP were older with more comorbidities, and prior revascularization vs. non-AP patients. AP patients more often received beta-blockers, ACE-inhibitors, nitrates, and statins (all P<0.05). In unadjusted analysis, AP patients had increased MACE and death/MI/revascularization (both P <0.001), lower rates of death and death/MI (both P<0.05), and similar rates of death/MI/stroke and cardiovascular death/cardiovascular hospitalization (both P>0.1). After multivariable adjustment, those with AP remained at increased risk for MACE (Hazard Ratio [HR] 1.30; 95% Confidence Interval [CI], 1.17–1.45) and death/MI/revascularization (HR 1.29; 95% CI, 1.15–1.43), but were at similar risk for other endpoints (P>0.06).
AP in HFpEF patients with a history of coronary artery disease is common despite medical therapy and is independently associated with increased MACE due to revascularization with similar risk of death, MI, and hospitalization.
heart failure with preserved ejection fraction; angina pectoris; outcomes
The aim of this study was to determine the role of Suppressor of Cytokine Signaling 1 (SOCS1) in graft arteriosclerosis (GA).
GA, the major cause of late cardiac allograft failure, is initiated by immune-mediated endothelial activation resulting in vascular inflammation and consequent neointima formation. SOCS1, a negative regulator of cytokine signaling, is highly expressed in endothelial cells (ECs) and may prevent endothelial inflammatory responses and phenotypic activation.
Clinical specimens of coronary arteries with GA, atherosclerosis, or without disease were collected for histological analysis. SOCS1 knockout or vascular endothelial SOCS1 transgenic mice (VESOCS1) were used in an aorta transplant model of GA. Mouse aortic ECs were isolated for in vitro assays.
Dramatic but specific reduction of endothelial SOCS1 was observed in human GA and atherosclerosis specimens which suggested the importance of SOCS1 in maintaining normal endothelial function. SOCS1 deletion in mice resulted in basal EC dysfunction. After transplantation, SOCS1-deficient aortic grafts augmented leukocyte recruitment and neointima formation, whereas endothelial overexpression of SOCS1 diminished arterial rejection. Induction of endothelial adhesion molecules in early stages of GA was suppressed by the VESOCS1 transgene and this effect was confirmed in cultured aortic ECs. Moreover, VESOCS1 maintained better vascular function during GA progression. Mechanistically, endothelial SOCS1, by modulating both basal and cytokine-induced expression of the adhesion molecules PECAM-1, ICAM-1 and VCAM-1, restrained leukocyte adhesion and trans-endothelial migration during inflammatory cell infiltration.
SOCS1 prevents GA progression by preserving endothelial function and attenuating cytokine-induced adhesion molecule expression in vascular endothelium.
graft arteriosclerosis; SOCS1; endothelial activation; endothelial adhesion molecule
To assess the pattern of the adoption of internal mammary artery (IMA) grafting in the United States, test its association with clinical outcomes, and assess whether its effectiveness differs in key clinical subgroups.
The effect of IMA grafting on major clinical outcomes has never been tested in a large randomized trial, yet it is now a quality standard for coronary artery bypass graft (CABG) surgery.
We identified Medicare beneficiaries aged ≥66 years who underwent isolated multivessel CABG between 1988 and 2008, and documented patterns of IMA use over time. We used a multivariable propensity score to match patients with and without an IMA, and compared rates of death, myocardial infarction (MI), and repeat revascularization. We tested for variations in IMA effectiveness using treatment by covariate interaction tests.
IMA use in CABG rose slowly from 31% in 1988 to 91% in 2008, with persistent wide geographic variations. Among 60,896 propensity score matched patients over a median 6.8 year follow-up, IMA use was associated with lower all-cause mortality (adjusted hazard ratio 0.77, p<0.001), lower death or MI (adjusted hazard ratio 0.77, p<0.001), and fewer repeat revascularization over five years (8% vs. 9%, p<0.001). The association between IMA use and lower mortality was significantly weaker (p≤0.008) for older patients, women, and for patients with diabetes or peripheral arterial disease.
IMA grafting was adopted slowly and still shows substantial geographic variation. IMA use is associated with lower rates of death, MI and repeat coronary revascularization.
Internal Mammary-Coronary Artery Anastomosis; Outcomes Research; Comparative Effectiveness Research
The objective of this study was to determine whether premature ventricular contractions (PVCs) arising from the aortic sinuses of Valsalva (SOV) and great cardiac vein (GCV) have coupling interval (CI) characteristics that differentiate them from other ectopic foci.
PVCs occur at relatively fixed CI from the preceding normal QRS complex in most patients. However, we observed patients with PVCs originating in unusual areas (SOV and GCV) in whom the PVC CI was highly variable. We hypothesized that PVCs from these areas occur seemingly randomly because of the lack of electrotonic effects of the surrounding myocardium.
Seventy-three consecutive patients referred for PVC ablation were assessed. Twelve consecutive PVC CIs were recorded. The ΔCI (maximum – minimum CI) was measured.
We studied 73 patients (age 50 ± 16 years, 47% male). The PVC origin was right ventricular (RV) in 29 (40%), left ventricular (LV) in 17 (23%), SOV in 21 (29%), and GCV in 6 (8%). There was a significant difference between the mean ΔCI of RV/LV PVCs compared with SOV/GCV PVCs (33 ± 15 ms vs. 116 ± 52 ms, p < 0.0001). A ΔCI of >60 ms demonstrated a sensitivity of 89%, specificity of 100%, positive predictive value of 100%, and negative predictive value of 94%. Cardiac events were more common in the SOV/GCV group versus the RV/LV group (7 of 27 [26%] vs. 2 of 46 [4%], p < 0.02).
ΔCI is more pronounced in PVCs originating from the SOV or GCV. A ΔCI of 60 ms helps discriminate the origin of PVCs before diagnostic electrophysiological study and may be associated with increased frequency of cardiac events.
aortic sinus of Valsalva; oupling interval; great cardiac vein; premature ventricular contraction
The goal of this study was to develop a low-energy, implantable device–based multistage electrotherapy (MSE) to terminate atrial fibrillation (AF).
Previous attempts to perform cardioversion of AF by using an implantable device were limited by the pain caused by use of a high-energy single biphasic shock (BPS).
Transvenous leads were implanted into the right atrium (RA), coronary sinus, and left pulmonary artery of 14 dogs. Self-sustaining AF was induced by 6 ± 2 weeks of high-rate RA pacing. Atrial defibrillation thresholds of standard versus experimental electrotherapies were measured in vivo and studied by using optical imaging in vitro.
The mean AF cycle length (CL) in vivo was 112 ± 21 ms (534 beats/min). The impedances of the RA–left pulmonary artery and RA–coronary sinus shock vectors were similar (121 ± 11 Ω vs. 126 ± 9 Ω; p = 0.27). BPS required 1.48 ± 0.91 J (165 ± 34 V) to terminate AF. In contrast, MSE terminated AF with significantly less energy (0.16 ± 0.16 J; p < 0.001) and significantly lower peak voltage (31.1 ± 19.3 V; p < 0.001). In vitro optical imaging studies found that AF was maintained by localized foci originating from pulmonary vein–left atrium interfaces. MSE Stage 1 shocks temporarily disrupted localized foci; MSE Stage 2 entrainment shocks continued to silence the localized foci driving AF; and MSE Stage 3 pacing stimuli enabled consistent RA–left atrium activation until sinus rhythm was restored.
Low-energy MSE significantly reduced the atrial defibrillation thresholds compared with BPS in a canine model of AF. MSE may enable painless, device-based AF therapy.
atrial fibrillation; cardioversion; defibrillation; low energy; multistage electrotherapy
The newly released 2013 ACC/AHA Guidelines for Assessing Cardiovascular Risk makes progress compared with previous cardiovascular risk assessment algorithms. For example, the new focus on total atherosclerotic cardiovascular diseases (ASCVD) is now inclusive of stroke in addition to hard coronary events, and there are now separate equations to facilitate estimation of risk in non-Hispanic white and black individuals and separate equations for women. Physicians may now estimate lifetime risk in addition to 10-year risk. Despite this progress, the new risk equations do not appear to lead to significantly better discrimination than older models. Because the exact same risk factors are incorporated, using the new risk estimators may lead to inaccurate assessment of atherosclerotic cardiovascular risk in special groups such as younger individuals with unique ASCVD risk factors. In general, there appears to be an overestimation of risk when applied to modern populations with greater use of preventive therapy, although the magnitude of overestimation remains unclear. Because absolute risk estimates are directly used for treatment decisions in the new cholesterol guidelines, these issues could result in overuse of pharmacologic management. The guidelines could provide clearer direction on which individuals would benefit from additional testing, such as coronary calcium scores, for more personalized preventive therapies. We applaud the advances of these new guidelines, and we aim to critically appraise the applicability of the risk assessment tools so that future iterations of the estimators can be improved to more accurately assess risk in individual patients.
coronary artery calcium; guidelines; preventive cardiology; risk assessment
To assess whether changes of major atrial fibrillation (AF) risk factors and/or intercurrent cardiovascular events could explain the relationship between type 2 diabetes mellitus (T2D) and incident AF.
Prior studies found an increased risk of incident AF among individuals with T2D, but few if any of these studies did take into account changes of AF risk factors over time.
A total of 34720 female health professionals who participated in the Women's Health Study, and were free of cardiovascular disease and AF at baseline were followed for a median of 16.4 years. Cox proportional-hazards models were constructed to assess the relationship between T2D and incident AF, using either information at baseline or time-varying covariates for both T2D and potential confounders.
At baseline, 937 (2.7 %) women had T2D. Compared to women without T2D, women with T2D had an age-adjusted hazard ratio (HR) for new-onset AF of 1.95 (95% confidence interval (CI), 1.49-2.56, p<0.0001). In multivariable analyses adjusting for baseline confounders, this HR was substantially attenuated, but baseline T2D remained a significant predictor of incident AF (HR 1.37, 95% CI, 1.03-1.83, p=0.03). In time-updated models that adjusted for changes in AF risk factors and intercurrent cardiovascular events, the HR for T2D was attenuated further and became non-significant (HR= 1.14; 95% CI, 0.93 – 1.40, p=0.20).
While this study confirms a significant relationship between baseline T2D and incident AF, our data suggest that the increased risk associated with T2D is mainly mediated by changes of other AF risk factors.
Atrial fibrillation; type 2 diabetes; blood pressure; obesity; women; cardiovascular disease; prospective cohort study