Seizure activity in EEG recordings can persist for hours with seizure dynamics changing rapidly over time and space. To characterise the spatiotemporal evolution of seizure activity, large data sets often need to be analysed. Dynamic causal modelling (DCM) can be used to estimate the synaptic drivers of cortical dynamics during a seizure; however, the requisite (Bayesian) inversion procedure is computationally expensive. In this note, we describe a straightforward procedure, within the DCM framework, that provides efficient inversion of seizure activity measured with non-invasive and invasive physiological recordings; namely, EEG/ECoG. We describe the theoretical background behind a Bayesian belief updating scheme for DCM. The scheme is tested on simulated and empirical seizure activity (recorded both invasively and non-invasively) and compared with standard Bayesian inversion. We show that the Bayesian belief updating scheme provides similar estimates of time-varying synaptic parameters, compared to standard schemes, indicating no significant qualitative change in accuracy. The difference in variance explained was small (less than 5%). The updating method was substantially more efficient, taking approximately 5–10 min compared to approximately 1–2 h. Moreover, the setup of the model under the updating scheme allows for a clear specification of how neuronal variables fluctuate over separable timescales. This method now allows us to investigate the effect of fast (neuronal) activity on slow fluctuations in (synaptic) parameters, paving a way forward to understand how seizure activity is generated.
•We describe a DCM procedure that provides efficient inversion of seizure activity.•Similar accuracy but substantially more efficient compared to standard DCM methods.•Physiological fluctuations over different timescales can be specified.•This scheme should contribute to understanding seizure activity using DCM.
Dynamic causal modelling (DCM); Bayesian belief updating; Epilepsy; Seizure activity; EEG/ECoG
In patients with severe aortic stenosis, transcatheter aortic valve replacement (TAVR) improves survival compared with nonsurgical therapy but with higher in-hospital and lifetime costs. Complications associated with TAVR may decrease with greater experience and improved devices, thereby reducing the overall cost of the procedure. Therefore, we sought to estimate the impact of peri-procedural complications on in-hospital costs and length of stay of TAVR.
Methods and Results
Using detailed cost data from 406 TAVR patients enrolled in the PARTNER I trial, we developed multivariable models to estimate the incremental cost and length of stay associated with specific peri-procedural complications. Attributable costs and length of stay for each complication were calculated by multiplying the independent cost of each event by its frequency in the treatment group. Mean cost for the initial hospitalization was $79,619 ± 40,570 ($50,891 excluding the valve); 49% of patients had ≥1 complication. Seven complications were independently associated with increased hospital costs, with major bleeding, arrhythmia and death accounting for the largest attributable cost per patient. Renal failure and the need for repeat TAVR, although less frequent, were also associated with substantial incremental and attributable costs. Overall, complications accounted for $12,475/patient in initial hospital costs and 2.4 days of hospitalization.
In the PARTNER trial, peri-procedural complications were frequent, costly, and accounted for approximately 25% of non-implant related hospital costs. Avoidance of complications should improve the cost-effectiveness of TAVR for inoperable and high-risk patients, but reductions in the cost of uncomplicated TAVR will also be necessary for optimal efficiency.
transcatheter aortic valve replacement; costs; complications; valve
The American College of Cardiology Foundation (ACCF), in partnership with key specialty and subspecialty societies, conducted a review of common clinical scenarios where noninvasive vascular testing (ultrasound and physiological testing) is frequently considered. The indications (clinical scenarios) were derived from common applications or anticipated uses, as well as from current clinical practice guidelines and results of studies examining the implementation of the original appropriate use criteria (AUC). The 159 indications in this document were developed by a diverse writing group and scored by a separate independent technical panel on a scale of 1 to 9, to designate appropriate use (median 7 to 9), uncertain use (median 4 to 6), and inappropriate use (median 1 to 3).
A total of 255 indications (with the inclusion of surveillance timeframes) were rated. One hundred and seventeen indications were rated as appropriate, 84 were rated as uncertain, and 54 were rated as inappropriate. The AUC for peripheral vascular disease have the potential to impact physician decision making, healthcare delivery, and reimbursement policy. Furthermore, recognition of uncertain clinical scenarios facilitates identification of areas that would benefit from future research.
ACCF Appropriate Use Criteria; abdominal aortic aneurysm; ankle-brachial index; arterial physiological testing; carotid artery disease; duplex ultrasound; mesenteric artery disease; peripheral artery disease; peripheral vascular disease; noninvasive testing; noninvasive vascular laboratory; renal artery stenosis
Many patients have symptoms suggestive of coronary artery disease (CAD) and are often evaluated with the use of diagnostic testing, although there are limited data from randomized trials to guide care.
We randomly assigned 10,003 symptomatic patients to a strategy of initial anatomical testing with the use of coronary computed tomographic angiography (CTA) or to functional testing (exercise electrocardiography, nuclear stress testing, or stress echocardiography). The composite primary end point was death, myocardial infarction, hospitalization for unstable angina, or major procedural complication. Secondary end points included invasive cardiac catheterization that did not show obstructive CAD and radiation exposure.
The mean age of the patients was 60.8±8.3 years, 52.7% were women, and 87.7% had chest pain or dyspnea on exertion. The mean pretest likelihood of obstructive CAD was 53.3±21.4%. Over a median follow-up period of 25 months, a primary end-point event occurred in 164 of 4996 patients in the CTA group (3.3%) and in 151 of 5007 (3.0%) in the functional-testing group (adjusted hazard ratio, 1.04; 95% confidence interval, 0.83 to 1.29; P = 0.75). CTA was associated with fewer catheterizations showing no obstructive CAD than was functional testing (3.4% vs. 4.3%, P = 0.02), although more patients in the CTA group underwent catheterization within 90 days after randomization (12.2% vs. 8.1%). The median cumulative radiation exposure per patient was lower in the CTA group than in the functional-testing group (10.0 mSv vs. 11.3 mSv), but 32.6% of the patients in the functional-testing group had no exposure, so the overall exposure was higher in the CTA group (mean, 12.0 mSv vs. 10.1 mSv; P<0.001).
In symptomatic patients with suspected CAD who required noninvasive testing, a strategy of initial CTA, as compared with functional testing, did not improve clinical outcomes over a median follow-up of 2 years. (Funded by the National Heart, Lung, and Blood Institute; PROMISE ClinicalTrials.gov number, NCT01174550.)
Little is known about the incidence of prosthesis-patient mismatch (PPM) and its impact on outcomes after transcatheter aortic valve replacement (TAVR).
The objectives of this study were: 1) to compare the incidence of PPM in the transcatheter and surgical aortic valve replacement (SAVR) randomized (RCT) arms of the PARTNER-I trial Cohort A; and 2) to assess the impact of PPM on regression of left ventricular (LV) hypertrophy and mortality in these 2 arms and in the TAVR nonrandomized continued access (NRCA) Registry cohort.
The PARTNER trial Cohort A randomized patients 1:1 to TAVR or bioprosthetic SAVR. Postoperative PPM was defined as absent if indexed effective orifice area >0.85, moderate ≥0.65 but ≤0.85, or severe <0.65 cm2/m2. LV mass regression and mortality were analyzed using the SAVR-RCT (n = 270), TAVR-RCT (n = 304) and TAVR-NRCA (n = 1637) cohorts.
Incidence of PPM was 60.0% (severe: 28.1%) in SAVR-RCT versus 46.4% (severe: 19.7%) in TAVR-RCT (p < 0.001) and 43.8% (severe: 13.6%) in TAVR-NRCA. In patients with aortic annulus diameter < 20 mm, severe PPM developed in 33.7% undergoing SAVR compared to 19.0% undergoing TAVR (p = 0.002). PPM was an independent predictor of less LV mass regression at 1 year in SAVR-RCT (p = 0.017) and TAVR-NRCA (p = 0.012) but not in TAVRRCT (p = 0.35). Severe PPM was an independent predictor of 2-year mortality in SAVR-RCT (hazard ratio [HR]: 1.78; p = 0.041) but not in TAVR-RCT (HR: 0.58; p = 0.11). In the TAVRNRCA, severe PPM was not a predictor of 1-year mortality in the whole cohort (HR: 1.05; p = 0.60) but did independently predict mortality in the subset of patients with no post-procedural aortic regurgitation (HR: 1.88; p = 0.02).
In patients with severe aortic stenosis and high surgical risk, PPM is more frequent and more often severe following SAVR than TAVR. Patients with PPM after SAVR have worse survival and less LV mass regression than those without PPM. Severe PPM also has a significant impact on survival after TAVR in the subset of patients with no post-procedural aortic regurgitation. TAVR may be preferable to SAVR in patients with a small aortic annulus who are susceptible to PPM to avoid its adverse impact on LV mass regression and survival.
Doppler echocardiography; aortic regurgitation; left ventricular mass regression; mortality
The 2013 ACC/AHA cholesterol guidelines are being applied to HIV-infected patients but have not been validated in this at-risk population, known to have a high prevalence of subclinical high-risk morphology (HRM) coronary atherosclerotic plaque.
To compare recommendations for statins among HIV-infected subjects with/without HRM coronary plaque according to 2013 ACC/AHA versus 2004 ATP III guidelines.
Data from 108 HIV-infected subjects without known CVD or lipid-lowering treatment who underwent contrast-enhanced computed-tomography angiography were analyzed. Recommendations for statin therapy according to 2013 versus 2004 guidelines were assessed among those with/without HRM coronary plaque.
Among all subjects, 10-year ASCVD risk score was 3.3% (1.6, 6.6), yet 36% of subjects had HRM coronary plaque. Among those with HRM coronary plaque, statins would be recommended for 26% by 2013 guidelines versus 10% by 2004 guidelines (p=0.04). Conversely, among those without HRM coronary plaque, statins would be recommended for 19% by 2013 guidelines versus 7% by 2004 guidelines (p=0.005). In multivariate modeling, while 10-year ASCVD risk score related to HRM coronary plaque burden (p=0.02), so too did other factors not incorporated into 2013 guidelines.
2013 ACC/AHA cholesterol guidelines recommend statin therapy for a higher percentage of subjects with and without HRM coronary plaque relative to 2004 guidelines. However, even by 2013 guidelines, statin therapy would not be recommended for the majority (74%) of HIV-infected subjects with subclinical HRM coronary plaque. Outcome studies are needed to determine the utility of new statin recommendations and the contribution of HRM coronary plaque to CVD events among HIV-infected subjects.
HIV; cholesterol guidelines; atherosclerosis; computed tomography angiography
We characterised the pathophysiology of seizure onset in terms of slow fluctuations in synaptic efficacy using EEG in patients with anti-N-methyl-d-aspartate receptor (NMDA-R) encephalitis. EEG recordings were obtained from two female patients with anti-NMDA-R encephalitis with recurrent partial seizures (ages 19 and 31). Focal electrographic seizure activity was localised using an empirical Bayes beamformer. The spectral density of reconstructed source activity was then characterised with dynamic causal modelling (DCM). Eight models were compared for each patient, to evaluate the relative contribution of changes in intrinsic (excitatory and inhibitory) connectivity and endogenous afferent input. Bayesian model comparison established a role for changes in both excitatory and inhibitory connectivity during seizure activity (in addition to changes in the exogenous input). Seizures in both patients were associated with a sequence of changes in inhibitory and excitatory connectivity; a transient increase in inhibitory connectivity followed by a transient increase in excitatory connectivity and a final peak of excitatory–inhibitory balance at seizure offset. These systematic fluctuations in excitatory and inhibitory gain may be characteristic of (anti NMDA-R encephalitis) seizures. We present these results as a case study and replication to motivate analyses of larger patient cohorts, to see whether our findings generalise and further characterise the mechanisms of seizure activity in anti-NMDA-R encephalitis.
•We characterised seizures in patient with anti-NMDA-R encephalitis using EEG.•Dynamic causal modelling was used to estimate causes of seizure activity.•Characteristic variation of excitatory–inhibitory balance during seizure activity.•This variation was seen for seizures within and between patients.
Anti-NMDA-R encephalitis; EEG; Dynamical causal modelling (DCM); Seizures
In symptomatic patients with suspected coronary artery disease (CAD), computed tomographic angiography (CTA) improves patient selection for invasive coronary angiography (ICA) compared with functional testing. The impact of measuring fractional flow reserve by CTA (FFRCT) is unknown.
Methods and results
At 11 sites, 584 patients with new onset chest pain were prospectively assigned to receive either usual testing (n = 287) or CTA/FFRCT (n = 297). Test interpretation and care decisions were made by the clinical care team. The primary endpoint was the percentage of those with planned ICA in whom no significant obstructive CAD (no stenosis ≥50% by core laboratory quantitative analysis or invasive FFR < 0.80) was found at ICA within 90 days. Secondary endpoints including death, myocardial infarction, and unplanned revascularization were independently and blindly adjudicated. Subjects averaged 61 ± 11 years of age, 40% were female, and the mean pre-test probability of obstructive CAD was 49 ± 17%. Among those with intended ICA (FFRCT-guided = 193; usual care = 187), no obstructive CAD was found at ICA in 24 (12%) in the CTA/FFRCT arm and 137 (73%) in the usual care arm (risk difference 61%, 95% confidence interval 53–69, P< 0.0001), with similar mean cumulative radiation exposure (9.9 vs. 9.4 mSv, P = 0.20). Invasive coronary angiography was cancelled in 61% after receiving CTA/FFRCT results. Among those with intended non-invasive testing, the rates of finding no obstructive CAD at ICA were 13% (CTA/FFRCT) and 6% (usual care; P = 0.95). Clinical event rates within 90 days were low in usual care and CTA/FFRCT arms.
Computed tomographic angiography/fractional flow reserve by CTA was a feasible and safe alternative to ICA and was associated with a significantly lower rate of invasive angiography showing no obstructive CAD.
Angina; Coronary computed tomographic angiography; Fractional flow reserve; Non-invasive testing
Conventional resting left ventricular ejection fraction (LVEF) assessments have limitations for detecting doxorubicin (DOX)-related cardiac dysfunction. Novel resting echocardiographic parameters, including 3-dimen-sional echocardiography (3DE) and global longitudinal strain (GLS), have potential for early identification of chemotherapy-related myocardial injury. Exercise “stress” is an established method to uncover impairments in cardiac function but has received limited attention in the adult oncology setting. We evaluated the utility of an integrated approach using 3DE, GLS, and exercise stress echocardiography for detecting subclinical cardiac dysfunction in early breast cancer patients treated with DOX-containing chemotherapy. Fifty-seven asymptomatic women with early breast cancer (mean 26 ± 22 months post-chemotherapy) and 20 sex-matched controls were studied. Resting left ventricular (LV) function was assessed by LVEF using 2-dimensional echocardiography (2DE) and 3DE and by GLS using 2-dimensional speckle-tracking echocardiography (2D-STE). After resting assessments, subjects completed cardiopulmonary exercise testing with stress 2DE. Resting LVEF was lower in patients than controls by 3DE (55 ± 4 vs. 59 ± 5 %; p = 0.005) but not 2DE (56 ± 4 vs. 58 ± 3 %; p = 0.169). 10 of 51 (20 %) patients had GLS greater than or equal to −17 %, which was below the calculated lower limit of normal (control mean 2SD); this patient subgroup had a mean 20 % impairment in GLS (−16.1 ± 0.9 vs. −20.1 ± 1.5 %; p < 0.001), despite similar LVEF by 2DE and 3DE compared to controls (p > 0.05). Cardiopulmonary function (VO2peak) was 20 % lower in patients than controls (p < 0.001). Exercise stress 2DE assessments of stroke volume (61 ± 11 vs. 69 ± 15 ml; p = 0.018) and cardiac index (2.3 ± 0.9 vs. 3.1 ± 0.8 1 min−1 m−2 mean increase; p = 0.003) were lower in patients than controls. Post-exercise increase in cardiac index predicted VO2peak (r = 0.429, p = 0.001). Resting 3DE, GLS, and exercise stress 2DE detect subclinical cardiac dysfunction not apparent with resting 2DE in post-DOX breast cancer patients.
Adjuvant therapy; Breast cancer; Cardiotoxicity; Echocardiography; Stress testing
Suspected coronary artery disease (CAD) is one of the most common, potentially life threatening diagnostic problems clinicians encounter. However, no large outcome-based randomized trials have been performed to guide the selection of diagnostic strategies for these patients.
The PROMISE study is a prospective, randomized trial comparing the effectiveness of two initial diagnostic strategies in patients with symptoms suspicious for CAD. Patients are randomized to either: 1) functional testing (exercise electrocardiogram, stress nuclear imaging, or stress echocardiogram); or 2) anatomic testing with >=64 slice multidetector coronary computed tomographic angiography. Tests are interpreted locally in real time by subspecialty certified physicians and all subsequent care decisions are made by the clinical care team. Sites are provided results of central core lab quality and completeness assessment. All subjects are followed for ≥1 year. The primary end-point is the time to occurrence of the composite of death, myocardial infarction, major procedural complications (stroke, major bleeding, anaphylaxis and renal failure) or hospitalization for unstable angina.
Over 10,000 symptomatic subjects were randomized in 3.2 years at 193 US and Canadian cardiology, radiology, primary care, urgent care and anesthesiology sites.
Multi-specialty community practice enrollment into a large pragmatic trial of diagnostic testing strategies is both feasible and efficient. PROMISE will compare the clinical effectiveness of an initial strategy of functional testing against an initial strategy of anatomic testing in symptomatic patients with suspected CAD. Quality of life, resource use, cost effectiveness and radiation exposure will be assessed.
Clinical trials.gov identifier NCT01174550
Angina; Coronary computed tomography angiogram; Diagnostic strategy; Stress ECG; Stress echocardiography; Stress nuclear
The lack of standardized reporting of the magnitude of ischemia on noninvasive imaging contributes to variability in translating the severity of ischemia across stress imaging modalities. We identified the risk of coronary artery disease (CAD) death or myocardial infarction (MI) associated with ≥10% ischemic myocardium on stress nuclear imaging as the risk threshold for stress echocardiography and cardiac magnetic resonance. A narrative review revealed that ≥10% ischemic myocardium on stress nuclear imaging was associated with a median rate of CAD death or MI of 4.9%/year (interquartile range: 3.75% to 5.3%). For stress echocardiography, ≥3 newly dysfunctional segments portend a median rate of CAD death or MI of 4.5%/year (interquartile range: 3.8% to 5.9%). Although imprecisely delineated, moderate-severe ischemia on cardiac magnetic resonance may be indicated by ≥4 of 32 stress perfusion defects or ≥3 dobutamine-induced dysfunctional segments. Risk-based thresholds can define equivalent amounts of ischemia across the stress imaging modalities, which will help to translate a common understanding of patient risk on which to guide subsequent management decisions.
cardiac imaging; ischemia; prognosis
Examine the relationship between left ventricular mass (LVM) regression and clinical outcomes after transcatheter aortic valve replacement (TAVR).
LVM regression after valve replacement for aortic stenosis (AS) is assumed to be a favorable effect of LV unloading, but its relationship to improved clinical outcomes is unclear.
Of 2115 patients with symptomatic AS at high surgical risk receiving TAVR in the PARTNER randomized trial or continued access registry, 690 had both severe LVH (LVM index [LVMi] ³149 g/m2 men, ³122 g/m2 women) at baseline and an LVMi measurement 30 days post-TAVR. Clinical outcomes were compared for patients with greater than vs. lesser than median percent change in LVMi between baseline and 30 days using Cox proportional hazard models to evaluate event rates from 30 to 365 days.
Compared to patients with lesser regression, patients with greater LVMi regression had a similar rate of all-cause mortality (14.1% vs. 14.3%, p=0.99), but a lower rate of rehospitalization (9.5% vs. 18.5%, HR 0.50; 95% CI, 0.32–0.78; p=0.002) and a lower rate of rehospitalizations specifically for heart failure (7.3% vs. 13.6%, p=0.01). The association with a lower rate of hospitalizations was consistent across sub-groups and remained significant after multivariable adjustment (HR 0.53; 95% CI, 0.34–0.84; p=0.007). Patients with greater LVMi regression had lower BNP (p=0.002) and a trend toward better quality of life (p=0.06) at 1 year compared to those with lesser regression.
In high-risk patients with severe AS and severe LVH undergoing TAVR, those with greater early LV mass regression had half the rate of rehospitalization over the subsequent year.
aortic stenosis; transcatheter aortic valve replacement; hypertrophic LV remodeling; hospitalizations; heart failure
Erectile dysfunction (ED) is a major adverse effect of radical prostatectomy (RP). We conducted a randomized controlled trial to examine the efficacy of aerobic training (AT) compared with usual care (UC) on ED prevalence in 50 men (n = 25 per group) after RP. AT consisted of five walking sessions per week at 55– 100% of peak oxygen uptake (VO2peak) for 30–60 min per session following a nonlinear prescription. The primary outcome was change in the prevalence of ED, as measured by the International Index of Erectile Function (IIEF), from baseline to 6 mo. Secondary outcomes were brachial artery flow–mediated dilation (FMD), VO2peak, cardiovascular (CV) risk profile (eg, lipid profile, body composition), and patient-reported outcomes (PROs). The prevalence of ED (IIEF score ≤21) decreased by 20% in the AT group and by 24% in the UC group (difference: p = 0.406). There were no significant between-group differences in any erectile function subscale (p > 0.05). Significant between-group differences were observed for changes in FMD and VO2peak, favoring AT. There were no group differences in other markers of CV risk profile or PROs. In summary, nonlinear AT does not improve ED in men with localized prostate cancer in the acute period following RP.
Exercise training; Prostate cancer; Efficacy; Exercise capacity; Endothelial function
New percutaneous coronary intervention (PCI) device technologies are often rapidly adopted into clinical practice, yet few studies have examined the overall impact of these new technologies on patient outcomes in community practice.
In hopes of determining temporal trends in PCI outcomes, we used data from the Centers for Medicare & Medicaid Service's Chronic Condition Warehouse (n = 3,250,836) by comparing patient characteristics and rates of 3-year major adverse cardiac events (MACE) across the balloon angioplasty (POBA) era (01/1991-09/1995), the bare metal stent (BMS) era (02/1998-04/2003), and the drug-eluting stent (DES) era (05/2004-10/2006). The adjusted association between era and outcomes was determined with Cox proportional hazards modeling (POBA era as reference).
Compared with the POBA era, patients undergoing PCI were significantly older and had more medical comorbidities, and the risk for 3-year MACE was significantly lower during the BMS and DES eras (BMS vs. POBA adjusted HR [95% CI]: 0.930 [0.926–0.935]; DES vs. BMS: 0.831 [0.827–0.835]). Compared with males, the adjusted risk for 3-year MACE among females was lower during the POBA era, but slightly higher during the BMS and DES eras. Across all three eras, patients ≥75 years of age had higher adjusted risk for MACE compared with younger patients, and the risk for revascularization was lower for both females and older patients.
Despite its application in older and sicker Medicare beneficiaries, there has been a significant decrease in post-PCI MACE over time. The risk for death or myocardial infarction is higher among females and older patients compared with males and younger patients; therefore, future studies should focus on improving clinical outcomes in these high-risk subgroups.
The purpose of this study was to investigate the extent of pre-exercise participation (“preparticipation”) cardiovascular screening in a heterogeneous cohort of adult cancer patients. The patient risk-stratification profile strongly suggests that the use of formalized preparticipation cardiovascular screening is required in all oncology scenarios, but risk of an exercise-induced event is low, suggesting that the use of exercise testing is not required for pre-exercise clearance in the majority of patients.
The purpose of this study was to investigate the extent of pre-exercise participation (“preparticipation”) health screening in a heterogeneous cohort of adult cancer patients.
Patients (n = 413) with histologically confirmed solid or hematologic malignancy were categorized into preparticipation health screening risk stratification based on American College Sports Medicine (ACSM) recommendations. Risk of an exercise-related event was evaluated during a symptom-limited cardiopulmonary exercise test (CPET) with 12-lead electrocardiography (ECG).
Participant risk was categorized as low risk (n = 59, 14%), moderate risk (n = 217, 53%), and high risk (n = 137, 33%). Mean peak oxygen consumption was 21.7 ± 6.7 mL/kg−1 per minute−1 or 19.5 ± 21.7% below age- and sex-predicted sedentary values. No major serious adverse events or fatal events were observed during CPET procedures. A total of 31 positive ECG tests were observed, for an event rate of 8%. ACSM risk stratification did not predict the risk of a positive test. Age, statin use, antiplatelet therapy use, cardiovascular disease, prior treatment with anthracycline or radiation therapy, and being sedentary were predictors of a positive test (all p < .10).
The patient risk-stratification profile strongly suggests that the use of formalized preparticipation health screening is required in all oncology scenarios; however, risk of an exercise-induced event is low, suggesting that the use of exercise testing is not required for pre-exercise clearance in the majority of patients.
Exercise tolerance testing; Safety; Pre-exercise clearance; Cancer; Oncology; Adverse events
Cardiac stress testing, particularly with imaging, has been the focus of debates about rising health care costs, inappropriate use, and patient safety in the context of radiation exposure.
To determine whether U.S. trends in cardiac stress test use may be attributable to population shifts in demographics, risk factors, and provider characteristics and evaluate whether racial/ethnic disparities exist in physician decision making.
Analyses of repeated cross-sectional data.
National Ambulatory Medical Care Survey and National Hospital Ambulatory Medical Care Survey (1993 to 2010).
Adults without coronary heart disease.
Cardiac stress test referrals and inappropriate use.
Between 1993 to 1995 and 2008 to 2010, the annual number of U.S. ambulatory visits in which a cardiac stress test was ordered or performed increased from 28 per 10 000 visits to 45 per 10 000 visits. No trend was found toward more frequent testing after adjustment for patient characteristics, risk factors, and provider characteristics (P = 0.134). Cardiac stress tests with imaging comprised a growing portion of all tests, increasing from 59% in 1993 to 1995 to 87% in 2008 to 2010. At least 34.6% were probably inappropriate, with associated annual costs and harms of $501 million and 491 future cases of cancer. Authors found no evidence of a lower likelihood of black patients receiving a cardiac stress test (odds ratio, 0.91 [95% CI, 0.69 to 1.21]) than white patients, although some evidence of disparity in Hispanic patients was found (odds ratio, 0.75 [CI, 0.55 to 1.02]).
Cross-sectional design with limited clinical data.
National growth in cardiac stress test use can largely be explained by population and provider characteristics, but use of imaging cannot. Physician decision making about cardiac stress test use does not seem to contribute to racial/ethnic disparities in cardiovascular disease.
Exercise; Breast cancer; Autonomic function; Cardiovascular disease; Cardiotoxicity
The recent Chu et al. (2012) manuscript discusses two key findings regarding feature selection (FS): (1) data driven FS was no better than using whole brain voxel data and (2) a priori biological knowledge was effective to guide FS. Use of FS is highly relevant in neuroimaging-based machine learning, as the number of attributes can greatly exceed the number of exemplars. We strongly endorse their demonstration of both of these findings, and we provide additional important practical and theoretical arguments as to why, in their case, the data-driven FS methods they implemented did not result in improved accuracy. Further, we emphasize that the data-driven FS methods they tested performed approximately as well as the all-voxel case. We discuss why a sparse model may be favored over a complex one with similar performance. We caution readers that the findings in the Chu et al. report should not be generalized to all data-driven FS methods.
Feature selection; Machine learning; Neuroimaging
The application of machine learning to epilepsy can be used both to develop clinically useful computer-aided diagnostic tools, and to reveal pathologically relevant insights into the disease. Such studies most frequently use neurologically normal patients as the control group to maximize the pathologic insight yielded from the model. This practice yields potentially inflated accuracy because the groups are quite dissimilar. A few manuscripts, however, opt to mimic the clinical comparison of epilepsy to non-epileptic seizures, an approach we believe to be more clinically realistic. In this manuscript, we describe the relative merits of each control group. We demonstrate that in our clinical quality FDG-PET database the performance achieved was similar using each control group. Based on these results, we find that the choice of control group likely does not hinder the reported performance. We argue that clinically applicable computer-aided diagnostic tools for epilepsy must directly address the clinical challenge of distinguishing patients with epilepsy from those with non-epileptic seizures.
controls; FDG-PET; epilepsy; non-epileptic seizures; machine learning; neuroimaging
Exercise stress testing is commonly obtained after percutaneous coronary intervention (PCI) performed for acute coronary syndromes (ACS). We compared the relationships between exercise echocardiography and nuclear testing after ACS-related PCI on outcomes and resource use.
Longitudinal observational study using fee-for-service Medicare claims to identify patients undergoing outpatient exercise stress testing with imaging within 15 months after PCI performed for ACS between 2003 and 2004.
Of 63,100 patients undergoing stress testing 3 to 15 months post-PCI, 31,731 (50.3%) underwent an exercise stress test with imaging. Among 29,279 patients undergoing exercise stress testing with imaging, 15.5% received echocardiography. Echocardiography recipients had higher rates of repeat stress testing (adjusted hazard ratio [HR] 2.60, CI 2.19–3.10) compared with those undergoing nuclear imaging in the 90 days after testing, but lower rates of revascularization (adjusted HR 0.87, CI 0.76–0.98) and coronary angiography (adjusted HR 0.88, CI 0.80–0.97). None of these differences persisted subsequent to 90 days after stress testing. Rates of death and readmission for myocardial infarction rates were similar. Total Medicare payments were lower initially after echocardiography (incremental difference $498, CI 488–507), an effect attributed primarily to lower reimbursement for the stress test itself, but not significantly different after 14 months after testing.
In this study using administrative data, echocardiography recipients initially had fewer invasive procedures but higher rates of repeat testing than nuclear testing recipients. However, these differences between echo and nuclear testing did not persist over longer time frames.
Aerobic exercise training (AET) is an effective adjunct therapy to attenuate the adverse side-effects of adjuvant chemotherapy in women with early breast cancer. Whether AET interacts with the antitumor efficacy of chemotherapy has received scant attention. We carried out a pilot study to explore the effects of AET in combination with neoadjuvant doxorubicin–cyclophosphamide (AC+AET), relative to AC alone, on: (i) host physiology [exercise capacity (VO2 peak), brachial artery flow-mediated dilation (BA-FMD)], (ii) host-related circulating factors [circulating endothelial progenitor cells (CEP) cytokines and angiogenic factors (CAF)], and (iii) tumor phenotype [tumor blood flow (15O–water PET), tissue markers (hypoxia and proliferation), and gene expression] in 20 women with operable breast cancer. AET consisted of three supervised cycle ergometry sessions/week at 60% to 100% of VO2 peak, 30 to 45 min/session, for 12 weeks. There was significant time × group interactions for VO2 peak and BA-FMD, favoring the AC+AET group (P < 0.001 and P = 0.07, respectively). These changes were accompanied by significant time × group interactions in CEPs and select CAFs [placenta growth factor, interleukin (IL)-1β, and IL-2], also favoring the AC+AET group (P < 0.05). 15O–water positron emission tomography (PET) imaging revealed a 38%decrease in tumor blood flow in the AC+AET group. There were no differences in any tumor tissue markers (P > 0.05). Whole-genome microarray tumor analysis revealed significant differential modulation of 57 pathways (P < 0.01), including many that converge on NF-κB. Data from this exploratory study provide initial evidence that AET can modulate several host- and tumor-related pathways during standard chemotherapy. The biologic and clinical implications remain to be determined.
To compare echocardiographic findings in patients with critical aortic stenosis following surgical (SAVR) or transcatheter aortic valve replacement (TAVR
The Placement of Aortic Transcatheter Valves trial randomized patients 1:1 to SAVR or TAVR
Echocardiograms were obtained at baseline, discharge, 30 days, 6 months, 1 year, and 2 years post procedure and analyzed in a core laboratory. For the analysis of post-implant variables, the first interpretable study (≤ 6 mos) was used.
Both groups showed a decrease in aortic valve gradients and increase in effective orifice area (EOA) (p < 0.0001) which remained stable over 2 years. Compared to SAVR, TAVR resulted in: larger indexed EOA (p = 0.038), less prosthesis-patient mismatch (p = 0.019), and more total and paravalvular aortic regurgitation (AR) (p < 0.0001). Baseline echocardiographic univariate predictors of death were: lower peak transaortic gradient in TAVR patients; low left ventricular diastolic volume (LVDV), low stroke volume, and greater severity of mitral regurgitation in SAVR patients. Post-implantation echocardiographic univariate predictors of death were: larger LVDV, systolic volume (LVSV) and EOA, decreased ejection fraction, and greater AR in TAVR patients; smaller LVSV and LVDV, low stroke volume, smaller EOA and prosthesis-patient mismatch in SAVR patients.
Patients randomized to either SAVR or TAVR experience enduring, significant reductions in transaortic gradients and increase in EOA. Compared to SAVR, TAVR patients had higher indexed EOA, lower prosthesis-patient mismatch and more AR. Univariate predictors of death for the TAVR group and SAVR groups differed and may allow future refinement in patient selection.
transcatheter aortic valve replacement; aortic stenosis; echocardiography
We examined cardiorespiratory fitness (CRF) levels in early-stage breast cancer patients and determined whether CRF differs as a function of adjuvant therapy regimen.
Patients and methods
A total of 180 early breast cancer patients representing three treatment groups (surgery only, single-modality adjuvant therapy, and multi-modality adjuvant therapy) in the Cooper Center Longitudinal Study (CCLS) were studied. A non-cancer control group (n=180) matched by sex, age, and date of the CCLS visit were included. All subjects underwent an incremental exercise tolerance test to symptom limitation to assess CRF (i.e., peak METs and time to exhaustion).
The mean time from breast cancer diagnosis to exercise tolerance testing was 7.4 ± 6.2 years. In adjusted analyses, time to exhaustion and peak METs were incrementally impaired with the addition of surgery, single-modality, and multi-modality adjuvant therapy compared to matched controls (p=0.006 and p=0.028, respectively). CRF was lowest in the multi-modality group compared to all other groups (all p’s <0.05).
Despite being seven years post diagnosis, asymptomatic early breast cancer survivors have marked reductions in CRF. Patients treated with multi-modal adjuvant therapy have the greatest impairment in CRF.
cardiorespiratory fitness; cardiovascular risk; adjuvant therapy; breast cancer
To determine diagnostic testing patterns after percutaneous coronary intervention (PCI).
Little is known about patterns of diagnostic testing after PCI in the U.S. or the relationship of these patterns with clinical outcomes.
We linked Centers for Medicare & Medicaid Services inpatient and outpatient claims to the National Cardiovascular Data Registry® CathPCI Registry® data from 2005–2007. Hospital quartiles of the cumulative incidence of diagnostic testing use within 12 and 24 months post-PCI were compared for patient characteristics, repeat revascularization, acute myocardial infarction (AMI), and death.
A total of 247,052 patients underwent PCI at 656 institutions. Patient and site characteristics were similar across testing use quartiles. There was a 9% and 20% higher adjusted risk of repeat revascularization in Quartile 3 and Quartile 4 (highest testing rate), respectively, when compared to Quartile 1 (lowest testing rate) (p=0.020 and <0.0001, respectively). The adjusted risk for death or AMI did not differ among quartiles.
While patient characteristics were largely independent of rates of post-PCI testing, higher testing rates was not associated with lower risks of myocardial infarction or death, but repeat revascularization was significantly higher at these sites. Additional studies should examine whether increased testing is a marker for improved quality of post-PCI care or simply increased healthcare utilization.
stress testing; diagnostic catheterization; site-level patterns; patient outcomes
To assess the prognostic and predictive value of bevacizumab-related hypertension, hypertension and efficacy outcomes for seven company-sponsored placebo-controlled phase III studies of bevacizumab were analyzed. Early treatment-related BP increases were not found to have general prognostic importance for patients with advanced cancer.
Hypertension is associated with antivascular endothelial growth factor treatment, but the clinical implications of hypertension are uncertain. To assess the prognostic and predictive value of bevacizumab-related hypertension, a comprehensive analysis of whether hypertension and efficacy outcomes are associated was conducted on seven company-sponsored placebo-controlled phase III studies of bevacizumab.
Patient-specific data were available from 6,486 patients with metastatic colorectal, breast, non-small cell lung, pancreatic, and renal cell cancers. Primary hypertension endpoint was a blood pressure (BP) increase of >20 mmHg systolic or >10 mmHg diastolic within the first 60 days of treatment. Additional endpoints included other predefined thresholds of change in BP and severity of hypertension graded using the National Cancer Institute's Common Terminology Criteria for Adverse Events. To analyze the general prognostic importance of an early BP increase, multivariate Cox regression models were used to assess the correlation between BP changes and progression-free (PFS) and overall survival (OS) outcomes in the control groups. To analyze whether early BP increases could predict for benefit from bevacizumab, similar analyses were conducted in the bevacizumab-treated and control groups.
In six of seven studies, early BP increase was neither predictive of clinical benefit from bevacizumab nor prognostic for the course of the disease. For study AVF2107g, early increased BP was associated with longer PFS and OS times in the bevacizumab group but shorter OS time in the control group.
Early treatment-related BP increases do not predict clinical benefit from bevacizumab based on PFS or OS outcomes. BP increases do not appear to have general prognostic importance for patients with advanced cancer.
Bevacizumab; Hypertension; Clinical outcome; Prognostic factor; Predictive factor