Despite the growing literature on co-morbidity risks in psoriasis, there remains a critical knowledge gap on the degree to which objectively measured psoriasis severity may affect the prevalence of major medical co-morbidities.
To examine the prevalence of major medical co-morbidities in patients with mild, moderate, and severe psoriasis, classified objectively based on body surface area involvement, compared to patients without psoriasis.
Population-based, cross-sectional study.
United Kingdom-based electronic medical records.
9,035 patients aged 25 to 64 years with psoriasis and 90,350 age- and practice-matched patients without psoriasis.
Psoriasis diagnosis and severity, based on body surface area involvement, as determined by provider-based questionnaires.
Main Outcomes and Measures
Prevalence of major co-morbidities comprising the Charlson co-morbidity index.
Among patients with psoriasis, 51.8%, 35.8%, and 12.4% respectively had mild, moderate, and severe disease. Mean Charlson co-morbidity index was increasingly higher in patients with mild (0.375 vs. 0.347), moderate (0.398 vs. 0.342), and severe psoriasis (0.450 vs. 0.348) compared to respective controls (each p < 0.05). Psoriasis overall was associated with higher prevalence of chronic pulmonary disease (adjusted odds ratio 1.08; 95% CI, 1.02–1.15), diabetes (1.22; 1.11–1.35), diabetes with systemic complications (1.34; 1.11–1.62), mild liver disease (1.41; 1.12–1.76), myocardial infarction (1.34; 1.07–1.69), peptic ulcer disease (1.27; 1.03–1.58), peripheral vascular disease (1.38; 1.07–1.77), renal disease (1.28; 1.11–1.48), and rheumatologic disease (2.04; 1.71–2.42). Trend analysis revealed significant associations between psoriasis severity and each of above co-morbidities (each p < 0.05).
Conclusions and Relevance
The burdens of overall medical co-morbidity and of specific co-morbid diseases are greater among psoriasis patients with increasing disease severity. Physicians should be aware of these associations in providing comprehensive care to patients with psoriasis, especially those presenting with more severe disease.
The aim of this study was to compare the accuracy of genetic tables and formal pharmacogenetic algorithms for warfarin dosing.
Pharmacogenetic algorithms based on regression equations can predict warfarin dose, but they require detailed mathematical calculations. A simpler alternative, recently added to the warfarin label by the U.S. Food and Drug Administration, is to use genotype-stratified tables to estimate warfarin dose. This table may potentially increase the use of pharmacogenetic warfarin dosing in clinical practice; however, its accuracy has not been quantified.
A retrospective cohort study of 1,378 patients from 3 anticoagulation centers was conducted. Inclusion criteria were stable therapeutic warfarin dose and complete genetic and clinical data. Five dose prediction methods were compared: 2 methods using only clinical information (empiric 5 mg/day dosing and a formal clinical algorithm), 2 genetic tables (the new warfarin label table and a table based on mean dose stratified by genotype), and 1 formal pharmacogenetic algorithm, using both clinical and genetic information. For each method, the proportion of patients whose predicted doses were within 20% of their actual therapeutic doses was determined. Dosing methods were compared using McNemar’s chi-square test.
Warfarin dose prediction was significantly more accurate (all p < 0.001) with the pharmacogenetic algorithm (52%) than with all other methods: empiric dosing (37%; odds ratio [OR]: 2.2), clinical algorithm (39%; OR: 2.2), warfarin label (43%; OR: 1.8), and genotype mean dose table (44%; OR: 1.9).
Although genetic tables predicted warfarin dose better than empiric dosing, formal pharmacogenetic algorithms were the most accurate.
coumarins; dose prediction; dosing algorithms; FDA label; genetic tables; pharmacogenetics; warfarin
The long-term durability and prognostic significance of improvement in renal function after mechanical circulatory support (MCS) has yet to be characterized in a large multicenter population. The primary goals of this analysis were to describe serial post-MCS changes in estimated glomerular filtration rate (eGFR) and determine their association with all-cause mortality.
Methods and Results
Adult patients enrolled in the Interagency Registry for Mechanically Assisted Circulatory Support (INTERMACS) with serial creatinine levels available (n=3363) were studied. Early post-MCS, eGFR improved substantially (median improvement, 48.9%; P<0.001) with 22.3% of the population improving their eGFR by ≥100% within the first few weeks. However, in the majority of patients, this improvement was transient, and by 1 year, eGFR was only 6.7% above the pre-MCS value (P<0.001). This pattern of early improvement followed by deterioration in eGFR was observed with both pulsatile and continuous-flow devices. Interestingly, poor survival was associated with both marked improvement (adjusted hazard ratio [HR], 1.64; 95% confidence interval [CI], 1.19–2.26; P=0.002) and worsening in eGFR (adjusted HR, 1.63; 95% CI, 1.15–2.13; P=0.004).
Post-MCS, early improvement in renal function is common but seems to be largely transient and not necessarily indicative of an improved prognosis. This pattern was observed with both pulsatile and continuous-flow devices. Additional research is necessary to better understand the mechanistic basis for these complex post-MCS changes in renal function and their associated survival disadvantage.
Clinical Trial Registration
URL: http://www.clinicaltrials.gov. Unique identifier: NCT00119834.
cardio-renal syndrome; heart failure; heart-assist devices; transplantation
Identifying reversible renal dysfunction (RD) in the setting of heart failure is challenging. The goal of this study was to evaluate whether elevated admission blood urea nitrogen/creatinine ratio (BUN/Cr) could identify decompensated heart failure patients likely to experience improvement in renal function (IRF) with treatment.
Methods and Results
Consecutive hospitalizations with a discharge diagnosis of heart failure were reviewed. IRF was defined as ≥20% increase and worsening renal function as ≥20% decrease in estimated glomerular filtration rate. IRF occurred in 31% of the 896 patients meeting eligibility criteria. Higher admission BUN/Cr was associated with inhospital IRF (odds ratio, 1.5 per 10 increase; 95% confidence interval [CI], 1.3–1.8; P<0.001), an association persisting after adjustment for baseline characteristics (odds ratio, 1.4; 95% CI, 1.1–1.8; P=0.004). However, higher admission BUN/Cr was also associated with post-discharge worsening renal function (odds ratio, 1.4; 95% CI, 1.1–1.8; P=0.011). Notably, in patients with an elevated admission BUN/Cr, the risk of death associated with RD (estimated glomerular filtration rate <45) was substantial (hazard ratio, 2.2; 95% CI, 1.6–3.1; P<0.001). However, in patients with a normal admission BUN/Cr, RD was not associated with increased mortality (hazard ratio, 1.2; 95% CI, 0.67–2.0; P=0.59; p interaction=0.03).
An elevated admission BUN/Cr identifies decompensated patients with heart failure likely to experience IRF with treatment, providing proof of concept that reversible RD may be a discernible entity. However, this improvement seems to be largely transient, and RD, in the setting of an elevated BUN/Cr, remains strongly associated with death. Further research is warranted to develop strategies for the optimal detection and treatment of these high-risk patients.
cardiorenal syndrome; heart failure; mortality
The clinical utility of genotype-guided (pharmacogenetically based) dosing of warfarin has been tested only in small clinical trials or observational studies, with equivocal results.
We randomly assigned 1015 patients to receive doses of warfarin during the first 5 days of therapy that were determined according to a dosing algorithm that included both clinical variables and genotype data or to one that included clinical variables only. All patients and clinicians were unaware of the dose of warfarin during the first 4 weeks of therapy. The primary outcome was the percentage of time that the international normalized ratio (INR) was in the therapeutic range from day 4 or 5 through day 28 of therapy.
At 4 weeks, the mean percentage of time in the therapeutic range was 45.2% in the genotype-guided group and 45.4% in the clinically guided group (adjusted mean difference, [genotype-guided group minus clinically guided group], −0.2; 95% confidence interval, −3.4 to 3.1; P=0.91). There also was no significant between-group difference among patients with a predicted dose difference between the two algorithms of 1 mg per day or more. There was, however, a significant interaction between dosing strategy and race (P=0.003). Among black patients, the mean percentage of time in the therapeutic range was less in the genotype-guided group than in the clinically guided group. The rates of the combined outcome of any INR of 4 or more, major bleeding, or thromboembolism did not differ significantly according to dosing strategy.
Genotype-guided dosing of warfarin did not improve anticoagulation control during the first 4 weeks of therapy. (Funded by the National Heart, Lung, and Blood Institute and others; COAG ClinicalTrials.gov number, NCT00839657.)
Warfarin therapy has been used clinically for over 60 years, yet continues to be problematic because of its narrow therapeutic index and large inter-individual variability in patient response. As a result, warfarin is a leading cause of serious medication-related adverse events, and its efficacy is also suboptimal.
To review factors that are responsible for variable response to warfarin, including clinical, environmental, and genetic factors, and to explore some possible approaches to improving warfarin therapy.
Recent efforts have focused on developing dosing algorithms that included genetic information to try to improve warfarin dosing. These dosing algorithms hold promise, but have not been fully validated or tested in rigorous clinical trials. Perhaps equally importantly, adherence to warfarin is a major problem that should be addressed with innovative and cost-effective interventions.
Additional research is needed to further test whether interventions can be used to improve warfarin dosing and outcomes.
adherence; pharmacogenetics; prediction; warfarin
Digitalis glycosides are known to improve the hemodynamic and neurohormonal perturbations that contribute to heart failure (HF) induced renal dysfunction (RD). The objective of this study was to determine if randomization to digoxin is associated with improvement in renal function (IRF) and to evaluate if patients with digoxin induced IRF have improved clinical outcomes.
Methods and Results
Patients in the Digitalis Investigation Group dataset with protocol driven 12 month serum creatinine levels (performed in a central laboratory, n=980) were studied. IRF was defined as a post randomization ≥ 20% increase in estimated glomerular filtration rate (eGFR). IRF occurred in 15.5% of the population (mean improvement in eGFR 34.5 ± 15.4%) and was more common in patients randomized to digoxin (adjusted OR=1.6, p=0.02). In patients without IRF, digoxin was not associated with reduced death or hospitalization (adjusted HR=0.96, 95% CI 0.8–1.2, p=0.67). However, in the group with IRF, digoxin was associated with substantially improved hospitalization free survival (adjusted HR=0.49, 95% CI 0.3–0.8, p=0.006, p interaction=0.026).
In this subset of the DIG trial, digoxin was associated with long term improvement in kidney function, and in patients demonstrating this favorable renal response, reduction in death or hospitalization. Additional research is necessary to confirm these hypothesis generating findings.
Cardio-renal syndrome; improved renal function; digoxin; mortality
The purpose of this study was to investigate if a surrogate for renal neurohormonal activation, blood urea nitrogen (BUN), could identify patients destined to experience adverse outcomes associated with the use of high dose loop diuretics (HDLD).
Loop diuretics are commonly used to control congestive symptoms in heart failure; however, these agents cause neurohormonal activation and are associated with worsened survival.
Subjects in the Beta-Blocker Evaluation of Survival Trial receiving loop diuretics at baseline were analyzed (n=2456). The primary outcome was the interaction between BUN and HDLD associated mortality.
In the overall cohort, HDLD use (≥160 mg/day) was associated with increased mortality (HR=1.56, 95% CI 1.35 to 1.80). However, after extensively controlling for baseline characteristics, this association did not persist (HR=1.06, 95% CI 0.89 to 1.25). In subjects with BUN levels above the median (21.0 mg/dl), both the unadjusted (HR=1.59, 95% CI 1.34 to 1.88) and adjusted (HR=1.29, 95% CI 1.07 to 1.60) risk for death was higher in the HDLD group. In patients with BUN levels below the median, there was no associated risk with HDLD (HR=0.99, 95% CI 0.75 to 1.34) and after controlling for baseline characteristics, the HDLD group had significantly improved survival (HR=0.71, 95% CI 0.49 to 0.96) (p interaction=0.018).
The risk associated with HDLD use is strongly dependent on BUN concentrations with reduced survival in patients with elevated BUN and improved survival in patients with normal BUN. These data suggest a role for neurohormonal activation in loop diuretic associated mortality.
Congestive heart failure; Loop diuretics; Kidney; Mortality
Proton pump inhibitors (PPIs) and corticosteroids are commonly prescribed drugs; however, each has been associated with fracture and community acquired pneumonia. How physicians select patients for co-therapy may have implications for potential additive or synergistic toxicities.
We conducted a retrospective cohort study of 13,749 incident corticosteroid users with no prior PPI exposure using the HealthCore Integrated Research DatabaseSM. We used logistic regression to evaluate the association between PPI initiation in the first 30 days of steroid therapy and corticosteroid dose, clinical risk factors including co-morbid diseases, and medication use including prescription nonsteroidal anti-inflammatory drugs (NSAIDs).
1,050 (7.6%) patients filled a new PPI prescription within 30 days of starting corticosteroids. PPI use was associated with the number of baseline co-morbid conditions (OR 1.21 for each additional condition, CI 1.13–1.28), recent hospitalization (OR 4.71, CI 4.02–5.52), prednisone dose above 40mg/day (OR 1.87, CI1.45–2.41), history of gastroesophageal reflux or gastric ulcer disease (OR 1.54, CI 1.24– 1.91), renal insufficiency (OR 2.06, CI 1.73–2.46), and liver disease (OR 1.82, CI 1.45–2.28). Concomitant use of prescription NSAIDs was also associated with PPI use (OR 1.89, CI 1.32–2.70); however, the total use of PPIs in this group was low (6.3%, CI 4.4–8.2%).
Overall, PPI therapy among corticosteroid users was uncommon, even among those with risk factors for gastrointestinal toxicity. PPI use was significantly more common among patients who had recently been hospitalized, had a greater burden of co-morbid illness, or were receiving high daily doses of corticosteroids.
proton pump inhibitors; corticosteroids; gastroprotection; adverse events
Poor adherence to efficacious cardiovascular related medications has led to considerable morbidity, mortality, and avoidable health care costs. This paper provides results of a recent think tank meeting in which various stakeholder groups representing key experts from consumers, community health providers, the academic community, decision-making government officials (FDA, NIH, etc), and industry scientists met to evaluate the current status of medication adherence and provide recommendations for improving outcomes. Below, we review the magnitude of the problem of medication adherence, prevalence, impact, and cost. We then summarize proven effective approaches and conclude with a discussion of recommendations to address this growing and significant public health issue of medication non adherence.
Antipsychotic drugs have been linked to QT-interval prolongation, a presumed marker of cardiac risk, and torsade de pointes.
To examine the associations between antipsychotics and 1) outpatient-originated sudden cardiac death and ventricular arrhythmia (SD/VA) and 2) all-cause death.
Two retrospective cohort studies
Medicaid programs of California, Florida, New York, Ohio and Pennsylvania.
Incident antipsychotic users aged 30–75 years.
Main Outcome Measures
1) Incident, first-listed emergency department or principal inpatient SD/VA diagnoses; and 2) death reported in the Social Security Administration Death Master File.
Among 459,614 incident antipsychotic users, the incidences of SD/VA and death were 3.4 and 35.1 per 1,000 person-years, respectively. Compared to olanzapine as the referent, adjusted hazard ratios (HRs) for SD/VA were 2.06 (95% CI, 1.20–3.53) for chlorpromazine, 1.72 (1.28–2.31) for haloperidol, and 0.73 (0.57–0.93) for quetiapine. Adjusted HRs for perphenazine and risperidone were consistent with unity. In a subanalysis limited to first prescription exposures, HRs for chlorpromazine and haloperidol were further elevated (2.54 [1.07–5.99] and 2.68 [1.59–4.53], respectively), with the latter exhibiting a dose-response relationship. Results for death were similar.
Haloperidol and chlorpromazine had less favorable cardiac safety profiles than olanzapine. Among atypical agents, risperidone had a similar cardiac safety profile to olanzapine, whereas quetiapine was associated with 30% and 20% lower risks of SD/VA and death, respectively, compared to olanzapine. These measured risks do not correlate well with average QT prolongation, further supporting the notion that average QT prolongation may be a poor surrogate of antipsychotic arrhythmogenicity.
Antipsychotic agents; Cardiac arrhythmias; Cohort studies; Death; International classification of diseases; Medicaid; Pharmacoepidemiology; Proportional hazards models; Sudden death; Torsades de pointes
To compare the incidence rates of serious cardiovascular events in adult initiators of amphetamines or atomoxetine to rates in non-users.
This was a retrospective cohort study of new amphetamines (n = 38,586) or atomoxetine (n = 20,995) users. Each medication user was matched to up to four non-users on age, gender, data source, and state (n = 238,183). The following events were primary outcomes of interest 1) sudden death or ventricular arrhythmia, 2) stroke, 3) myocardial infarction, 4) a composite endpoint of stroke or myocardial infarction. Cox proportional hazard regression was used to calculate propensity-adjusted hazard ratios for amphetamines versus matched non-users and atomoxetine versus matched non-users, with intracluster dependence within matched sets accounted for using a robust sandwich estimator.
The propensity-score adjusted hazard ratio for amphetamines use versus non-use was 1.18 (95% CI: 0.55–2.54) for sudden death/ventricular arrhythmia, 0.80 (95% CI: 0.44–1.47) for stroke, 0.75 (95% CI: 0.42–1.35) for myocardial infarction, and 0.78 (95% CI: 0.51–1.19) for stroke/myocardial infarction. The propensity-score adjusted hazard ratio for atomoxetine use versus non-use was 0.41 (95% CI: 0.10–1.75) for sudden death/ventricular arrhythmia, 1.30 (95% CI: 0.52–3.29) for stroke, 0.56 (95% CI: 0.16–2.00) for myocardial infarction, and 0.92 (95% CI: 0.44–1.92) for stroke/myocardial infarction.
Initiation of amphetamines or atomoxetine was not associated with an elevated risk of serious cardiovascular events. However, some of the confidence intervals do not exclude modest elevated risks, e.g. for sudden death/ventricular arrhythmia.
In the setting of acute decompensated heart failure, worsening renal function (WRF) and improved renal function (IRF) have been associated with similar hemodynamic derangements and poor prognosis. Our aim was to further characterize IRF and its associated mortality risk.
Methods and Results
Consecutive patients with a discharge diagnosis of congestive heart failure at the Hospital of the University of Pennsylvania were reviewed. IRF was defined as a ≥20% improvement and WRF as a ≥20% deterioration in glomerular filtration rate. Overall, 903 patients met eligibility criteria, 31.4% experiencing IRF. Baseline venous congestion/right sided cardiac dysfunction was more common (p≤0.04) and volume of diuresis (p=0.003) was greater in patients with IRF. IRF was associated with a greater incidence of pre-admission (OR=4.2, 95% CI 2.6–6.7, p<0.0001) and post-discharge (OR=1.8, 95% CI 1.2–2.7 p=0.006) WRF. IRF was associated with increased mortality (adjusted HR=1.3, 95% CI 1.1–1.7, p=0.011), a finding largely restricted to patients with post-discharge recurrence of renal dysfunction (p interaction=0.038).
IRF is associated with significantly worsened survival and may represent the resolution of venous congestion induced pre-admission WRF. Unlike WRF, the renal dysfunction in IRF patients occurs independent of the confounding effects of acute decongestion and may provide incremental information for the study of cardio-renal interactions.
Cardio-renal syndrome; Worsening renal function; Venous congestion
Renal neurohormonal activation leading to a reduction in glomerular filtration rate (GFR) has been suggested as a mechanism for renal insufficiency (RI) in the setting of heart failure. We hypothesized that RI occurring in the presence of renal neurohormonal activation may be prognostically more important than RI in the absence of renal neurohormonal activation.
Methods and results
Subjects in the Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness (ESCAPE) trial (n = 429), Beta-Blocker Evaluation of Survival Trial (BEST) (n = 2691), and Studies Of Left Ventricular Dysfunction (SOLVD) trial (n = 6782) limited datasets were studied. The blood urea nitrogen to creatinine ratio (BUN/Creatinine) was employed as a surrogate for renal neurohormonal activation and the primary outcome was the interaction between BUN/Creatinine and RI associated mortality. Baseline RI (GFR < 60 mL/min/1.73 m²) was associated with mortality in all study populations (P < 0.001). In patients with higher BUN/Creatinine, the risk of mortality was consistently greater in patients with RI [adjusted hazard ratio (HR) ESCAPE = 2.8, 95% confidence interval (CI) 1.3–14.3, P = 0.019; BEST = 1.6, 95% CI 1.2–2.2, P = 0.002; SOLVD = 1.6, 95% CI 1.3–2.0, P = 0.001]. However, in patients with lower BUN/Creatinine, the risk of mortality was not elevated in patients with RI (adjusted HR ESCAPE = 0.94, 95% CI 0.35–2.4, P = 0.90, P interaction = 0.005; BEST = 0.97, 95% CI 0.64–1.4, P = 0.90, P interaction = 0.02; SOLVD = 1.0, 95% CI 0.8–1.3, P = 0.71, P interaction = 0.005).
The association between RI and poor survival observed in heart failure populations appears to be contingent not simply on the presence of a reduced GFR, but possibly on the mechanism by which GFR is reduced.
Cardio-renal syndrome; Heart Failure; Chronic kidney disease; Neurohormonal activation; Mortality
Worsening renal function (WRF) in the setting of heart failure has been associated with increased mortality. However, it is unclear if this decreased survival is a direct result of the reduction in glomerular filtration rate (GFR) or if the mechanism underlying the deterioration in GFR is driving prognosis. Given that WRF in the setting of angiotensin converting enzyme inhibitor (ACE-I) initiation is likely mechanistically distinct from spontaneously occurring WRF, we sought to investigate the relative early WRF associated mortality rates in subjects randomized to ACE-I or placebo.
Methods and Results
Subjects in the Studies Of Left Ventricular Dysfunction limited data set were studied (6,377 patients). The interaction between early WRF (decrease in estimated GFR ≥20% at 14 days), randomization to enalapril, and mortality was the primary endpoint. In the overall population, early WRF was associated with increased mortality (adjusted HR=1.2, 95% CI 1.0–1.4, p=0.037). When analysis was restricted to the placebo group, this association strengthened (adjusted HR=1.4, 95% CI 1.1–1.8, p=0.004). However, in the enalapril group, early WRF had no adverse prognostic significance (adjusted HR=1.0, 95% CI 0.8–1.3, p=1.0, p interaction=0.09). In patients that continued study drug despite early WRF, a survival advantage remained with enalapril therapy (adjusted HR=0.66, 95% CI 0.5–0.9, p=0.018).
These data support the notion that the mechanism underlying WRF is important in determining its prognostic significance. Specifically, early WRF in the setting of ACE-I initiation appears to represent a benign event which is not associated with a loss of benefit from continued ACE-I therapy.
cardio-renal syndrome; worsening renal function; kidney; ACE inhibitor
To examine the association between exposure to antidepressants and emergency department or inpatient admission for sudden cardiac death and ventricular arrhythmia (SD/VA), and to examine the impact of dose and cytochrome P-450 inhibition
A cohort study was conducted within 1999–2003 Medicaid claims data from beneficiaries of five large states, supplemented with Medicare claims for dually-eligible individuals. Exposures were prescription claims for antidepressants of interest or a reference antidepressant. Outcomes were incident first-listed emergency department or principal inpatient diagnoses indicative of SD/VA originating in the outpatient setting, an outcome previously found to have a positive predictive value of 85%.
In 1.3 million person-years of antidepressant exposure, we identified 4,222 SD/VA outcomes for a rate of 3.3/1,000 person-years (95% CI, 3.2 – 3.4). Compared to paroxetine (a referent with a putatively favorable cardiovascular risk profile), adjusted hazard ratios (HRs) were 0.80 (0.67 – 0.95) for bupropion, 1.24 (0.93 – 1.65) for doxepin, 0.79 (0.55 – 1.15) for lithium, and 1.26 (1.11 – 1.42) for mirtazapine. HRs for amitriptyline, citalopram, fluoxetine, nefazodone, nortriptyline, sertraline, trazodone, and venlafaxine were near unity. For antidepressants having non-null risks (bupropion and mirtazapine), we observed no relationship with antidepressant dose and some relationships with concomitant cytochrome P-450 inhibition.
Of antidepressants studied, only mirtazapine had a statistically significantly greater SD/VA risk versus paroxetine. However, baseline differences between these users suggest that this finding may be attributable to residual confounding. Eleven other antidepressants had SD/VA risks no greater than that of paroxetine, thereby providing reassurance regarding the comparative cardiovascular safety of antidepressants.
Antidepressive Agents; Death, Sudden, Cardiac; Arrhythmias, Cardiac; Medicaid; Medicare; Centers for Medicare and Medicaid Services; Cohort Studies; Pharmacoepidemiology
Increasing epidemiological evidence suggests independent associations between psoriasis and cardiovascular and metabolic disease. Our objective was to test the hypothesis that directly-assessed psoriasis severity relates to the prevalence of metabolic syndrome and its components.
Population-based, cross-sectional study using computerized medical records from The Health Improvement Network Study population included individuals aged 45-65 years with psoriasis and practice-matched controls. Psoriasis diagnosis and extent were determined using provider-based questionnaires. Metabolic syndrome was defined using National Cholesterol Education Program (NCEP) Adult Treatment Panel (ATP) III criteria.
44,715 individuals were included: 4,065 with psoriasis and 40,650 controls. 2,044 participants had mild psoriasis (≤2% body surface area (BSA)), 1,377 had moderate (3-10% BSA), and 475 had severe psoriasis (>10% BSA). Psoriasis was associated with metabolic syndrome, adjusted odds ratio (OR) 1.41 (95% CI 1.31-1.51), varying in a “dose-response” manner, from mild (adj. OR 1.22, 95% CI 1.11-1.35) to severe psoriasis (adj. OR 1.98, 95% CI 1.62-2.43).
Psoriasis is associated with metabolic syndrome and the association increases with increasing disease severity. Furthermore, associations with obesity, hypertriglyceridemia and hyperglycemia increase with increasing disease severity independent of other metabolic syndrome components. These findings suggest that screening for metabolic disease should be considered for psoriasis, especially when extensive.
One of the primary determinants of blood flow in regional vascular beds is perfusion pressure. Our aim was to investigate if reduction in blood pressure during the treatment of decompensated heart failure would be associated with worsening renal function (WRF). Our secondary aim was to evaluate the prognostic significance of this potentially treatment-induced form of WRF.
Methods and results
Subjects included in the Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness (ESCAPE) trial limited data were studied (386 patients). Reduction in systolic blood pressure (SBP) was greater in patients experiencing WRF (−10.3 ± 18.5 vs. −2.8 ± 16.0 mmHg, P < 0.001) with larger reductions associated with greater odds for WRF (odds ratio = 1.3 per 10 mmHg reduction, P < 0.001). Systolic blood pressure reduction (relative change > median) was associated with greater doses of in-hospital oral vasodilators (P ≤ 0.017), thiazide diuretic use (P = 0.035), and greater weight reduction (P = 0.023). In patients with SBP-reduction, WRF was not associated with worsened survival [adjusted hazard ratio (HR) = 0.76, P = 0.58]. However, in patients without SBP-reduction, WRF was strongly associated with increased mortality (adjusted HR = 5.3, P < 0.001, P interaction = 0.001).
During the treatment of decompensated heart failure, significant blood pressure reduction is strongly associated with WRF. However, WRF that occurs in the setting of SBP-reduction is not associated with an adverse prognosis, whereas WRF in the absence of this provocation is strongly associated with increased mortality. These data suggest that WRF may represent the final common pathway of several mechanistically distinct processes, each with potentially different prognostic implications.
Cardio-renal syndrome; Worsening renal function; Kidney; Decompensated heart failure; Blood pressure
The objective of this study was to compare the rate of severe cardiovascular events and death in children who use attention-deficit/hyperactivity disorder (ADHD) medications versus nonusers.
PATIENTS AND METHODS:
We performed a large cohort study using data from 2 administrative databases. All children aged 3 to 17 years with a prescription for an amphetamine, atomoxetine, or methylphenidate were included and matched with up to 4 nonusers on the basis of data source, gender, state, and age. Cardiovascular events were validated using medical records. Proportional hazards regression was used to calculated hazard ratios.
We identified 241 417 incident users (primary cohort). No statistically significant difference between incident users and nonusers was observed in the rate of validated sudden death or ventricular arrhythmia (hazard ratio: 1.60 [95% confidence interval (CI): 0.19–13.60]) or all-cause death (hazard ratio: 0.76 [95% CI: 0.52–1.12]). None of the strokes identified during exposed time to ADHD medications were validated. No myocardial infarctions were identified in ADHD medication users. No statistically significant difference between prevalent users and nonusers (secondary cohort) was observed (hazard ratios for validated sudden death or ventricular arrhythmia: 1.43 [95% CI: 0.31–6.61]; stroke: 0.89 [95% CI: 0.11–7.11]; stroke/myocardial infarction: 0.72 [95% CI: 0.09–5.57]; and all-cause death: 0.77 [95% CI: 0.56–1.07).
The rate of cardiovascular events in exposed children was very low and in general no higher than that in unexposed control subjects. Because of the low number of events, we have limited ability to rule out relative increases in rate.
children; adolescents; amphetamines; atomoxetine; methylphenidate; cardiovascular; death
Worsening renal function (WRF) commonly complicates the treatment of acute decompensated heart failure. Despite considerable investigation in this area, it remains unclear to what degree WRF is a reflection of treatment versus patient related factors. We hypothesized that if WRF is significantly influenced by factors intrinsic to the patient than WRF during an index hospitalization should predict WRF during subsequent hospitalization.
Consecutive admissions to the Hospital of the University of Pennsylvania with a discharge diagnosis of congestive heart failure were reviewed. Patients with >1 hospitalization were retained for analysis.
In total 181 hospitalization pairs met the inclusion criteria. Baseline patient characteristics demonstrated significant correlation between hospitalizations (p≤0.002 for all) but minimal association with WRF. In contrast, variables related to the aggressiveness of diuresis were weakly correlated between hospitalizations but significantly associated with WRF (p≤0.024 for all). Consistent with the primary hypothesis, WRF during the index hospitalization was strongly associated with WRF during subsequent hospitalization (OR=2.7, p=0.003). This association was minimally altered after controlling for traditional baseline characteristics (OR=2.5, p=0.006) and in-hospital treatment related parameters (OR=2.8, p=0.005).
A prior history of WRF is strongly associated with subsequent episodes of WRF, independent of in-hospital treatment received. These results suggest that baseline factors intrinsic to the patient’s cardiorenal pathophysiology have substantial influence on the subsequent development of WRF.
By guiding initial warfarin dose, pharmacogenetic (PGx) algorithms may improve the safety of warfarin initiation. However, once INR response is known, the contribution of PGx to dose refinements is uncertain. This study sought to develop and validate clinical and PGx dosing algorithms for warfarin dose refinement on days 6–11 after therapy initiation.
Materials and Methods
An international sample of 2,022 patients at 13 medical centers on 3 continents provided clinical, INR, and genetic data at treatment days 6–11 to predict therapeutic warfarin dose. Independent derivation and retrospective validation samples were composed by randomly dividing the population (80%/20%). Prior warfarin doses were weighted by their expected effect on S-warfarin concentrations using an exponential-decay pharmacokinetic model. The INR divided by that “effective” dose constituted a treatment response index.
Treatment response index, age, amiodarone, body surface area, warfarin indication, and target INR were associated with dose in the derivation sample. A clinical algorithm based on these factors was remarkably accurate: in the retrospective validation cohort its R2 was 61.2% and median absolute error (MAE) was 5.0 mg/week. Accuracy and safety was confirmed in a prospective cohort (N=43). CYP2C9 variants and VKORC1-1639 G→A were significant dose predictors in both the derivation and validation samples. In the retrospective validation cohort, the PGx algorithm had: R2= 69.1% (P<0.05 vs. clinical algorithm), MAE= 4.7 mg/week.
A pharmacogenetic warfarin dose-refinement algorithm based on clinical, INR, and genetic factors can explain at least 69.1% of therapeutic warfarin dose variability after about one week of therapy.
warfarin; VKORC1; CYP2C9; pharmacogenetic
Rofecoxib has been proposed to increase the risk of myocardial infarction (MI) through suppression of cyclooxygenase (COX)-2 mediated prostacyclin. Estrogen may have protective effects through augmenting COX-2 expression and subsequently increasing prostacyclin. Estrogen may attenuate the association between rofecoxib and MI. We used 1999–2002 Medicaid claims data to measure the MI-hazard ratio (HR) attributed to rofecoxib exposure in estrogen exposed and unexposed 45–65 year old women. We identified 184,169 female rofecoxib users who contributed 309,504 person-years and experienced 1217 first MIs. Estrogen exposure appeared protective (MI-HR 0.72, 95% CI 0.62–0.84) in this cohort. Rofecoxib was associated with an elevated MI-HR in both estrogen exposed (2.01, 95% CI 1.60–2.54) and estrogen unexposed women (1.69, 95% CI 1.43–1.99). The rofecoxib-estrogen interaction ratio was not significantly different from 1 (1.19, 95% CI 0.91–1.57). Although estrogen use was associated with a lower risk of MI, it did not appear to attenuate the association between rofecoxib and MI.
myocardial infarction; hormone replacement therapy; pharmacoepidemiology; cyclooxygenase inhibitor
Worsening renal function (RF) and improved RF during the treatment of decompensated heart failure have traditionally been thought of as hemodynamically distinct events. We hypothesized that if pulmonary artery catheter derived measures are relevant in the evaluation of cardiorenal interactions comparison of patients with improved vs. worsening RF should highlight any important hemodynamic differences. All subjects in the ESCAPE trial limited data set with admission and discharge creatinine values available were included (401 patients). There were no differences in baseline, final, or change in pulmonary artery catheter derived hemodynamic variables, inotrope and intravenous vasodilator use, or survival between patients with improved and worsening RF (p=NS for all). Both groups were equally likely to be in the bottom quartile of cardiac index (CI) (p=0.32), have a 25% improvement in CI (p=0.97), or have any worsening in CI (p=0.90). When patients with any significant change in renal function (positive or negative) were compared to patients with stable renal function, strong associations between variables such as reduced CI (OR=2.2, p=0.02), increased intravenous inotrope and vasodilator use (OR=2.9, p<0.001), and worsened all cause mortality (HR=1.8, p=0.01) became apparent. Contrary to traditionally held views, patients with improved RF and worsening RF have similar hemodynamic parameters and outcomes. Combining these groups identifies a hemodynamically compromised population with significantly worse survival than patients with stable renal function. In Conclusion, changes in renal function, regardless of direction, likely identify a population with an advanced disease state and poor prognosis.
Cardio-renal syndrome; worsening renal function; improved renal function; acute heart failure; kidney
The purpose of this study was to test the discriminant validity of ISHLT PGD grades in terms of lung injury biomarker profiles and survival.
The study samples consisted of a multicenter prospective cohort study for the biomarker analysis and a 450 patient cohort study for the mortality analyses. PGD was defined according to ISHLT consensus at 24, 48 and 72 hours after transplantation. We compared the changes in plasma markers of acute lung injury between PGD grades using longitudinal data models. To test predictive validity, we compared differences in the 30-day mortality and long-term survival according to PGD grade.
PGD grade 3 demonstrated greater differences between plasma ICAM-1, protein C, and PAI-1 levels than did PGD grades 0-2 at 24, 48, and 72 hours after lung transplantation (p<0.05 for each). Grade 3 had the highest 30-day (test for trend p<0.001) and overall mortality (log rank p<0.001), with PGD grades 1 and 2 demonstrating intermediate risks of mortality. The ability to discriminate both 30 day and overall mortality improved as the time of grading moved away from the time of transplantation (test for trend p<0.001).
The ISHLT grading system has good discriminant validity, based on plasma markers of lung injury and mortality. Grade 3 PGD was associated with the most severely altered plasma biomarker profile and the worst outcomes, regardless of the time point of grading. PGD grade at 48 and 72 hours discriminated mortality better than PGD grade at 24 hours.
Lung Transplantation; complications; Acute lung injury; Primary graft dysfunction; reperfusion injury
Overly aggressive diuresis leading to intravascular volume depletion has been proposed as a cause for worsening renal function (WRF) during the treatment of decompensated heart failure. If diuresis occurs at a rate greater than extravascular fluid can refill the intravascular space, intravascular substances such as hemoglobin and plasma proteins increase in concentration. We hypothesized that hemoconcentration would be associated with WRF and possibly provide insight into the relationship between aggressive decongestion and outcomes.
Methods and Results
Subjects in the Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness trial limited data set with a baseline/discharge pair of hematocrit, albumin, or total protein values were included (336 patients). Baseline to discharge increases in these parameters were evaluated and patients with ≥2 in the top tertile were considered to have evidence of hemoconcentration. The group experiencing hemoconcentration received higher doses of loop diuretics, lost more weight/fluid, and had greater reductions in filling pressures (p<0.05 for all). Hemoconcentration was strongly associated with WRF (OR=5.3, p<0.001) whereas change in right atrial pressure (p=0.36) and change in pulmonary capillary wedge pressure (p=0.53) were not. Patients with hemoconcentration had significantly lower 180 day mortality (HR=0.31, p=0.013). This relationship persisted after adjustment for baseline characteristics (HR=0.16, p=0.001).
Hemoconcentration is significantly associated with measures of aggressive fluid removal and deterioration in renal function. Despite this relationship, hemoconcentration is associated with substantially improved survival. These observations raise the question whether aggressive decongestion, even in the setting of WRF, can positively impact survival.
Diuretics; Heart failure; Kidney