PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (53)
 

Clipboard (0)
None

Select a Filter Below

Journals
more »
Year of Publication
more »
Document Types
1.  Home screening for sexually transmitted diseases in high‐risk young women: randomised controlled trial 
Sexually Transmitted Infections  2007;83(4):286-291.
Objective
Home screening tests could eliminate several barriers to testing sexually transmitted diseases (STDs).
Aim
To determine whether offering repeated home screening tests would increase the rate of testing for chlamydia and gonorrhoea in a high‐risk sample of young women.
Methods
In this randomised controlled trial, 403 young women (mean age 18.9 years, 70% black) with a recent STD or with STD‐related risk factors were enrolled. Participants were recruited from clinics and high‐prevalence neighbourhoods and then randomly assigned to receive either a home testing kit or an invitation to attend a medical clinic for testing at 6, 12 and 18 months after enrollment. Over 80% of women were followed for 2 years. The trial is registered with ClinicalTrials.gov, number NCT 00177437.
Results
Of 197 women in the intervention group, 140 (71%) returned at least one home test and 25 of 249 (10%) home tests were positive. Women who received home screening tests completed significantly more STD tests overall (1.94 vs 1.41 tests per woman‐year, p<0.001) and more STD tests in the absence of symptoms (1.18 vs 0.75 tests per woman‐year, p<0.001). More women in the intervention group completed at least one test when asymptomatic (162 (82.2%) vs 117 (61.3%), p<0.001). The intervention was most effective among women recruited outside medical clinics. There was no significant difference in the overall rate of STDs detected.
Conclusions
Home screening significantly increased the utilisation of chlamydia and gonorrhoea testing in this sample of high‐risk young women, and thus represents a feasible strategy to facilitate STD testing in young women.
doi:10.1136/sti.2006.023762
PMCID: PMC2598665  PMID: 17301105
2.  Smoking, death, and Alzheimer’s disease: A case of competing risks 
If smoking is a risk factor for Alzheimer’s disease (AD) but a smoker dies of another cause before developing or manifesting AD, smoking-related mortality may mask the relationship between smoking and AD. This phenomenon, referred to as competing risk, complicates efforts to model the effect of smoking on AD. Typical survival regression models assume that censorship from analysis is unrelated to an individual’s probability of developing AD (i.e., that censoring is noninformative). However, if individuals who die before developing AD are younger than those who survive long enough to develop AD, and if they include a higher percentage of smokers than nonsmokers, the incidence of AD will appear to be higher in older individuals and in nonsmokers. Further, age-specific mortality rates are higher in smokers because they die earlier than nonsmokers. Therefore, if we fail to take into account the competing risk of death when we estimate the effect of smoking on AD, we bias the results and are really only comparing the incidence of AD in nonsmokers with that in the healthiest smokers. In this study, we demonstrate that the effect of smoking on AD differs in models that are and are not adjusted for competing risks.
doi:10.1097/WAD.0b013e3182420b6e
PMCID: PMC3321062  PMID: 22185783
Alzheimer disease; competing risks; elderly; mortality; smoking
3.  Sustained Benzodiazepine Use in a Community Sample of Older Adults 
OBJECTIVES
To identify factors associated with sustained benzodiazepine use in older adults
DESIGN
12-year cohort study
SETTING
Community-based epidemiologic survey.
PARTICIPANTS
1,342 individuals aged 65+ years
MEASUREMENTS
Demographics, medication use, depressive symptoms, sleep complaints, alcohol use, and smoking, assessed at two-year intervals; descriptive analysis to characterize benzodiazepine users and identify factors associated with sustained benzodiazepine use (use at two consecutive waves); longitudinal lag-time analysis to determine characteristics that predicted sustained use.
RESULTS
Initially, 5.5 % of men and 9.8 % of women were using benzodiazepines. Users were significantly more likely than non-users to be women, less educated, report more depressive and anxiety symptoms, use more prescription medications, have lower self-rated health, have difficulty maintaining sleep, and less likely to consume alcohol. Approximately 50%, 44%, and 25% of these users aged 65-74. 75-84, and 85+ were sustained users at follow-up. Being female, using two or more non-benzodiazepine prescription medications, and smoking were independently associated with subsequent sustained benzodiazepine use.
CONCLUSION
At the population level, women, smokers, and users of at least two prescription drugs have elevated probabilities of sustaining benzodiazepine use once started. This information can facilitate risk assessment and counseling of older adults before prescribing benzodiazepines.
doi:10.1111/j.1532-5415.2008.02011.x
PMCID: PMC2739598  PMID: 19093928
aging; anxiolytics; MoVIES
4.  Minimizing attrition bias: a longitudinal study of depressive symptoms in an elderly cohort 
Background
Attrition from mortality is common in longitudinal studies of the elderly. Ignoring the resulting non-response or missing data can bias study results.
Methods
1260 elderly participants underwent biennial follow-up assessments over 10 years. Many missed one or more assessments over this period. We compared three statistical models to evaluate the impact of missing data on an analysis of depressive symptoms over time. The first analytic model (generalized mixed model) treated non-response as data missing at random. The other two models used shared parameter methods; each had different specifications for dropout but both jointly modeled both outcome and dropout through a common random effect.
Results
The presence of depressive symptoms was associated with being female, having less education, functional impairment, using more prescription drugs, and taking antidepressant drugs. In all three models, the same variables were significantly associated with depression and in the same direction. However, the strength of the associations differed widely between the generalized mixed model and the shared parameter models. Although the two shared parameter models had different assumptions about the dropout process, they yielded similar estimates for the outcome. One model fitted the data better, and the other was computationally faster.
Conclusions
Dropout does not occur randomly in longitudinal studies of the elderly. Thus, simply ignoring it can yield biased results. Shared parameter models are a powerful, flexible, and easily implemented tool for analyzing longitudinal data while minimizing bias due to nonrandom attrition.
doi:10.1017/S104161020900876X
PMCID: PMC2733930  PMID: 19288971
discrete failure time model; dropout; non-ignorable nonresponse; shared parameter model; Weibull model
5.  Racial Variation in End-of-Life Intensive Care Use: A Race or Hospital Effect? 
Health Services Research  2006;41(6):2219-2237.
Objective
To determine if racial and ethnic variations exist in intensive care (ICU) use during terminal hospitalizations, and, if variations do exist, to determine whether they can be explained by systematic differences in hospital utilization by race/ethnicity.
Data Source
1999 hospital discharge data from all nonfederal hospitals in Florida, Massachusetts, New Jersey, New York, and Virginia.
Design
We identified all terminal admissions (N =192,705) among adults. We calculated crude rates of ICU use among non-Hispanic whites, blacks, Hispanics, and those with “other” race/ethnicity. We performed multivariable logistic regression on ICU use, with and without adjustment for clustering of patients within hospitals, to calculate adjusted differences in ICU use and by race/ethnicity. We explored both a random-effects (RE) and fixed-effect (FE) specification to adjust for hospital-level clustering.
Data Collection
The data were collected by each state.
Principal Findings
ICU use during the terminal hospitalization was highest among nonwhites, varying from 64.4 percent among Hispanics to 57.5 percent among whites. Compared to white women, the risk-adjusted odds of ICU use was higher for white men and for nonwhites of both sexes (odds ratios [ORs] and 95 percent confidence intervals: white men =1.16 (1.14–1.19), black men =1.35 (1.17–1.56), Hispanic men =1.52 (1.27–1.82), black women =1.31 (1.25–1.37), Hispanic women =1.53 (1.43–1.63)). Additional adjustment for within-hospital clustering of patients using the RE model did not change the estimate for white men, but markedly attenuated observed differences for blacks (OR for men =1.12 (0.96–1.31), women =1.10 (1.03–1.17)) and Hispanics (OR for men =1.19 (1.00–1.42), women =1.18 (1.09–1.27)). Results from the FE model were similar to the RE model (OR for black men =1.10 (0.95–1.28), black women =1.07 (1.02–1.13) Hispanic men =1.17 (0.96–1.42), and Hispanic women =1.14 (1.06–1.24))
Conclusions
The majority of observed differences in terminal ICU use among blacks and Hispanics were attributable to their use of hospitals with higher ICU use rather than to racial differences in ICU use within the same hospital.
doi:10.1111/j.1475-6773.2006.00598.x
PMCID: PMC1955321  PMID: 17116117
Intensive care units; terminal care; life support care; ethnic groups; hospitals
6.  Prehypertension, Hypertension, and the Risk of Acute Myocardial Infarction in HIV-Infected and -Uninfected Veterans 
We found increased acute myocardial infarction risk among hypertensive and prehypertensive HIV-infected veterans compared to normotensive uninfected veterans, independent of confounding comorbidities.
Background. Compared to uninfected people, human immunodeficiency virus (HIV)–infected individuals may have an increased risk of acute myocardial infarction (AMI). Currently, HIV-infected people are treated to the same blood pressure (BP) goals (<140/90 or <130/80 mm Hg) as their uninfected counterparts. Whether HIV-infected people with elevated BP have excess AMI risk compared to uninfected people is not known. This study examines whether the association between elevated BP and AMI risk differs by HIV status.
Methods. The Veterans Aging Cohort Study Virtual Cohort (VACS VC) consists of HIV-infected and -uninfected veterans matched 1:2 on age, sex, race/ethnicity, and clinical site. For this analysis, we analyzed 81 026 people with available BP data from VACS VC, who were free of cardiovascular disease at baseline. BP was the average of the 3 routine outpatient clinical measurements performed closest to baseline (first clinical visit after April 2003). BP categories used in the analyses were based on criteria of the Seventh Report of the Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure. Analyses were performed using Cox proportional hazards regression.
Results. Over 5.9 years (median), 860 incident AMIs occurred. Low/high prehypertensive and untreated/treated hypertensive HIV-infected individuals had increased AMI risk compared to uninfected, untreated normotensive individuals (hazard ratio [HR], 1.60 [95% confidence interval {CI}, 1.07–2.39]; HR, 1.81 [95% CI, 1.22–2.68]; HR, 2.57 [95% CI, 1.76–3.76]; and HR, 2.76 [95% CI, 1.90–4.02], respectively).
Conclusions. HIV, prehypertensive BP, and hypertensive BP were associated with an increased risk of AMI in a cohort of HIV-infected and -uninfected veterans. Future studies should prospectively investigate whether HIV interacts with BP to further increase AMI risk.
doi:10.1093/cid/cit652
PMCID: PMC3864500  PMID: 24065316
blood pressure; prehypertension; HIV; myocardial infarction
7.  VASCULAR RISK FACTORS AND COGNITIVE DECLINE IN A POPULATION SAMPLE 
We examined several vascular factors in relation to rates of decline in five cognitive domains in a population-based cohort. In an age-stratified random sample (N=1982) aged 65+ years, we assessed at baseline the cognitive domains of attention, executive function, memory, language, and visuospatial function, and also vascular, inflammatory, and metabolic indices. Random effects models generated slopes of cognitive decline over the next four years; linear models identified vascular factors associated with these slopes, adjusting for demographics, baseline cognition, and potential interactions. Several vascular risk factors (history of stroke, diabetes, central obesity, C-Reactive Protein), although associated with lower baseline cognitive performance, did not predict rate of subsequent decline. APOE*4 genotype was associated with accelerated decline in language, memory, and executive functions. Homocysteine elevation was associated with faster decline in executive function. Hypertension (history or systolic blood pressure >140 mm) was associated with slower decline in memory. Baseline alcohol consumption was associated with slower decline in attention, language, and memory. Different indices of vascular risk are associated with low performance and with rates of decline in different cognitive domains. Cardiovascular mechanisms explain at least some of the variance in cognitive decline. Selective survival may also play a role.
doi:10.1097/WAD.0000000000000004
PMCID: PMC3945071  PMID: 24126216
8.  Survival after Acute Hemodialysis in Pennsylvania, 2005–2007: A Retrospective Cohort Study 
PLoS ONE  2014;9(8):e105083.
Background
Little is known about acute hemodialysis in the US. Here we describe predictors of receipt of acute hemodialysis in one state and estimate the marginal impact of acute hemodialysis on survival after accounting for confounding due to illness severity.
Materials and Methods
This is a retrospective cohort study of acute-care hospitalizations in Pennsylvania from October 2005 to December 2007 using data from the Pennsylvania Health Care Cost Containment Council. Exposure variable is acute hemodialysis; dependent variable is survival following acute hemodialysis. We used multivariable logistic regression to determine propensity to receive acute hemodialysis and then, for a Cox proportional hazards model, matched acute hemodialysis and non-acute hemodialysis patients 1∶5 on this propensity.
Results
In 2,131,248 admissions of adults without end-stage renal disease, there were 6,657 instances of acute hemodialysis. In analyses adjusted for predicted probability of death upon admission plus other covariates and stratified on age, being male, black, and insured were independent predictors of receipt of acute hemodialysis. One-year post-admission mortality was 43% for those receiving acute hemodialysis, compared to 13% among those not receiving acute hemodialysis. After matching on propensity to receive acute hemodialysis and adjusting for predicted probability of death upon admission, patients who received acute hemodialysis had a higher risk of death than patients who did not over at least 1 year of follow-up (hazard ratio 1·82, 95% confidence interval 1·68–1·97).
Conclusions
In a populous US state, receipt of acute hemodialysis varied by age, sex, race, and insurance status even after adjustment for illness severity. In a comparison of patients with similar propensity to receive acute hemodialysis, those who did receive it were less likely to survive than those who did not. These findings raise questions about reasons for lack of benefit.
doi:10.1371/journal.pone.0105083
PMCID: PMC4139312  PMID: 25141028
9.  Mild cognitive impairment 
Neurology  2013;80(23):2112-2120.
Objective:
We examined the incidence of mild cognitive impairment (MCI) and its potential vascular risk factors in a prospective population-based study.
Methods:
An age-stratified random population-based cohort (baseline n = 1,982), followed for up to 4 years, was annually assessed for cognitive and everyday functioning. Incidence rates were calculated for both cognitive (neuropsychological [NP]-MCI) and functional (Clinical Dementia Rating [CDR] = 0.5) definitions of MCI. Several measures of vascular, metabolic, and inflammatory risk were assessed at baseline. Risk factor analyses used interval censoring survival models, followed by joint modeling of both MCI and attrition due to mortality and illness.
Results:
Incidence rates for NP-MCI and CDR = 0.5 were 95 and 55 per 1,000 person-years. In individual joint models, risk factors for NP-MCI were diabetes and adiposity (waist: hip ratio), while APOE ε4 genotype and heart failure increased risk of attrition. Adiposity, stroke, heart failure, and diabetes were risk factors for nonamnestic MCI. For CDR = 0.5, risk factors were stroke and heart failure; heart failure and adiposity increased risk of attrition. In multivariable joint models combining all risk factors, adiposity increased risk of NP-MCI, while stroke and heart failure increased risk for CDR = 0.5. Current alcohol use appeared protective against all subtypes.
Conclusion:
Incidence of MCI increased with age regardless of definition and did not vary by sex or education. Several vascular risk factors elevated the risk of incident MCI, whether defined cognitively or functionally, but most were associated with nonamnestic MCI and CDR = 0.5. Controlling vascular risk may potentially reduce risk of MCI.
doi:10.1212/WNL.0b013e318295d776
PMCID: PMC3716350  PMID: 23658380
10.  Clinical and demographic covariates of chronic opioid and non-opioid analgesic use in rural-dwelling older adults: the MoVIES project 
International psychogeriatrics / IPA  2013;25(11):1801-1810.
Background
To describe covariates and patterns of late-life analgesic use in the rural, population-based MoVIES cohort from 1989 to 2002.
Methods
Secondary analysis of epidemiologic survey of elderly people conducted over six biennial assessment waves. Potential covariates of analgesic use included age, gender, depression, sleep, arthritis, smoking, alcohol, and general health status. Of the original cohort of 1,681, this sample comprised 1,109 individuals with complete data on all assessments. Using trajectory analysis, participants were characterized as chronic or non-chronic users of opioid and non-opioid analgesics. Multivariable regression was used to model predictors of chronic analgesic use.
Results
The cohort was followed for mean (SD) 7.3 (2.7) years. Chronic use of opioid analgesics was reported by 7.2%, while non-opioid use was reported by 46.1%. In the multivariable model, predictors of chronic use of both opioid and non-opioid analgesics included female sex, taking ≥2 prescription medications, and “arthritis” diagnoses. Chronic opioid use was also associated with age 75–84 years; chronic non-opioid use was also associated with sleep continuity disturbance.
Conclusions
These epidemiological data confirm clinical observations and generate hypotheses for further testing. Future studies should investigate whether addressing sleep problems might lead to decreased use of non-opioid analgesics and possibly enhanced pain management.
doi:10.1017/S104161021300121X
PMCID: PMC4020176  PMID: 23883528
aging; epidemiology; medical comorbidity; pain; rural; sleep
11.  Engagement in Social Activities and Progression from Mild to Severe Cognitive Impairment: The MYHAT Study 
Background
It is of considerable public health importance to prevent or delay the progression of mild cognitive impairment (MCI) to more severely impaired cognitive states. This study examines the risk of progression from mild to severe cognitive impairment in relation to engagement in social activities while mildly impaired and the concurrence of subsequent change in engagement with MCI progression.
Methods
Participants were 816 older adults with cognitively defined MCI (mean age 78.0 [SD = 7.4] years) from the Monongahela-Youghiogheny Healthy Aging Team (MYHAT) Study - a prospective cohort study of MCI in the community. Over three years of follow-up, 78 individuals progressed from MCI severe cognitive impairment while 738 did not progress. Risk of progression was estimated using discrete time survival analyses. The main predictors were standardized composite measures of the variety of and frequency of engagement in social activities.
Results
Lower risk of progression from mild to severe cognitive impairment was associated with both a greater level of frequency of engagement in social activities while mildly impaired (OR = 0.72, 95% CI: 0.55–0.93, p = 0.01), and also with a slower rate of decline in the variety of activities over time (OR = 0.01, 95% CI: <0.001–0.38, p = 0.02).
Conclusions
Greater engagement in social activities may potentially be beneficial for preventing or delaying further cognitive decline among older adults with MCI. Alternatively, lesser engagement in social activities may be a marker of impending cognitive decline in MCI.
doi:10.1017/S1041610212002086
PMCID: PMC3578022  PMID: 23257280
MCI; leisure activities; social engagement; cognitive decline
12.  Myocardial Damage Detected by Late Gadolinium Enhancement Cardiovascular Magnetic Resonance Is Associated With Subsequent Hospitalization for Heart Failure 
Background
Hospitalization for heart failure (HHF) is among the most important problems confronting medicine. Late gadolinium enhancement (LGE) cardiovascular magnetic resonance (CMR) robustly identifies intrinsic myocardial damage. LGE may indicate inherent vulnerability to HHF, regardless of etiology, across the spectrum of heart failure stage or left ventricular ejection fraction (LVEF).
Methods and Results
We enrolled 1068 consecutive patients referred for CMR where 448 (42%) exhibited LGE. After a median of 1.4 years (Q1 to Q3: 0.9 to 2.0 years), 57 HHF events occurred, 15 deaths followed HHF, and 43 deaths occurred without antecedent HHF (58 total deaths). Using multivariable Cox regression adjusting for LVEF, heart failure stage, and other covariates, LGE was associated with first HHF after CMR (HR: 2.70, 95% CI: 1.32 to 5.50), death (HR: 2.13, 95% CI: 1.08 to 4.21), or either death or HHF (HR: 2.52, 95% CI: 1.49 to 4.25). Quantifying LGE extent yielded similar results; more LGE equated higher risks. LGE improved model discrimination (IDI: 0.016, 95% CI: 0.005 to 0.028, P=0.002) and reclassification of individuals at risk (continuous NRI: 0.40, 95% CI: 0.05 to 0.70, P=0.024). Adjustment for competing risks of death that shares common risk factors with HHF strengthened the LGE and HHF association (HR: 4.85, 95% CI: 1.40 to 16.9).
Conclusions
The presence and extent of LGE is associated with vulnerability for HHF, including higher risks of HHF across the spectrum of heart failure stage and LVEF. Even when LVEF is severely decreased, those without LGE appear to fare reasonably well. LGE may enhance risk stratification for HHF and may enhance both clinical and research efforts to reduce HHF through targeted treatment.
doi:10.1161/JAHA.113.000416
PMCID: PMC3886781  PMID: 24249712
late gadolinium enhancement; magnetic resonance imaging; myocardial delayed enhancement; myocardial fibrosis; myocardial infarction
13.  Subjective cognitive complaints of older adults at the population level: An item response theory analysis 
Subjective cognitive complaints (SCCs) are increasingly a focus in studies of prodromal Alzheimer disease (AD) and risk for dementia. Little is known about the optimal approach to measure SCCs. We used item response theory (IRT) to examine characteristics of 24 SCC items in a sample of 3,495 older adults pooled from four community-based studies. We investigated the potential advantages of IRT scoring over conventional scoring, based on participants' item response patterns. Items most likely endorsed by individuals low in SCC severity relate to word retrieval and general subjective memory decline. Items likely endorsed only by individuals high in SCC severity relate to non-episodic memory changes, such as decline in comprehension, judgment and executive functions, praxis and procedural memory, and social behavior changes. IRT scoring of SCCs was associated with performance on objective cognitive test performance above and beyond total SCC scores, and was associated with objective cognitive test performance among participants endorsing only one SCC item. Thus, IRT scoring captures additional information beyond a simple sum of SCC symptoms. Modern psychometric approaches including IRT may be useful in developing 1) brief community screening questionnaires, and 2) more sensitive measures of very subtle subjective decline for use in prodromal AD research.
doi:10.1097/WAD.0b013e3182420bdf
PMCID: PMC3337955  PMID: 22193355
Subjective memory; item response theory; dementia; neuropsychological tests; subjective cognitive impairment
14.  Mild cognitive deficits and everyday functioning among older adults in the community: The Monongahela-Youghiogheny Healthy Aging Team (MYHAT) Study 
Objective
A key component of successful aging in old age is the ability to independently perform instrumental activities of daily living (IADLs). We examined the ability to perform multiple IADL tasks in relation to mild cognitive impairment (MCI) defined on purely neuropsychological grounds.
Design
Cross-sectional study.
Setting
Population-based cohort in Southwestern Pennsylvania.
Participants
1,737 community-dwelling adults aged 65 years and older.
Measurements
Classification of MCI based on performance with reference to norms in the cognitive domains of memory, language, attention, executive and visuospatial function. The ability to perform seven IADL tasks (travel, shopping, meal preparation, housework, taking medications, handling personal finances, and telephone use) as assessed by the Older Americans Resources and Services (OARS) scale.
Results
Those with cognitively defined MCI were more likely to be dependent in at least one IADL task, and in each individual IADL task, than cognitively normal participants. Better memory and executive functioning were associated with lower odds of IADL dependence in MCI. Across the subtypes of MCI, those with the multiple-domain amnestic subtype were the most likely to be dependent in all IADL tasks; with better executive functioning associated with lower risk of dependence in select IADL tasks in this group.
Conclusions
Mild impairment in cognition is associated with difficulty performing IADL tasks at the population level. Understanding these associations may help improve prediction of the outcomes of MCI. It may also allow appropriate targeting of cognitive interventions in MCI to potentially help preserve functional independence.
doi:10.1097/JGP.0b013e3182423961
PMCID: PMC3445790  PMID: 22337146
cognition; mild cognitive impairment; everyday functioning; instrumental activities of daily living; epidemiology; community; population
15.  Is Survival Better at Hospitals With Higher “End-of-Life” Treatment Intensity? 
Medical care  2010;48(2):125-132.
Background
Concern regarding wide variations in spending and intensive care unit use for patients at the end of life hinges on the assumption that such treatment offers little or no survival benefit.
Objective
To explore the relationship between hospital “end-of-life” (EOL) treatment intensity and postadmission survival.
Research Design
Retrospective cohort analysis of Pennsylvania Health Care Cost Containment Council discharge data April 2001 to March 2005 linked to vital statistics data through September 2005 using hospital-level correlation, admission-level marginal structural logistic regression, and pooled logistic regression to approximate a Cox survival model.
Subjects
A total of 1,021,909 patients ≥65 years old, incurring 2,216,815 admissions in 169 Pennsylvania acute care hospitals.
Measures
EOL treatment intensity (a summed index of standardized intensive care unit and life-sustaining treatment use among patients with a high predicted probability of dying [PPD] at admission) and 30- and 180-day postadmission mortality.
Results
There was a nonlinear negative relationship between hospital EOL treatment intensity and 30-day mortality among all admissions, although patients with higher PPD derived the greatest benefit. Compared with admission at an average intensity hospital, admission to a hospital 1 standard deviation below versus 1 standard deviation above average intensity resulted in an adjusted odds ratio of mortality for admissions at low PPD of 1.06 (1.04–1.08) versus 0.97 (0.96–0.99); average PPD: 1.06 (1.04–1.09) versus 0.97 (0.96–0.99); and high PPD: 1.09 (1.07–1.11) versus 0.97 (0.95– 0.99), respectively. By 180 days, the benefits to intensity attenuated (low PPD: 1.03 [1.01–1.04] vs. 1.00 [0.98–1.01]; average PPD: 1.03 [1.02–1.05] vs. 1.00 [0.98–1.01]; and high PPD: 1.06 [1.04–1.09] vs. 1.00 [0.98–1.02]), respectively.
Conclusions
Admission to higher EOL treatment intensity hospitals is associated with small gains in postadmission survival. The marginal returns to intensity diminish for admission to hospitals above average EOL treatment intensity and wane with time.
doi:10.1097/MLR.0b013e3181c161e4
PMCID: PMC3769939  PMID: 20057328
intensive care; terminal care; mortality; hospitals; efficiency; quality
16.  Population Attributable Risk of Aflatoxin-Related Liver Cancer: Systematic Review and Meta-Analysis 
Background
Over 4 billion people worldwide are exposed to dietary aflatoxins, which cause liver cancer (hepatocellular carcinoma, HCC) in humans. However, the population attributable risk (PAR) of aflatoxin-related HCC remains unclear.
Methods
In our systematic review and meta-analysis of epidemiological studies, summary odds ratios (ORs) of aflatoxin-related HCC with 95% confidence intervals were calculated in HBV+ and HBV− individuals, as well as the general population. We calculated the PAR of aflatoxin-related HCC for each study as well as the combined studies, accounting for HBV status.
Results
17 studies with 1680 HCC cases and 3052 controls were identified from 479 articles. All eligible studies were conducted in China, Taiwan, or sub-Saharan Africa. The PAR of aflatoxin-related HCC was estimated at 17% (14–19%) overall, and higher in HBV+ (21%) than HBV− (8.8%) populations. If the one study that contributed most to heterogeneity in the analysis is excluded, the summarized OR of HCC with 95% CI is 73.0 (36.0–148.3) from the combined effects of aflatoxin and HBV, 11.3 (6.75–18.9) from HBV only, and 6.37 (3.74–10.86) from aflatoxin only. The PAR of aflatoxin-related HCC increases to 23% (21–24%). The PAR has decreased over time in certain Taiwanese and Chinese populations.
Conclusions
In high exposure areas, aflatoxin multiplicatively interacts with HBV to induce HCC; reducing aflatoxin exposure to non-detectable levels could reduce HCC cases in high-risk areas by about 23%. The decreasing PAR of aflatoxin-related HCC reflects the benefits of public health interventions to reduce aflatoxin and HBV.
doi:10.1016/j.ejca.2012.02.009
PMCID: PMC3374897  PMID: 22405700
Aflatoxin; hepatocellular carcinoma; hepatitis B virus; population attributable risk; systematic review; meta-analysis
17.  Disability, Race/ethnicity, and Medication Adherence Among Medicare Myocardial Infarction Survivors 
American heart journal  2012;164(3):425-433.e4.
Background
Long-term medication therapy for post myocardial infarction (MI) patients can prolong life. However, recent data on long-term adherence are limited, particularly among some subpopulations. We compared medication adherence among Medicare MI survivors, by disability status, race/ethnicity, and income.
Methods
We examined 100% of Medicare fee-for-service beneficiaries discharged post-MI in 2008. The outcomes were adherence to β-blockers, statins, and angiotensin-converting enzyme inhibitors (ACEIs)/angiotensin II receptor blockers (ARBs), for 1-year and 6-month post-discharge. Adherence was defined as having prescriptions in possession for ≥75% of days.
Results
Among aged beneficiaries who survived 1-year, 1-year adherence to β-blockers were 68%, 66%, 61%, 58%, and 57% for Whites, Asians, Hispanics, Native Americans and Blacks, respectively; among the disabled, 1-year adherence was worse for each group: 59%, 54%, 52%, 47% and 43% respectively. The racial/ethnic difference persisted after adjustment for age, gender, income, drug coverage, location and health status. Patterns of adherence to statins and ACEs/ARBs were similar. Among beneficiaries with close-to full drug coverage, minorities were still less likely to adhere relative to Whites, OR=0.70 (95% CI 0.65-0.75) for Blacks and OR=0.70 (95% CI 0.55-0.90) for Native Americans.
Conclusions
Although β-blockers at discharge has improved since the National Committee for Quality Assurance implemented quality measures, long-term adherence remains problematic, especially among disabled and minority beneficiaries. Quality measures for long-term adherence should be created to improve outcomes in post-MI patients. Even among those with close-to full drug coverage, racial differences remain, suggesting that policies simply relying on cost reduction cannot eliminate racial differences.
doi:10.1016/j.ahj.2012.05.021
PMCID: PMC3445297  PMID: 22980311
18.  Comorbid diabetes and the risk of progressive chronic kidney disease in HIV-infected adults: Data from the Veterans Aging Cohort Study 
Introduction
Approximately 15% of HIV-infected individuals have comorbid diabetes. Studies suggest that HIV and diabetes have an additive effect on chronic kidney (CKD) progression; however, this observation may be confounded by differences in traditional CKD risk factors.
Methods
We studied a national cohort of HIV-infected and matched HIV-uninfected individuals who received care through the Veterans Healthcare Administration. Subjects were divided into four groups based on baseline HIV and diabetes status, and the rate of progression to an estimated glomerular filtration rate (eGFR) < 45ml/min/1.73m2 was compared using Cox-proportional hazards modeling to adjust for CKD risk factors.
Results
31,072 veterans with baseline eGFR ≥ 45ml/min/1.73m2 (10,626 with HIV only, 5,088 with diabetes only, and 1,796 with both) were followed for a median of 5 years. Mean baseline eGFR was 94ml/min/1.73m2, and 7% progressed to an eGFR < 45ml/min/1.73m2. Compared to those without HIV or diabetes, the relative rate of progression was increased in individuals with diabetes only [adjusted hazard ratio (HR) 2.48; 95% confidence interval (CI) 2.19–2.80], HIV only [HR 2.80, 95% CI 2.50–3.15], and both HIV and diabetes [HR 4.47, 95% CI 3.87–5.17].
Discussion
Compared to patients with only HIV or diabetes, patients with both diagnoses are at significantly increased risk of progressive CKD even after adjusting for traditional CKD risk factors. Future studies should evaluate the relative contribution of complex comorbidities and accompanying polypharmacy to the risk of CKD in HIV-infected individuals, and prospectively investigate the use of cART, glycemic control, and adjunctive therapy to delay CKD progression.
doi:10.1097/QAI.0b013e31825b70d9
PMCID: PMC3392432  PMID: 22592587
non-AIDS complications; HIV; chronic kidney disease; diabetes; risk factors
19.  Cross-Validation of Brain Structural Biomarkers and Cognitive Aging in a Community-Based Study 
International Psychogeriatrics / Ipa  2012;24(7):1065-1075.
Background
Population-based studies face challenges in measuring brain structure relative to cognitive aging. We examined the feasibility of acquiring state-of-the-art brain MRI images at a community hospital, and attempted to cross-validate two independent approaches to image analysis.
Methods
Participants were 49 older adults (29 cognitively normal and 20 with mild cognitive impairment, MCI) drawn from an ongoing cohort study, with annual clinical assessments within one month of scan, without overt cerebrovascular disease, and without dementia (Clinical Dementia Ratings (CDR) <1)). Brain MRI images, acquired at the local hospital using the Alzheimer's Disease Neuroimaging Initiative protocol, were analyzed using (1) a visual atrophy rating scale and (2) a semi-automated voxel-level morphometric method. Atrophy and volume measures were examined in relation to cognitive classification (any MCI and Amnestic MCI vs. normal cognition), CDR (0.5 vs. 0), and presumed etiology.
Results
Measures indicating greater atrophy or lesser volume of the hippocampal formation, the medial temporal lobe, and the dilation of the ventricular space, were significantly associated with cognitive classification, CDR=0.5, and presumed neurodegenerative etiology, independent of the image analytic method. Statistically significant correlations were also found between the visual ratings of medial temporal lobe atrophy and the semiautomated ratings of brain structural integrity.
Conclusions
High quality MRI data can be acquired and analyzed from older adults in population studies, enhancing their capacity to examine imaging biomarkers in relation to cognitive aging and dementia.
doi:10.1017/S1041610212000191
PMCID: PMC3391579  PMID: 22420888
20.  Gray's Time-Varying Coefficients Model for Posttransplant Survival of Pediatric Liver Transplant Recipients with a Diagnosis of Cancer 
Transplantation is often the only viable treatment for pediatric patients with end-stage liver disease. Making well-informed decisions on when to proceed with transplantation requires accurate predictors of transplant survival. The standard Cox proportional hazards (PH) model assumes that covariate effects are time-invariant on right-censored failure time; however, this assumption may not always hold. Gray's piecewise constant time-varying coefficients (PC-TVC) model offers greater flexibility to capture the temporal changes of covariate effects without losing the mathematical simplicity of Cox PH model. In the present work, we examined the Cox PH and Gray PC-TVC models on the posttransplant survival analysis of 288 pediatric liver transplant patients diagnosed with cancer. We obtained potential predictors through univariable (P < 0.15) and multivariable models with forward selection (P < 0.05) for the Cox PH and Gray PC-TVC models, which coincide. While the Cox PH model provided reasonable average results in estimating covariate effects on posttransplant survival, the Gray model using piecewise constant penalized splines showed more details of how those effects change over time.
doi:10.1155/2013/719389
PMCID: PMC3665233  PMID: 23762197
21.  The impact of menopause on health-related quality of life: results from the STRIDE longitudinal study 
Quality of Life Research  2011;21(3):535-544.
Purpose
We examine the impact of menopausal status, beyond menopausal symptoms, on health-related quality of life (HRQoL).
Methods
Seven hundred thirty-two women aged 40–65, regardless of health condition or menopausal status, were enrolled from single general internal medicine practice. Women completed annual questionnaires including HRQoL, and menopausal status and symptoms.
Results
The physical health composite of the RAND-36 is lower in late peri (45.6, P<.05), early post (45.4, P<.05), and late postmenopausal women (44.6, P<.01), and those who report a hysterectomy (44.2, P<.01) compared to premenopausal women (47.1), with effect sizes of Cohen’s d = .12-.23. The mental health composite of the RAND-36 is lower in late peri (44.7, P<.01), early post (44.9, P<.01), and late postmenopausal women (45.0, P<.05) and those who report a hysterectomy (44.2, P<.01) compared to premenopausal women (46.8), with effect sizes of Cohen’s d = .15–.20. Findings are comparable adjusted for menopausal symptom frequency and bother.
Conclusions
Over a 5-year follow-up period, we found a negative impact of menopause on some domains of HRQoL, regardless of menopausal symptoms. Clinicians should be aware of this relationship and work to improve HRQoL, rather than expect it to improve spontaneously when menopausal symptoms resolve.
doi:10.1007/s11136-011-9959-7
PMCID: PMC3252474  PMID: 21755412
Menopause; Health-related quality of life; Hot flashes; Vaginal dryness; Women’s health
22.  Perceived Discrimination Predicts Longer Time to be Accepted for Kidney Transplant 
Transplantation  2012;93(4):423-429.
Background
Although end-stage kidney disease (ESKD) in African Americans (AAs) is four times greater than in Whites, AAs are less than half as likely to undergo kidney transplantation (KT). This racial disparity has been found even after controlling for clinical factors such as co-morbid conditions, dialysis vintage and type, and availability of potential living donors. Therefore, studying non-medical factors is critical to understanding disparities in KT.
Design, Setting, and Participants
We conducted a longitudinal cohort study with 127 AA and White patients with ESKD undergoing evaluation for KT (12/06 – 7/07) to determine whether, after controlling for medical factors, differences in time to acceptance for transplant is explained by patients’ cultural factors (e.g., perceived racism and discrimination, medical mistrust, religious objections to living donor KT), psychosocial characteristics (e.g., social support, anxiety, depression), or transplant knowledge. Participants completed 2 telephone interviews (shortly after initiation of transplant evaluation and after being accepted or found ineligible for transplant).
Results
Results indicated that AA patients reported higher levels of the cultural factors than did Whites. We found no differences in co-morbidity or availability of potential living donors. AAs took significantly longer to get accepted for transplant than did Whites (HR=1.49, p=0.005). After adjustment for demographic, psychosocial, and cultural factors, the association of race with longer time for listing was no longer significant.
Conclusions
We suggest that interventions to address racial disparities in KT incorporate key non-medical risk factors in patients.
doi:10.1097/TP.0b013e318241d0cd
PMCID: PMC3275654  PMID: 22228417
kidney transplantation; disparities; discrimination
23.  Delay in Seeking Care for Sexually Transmitted Diseases in Young Men and Women Attending a Public STD Clinic 
The Open AIDS Journal  2013;7:7-13.
Background:
Delay in seeking care for sexually transmitted diseases (STDs) has adverse consequences for both the individual and population. We sought to identify factors associated with delay in seeking care for STDs.
Methods:
Subjects included 300 young men and women (aged 15-24) attending an urban STD clinic for a new STD-related problem due to symptoms or referral for an STD screening. Subjects completed a structured interview that evaluated STD history, attitudes and beliefs about STDs, depression, substance use, and other factors possibly associated with delay. Delay was defined as waiting > 7 days to seek and obtain care for STDs.
Results:
Nearly one-third of participants delayed seeking care for > 7 days. Significant predictors for delay included self-referral for symptoms as the reason for visit (OR 5.3, 95% CI: 2.58 – 10.98), and the beliefs “my partner would blame me if I had an STD” (OR 2.44, 95% CI: 1.30 – 4.60) and “it’s hard to find time to get checked for STDs” (OR 3.62, 95% CI: 1.95 – 6.69), after adjusting for age, race, sex, and other factors. Agreeing with the statement “would use a STD test at home if one were available” was associated with a decrease in delay (OR 0.24, 95% CI: 0.09 – 0.60).
Conclusions:
Many young persons delay seeking care for STDs for a number of reasons. Strategies to improve STD care-seeking include encouragement of symptomatic persons to seek medical care more rapidly, reduction of social stigmas, and improved access to testing options.
doi:10.2174/1874613620130614002
PMCID: PMC3785038  PMID: 24078858
Delay; healthcare-seeking behavior; men; sexually transmitted diseases; symptoms; women.
24.  Adherence, virological and immunological outcomes for HIV-infected veterans starting combination antiretroviral therapies 
AIDS (London, England)  2007;21(12):1579-1589.
Objectives
We aimed to determine adherence, virological, and immunological outcomes one year after starting a first combination antiretroviral therapy (ART) regimen.
Design
Observational; synthesis of administrative, laboratory, and pharmacy data. Antiretroviral regimens were divided into efavirenz, nevirapine, boosted protease inhibitor (PI), and single PI categories. Propensity scores were used to control for confounding by treatment assignment. Adherence was estimated from pharmacy refill records.
Setting
Veterans Affairs Healthcare System, all sites.
Participants
HIV-infected individuals starting combination ART with a low likelihood of previous antiretroviral exposure.
Interventions
None.
Outcomes
The proportion of antiretroviral prescriptions filled as prescribed, a change in log HIV-RNA, the proportion with log HIV-RNA viral suppression, a change in CD4 cell count.
Results
A total of 6394 individuals unlikely to have previous antiretroviral exposure started combination ART between 1996 and 2004, and were eligible for analysis. Adherence overall was low (63% of prescriptions filled as prescribed), and adherence with efavirenz (67%) and nevirapine (65%) regimens was significantly greater than adherence with boosted PI (59%) or single PI (61%) regimens (P < 0.001). Efavirenz regimens were more likely to suppress HIV-RNA at one year (74%) compared with nevirapine (62%), boosted PI (63%), or single PI (53%) regimens (all P < 0.001), and this superiority was maintained when analyses were adjusted for baseline clinical characteristics and propensity for treatment assignment. Efavirenz also yielded more favorable immunological outcomes.
Conclusion
HIV-infected individuals initiating their first combination ART using an efavirenz-based regimen had improved virological and immunological outcomes and greater adherence levels.
doi:10.1097/QAD.0b013e3281532b31
PMCID: PMC3460378  PMID: 17630553
Adherence; resistance; ART; Veterans Affairs Healthcare System
25.  Terminal decline and practice effects in older adults without dementia 
Neurology  2011;77(8):722-730.
Objective:
To track cognitive change over time in dementia-free older adults and to examine terminal cognitive decline.
Methods:
A total of 1,230 subjects who remained free from dementia over 14 years of follow-up were included in a population-based epidemiologic cohort study. First, we compared survivors and decedents on their trajectories of 5 cognitive functions (learning, memory, language, psychomotor speed, executive functions), dissociating practice effects which can mask clinically significant decline from age-associated cognitive decline. We used longitudinal mixed-effects models with penalized linear spline. Second, limiting the sample to 613 subjects who died during follow-up, we identified the inflection points at which the rate of cognitive decline accelerated, in relation to time of death, controlling for practice effects. We used mixed-effects model with a change point.
Results:
Age-associated cognitive trajectories were similar between decedents and survivors without dementia. However, substantial differences were observed between the trajectories of practice effects of survivors and decedents, resembling those usually observed between normal and mildly cognitively impaired elderly. Executive and language functions showed the earliest terminal declines, more than 9 years prior to death, independent of practice effects.
Conclusions:
Terminal cognitive decline in older adults without dementia may reflect presymptomatic disease which does not cross the clinical threshold during life. Alternatively, cognitive decline attributed to normal aging may itself represent underlying neurodegenerative or vascular pathology. Although we cannot conclude definitively from this study, the separation of practice effects from age-associated decline could help identify preclinical dementia. Neurology® 2011;77:722–730
doi:10.1212/WNL.0b013e31822b0068
PMCID: PMC3164394  PMID: 21832224

Results 1-25 (53)