PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (28)
 

Clipboard (0)
None

Select a Filter Below

Year of Publication
Document Types
1.  Estimating Minimally Important Differences for Two Vision-Specific Quality of Life Measures 
Purpose.
To estimate minimally important differences (MIDs) for the Visual Activities Questionnaire (VAQ) and the National Eye Institute-Visual Function Questionnaire (NEI-VFQ).
Methods.
A total of 607 subjects with newly-diagnosed open-angle glaucoma (OAG) was enrolled in the Collaborative Initial Glaucoma Treatment Study (CIGTS) and randomized to initial treatment with medications or surgery. Subjects underwent an ophthalmic examination and telephone-administered quality of life (QOL) interview before randomization and every six months thereafter. The VAQ and NEI-VFQ were used to assess participants' perceptions of their visual function. Clinical measures included the mean deviation (MD) from Humphrey 24-2 full threshold visual field (VF) testing, and best-corrected visual acuity (VA) measured using the Early Treatment Diabetic Retinopathy Study (ETDRS) protocol. Anchor-based (using MD and VA) and distribution-based methods were used to estimate MIDs.
Results.
Anchor-based cross-sectional analyses at 66 months follow-up found a 10-letter increment in better eye VA corresponded to MIDs of 5.2 units for VAQ and 3.8 units for NEI-VFQ total scores. A 3-dB increment in the better eye MD yielded MIDs of 2.6 and 2.3 units for the same two questionnaires. In longitudinal analyses, MIDs for the VAQ were 3.2 units for a 10-letter change of VA and 3.4 units for a 3-dB change in the MD. Distribution-based MIDs were larger.
Conclusions.
A range of MIDs for the VAQ (2.6–6.5 units) and NEI-VFQ (2.3–3.8 units) was found. Although similar in magnitude, MIDs were sensitive to the MID estimation method, the anchor chosen, and differences between questionnaires. (ClinicalTrials.gov number, NCT00000149.)
Minimally important differences (MIDs) were estimated for two vision-specific quality of life measures, VAQ and NEI-VFQ, using anchor-based cross-sectional, anchor-based longitudinal, and distribution-based methods. Subscale MIDs for VAQ (2.6–6.5 units) and NEI-VFQ (2.3–3.8 units) were found.
doi:10.1167/iovs.13-13683
PMCID: PMC4095718  PMID: 24906863
MID; glaucoma; CIGTS; NEI-VFQ; VAQ
2.  Sex-Specific Differences in Hemodialysis Prevalence and Practices and the Male-to-Female Mortality Rate: The Dialysis Outcomes and Practice Patterns Study (DOPPS) 
PLoS Medicine  2014;11(10):e1001750.
In this study, Port and colleagues describe hemodialysis prevalence and patient characteristics by sex, compare men-to-women mortality rate with data from the general population, and evaluate sex interactions with mortality. The results show that women's survival advantage was markedly diminished in hemodialysis patients.
Please see later in the article for the Editors' Summary
Background
A comprehensive analysis of sex-specific differences in the characteristics, treatment, and outcomes of individuals with end-stage renal disease undergoing dialysis might reveal treatment inequalities and targets to improve sex-specific patient care. Here we describe hemodialysis prevalence and patient characteristics by sex, compare the adult male-to-female mortality rate with data from the general population, and evaluate sex interactions with mortality.
Methods and Findings
We assessed the Human Mortality Database and 206,374 patients receiving hemodialysis from 12 countries (Australia, Belgium, Canada, France, Germany, Italy, Japan, New Zealand, Spain, Sweden, the UK, and the US) participating in the international, prospective Dialysis Outcomes and Practice Patterns Study (DOPPS) between June 1996 and March 2012. Among 35,964 sampled DOPPS patients with full data collection, we studied patient characteristics (descriptively) and mortality (via Cox regression) by sex. In all age groups, more men than women were on hemodialysis (59% versus 41% overall), with large differences observed between countries. The average estimated glomerular filtration rate at hemodialysis initiation was higher in men than women. The male-to-female mortality rate ratio in the general population varied from 1.5 to 2.6 for age groups <75 y, but in hemodialysis patients was close to one. Compared to women, men were younger (mean = 61.9±standard deviation 14.6 versus 63.1±14.5 y), were less frequently obese, were more frequently married and recipients of a kidney transplant, more frequently had coronary artery disease, and were less frequently depressed. Interaction analyses showed that the mortality risk associated with several comorbidities and hemodialysis catheter use was lower for men (hazard ratio [HR] = 1.11) than women (HR = 1.33, interaction p<0.001). This study is limited by its inability to establish causality for the observed sex-specific differences and does not provide information about patients not treated with dialysis or dying prior to a planned start of dialysis.
Conclusions
Women's survival advantage was markedly diminished in hemodialysis patients. The finding that fewer women than men were being treated with dialysis for end-stage renal disease merits detailed further study, as the large discrepancies in sex-specific hemodialysis prevalence by country and age group are likely explained by factors beyond biology. Modifiable variables, such as catheter use, showing significant sex interactions suggest interventional targeting.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Throughout life, the kidneys filter waste products (from the normal breakdown of tissues and from food) and excess water from the blood to make urine. Chronic kidney disease—an increasingly common condition globally—gradually destroys the kidney's filtration units (the nephrons). As the nephrons stop working, the rate at which the blood is filtered (the glomerular filtration rate) decreases, and waste products build up in the blood, eventually leading to life-threatening end-stage kidney (renal) disease. Symptoms of chronic kidney disease, which rarely occur until the disease is advanced, include tiredness, swollen feet and ankles, and frequent urination, particularly at night. Chronic kidney disease cannot be cured, but its progression can be slowed by controlling diabetes and other conditions that contribute to its development. End-stage kidney disease is treated by regular hemodialysis (a process in which blood is cleaned by passing it through a filtration machine) or by kidney transplantation.
Why Was This Study Done?
Like many other long-term conditions, the prevalence (the proportion of the population that has a specific disease) of chronic kidney disease and of end-stage renal disease, and treatment outcomes for these conditions, may differ between men and women. Some of these sex-specific differences may arise because of sex-specific differences in normal biological functions. Other sex-specific differences may be related to sex-specific differences in patient care or in patient awareness of chronic kidney disease. A comprehensive analysis of sex-specific differences among individuals with end-stage renal disease might identify both treatment inequalities and ways to improve sex-specific care. Here, in the Dialysis Outcomes and Practice Patterns Study (DOPPS), the researchers investigate sex-specific differences in the prevalence and practices of hemodialysis and in the characteristics of patients undergoing hemodialysis, and investigate the adult male-to-female mortality (death) rate among patients undergoing hemodialysis. The DOPPS is a prospective cohort study that is investigating the characteristics, treatment, and outcomes of adult patients undergoing hemodialysis in representative facilities in 19 countries (12 countries were available for analysis at the time of the current study).
What Did the Researchers Do and Find?
To investigate sex-specific differences in hemodialysis prevalence, the researchers compared data from the Human Mortality Database, which provides detailed population and mortality data for 37 countries, with data collected by the DOPPS. Forty-one percent of DOPPS patients were women, compared to 52% of the general population in 12 of the DOPPS countries. Next, the researchers used data collected from a randomly selected subgroup of patients to examine sex-specific differences in patient characteristics and mortality. The average estimated glomerular filtration rate at hemodialysis initiation was higher in men than women. Moreover, men were more frequently recipients of a kidney transplant than women. Notably, although in the general population in a given age group women were less likely to die than men, among hemodialysis patients, women were as likely to die as men. Finally, the researchers investigated which patient characteristics were associated with the largest sex-specific differences in mortality risk. The use of a hemodialysis catheter (a tube that is inserted into a patient's vein to transfer their blood into the hemodialysis machine) was associated with a lower mortality risk in men than in women.
What Do These Findings Mean?
These findings show that, among patients treated with hemodialysis for end-stage renal disease, women differ from men in many ways. Although some of these sex-specific differences may be related to biology, others may be related to patient care and to patient awareness of chronic kidney disease. Because this is an observational study, these findings cannot prove that the reported differences in hemodialysis prevalence, treatment, and mortality are actually caused by being a man or a woman. Importantly, however, these findings suggest that hemodialysis may abolish the survival advantage that women have over men in the general population and that fewer women than men are being treated for end-stage-renal disease, even though chronic kidney disease is more common in women than in men. Finally, the finding that the use of hemodialysis catheters for access to veins is associated with a higher mortality risk among women than among men suggests that, where possible, women should be offered a surgical process called arteriovenous fistula placement, which is recommended for access to veins during long-term hemodialysis but which may, in the past, have been underused in women.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001750.
More information about the DOPPS program is available
The US National Kidney and Urologic Diseases Information Clearinghouse provides information about all aspects of kidney disease; the US National Kidney Disease Education Program provides resources to help improve the understanding, detection, and management of kidney disease (in English and Spanish)
The UK National Health Service Choices website provides information for patients on chronic kidney disease and about hemodialysis, including some personal stories
The US National Kidney Foundation, a not-for-profit organization, provides information about chronic kidney disease and about hemodialysis (in English and Spanish)
The not-for-profit UK National Kidney Federation provides support and information for patients with kidney disease and for their carers, including information and personal stories about hemodialysis
World Kidney Day, a joint initiative between the International Society of Nephrology and the International Federation of Kidney Foundations, aims to raise awareness about kidneys and kidney disease
MedlinePlus has pages about chronic kidney disease and about hemodialysis
doi:10.1371/journal.pmed.1001750
PMCID: PMC4211675  PMID: 25350533
3.  High density lipoprotein is targeted for oxidation by myeloperoxidase in rheumatoid arthritis 
Annals of the rheumatic diseases  2013;72(10):1725-1731.
Objective
Phagocyte-derived myeloperoxidase (MPO) and pro-inflammatory high density lipoprotein (HDL) associate with rheumatoid arthritis (RA), but the link between MPO and HDL has not been systematically examined. In this study we investigated whether MPO can oxidize HDL and determined MPO-specific oxidative signature by apoA1 by peptide mapping in RA subjects without and with known cardiovascular disease (CVD).
Methods
Two MPO oxidation products, 3-chlorotyrosine and 3-nitrotyrosine were quantified by tandem mass-spectrometry (MS/MS) in in vitro model system studies and in plasma and HDL derived from healthy controls and RA subjects. MPO levels and cholesterol efflux were determined. Site-specific nitration and chlorination of apo A-1 peptides were quantified by MS/MS.
Results
RA subjects demonstrated higher levels of MPO, MPO-oxidized HDL, and diminished cholesterol efflux. There was marked increase in MPO-specific 3-chlorotyrosine and 3-nitrotyrosine content in HDL in RA subjects consistent with specific targeting of HDL, with increased nitration in RA subjects with CVD. Cholesterol efflux capacity was diminished in RA subjects and correlated inversely with HDL 3-chlorotyrosine suggesting a mechanistic role for MPO. Nitrated HDL was elevated in RACVD subjects compared with RA subjects without CVD. Oxidative peptide mapping revealed site-specific unique oxidation signatures on apoA1 for RA subjects without and with CVD.
Conclusion
We report an increase in MPO-mediated HDL oxidation that is regiospecific in RA and accentuated in those with CVD. Decreased cholesterol efflux capacity due MPO-mediated chlorination is a potential mechanism for atherosclerosis in RA and raises the possibility that oxidant-resistant forms of HDL may attenuate this increased risk.
doi:10.1136/annrheumdis-2012-202033
PMCID: PMC4151549  PMID: 23313808
Rheumatoid Arthritis; Myeloperoxidase; High Density Lipoprotein; Mass Spectrometry; Oxidative stress
4.  Computerized Assessment of Competence-Related Abilities in Living Liver Donors: The Adult-to-Adult Living Donor Liver Transplantation Cohort Study (A2ALL) 
Clinical transplantation  2013;27(4):633-645.
Background
Despite its importance, determination of competence to consent to organ donation varies widely based on local standards. We piloted a new tool to aid transplant centers in donor assessment.
Methods
We assessed competence-related abilities among potential living liver donors (LDs) in the 9-center A2ALL study. Prospective LDs viewed an educational video, and were queried to assess Understanding, Appreciation, Reasoning, and ability to express a Final Choice using the MacArthur Competence Assessment Tool for Clinical Research, adapted for computerized administration in LDs (“MacLiver”). Videotaped responses were scored by a clinical neuropsychologist (JF).
Results
Ninety-three LDs were assessed. Mean (standard deviation; domain maximum) scores were: Understanding: 18.1 (2.6; max=22), Appreciation: 5.1 (1.0; max=6), Reasoning: 3.1 (0.8; max=4), and Final Choice: 3.8 (0.5; max=4). Scores did not differ by demographics, relationship to the recipient, eligibility to donate, or eventual donation (p>0.4). Higher education was associated with greater Understanding (p=0.004) and Reasoning (p=0.03).
Conclusion
Standardized, computerized education with independent ratings of responses may (1) alert the clinical staff to potential donors who may not be competent to donate, and (2) highlight areas needing further assessment and education, leading to better informed decision-making.
doi:10.1111/ctr.12184
PMCID: PMC4096778  PMID: 23859354
Living Donation; Comprehension; MacArthur Competence Assessment Tool for Clinical Research; Informed Consent; Ethics; Transplantation
5.  A Randomized Controlled Trial of Pretransplant Antiviral Therapy to Prevent Recurrence of Hepatitis C after Liver Transplantation 
Hepatology (Baltimore, Md.)  2013;57(5):1752-1762.
Hepatitis C virus (HCV) infection recurs in liver recipients who are viremic at transplantation. We conducted a randomized controlled trial to test the efficacy and safety of pre-transplant pegylated interferon alpha-2b plus ribavirin (Peg-IFN-a2b/RBV) for prevention of post-transplant HCV recurrence. Enrollees had HCV and were listed for liver transplantation, with either potential living donors or MELD upgrade for hepatocellular carcinoma. Patients with HCV genotypes (G) 1/4/6 (n=44/2/1) were randomized 2:1 to treatment (n=31) or untreated control (n=16); HCV G2/3 (n=32) were assigned to treatment. Overall, 59 were treated and 20 were not. PEGIFN alfa-2b, starting at 0.75 μg/kg/wk, and ribavirin (RBV), starting at 600 mg/d, were escalated as tolerated. Patients assigned to treatment versus control had similar baseline characteristics. Combined virologic response (CVR) included pre-transplant sustained VR (SVR12) and post-transplant VR (pTVR), defined as undetectable HCV RNA 12 weeks after end of treatment or transplant, respectively. In intent-to-treat analyses, 12 (19%) assigned to treatment and 1 (6%) assigned to control achieved CVR (p=0.29); per-protocol values were 13 (22%) and 0 (0%) (p=0.03). Among treated G1/4/6 patients, 23/30 received transplant of whom 22% had pTVR; among treated G2/3 patients 21/29 received transplant, of whom 29% had pTVR. pTVR was 0%, 18%, and 50% in patients treated for <8, 8–16, and >16 weeks, respectively (p=0.01). Serious adverse events (SAEs) occurred with similar frequency in treated versus untreated patients (68% vs. 55%, p=0.30) but the number of SAEs per patient was higher in the treated group (2.7 vs. 1.3, p=0.003).
Conclusion
Pretransplant treatment with PEGIFN/RBV prevents post-transplant recurrence of HCV in selected patients. Efficacy is higher with >16 weeks of treatment, but treatment is associated with increased risk of potentially serious complications.
doi:10.1002/hep.25976
PMCID: PMC3510348  PMID: 22821361
Peginterferon alfa-2b; Ribavirin; Cirrhosis; Waiting List; post-transplant virologic response
6.  Functional Elements Associated with Hepatic Regeneration in Living Donors after Right Hepatic Lobectomy1,2 
We quantified rates of hepatic regeneration and functional recovery for 6 months after right hepatic lobectomy in living donors for liver transplantation.
Twelve donors were studied at baseline; eight retested at (mean±SD) 11±3 days (T1), 10 at 91±9 days (T2), and 10 at 185±17 days (T3) after donation. Liver and spleen volumes were measured by computed tomography (CT) and single photon emission computed tomography (SPECT). Hepatic metabolism was assessed from caffeine and erythromycin, and hepatic blood flow from cholates, galactose, and perfused hepatic mass (PHM, by SPECT).
Regeneration rates (mL liver per kg body weight per day) were 0.60±0.22 from baseline to T1, 0.05±0.02 from T1 to T2, 0.01±0.01 from T2 to T3 by CT, 0.54±0.20, 0.04±0.01 and 0.01±0.02 by SPECT. At T3, liver volume was 84±7% of baseline by CT and 92±13% by SPECT. Changes in hepatic metabolism did not achieve statistical significance. At T1, unadjusted clearance ratios relative to baseline were 0.75±0.07 for intravenous cholate (p=0.0001), 0.88±0.15 for galactose (p=0.0681), 0.84±0.08 (p=0.002) for PHM, and 0.83±0.19 (p=0.056) for estimated hepatic blood flow. These ratios approached 1.00 by T3. At T1, ratios adjusted per L liver were 20%-50% greater than baseline and trended toward baseline by T3. Several findings were consistent with alteration of the portal circulation: increased cholate shunt, increased spleen volume, decreased platelet count, and decreased clearance of orally-administered cholate.
During the first 2 weeks after donation, hepatic regeneration is rapid and accounts for nearly two-thirds of total regeneration. Increases in hepatic blood flow and uptake of cholate characterize the early phase of regeneration. Right lobe donation alters the portal circulation of living donors, but long-term clinical consequences, if any, are unknown.
doi:10.1002/lt.23592
PMCID: PMC3600052  PMID: 23239552
cholate; SPECT liver-spleen scan; erythromycin; caffeine; galactose
7.  Major bleeding events and risk stratification of antithrombotic agents in hemodialysis: Results from the DOPPS 
Kidney international  2013;84(3):10.1038/ki.2013.170.
Benefits and risks of antithrombotic agents remain unclear in the hemodialysis population. We aimed to determine variation in antithrombotic agent use, rates of major bleeding events, and to determine factors predictive of stroke and bleeding to allow for risk stratification, enabling more rational decisions about using antithrombotic agents.
The sample included 48,144 patients in 12 countries in the Dialysis Outcomes and Practice Patterns Study Phase I–IV. Antithrombotic agents included oral anticoagulants (OAC), ASA and anti-platelet agents (APA). OAC prescription, comorbidities and vascular access were assessed at study entry; data on clinical events including hospitalization due to bleeding were collected every four months during follow-up.
There was wide variation in OAC (0.3–18%), APA (3–25%) and ASA use (8–36%), and major bleeding rates (0.05–0.22 events/year) among countries. Rates of all-cause mortality, cardiovascular mortality, and bleeding events requiring hospitalization were elevated in patients prescribed OAC across adjusted models. The CHADS2 score predicted the risk of stroke in atrial fibrillation patients. Gastrointestinal bleeding in the past 12 months was highly predictive of major bleeding events; for patients with previous gastrointestinal bleeding, the rate of bleeding exceeded the rate of stroke by at least 2-fold across categories of CHADS2 score.
Prescription of antithrombotic agents varied greatly. The CHADS2 score and a history of gastrointestinal bleeding were predictive of stroke and bleeding events, respectively, with bleeding rates substantially exceeding stroke rates in all groups including patients at high stroke risk. Appropriate risk stratification and a cautious approach should be considered before OAC use in the dialysis population.
doi:10.1038/ki.2013.170
PMCID: PMC3885984  PMID: 23677245
8.  Major Depressive Disorder in a Family Study of Obsessive-Compulsive Disorder with Pediatric Probands 
Depression and anxiety  2011;28(6):10.1002/da.20824.
Objective
The study examined the comorbidity of obsessive-compulsive disorder (OCD) with major depressive disorder (MDD) in a family study of OCD with pediatric probands.
Method
The study examined the lifetime prevalence of MDD in 133 first- and 459 second-degree relatives of pediatric probands with OCD and normal controls, and identified clinical variables associated with MDD in case first-degree relatives (FDR). All available FDR were directly interviewed blind to proband status; parents were also interviewed to assess the family psychiatric history of FDR and second-degree relatives (SDR). Best-estimate diagnoses were made using all sources of information. Data were analyzed with logistic regression and robust Cox regression models.
Results
Lifetime MDD prevalence was higher in case than control FDR (30.4% vs. 15.4%). Lifetime MDD prevalence was higher in FDR of case probands with MDD than in either FDR of case probands without MDD (46.3% vs. 19.7%) or control FDR (46.3% vs. 15.4%). MDD in case FDR was associated with MDD in case probands and with age and OCD in those relatives. Lifetime MDD prevalence was similar in case and control SDR. However, lifetime MDD prevalence was higher in SDR of case probands with MDD than in either SDR of case probands without MDD (31.9% vs. 16.8%) or control SDR (31.9% vs. 15.4%).
Conclusions
The results provide further evidence of the heterogeneity of early-onset OCD and suggest that early-onset OCD comorbid with MDD is a complex familial syndrome.
doi:10.1002/da.20824
PMCID: PMC3859241  PMID: 21538726
anxiety; comorbidity; first-degree relatives; second-degree relatives; logistic regression; survival analysis
9.  Hemoglobin A1c Levels and Mortality in the Diabetic Hemodialysis Population 
Diabetes Care  2012;35(12):2527-2532.
OBJECTIVE
Lowering hemoglobin A1c to <7% reduces the risk of microvascular complications of diabetes, but the importance of maintaining this target in diabetes patients with kidney failure is unclear. We evaluated the relationship between A1c levels and mortality in an international prospective cohort study of hemodialysis patients.
RESEARCH DESIGN AND METHODS
Included were 9,201 hemodialysis patients from 12 countries (Dialysis Outcomes and Practice Patterns Study 3 and 4, 2006–2010) with type 1 or type 2 diabetes and at least one A1c measurement during the first 8 months after study entry. Associations between A1c and mortality were assessed with Cox regression, adjusting for potential confounders.
RESULTS
The association between A1c and mortality was U-shaped. Compared with an A1c of 7–7.9%, the hazard ratios (95% CI) for A1c levels were 1.35 (1.09–1.67) for <5%, 1.18 (1.01–1.37) for 5–5.9%, 1.21 (1.05–1.41) for 6–6.9%, 1.16 (0.94–1.43) for 8–8.9%, and 1.38 (1.11–1.71) for ≥9.0%, after adjustment for age, sex, race, BMI, serum albumin, years of dialysis, serum creatinine, 12 comorbid conditions, insulin use, hemoglobin, LDL cholesterol, country, and study phase. Diabetes medications were prescribed for 35% of patients with A1c <6% and not prescribed for 29% of those with A1c ≥9%.
CONCLUSIONS
A1c levels strongly predicted mortality in hemodialysis patients with type 1 or type 2 diabetes. Mortality increased as A1c moved further from 7–7.9%; thus, target A1c in hemodialysis patients may encompass values higher than those recommended by current guidelines. Modifying glucose-lowering medicines for dialysis patients to target A1c levels within this range may be a modifiable practice to improve outcomes.
doi:10.2337/dc12-0573
PMCID: PMC3507600  PMID: 22912431
10.  Severe Psychiatric Problems in Right Hepatic Lobe Donors for Living Donor Liver Transplantation 
Transplantation  2007;83(11):1506-1508.
Background
The morbidity and mortality from donation of a right hepatic lobe for living donor liver transplantation (LDLT) is an important issue for this procedure. We report the prevalence of severe psychiatric postoperative complications from the Adult-to-Adult Living Donor Liver Transplantation Cohort study (A2ALL), which was established to define the risks and benefits of LDLT for donors and recipients.
Methods
Severe psychiatric complications were evaluated in all donors from the A2ALL study who were evaluated between 1998 and February 2003.
Results
Of the 392 donors, 16 (4.1%) had one or multiple psychiatric complications, including three severe psychiatric complications (suicide, accidental drug overdose, and suicide attempt).
Conclusions
Despite extensive preoperative screening, some donors experience severe psychiatric complications, including suicide, after liver donation. Psychiatric assessment and monitoring of liver donors may help to understand and prevent such tragic events.
doi:10.1097/01.tp.0000263343.21714.3b
PMCID: PMC3762598  PMID: 17565325
Transplantation; Psychiatric; Morbidity; Liver
11.  Spironolactone and colitis: Increased mortality in rodents and in humans 
Inflammatory Bowel Diseases  2011;18(7):1315-1324.
Background and Methods
Crohn’s disease causes intestinal inflammation leading to intestinal fibrosis. Spironolactone is an anti-fibrotic medication commonly used in heart failure to reduce mortality. We examined whether spironolactone is anti-fibrotic in the context of intestinal inflammation. In vitro, spironolactone repressed fibrogenesis in TGFβ-stimulated human colonic myofibroblasts. However, spironolactone therapy significantly increased mortality in two rodent models of inflammation-induced intestinal fibrosis, suggesting spironolactone could be harmful during intestinal inflammation. Since IBD patients rarely receive spironolactone therapy, we examined whether spironolactone use was associated with mortality in a common cause of inflammatory colitis, Clostridium difficile infection (CDI).
Results
Spironolactone use during CDI infection was associated with increased mortality in a retrospective cohort of 4008 inpatients (15.9% vs. 9.1%, n=390 deaths, p<0.0001). In patients without liver disease, the adjusted OR for inpatient mortality associated with 80 mg spironolactone was 1.99 (95% CI: 1.51 – 2.63) In contrast to the main effect of spironolactone mortality, multivariable modeling revealed a protective interaction between liver disease and spironolactone dose. The adjusted odds ratio for mortality after CDI was 1.96 (95% CI: 1.50 – 2.55) for patients without liver disease on spironolactone vs. 1.28 (95% CI: 0.82 – 2.00) for patients with liver disease on spironolactone, when compared to a reference group without liver disease or spironolactone use.
Conclusions
We propose that discontinuation of spironolactone in patients without liver disease during CDI could reduce hospital mortality by 2-fold, potentially reducing mortality from CDI by 35,000 patients annually across Europe and the US.
doi:10.1002/ibd.21929
PMCID: PMC3288762  PMID: 22081497
12.  Antibody Levels to Persistent Pathogens and Incident Stroke in Mexican Americans 
PLoS ONE  2013;8(6):e65959.
Background
Persistent pathogens have been proposed as risk factors for stroke; however, the evidence remains inconclusive. Mexican Americans have an increased risk of stroke especially at younger ages, as well as a higher prevalence of infections caused by several persistent pathogens.
Methodology/Principal
Findings Using data from the Sacramento Area Latino Study on Aging (n = 1621), the authors used discrete-time regression to examine associations between stroke risk and (1) immunoglobulin G antibody levels to Helicobacter pylori (H. pylori), Cytomegalovirus, Varicella Zoster Virus, Toxoplasma gondii and Herpes simplex virus 1, and (2) concurrent exposure to several pathogens (pathogen burden), defined as: (a) summed sero-positivity, (b) number of pathogens eliciting high antibody levels, and (c) average antibody level. Models were adjusted for socio-demographics and stroke risk factors. Antibody levels to H. pylori predicted incident stroke in fully adjusted models (Odds Ratio: 1.58; 95% Confidence Interval: 1.09, 2.28). No significant associations were found between stroke risk and antibody levels to the other four pathogens. No associations were found for pathogen burden and incident stroke in fully adjusted models.
Conclusions/Significance
Our results suggest that exposure to H. pylori may be a stroke risk factor in Mexican Americans and may contribute to ethnic differences in stroke risk given the increased prevalence of exposure to H. pylori in this population. Future studies are needed to confirm this association.
doi:10.1371/journal.pone.0065959
PMCID: PMC3682951  PMID: 23799066
13.  Predictors of heart rate variability and its prognostic significance in chronic kidney disease 
Background.
Heart rate variability (HRV), a noninvasive measure of autonomic dysfunction and a risk factor for cardiovascular disease (CVD), has not been systematically studied in nondialysis chronic kidney disease (CKD).
Methods.
HRV was assessed using 24-h Holter monitoring in 305 subjects from the Renal Research Institute-CKD Study, a four-center prospective cohort of CKD (Stages 3–5). Multiple linear regression was used to assess predictors of HRV (both time and frequency domain) and Cox regression used to predict outcomes of CVD, composite of CVD/death and end-stage renal disease (ESRD).
Results.
A total of 47 CVD, 67 ESRD and 24 death events occurred over a median follow-up of 2.7 years. Lower HRV was significantly associated with older age, female gender, diabetes, higher heart rate, C-reactive protein and phosphorus, lower serum albumin and Stage 5 CKD. Lower HRV (mostly frequency domain) was significantly associated with higher risk of CVD and the composite end point of CVD or death. Significantly, lower HRV (frequency domain) was associated with higher risk of progression to ESRD, although this effect was relatively weaker.
Conclusions.
This study draws attention to the importance of HRV as a relatively under recognized predictor of adverse cardiovascular and renal outcomes in patients with nondialysis CKD. Whether interventions that improve HRV will improve these outcomes in this high-risk population deserves further study.
doi:10.1093/ndt/gfr340
PMCID: PMC3616757  PMID: 21765187
autonomic nervous system; cardiovascular disease risk factors; cardiovascular outcomes; cohort study; end-stage renal disease
14.  Clinical Characteristics of Newly Diagnosed Primary, Pigmentary, and Pseudoexfoliative Open-Angle Glaucoma in the Collaborative Initial Glaucoma Treatment Study 
The British journal of ophthalmology  2012;96(9):1180-1184.
Background/Aims
Three types of open-angle glaucoma (OAG) – primary, pigmentary, and pseudoexfoliative – are frequently encountered. The aim of this study was to compare demographic, ocular, and systemic medical information collected on people with these three OAG types at diagnosis, and determine if the OAG type affected prognosis.
Methods
Information on 607 participants of the Collaborative Initial Glaucoma Treatment Study was accessed. Descriptive statistics characterized their demographic, ocular, and medical status at diagnosis. Comparisons were made using analysis of variance (ANOVA), and chi-square or Fisher exact tests. Multinomial, mixed, and logistic regression analyses were also performed.
Results
Relative to people with primary OAG, those with pigmentary OAG were younger, more likely to be white, less likely to have a family history of glaucoma, and were more myopic. Those with pseudoexfoliative OAG were older, more likely to be white, more likely to be female, less likely to have bilateral disease, and presented with higher IOP and better VA. The type of glaucoma was not associated with intraocular pressure or visual field progression during follow-up.
Conclusion
Characteristics of newly-diagnosed enrollees differed by the type of OAG. While some of these differences relate to the pathogenesis of OAG type, other differences are noteworthy for further evaluation within population-based samples of subjects with newly-diagnosed OAG.
doi:10.1136/bjophthalmol-2012-301820
PMCID: PMC3480313  PMID: 22773091
Glaucoma; Epidemiology
15.  Evaluating clinical change and visual function concerns in drivers and non-drivers with glaucoma 
Purpose
To compare drivers and non-drivers, and describe the specific concerns of drivers, among individuals with glaucoma.
Methods
607 newly-diagnosed glaucoma patients from 14 clinical centers of the Collaborative Initial Glaucoma Treatment Study were randomly assigned to initial medicine or surgery and followed every six months for < 5 years. Driving status (drivers vs. non-drivers) as well as patient-reported visual function was determined by the Visual Activities Questionnaire and the National Eye Institute Visual Function Questionnaire. Clinical evaluation included visual field mean deviation (MD) and visual acuity. Statistical comparisons were made using t, Chi-square, and exact tests, regression, and Rasch analyses.
Results
Drivers were more likely than non-drivers to be male, white, married, employed, and have more education, higher income, and fewer co-morbidities. Over 50% of drivers reported at least “some” difficulty performing tasks involving glare, whereas 22% reported at least “some” difficulty with tasks requiring peripheral vision. At 54 months, drivers with moderate/severe bilateral visual field loss (VFL) reported greater difficulty with night driving and tasks involving visual search and visual processing speed than drivers with less bilateral VFL (all p-values <0.05). While those who remained drivers over follow-up had better MD in both eyes than those who became non-drivers due to eyesight, a number of drivers had marked VFL.
Conclusion
Inquiring about specific difficulties with tasks related to glare, visual processing speed, visual search and peripheral vision in driving, especially among patients with substantial bilateral VF damage, will enable physicians to more effectively counsel patients regarding driving.
doi:10.1167/iovs.08-2575
PMCID: PMC3395081  PMID: 19060263
16.  Outcomes for Adult Living Donor Liver Transplantation: Comparison of A2ALL and National Experience 123 
Objective
To determine if A2ALL (Adult-to-Adult Living Donor Liver Transplantation Cohort Study) findings are reflected in the national experience, and further define risk factors for patient mortality and graft loss in living donor liver transplantation (LDLT).
Background
A2ALL previously identified risk factors for mortality after LDLT, including early center experience, older recipient age and duration of cold ischemia.
Methods
LDLTs at the 9 A2ALL centers (n=702) and 67 non-A2ALL centers (n=1664) from 1/1/98 to 12/31/07 in the Scientific Registry of Transplant Recipients database were analyzed. Potential predictors of time to death or graft failure were tested using multivariable Cox regression, starting at time of transplant.
Results
There was no significant difference in overall mortality between A2ALL and non-A2ALL centers. Higher mortality hazard ratios (HR) were associated with donor age (HR=1.13/10 years, P<0.001), recipient age (HR=1.20/10 years, P<0.001), serum creatinine (HR=1.52 (log scale), P<0.001), diagnosis of HCC (HR=2.12, P<0.001) or HCV (HR=1.18, P=0.03), ICU (HR=2.52, P<0.001) or hospitalized (HR=1.62, P<0.001) vs home, and earlier center experience (LDLT case number ≤15, HR=1.61, P<0.001; HR=2.24, P<0.001 among A2ALL centers, HR=1.45, P=0.005 among non-A2ALL centers). Cold ischemia time (CIT) >4.5 hours was also associated with higher mortality (HR=1.79, P<0.001). Other than for center experience, comparisons of risk factor effects between A2ALL and non-A2ALL centers were not significant. Increased risk of graft failure in early experience was comparable in both groups.
Conclusions
Mortality risk factors were similar at A2ALL and non-A2ALL centers. Variables associated with graft loss were identified and showed similar trends, with some minor differences in degree of significance. These analyses demonstrate that the findings from the A2ALL consortium are relevant to other centers in the U.S. performing LDLT, and conclusions and recommendations from A2ALL may help guide clinical decision making.
doi:10.1002/lt.22288
PMCID: PMC3116058  PMID: 21360649
Liver transplantation; Living donor; Post-operative outcomes; Risk factors
17.  Laboratory test results after living liver donation in the Adult to Adult Living Donor Liver Transplantation Cohort Study (A2ALL) 
Liver Transplantation  2011;17(4):409-417.
Introduction
Information on long-term health among living liver donors is incomplete. Because changes in standard laboratory tests may reflect underlying health among donors, pre- and post- donation results were examined in the Adult-to-Adult Living Donor Liver Transplantation Cohort Study (A2ALL).
Methods
A2ALL followed 487 living liver donors who donated at nine U.S. transplant centers between 1998 and 2009. Aminotransferase and alkaline phosphatase activities (AST, ALT, AP), bilirubin, INR, albumin, white blood cell count (WBC), hemoglobin, platelet count, ferritin, serum creatinine and BUN were measured at evaluation and post-donation: 1 week, 1 month, 3 months, 1 year and yearly thereafter. Repeated measures models were used to estimate median lab values at each time point and test for differences between values at evaluation (baseline) and post-donation time points.
Results
Platelet counts were significantly decreased at every time point compared to baseline, and at three years were 19% lower. Approximately 10% of donors had a platelet count ≤150 (×1000/mm3) at 2–3 years post-donation. Donors with a platelet count ≤150 (×1000/mm3) at one year had had significantly lower mean platelet counts (189±32) vs. the remainder of the cohort (267±56, p<0.0001) at evaluation. Statistically significant differences from evaluation were noted for AST, AP, INR and albumin through the first year, although most measurements were in the normal range. Median values for WBC, hemoglobin, ferritin, albumin, serum creatinine, BUN, and INR were not substantially outside the normal range at any time point.
Conclusions
After three months, most laboratory values return to normal among right hepatic lobe liver donors, with slower return to baseline levels for AST, AP, INR and albumin. Persistently decreased platelet counts warrant further investigation.
doi:10.1002/lt.22246
PMCID: PMC3295864  PMID: 21445924
living donor liver transplantation; complications; liver transplantation; liver function tests; hepatectomy
18.  Visual Field Progression in the Collaborative Initial Glaucoma Treatment Study: The Impact of Treatment and other Baseline Factors 
Ophthalmology  2008;116(2):200-207.
Purpose
To evaluate factors associated with visual field (VF) progression, using all available follow-up through nine years after treatment initiation, in the Collaborative Initial Glaucoma Treatment Study (CIGTS).
Design
Longitudinal follow-up of participants enrolled in a randomized clinical trial.
Participants
607 newly diagnosed glaucoma patients.
Methods
In a randomized clinical trial, 607 subjects with newly diagnosed open-angle glaucoma were initially treated with either medication or trabeculectomy. After treatment initiation and early follow-up, subjects were evaluated clinically at 6-month intervals. Study participants in both arms of the CIGTS were treated aggressively in an effort to reduce intraocular pressure (IOP) to a level at or below a predetermined, eye-specific target pressure. VF progression was analyzed using repeated measures models.
Main outcome measures
VF progression, measured by Humphrey 24-2 full threshold testing and assessed by the change in the mean deviation (MD), and an indicator of substantial worsening of the VF (MD decrease of ≥3 dB from baseline), assessed at each follow-up visit.
Results
Follow-up indicates minimal change from baseline in each initial treatment group’s average MD. However, at the eight year follow-up examination, substantial worsening (≥3 dB) of MD from baseline was found in 21.3% and 25.5% of the initial surgery and initial medicine groups, respectively. The effect of initial treatment on subsequent VF loss was modified by time (P<0.0001), baseline MD (P=0.03), and diabetes (P=0.01). Initial surgery led to less VF progression than initial medicine in subjects with advanced VF loss at baseline, whereas subjects with diabetes had more VF loss over time if treated initially with surgery.
Conclusions
The CIGTS intervention protocol led to a lowering of IOP that persisted over time in both treatment groups. Progression in VF loss was seen in a subset increasing to over 20% of the subjects. Our findings regarding initial surgery being beneficial for subjects who present at diagnosis with more advanced VF loss, but detrimental for patients with diabetes, are noteworthy and warrant independent confirmation.
doi:10.1016/j.ophtha.2008.08.051
PMCID: PMC3316491  PMID: 19019444
19.  IMPROVEMENT IN SURVIVAL ASSOCIATED WITH ADULT-TO-ADULT LIVING DONOR LIVER TRANSPLANTATION1,2 
Gastroenterology  2007;133(6):1806-1813.
Background and Aims
More than 2000 adult-to-adult living donor liver transplants (LDLT) have been performed in the U.S., yet the potential benefit to liver transplant candidates of undergoing LDLT compared to waiting for deceased donor liver transplant (DDLT) is unknown. The aim of this study was to determine if there is a survival benefit of adult LDLT
Methods
Adults with chronic liver disease who had a potential living donor evaluated from 1/98 to 2/03 at nine university-based hospitals were analyzed. Starting at the time of a potential donor’s evaluation, we compared mortality after LDLT to mortality among those who remained on the waitlist or received DDLT. Median follow-up was 4.4 years. Comparisons were made by hazard ratios (HR) adjusted for LDLT candidate characteristics at the time of donor evaluation.
Results
Among 807 potential living donor recipients, 389 received LDLT, 249 received DDLT, 99 died without transplant, and 70 were awaiting transplant at last follow-up. Receipt of LDLT was associated with an adjusted mortality HR of 0.56 (95% confidence interval [CI] 0.42–0.74; P<0.001) relative to candidates who did not receive LDLT. As centers gained greater experience (> 20 LDLT), LDLT benefit was magnified, with a mortality HR of 0.35 (CI 0.23–0.53; P<0.001).
Conclusions
Adult LDLT was associated with lower mortality than the alternative of waiting for DDLT. This reduction in mortality was magnified as centers gained experience with living donor liver transplantation. This reduction in transplant candidate mortality must be balanced against the risks undertaken by the living donors themselves.
doi:10.1053/j.gastro.2007.09.004
PMCID: PMC3170913  PMID: 18054553
20.  Contrasting the Use of Two Vision-Specific Quality of Life Questionnaires in Subjects with Open-Angle Glaucoma 
Journal of glaucoma  2009;18(5):403-411.
Purpose
To compare two vision-specific functional status measures to each other and to clinical parameters in the Collaborative Initial Glaucoma Treatment Study (CIGTS).
Methods
CIGTS participants completed the Visual Activities Questionnaire (VAQ) and the National Eye Institute-Visual Function Questionnaire (NEI-VFQ) and were tested for visual field (VF) and visual acuity (VA). 426 subjects contributed VAQ and NEI-VFQ scores at 54 months. Pearson correlations were used to assess associations.
Results
The VAQ subscales (range 0–100) that assessed light-dark adaptation (mean=66.1), glare disability (66.4), and acuity/spatial vision (67.7) indicated vision-related functions that CIGTS participants found most difficult. On the NEI-VFQ, subjects reported high levels of visual functioning, with mean ≥90 (out of 100) on the total score and in 9 of 12 subscales. General vision (mean=82.6) received the lowest subscale score. Two subscales common to both questionnaires were highly correlated: VA (r=0.68) and peripheral vision (r=0.77) (both p<.0001). Correlations between participants’ perceptions and clinical measures of visual function were in the expected direction, but weaker. Stronger associations were found between clinical measures and the NEI-VFQ than the VAQ. Better eye VF and worse eye VA had the highest number of significant correlations with subjects’ perceptions of their visual function. Increasing VF loss was associated with a significant decrease in the overall and peripheral vision subscale scores from both questionnaires, as well as several other subscales.
Conclusions
These findings will help researchers interested in assessing patients’ perceptions of their visual function make an informed selection when choosing between the VAQ and the NEI-VFQ.
doi:10.1097/IJG.0b013e3181879e63
PMCID: PMC3060041  PMID: 19525733
Glaucoma; quality of life; visual field; visual acuity
21.  Case Report: The University of Michigan Dioxin Exposure Study: A Follow-up Investigation of a Case with High Serum Concentration of 2,3,4,7,8-Pentachlorodibenzofuran 
Environmental Health Perspectives  2010;118(9):1313-1317.
Context
Polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans, and dioxin-like polychlorinated biphenyls that have toxic equivalency factors (TEFs) were measured in serum of 946 subjects in five Michigan counties. The study was motivated by concerns about human exposure to dioxin-contaminated sediments in the Tittabawassee River (TR). Most of the toxic equivalency in TR sediments is from two furan congeners, 2,3,7,8-tetrachlorodibenzofuran and 2,3,4,7,8-pentachlorodibenzofuran (2,3,4,7,8-pentaCDF).
Case presentation
The individual with the highest adjusted (for age, age squared, and body mass index) serum level of 2,3,4,7,8-pentaCDF in the study (42.5 ppt) reported a unique history of raising cattle and vegetables in the floodplain of the TR. Interviews and serum samples were obtained from the index case and 15 other people who ate beef and vegetables raised by the index case. 2,3,4,7,8-pentaCDF in beef lipid was estimated to have been more than three orders of magnitude greater than background (1,780 vs. 1.1 ppt). The mean, median, and 95th percentile for serum 2,3,4,7,8-pentaCDF in the study control population were 6.0, 5.4, and 13.0 ppt, respectively, and were 9.9, 8.4, and 20.5 ppt among beef and vegetable consumers, respectively. Back extrapolation for the index case suggests that his increase in serum concentration of 2,3,4,7,8-pentaCDF above background may have been as high as 146 ppt.
Discussion
Consumption of beef and/or vegetables raised on dioxin-contaminated soil may be an important completed pathway of exposure.
Relevance to public health practice
Animals and crops should not be raised for human consumption in areas contaminated with dioxins.
doi:10.1289/ehp.0901723
PMCID: PMC2944095  PMID: 20813655
dioxins; food; furans; pathway of exposure; polychlorinated biphenyls
22.  Factors Associated with Intraocular Pressure Prior to and during Nine Years of Treatment in the Collaborative Initial Glaucoma Treatment Study 
Ophthalmology  2007;115(6):927-933.
Purpose
To evaluate, both at initial glaucoma diagnosis and during treatment, the role of demographic and clinical factors on intraocular pressure (IOP).
Design
Cohort study of patients enrolled in a randomized clinical trial.
Participants
607 patients with newly diagnosed, open-angle glaucoma (OAG) were enrolled at 14 U.S. centers.
Methods
After randomization to initial surgery or medications, patients were followed at six-month intervals. IOP was measured by Goldmann applanation tonometry. Predictive factors for IOP at baseline and during follow-up were analyzed using linear mixed models.
Main Outcome Measure
IOP at baseline and during follow-up.
Results
The mean baseline IOP was 27.5 mmHg (standard deviation, 5.6 mmHg). Predictive factors for higher baseline IOP included younger age (0.7 mmHg per 10 years), male sex (2.4 mmHg higher than females), pseudoexfoliative glaucoma (5.4 mmHg higher than primary OAG), and pupillary defect (2.2 mmHg higher than those without a defect). During nine years of follow-up, both surgery and medications dramatically reduced IOP from baseline levels, but the extent of IOP reduction was consistently greater in the surgery group. Over follow-up years 2–9, mean IOP was 15.0 vs. 17.2 mmHg for surgery vs. medicine, respectively. Predictive associations with higher IOP during follow-up included higher baseline IOP (P<0.0001), worse baseline visual field (mean deviation; P<0.0001), and lower level of education (P=0.0019). Treatment effect was modified by smoking status: non-smokers treated surgically had lower IOP than smokers treated surgically (14.6 vs. 16.7 mmHg, respectively; P=0.0013). Clinical center effects were significant (P<0.0001) in both the baseline and follow-up models.
Conclusions
In this large cohort of newly diagnosed glaucoma patients, predictors of pre-treatment IOP and IOP measurements over nine years of follow-up were identified. Our findings lend credence to the postulate that sociodemographic, economic, compliance, or other environmental influences play a role in IOP control during treatment.
doi:10.1016/j.ophtha.2007.08.010
PMCID: PMC2758572  PMID: 17964655
23.  The University of Michigan Dioxin Exposure Study: Predictors of Human Serum Dioxin Concentrations in Midland and Saginaw, Michigan 
Environmental Health Perspectives  2008;117(5):818-824.
Background
We conducted a population-based human exposure study in response to concerns among the population of Midland and Saginaw counties, Michigan, that discharges by the Dow Chemical Company of dioxin-like compounds into the nearby river and air had led to an increase in residents’ body burdens of polychlorinated dibenzofurans (PCDDs), polychlorinated dibenzofurans (PCDFs), and dioxin-like polychlorinated biphenyls (PCBs), here collectively referred to as “dioxins.”
Objectives
We sought to identify factors that explained variation in serum dioxin concentrations among the residents of Midland and Saginaw counties. Exposures to dioxins in soil, river sediments, household dust, historic emissions, and contaminated fish and game were of primary interest.
Methods
We studied 946 people in four populations in the contaminated area and in a referent population, by interview and by collection of serum, household dust, and residential soil. Linear regression was used to identify factors associated with serum dioxins.
Results
Demographic factors explained a large proportion of variation in serum dioxin concentrations. Historic exposures before 1980, including living in the Midland/Saginaw area, hunting and fishing in the contaminated areas, and working at Dow, contributed to serum dioxin levels. Exposures since 1980 in Midland and Saginaw counties contributed little to serum dioxins.
Conclusions
This study provides valuable insights into the relationships between serum dioxins and environmental factors, age, sex, body mass index, smoking, and breast-feeding. These factors together explain a substantial proportion of the variation in serum dioxin concentrations in the general population. Historic exposures to environmental contamination appeared to be of greater importance than recent exposures for dioxins.
doi:10.1289/ehp.11779
PMCID: PMC2685847  PMID: 19479027
epidemiology; exposure pathways; polychlorinated biphenyls; polychlorinated dioxins; polychlorinated furans; soil contamination
24.  The University of Michigan Dioxin Exposure Study: Population Survey Results and Serum Concentrations for Polychlorinated Dioxins, Furans, and Biphenyls 
Environmental Health Perspectives  2008;117(5):811-817.
Background
The University of Michigan Dioxin Exposure Study was undertaken to address concerns that the discharge of polychlorinated dibenzo-p-dioxins (PCDDs) and polychlorinated dibenzo furans (PCDFs) from the Dow Chemical Company in the Midland, Michigan, area had resulted in contamination of soils in the Tittabawassee River floodplain and the city of Midland, leading to an increase in residents’ body burdens of these compounds.
Objective
In this article we present descriptive statistics from the resident survey and sampling of human serum, household dust, and soil and compare them with other published values.
Methods
From a multistage random sample of populations in four areas of Midland and Saginaw counties and from a distant referent population, we interviewed 946 adults, who also donated blood for analysis of PCDDs, PCDFs, and polychlorinated biphenyls (PCBs). Samples of household dust and house perimeter soil were collected from consenting subjects who owned their property.
Results
All five study populations were comparable in age, race, sex, and length of residence in their current home. Regional differences existed in employment history, personal contact with contaminated soils, and consumption of fish and game from contaminated areas. Median soil concentrations were significantly increased around homes in the Tittabawassee River floodplain (11.4 ppt) and within the city of Midland (58.2 ppt) compared with the referent population (3.6 ppt). Median serum toxic equivalencies were significantly increased in people who lived in the floodplain (23.2 ppt) compared with the referent population (18.5 ppt).
Conclusions
Differences in serum dioxin concentrations among the populations were small but statistically significant. Regression modeling is needed to identify whether the serum concentrations of PCDDs, PCDFs, and PCBs are associated with contaminated soils, household dust, and other factors.
doi:10.1289/ehp.11780
PMCID: PMC2685846  PMID: 19479026
biomonitoring; dioxins; dust; environmental exposure; furans; polychlorinated biphenyls; serum; soil; survey
25.  The University of Michigan Dioxin Exposure Study: Methods for an Environmental Exposure Study of Polychlorinated Dioxins, Furans, and Biphenyls 
Environmental Health Perspectives  2008;117(5):803-810.
Background
The University of Michigan Dioxin Exposure Study (UMDES) was undertaken in response to concerns that the discharge of dioxin-like compounds from the Dow Chemical Company facilities in Midland, Michigan, resulted in contamination of soils in the Tittabawassee River floodplain and areas of the city of Midland, leading to an increase in residents’ body burdens of polychlorinated dibenzodioxins and polychlorinated dibenzofurans.
Objectives
The UMDES is a hypothesis-driven study designed to answer important questions about human exposure to dioxins in the environment of Midland, where the Dow Chemical Company has operated for > 100 years, and in neighboring Saginaw, Michigan. In addition, the UMDES includes a referent population from an area of Michigan in which there are no unusual sources of dioxin exposure and from which inferences regarding the general Michigan population can be derived. A central goal of the study is to determine which factors explain variation in serum dioxin levels and to quantify how much variation each factor explains.
Conclusions
In this article we describe the study design and methods for a large population-based study of dioxin contamination and its relationship to blood dioxin levels. The study collected questionnaire, blood, dust, and soil samples on 731 people. This study provides a foundation for understanding the exposure pathways by which dioxins in soils, sediments, fish and game, and homegrown produce lead to increased body burdens of these compounds.
doi:10.1289/ehp.11777
PMCID: PMC2685845  PMID: 19479025
biomonitoring; diet; dioxins; environmental exposure; epidemiology; population-based; serum; soil; survey

Results 1-25 (28)