PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-18 (18)
 

Clipboard (0)
None

Select a Filter Below

Year of Publication
1.  The subgingival microbiome of clinically healthy current and never smokers 
The ISME Journal  2014;9(1):268-272.
Dysbiotic oral bacterial communities have a critical role in the etiology and progression of periodontal diseases. The goal of this study was to investigate the extent to which smoking increases risk for disease by influencing the composition of the subgingival microbiome in states of clinical health. Subgingival plaque samples were collected from 200 systemically and periodontally healthy smokers and nonsmokers. 16S pyrotag sequencing was preformed generating 1 623 713 classifiable sequences, which were compared with a curated version of the Greengenes database using the quantitative insights into microbial ecology pipeline. The subgingival microbial profiles of smokers and never-smokers were different at all taxonomic levels, and principal coordinate analysis revealed distinct clustering of the microbial communities based on smoking status. Smokers demonstrated a highly diverse, pathogen-rich, commensal-poor, anaerobic microbiome that is more closely aligned with a disease-associated community in clinically healthy individuals, suggesting that it creates an at-risk-for-harm environment that is primed for a future ecological catastrophe.
doi:10.1038/ismej.2014.114
PMCID: PMC4274424  PMID: 25012901
2.  Comparison of a Mindful Eating Intervention to a Diabetes Self-Management Intervention Among Adults With Type 2 Diabetes: A Randomized Controlled Trial 
Mindful eating may be an effective intervention for increasing awareness of hunger and satiety cues, improving eating regulation and dietary patterns, reducing symptoms of depression and anxiety, and promoting weight loss. Diabetes self-management education (DSME), which addresses knowledge, self-efficacy, and outcome expectations for improving food choices, also may be an effective intervention for diabetes self-care. Yet few studies have compared the impact of mindful eating to a DSME-based treatment approach on patient outcomes. Adults 35 to 65 years old with type 2 diabetes for ≥1 year not requiring insulin therapy were recruited from the community and randomly assigned to treatment group. The impact of a group-based 3-month mindful eating intervention (MB-EAT-D; n = 27) to a group-based 3-month DSME “Smart Choices” (SC) intervention (n = 25) postintervention and at 3-month follow-up was evaluated. Repeated-measures ANOVA with contrast analysis compared change in outcomes across time. There was no significant difference between groups in weight change. Significant improvement in depressive symptoms, outcome expectations, nutrition and eating-related self-efficacy, and cognitive control and disinhibition of control regarding eating behaviors occurred for both groups (all p < .0125) at 3-month follow-up. The SC group had greater increase in nutrition knowledge and self-efficacy than the MB-EAT-D group (all p < .05) at 3-month follow-up. MB-EAT-D had significant increase in mindfulness, whereas the SC group had significant increase in fruit and vegetable consumption at study end (all p < .0125). Both SC and MB-EAT-D were effective treatments for diabetes self-management. The availability of mindful eating and DSME-based approaches offers patients greater choices in meeting their self-care needs.
doi:10.1177/1090198113493092
PMCID: PMC4217158  PMID: 23855018
meditation; patient education; randomized controlled trial; type 2 diabetes mellitus
4.  Impact of post-kidney transplant parathyroidectomy on allograft function 
Clinical transplantation  2013;27(3):397-402.
Background
The impact of parathyroidectomy on allograft function in kidney transplant patients is unclear.
Methods
We conducted a retrospective, observational study of all kidney transplant recipients from 1988 to 2008 who underwent parathyroidectomy for uncontrolled hyperparathyroidism (n = 32). Post-parathyroidectomy, changes in estimated glomerular filtration rate (eGFR) and graft loss were recorded. Cross-sectional associations at baseline between eGFR and serum calcium, phosphate, and parathyroid hormone (PTH), and associations between their changes within subjects during the first two months post-parathyroidectomy were assessed.
Results
Post-parathyroidectomy, the mean eGFR declined from 51.19 mL/min/1.73 m2 at parathyroidectomy to 44.78 mL/min/1.73 m2 at two months (p < 0.0001). Subsequently, graft function improved, and by 12 months, mean eGFR recovered to 49.76 mL/min/1.73 m2 (p = 0.035). Decrease in serum PTH was accompanied by a decrease in eGFR (p = 0.0127) in the first two months post-parathyroidectomy. Patients whose eGFR declined by ≥ 20% (group 1) in the first two months post-parathyroidectomy were distinguished from the patients whose eGFR declined by <20% (group 2). The two groups were similar except that group 1 had a higher baseline mean serum PTH compared with group 2, although not significant (1046.7 ± 1034.2 vs. 476.6 ± 444.9, p = 0.14). In group 1, eGFR declined at an average rate of 32% (p < 0.0001) during the first month post-parathyroidectomy compared with 7% (p = 0.1399) in group 2, and the difference between these two groups was significant (p = 0.0003). The graft function recovered in both groups by one yr. During median follow-up of 66.00 ± 49.45 months, 6 (18%) patients lost their graft with a mean time to graft loss from parathyroidectomy of 37.2 ± 21.6 months. The causes of graft loss were rejection (n = 2), pyelonephritis (n = 1) and chronic allograft nephropathy (n = 3). No graft loss occurred during the first-year post-surgery.
Conclusion
Parathyroidectomy may lead to transient kidney allograft dysfunction with eventual recovery of graft function by 12 months post-parathyroidectomy. Higher level of serum PTH pre-parathyoidectomy is associated with a more profound decrease in eGFR post-parathyroidectomy.
doi:10.1111/ctr.12099
PMCID: PMC3932484  PMID: 23448282
graft dysfunction; hyperparathyroidism; kidney transplant; outcomes; parathyroidectomy
5.  Metabolic Syndrome and Insulin Resistance in Division 1 Collegiate Football Players 
Medicine and science in sports and exercise  2009;41(12):10.1249/MSS.0b013e3181abdfec.
BORCHERS, J. R., K. L. CLEM, D. L. HABASH, H. N. NAGARAJA, L. M. STOKLEY, and T. M. BEST. Metabolic Syndrome and Insulin Resistance in Division 1 Collegiate Football Players. Med. Sci. Sports Exerc., Vol. 41, No. 12, pp. 2105–2110, 2009.
Purpose
To estimate the prevalence of metabolic syndrome and insulin resistance in a cohort of Division 1 collegiate football players.
Methods
Ninety football players were evaluated in a cross-sectional study to estimate the prevalence of metabolic syndrome, insulin resistance, and associated risk factors. Obesity was defined as a body fat ≥25% determined by BOD POD measurements. The National Cholesterol Education Program Adult Treatment Panel III criteria were used to estimate prevalence of metabolic syndrome. Quantitative insulin sensitivity check index calculations were performed to estimate prevalence of insulin resistance. Linear regression techniques were used to determine association between body fat percentage and other measured continuous parameters. Fisher exact test was used to determine association between nominal variables, and one-way ANOVA compared the three groups defined by position.
Results
Summary measures showed a small prevalence of abnormal individual measurements. There was an association between body fat percentage and most evaluated parameters (P < 0.05). The prevalence of obesity, insulin resistance, and metabolic syndrome was 21%, 21%, and 9%, respectively. Obesity is closely associated with metabolic syndrome (P < 0.0001) and insulin resistance (P < 0.0001) in this population. All subjects with metabolic syndrome were obese, and the odds for insulin resistance in the obese group are 10.6 times the odds for the nonobese group. Linemen (n = 29) had 19 of the 19 obese subjects, 13 of the 19 subjects with insulin resistance, and all subjects with metabolic syndrome.
Conclusions
There is a strong association between obesity and both metabolic syndrome and insulin resistance in Division 1 collegiate football players. Linemen are at significant risk for metabolic syndrome and insulin resistance compared with other positions. This may be predictive of future health problems in Division 1 collegiate football players, especially linemen.
doi:10.1249/MSS.0b013e3181abdfec
PMCID: PMC3872996  PMID: 19915510
Obesity; Cardiovascular Risk Factors; Exercise; Diet
6.  Comparative Effectiveness of a Mindful Eating Intervention to a Diabetes Self-Management Intervention among Adults with Type 2 Diabetes: A Pilot Study 
Mindful eating offers promise as an effective approach for weight management and glycemic control in people with diabetes. Diabetes self-management education (DSME) is an essential component of effective self-care. Yet, little research has compared the effect of mindful eating to DSME-based treatment. This study compared the impact of these two interventions in adults with type 2 diabetes mellitus (T2DM). A prospective randomized controlled trial with two parallel interventions was employed. Participants included adults aged 35–65 with T2DM for ≥ 1 year, body mass index (BMI) ≥ 27.0, and A1c ≥ 7.0% who were randomly assigned to a 3-month mindful eating (MB-EAT-D; n=27) or Smart Choices (SC) DSME-based (n=25) intervention. Follow-up occurred 3-months following intervention completion. Dietary intake, physical activity, weight, glycemia, and fasting insulin were assessed using repeated measures ANOVA with contrast analysis. There was no significant difference between groups in the change in weight or glycemia at study end. Significant difference occurred between groups in the change in dietary intake/1000 kcal of trans fats, total fiber, and sugars (all P<0.05). Mean (±SE) reduction in weight (−2.92 ± 0.54 kg for SC vs. −1.53 ± 0.54 kg for MB-EAT-D) and A1c (−0.67 ± 0.24% for SC and −0.83 ± 0.24% for MB-EAT-D) were significant (P<0.01). Significant reduction in energy intake and glycemic load occurred (all P<0.0001) for both groups. Training in mindful eating and diabetes self-management facilitate improvement in dietary intake, modest weight loss, and glycemic control. The availability of effective treatments allows diabetes patients choices in meeting their self-care needs.
doi:10.1016/j.jand.2012.07.036
PMCID: PMC3485681  PMID: 23102183
type 2 diabetes mellitus; meditation; patient education; randomized controlled trial
7.  Deep Sequencing Identifies Ethnicity-Specific Bacterial Signatures in the Oral Microbiome 
PLoS ONE  2013;8(10):e77287.
Oral infections have a strong ethnic predilection; suggesting that ethnicity is a critical determinant of oral microbial colonization. Dental plaque and saliva samples from 192 subjects belonging to four major ethnicities in the United States were analyzed using terminal restriction fragment length polymorphism (t-RFLP) and 16S pyrosequencing. Ethnicity-specific clustering of microbial communities was apparent in saliva and subgingival biofilms, and a machine-learning classifier was capable of identifying an individual’s ethnicity from subgingival microbial signatures. The classifier identified African Americans with a 100% sensitivity and 74% specificity and Caucasians with a 50% sensitivity and 91% specificity. The data demonstrates a significant association between ethnic affiliation and the composition of the oral microbiome; to the extent that these microbial signatures appear to be capable of discriminating between ethnicities.
doi:10.1371/journal.pone.0077287
PMCID: PMC3806732  PMID: 24194878
8.  The Role of Repeat Transesophageal Echocardiography in Patients without Atrial Thrombus Prior to Cardioversion or Ablation 
Background
Cardioversion (CV) and radiofrequency catheter ablation (RFA) are often used to restore sinus rhythm in patients with atrial fibrillation (AF). These procedures are associated with a risk for stroke. The use of transesophageal echocardiography (TEE) to guide the management of AF is a validated strategy for patients in whom CV is planned, as well patients before RFA. For patients in whom the initial procedure fails, repeat TEE is often performed before repeat CV or RFA. The aim of this study was to test the hypothesis that patients with initial negative results on TEE would be unlikely to have thrombi detected on subsequent TEE and thus may avoid repeat procedures.
Methods
A total of 2,999 patients with AF were identified via retrospective review who had undergone TEE before CV or RFA, and 418 of these individuals underwent repeat TEE. After excluding patients who underwent repeat TEE >365 days from the initial study (n= 135) and those with thrombi on initial TEE (n= 20), 263 patients who had underwent two or more examinations were identified and analyzed.
Results
Of 263 eligible patients, two (0.8%; 95% confidence interval,0.21–2.7%)had thrombion subsequent TEE.
Conclusions
Fewer than 1% of patients with AF with negative results on baseline TEE had thrombi detected on repeat TEE before subsequent CV or RFA. Thus, it may be possible to selectively screen patients to identify those at low risk for developing thrombi subsequent to negative results on initial TEE, especially if patients are in sinus rhythm. These results suggest the need for a prospective trial to definitively answer the question regarding repeat TEE in low-risk patients. (J Am Soc Echocardiogr 2012;25:1106-12.)
doi:10.1016/j.echo.2012.06.003
PMCID: PMC3742543  PMID: 22749434
Atrial fibrillation; Atrial flutter; Electrical cardioversion; Transesophageal echocardiography
9.  Goal difficulty and goal commitment affect adoption of a lower glycemic index diet in adults with type 2 diabetes 
Objective
Few studies have examined the effect of goal difficulty on behavioral change even though goal setting is widely used in diabetes education. The effect of a goal to consume either 6 or 8 servings/day of low glycemic index (LGI) foods was evaluated in this study.
Methods
Adults 40–65 years old with type 2 diabetes were randomly assigned to the 6 or 8 serving/day treatment group following a 5-week GI intervention. Perceived goal difficulty, commitment, satisfaction, and self-efficacy were evaluated, and four day food records assessed dietary intake.
Results
Both groups increased consumption of LGI foods (P < 0.001); there were no significant differences in the change in consumption between groups. Participants who were more committed to the goal perceived the goal to be less difficult (P < 0.01). Those with greater efficacy beliefs were more committed to their goal, perceived the goal to be less difficult, and were more satisfied with their performance (all P < 0.05).
Conclusion
A specific goal regarding LGI foods can facilitate the adoption of a lower GI diet. Future research is needed to determine if goal commitment or goal difficulty mediate the process.
Practice implications
Clinicians should help clients set specific goals regarding dietary change.
doi:10.1016/j.pec.2011.03.009
PMCID: PMC3688050  PMID: 21497479
Type 2 diabetes mellitus; Patient education; Health behavior; Goals; Nutrition assessment
10.  A behavioural intervention incorporating specific glycaemic index goals improves dietary quality, weight control and glycaemic control in adults with type 2 diabetes 
Public health nutrition  2011;14(7):1303-1311.
Objective
A lower glycaemic index (GI) diet is associated with a reduction in glycosylated Hb (HbA1c) in people with diabetes. Yet, little research has been conducted to determine the effects of specific goals regarding consumption of low GI (LGI) foods on diabetes outcomes. The present study evaluated a behavioural intervention on dietary intake, weight status and HbA1c, which included a goal to consume either six or eight servings of LGI foods daily.
Design
A parallel two-group design was used. Following the 5-week intervention, participants were randomly assigned to the group of six (n 15) or eight (n 20) servings of LGI foods daily and followed up for 8 weeks. Dietary intake was assessed using the mean of 4 d food records.
Setting
A metropolitan community in the USA.
Subjects
Individuals aged 40–65 years with type 2 diabetes of ≥1 year and HbA1c ≥ 7·0 % were eligible.
Results
There was no significant difference between goal difficulty groups with regard to GI servings at the end of the study. However, mean consumption of LGI foods increased by 2·05 (se 0·47) and 1·65 (se 0·40) servings per 4184 kJ in the six (P< 0·001) and eight (P< 0·001) LGI serving groups, respectively. For all participants combined, there were significant decreases in mean HbA1c (−0·58 (se 0·21) %; P = 0·01), weight (−2·30 (se 0·78) kg; P = 0·01), BMI (−0·80 (se 0·29) kg/m2; P = 0·01) and waist circumference (−2·36 (se 0·81) cm; P = 0·01).
Conclusions
An intervention including a specific goal to consume six to eight servings of LGI foods daily can improve diabetes outcomes. Clinicians should help patients set specific targets for dietary change and identify ways of achieving those goals.
doi:10.1017/S1368980011000085
PMCID: PMC3671479  PMID: 21356150
Type 2 diabetes mellitus; Patient education; Nutritional assessment; Behavioural change
11.  The impact of glycemic control and glycemic variability on mortality in patients hospitalized with congestive heart failure 
Background
Diabetes and congestive heart failure (CHF) are common comorbidities in hospitalized patients but the relationship between glycemic control, glycemic variability, and mortality in patients with both conditions is unclear.
Methods
We used administrative data to retrospectively identify patients with a diagnosis of CHF who underwent frequent glucose assessments. Time-weighted mean glucose (TWMG) was compared to other measures of glycemic control and a time-weighted measure of glycemic variability, the glycemic lability index (GLI). The outcome was hospital mortality.
Results
748 patients were included in the final analysis. TWMG was higher than unadjusted mean glucose (137+/−44.7 mg/dL vs. 167 +/−54.9, p<0.001), due in part to shorter sampling intervals at higher glucose levels. Hypoglycemia, defined as a glucose level <70 mg/dl, occurred during 6.3% of patient-days in survivors and 8.4% of patient-days among nonsurvivors (p=0.05). TWMG was similar (128 +/− 33.1 mg/dl vs. 138 +/− 45.1 mg/dl) in nonsurvivors vs. survivors, p=0.19). However, relatively few glucose readings were significantly elevated. Median GLI was higher in nonsurvivors compared to survivors (18.1 vs. 6.82, p=0.0003). Increasing GLI (OR 1.32, 95% CI 1.05-1.65), and hypoglycemia (OR 2.21, 95% CI 1.07-4.65), were independently associated with higher mortality in logistic regression analysis. Respiratory failure was associated with mortality, but not standard deviation of glucose.
Conclusions
Future studies analyzing glycemic control should control for variable sampling intervals. In this analysis, GLI was independently associated with increased mortality, independent of hypoglycemia. Prospective studies are needed to evaluate these findings.
doi:10.1002/dmrr.1155
PMCID: PMC3058483  PMID: 21218512
12.  Evaluation of a modified interferon gamma release assay for the diagnosis of latent tuberculosis infection in adult and pediatric populations that enables delayed processing 
The objective of the study was to evaluate the specificity of a modified interferon gamma release assay procedure that allows storage of blood samples for up to 32 hours before processing. A total of 116 subjects were enrolled for the study. Two blood samples were collected from each volunteer; one specimen was processed within 8 hours and analyzed using the T-SPOT.TB test; the second specimen was stored overnight and processed 23 to 32 hours later after addition of the T-Cell Xtend™ reagent and then analyzed using the T-SPOT.TB test. A total of 108 paired T-SPOT.TB and T-SPOT.TB plus T-Cell Xtend™ tests were analyzed on specimens from 97 adults and 11 children. The median age of the subjects was 28 years old with 68.5% female and 78.7% White. The overall agreement between the two tests was 98.2% (106/108). Specificity of the T-SPOT.TB test was 99.1% (107/108) and for T-SPOT.TB plus T-Cell Xtend was 97.2%. The two tests were comparable in results. Increasing storage time of the collected blood specimen prior to processing provides flexibility for clinicians and laboratories. Additional studies in larger and diverse patient populations including immunocompromised, pediatric, patients with active TB disease or latent tuberculosis infection are needed.
doi:10.3109/00365548.2010.498021
PMCID: PMC3249229  PMID: 20608764
13.  Relationship between glycemic control and readmission rates in patients hospitalized with congestive heart failure during the implementation of hospital-wide initiatives 
Objective
To determine the relationship between inpatient glycemic control and hospital readmission in patients with congestive heart failure (CHF).
Methods
We used an electronic data collection tool to identify patients with a discharge diagnosis of CHF who underwent point-of-care glucose assessments. Time-weighted mean glucose (TWMG), HbA1c, and glycemic lability index (GLI) served as glycemic indicators, and readmission for CHF was determined at 30 and 30-90 days.
Results
748 patients were included in the analysis. After adjustment for significant covariates, increasing TWMG (OR 3.3, p=0.03) and HbA1c (OR 5.5, p=0.04) were independently associated with higher readmission for CHF at 30-90 days, but not at 30 days. Renal disease, African American race, and year of hospital admission were also significantly associated with readmission, but GLI was not. There was no difference in TWMG when analyzed according to race or renal status. There was a decrease in TWMG (p=0.004) and a trend for reduction in 30-90 day readmission rates (p=0.06) following hospital-wide interventions to improve glycemic control, but no difference in GLI or hypoglycemia.
Conclusions
Increasing glucose exposure, but not glycemic variability was associated with higher risk of 30-90 day CHF readmission. Prospective studies are needed to confirm or refute these results.
doi:10.4158/EP10093.OR
PMCID: PMC3142926  PMID: 20497933
diabetes; glucose; rehospitalization
14.  A prospective study of protein excretion using short-interval timed urine collections in patients with lupus nephritis 
Kidney international  2009;76(12):1284-1288.
The 24-h urine protein-to-creatinine ratio is the gold standard in evaluating proteinuria in lupus nephritis; however, the urine collection is inconvenient to the patient. Random spot urine protein-to-creatinine ratios, although convenient, have poor agreement with the 24-h ratios in these patients. Here, we sought to define a timed collection interval providing accurate and precise data and patient convenience. Urine from 41 patients, in 2 medical centers, with biopsy-proven lupus nephritis was collected at 6-h intervals for 24 h. The protein-to-creatinine ratio of each short collection was then compared with that of a 24-h collection made by combining the 6-h samples. A first morning void and spot urine samples were collected before and after the 24-h collection, respectively. There was significant diurnal variation with peak proteinuria at 6–12 h and nadir at 18–24 h. Each 6-h collection showed excellent correlation and concordance with the 24-h protein-to-creatinine ratio, but the 12–24-h interval had the best agreement. In contrast to the random spot urines, the first morning void also had excellent correlation and concordance, but underestimated the 24-h protein-to-creatinine ratio. Our study shows that a 12-h overnight urine collection is the best surrogate, with excellent agreement with the 24-h protein-to-creatinine ratio, and it is convenient for patients. There was little variability between centers, an important feature for clinical trials.
doi:10.1038/ki.2009.344
PMCID: PMC3093656  PMID: 19759526
glomerulonephritis; lupus nephritis; nephritis; proteinuria systemic lupus erythematosus
15.  Pathogenic missense MAPT mutations differentially modulate tau aggregation propensity at nucleation and extension steps 
Journal of neurochemistry  2008;107(4):1113-1123.
Mutations in the MAPT gene encoding tau protein lead to neurofibrillary lesion formation, neurodegeneration, and cognitive decline associated with frontotemporal lobar degeneration. While some pathogenic mutations affect MAPT introns, resulting in abnormal splicing patterns, the majority occur in the tau coding sequence leading to single amino acid changes in tau primary structure. Depending on their location within the polypeptide chain, tau missense mutations have been reported to augment aggregation propensity. To determine the mechanisms underlying mutation-associated changes in aggregation behavior, the fibrillization of recombinant pathogenic mutants R5L, G272V, P301L, V337M, and R406W prepared in a full-length four-repeat human tau background was examined in vitro as a function of time and submicromolar tau concentrations using electron microscopy assay methods. Kinetic constants for nucleation and extension phases of aggregation were then estimated by direct measurement and mathematical simulation. Results indicated that the mutants differ from each other and from wild-type tau in their aggregation propensity. G272V and P301L mutations increased the rates of both filament nucleation and extension reactions, whereas R5L and V337M increased only the nucleation phase. R406W did not differ from wild-type in any kinetic parameter. The results show that missense mutations can directly promote tau filament formation at different stages of the aggregation pathway.
doi:10.1111/j.1471-4159.2008.05692.x
PMCID: PMC2596975  PMID: 18803694
Frontotemporal lobar degeneration; tau; neurofibrillary tangle; aggregation reaction mechanism
16.  Inflammasome mRNA Expression in Human Monocytes during Early Septic Shock 
Rationale: Monocytes are central to the initiation of the inflammatory response in sepsis, with caspase-1 activation playing a key role. Monocyte deactivation during sepsis has been linked to poor outcomes.
Objectives: Given the importance of caspase-1 in the immune response, we investigated whether monocytes from patients early in septic shock demonstrate alterations in mRNAs for caspase-1–related molecules.
Methods: Patients with septic shock (n = 26; age >18 years), critically ill intensive care unit patients (n = 20), and healthy volunteers (n = 22) were enrolled in a prospective cohort study in a university intensive care unit. Demographic, biological, physiologic, and plasma cytokine measurements were obtained. Monocytes were assayed for ex vivo tumor necrosis factor-α production, and fresh monocyte mRNA was analyzed by quantitative reverse-transcription polymerase chain reaction for Toll-like receptors, NOD-LRR proteins, cytokines, and nuclear factor-κB–related genes.
Measurements and Main Results: Relative copy numbers for the inflammasome mRNAs for ASC, caspase-1, NALP1, and Pypaf-7 were significantly lower in patients with septic shock compared with critically ill control subjects. NALP1 mRNA levels were linked to survival in patients with sepsis (P = 0.0068) and correlated with SAPS II scores (r = −0.63).
Conclusions: These data suggest that monocyte deactivation occurs during the earliest stages of the systemic inflammatory response and that changes in inflammasome mRNA expression are part of this process.
doi:10.1164/rccm.200703-418OC
PMCID: PMC2361424  PMID: 18263805
inflammasome; monocytes; septic shock; messenger RNA; NALP1
17.  Biomarker Discovery for Lupus Nephritis Through Longitudinal Urine Proteomics 
Kidney international  2008;74(6):799-807.
Lupus nephritis is a frequent and serious complication of systemic lupus erythematosus (SLE). Treatment often requires the use of immunosuppression, and may be associated with severe side effects. The ability to predict relapse, relapse severity, and recovery could be used to more effectively implement therapy and reduce toxicity. We postulated that a proteomic analysis of the low-molecular weight urine proteome using serial urine samples obtained before, during, and after SLE nephritis flares would demonstrate potential biomarkers of SLE renal flare. This study was undertaken to test our hypothesis.
Urine from 25 flare cycles of 19 WHO Class III, IV, and V SLE nephritis patients was used. Urine samples included a baseline, and pre-flare, flare, and post-flare specimens. The urines were fractionated to remove proteins larger than 30 kDa, and spotted onto weak cation exchanger (CM10) protein chips for analysis by surface-enhanced laser desorption/ionization time-of-flight mass spectrometry (SELDI-TOF MS).
SELDI-TOF MS screening showed 176 protein ions between 2-20 kDa of which 27 were found to be differentially-expressed between specific flare intervals. On-chip peptide sequencing by integrated tandem mass spectrometry was used to positively identify selected differentially-expressed protein ions. The identified proteins included the 20 and 25 amino acid isoforms of hepcidin, a fragment of α1-antitrypsin, and an albumin fragment. Hepcidin 20 increased 4 months pre-flare and returned to baseline at renal flare, whereas hepcidin 25 decreased at renal flare and returned to baseline 4 months post-flare.
Using SELDI-TOF urine protein profiling in lupus nephritis, several candidate biomarkers of renal flare were found. To verify these candidates as true biomarkers, further identification and validation are needed in an independent SLE cohort.
doi:10.1038/ki.2008.316
PMCID: PMC2614389  PMID: 18596723
lupus nephritis; biomarker; SELDI
18.  A polymorphism in the type one complement receptor (CR1) involves an additional cysteine within the C3b/C4b binding domain that inhibits ligand binding 
Molecular immunology  2007;44(14):3510-3516.
The type one complement receptor (CR1) contains a variable number of binding domains for C3b and C4b, formed through a nearly identical set of repeating units known as short consensus repeats (SCRs). Each SCR contains 4 cysteines that, by forming two disulfide bonds, impart a conformation critical for function. In this study, we identified a CR1 single nucleotide polymorphism (1597C>T) that results in an additional cysteine (483R>C) in SCR 8 of the N-terminal C3b/C4b binding domain, and occurring sporadically in corresponding SCRs of other repeated C3b/C4b binding domains. The normal carrier frequency for 483-C was 6.3% in 175 African Americans, and 2.4% in 153 Caucasians. In expression constructs containing one C3b/C4b binding domain, the 483-C residue reduced binding to C3b, C3bi, and C4b by over 80% (each p < 0.0001), versus the wildtype construct. Full-length CR1 from 483-C carriers also exhibited reduced binding to C3b and C4b, although the effect was influenced by the total number of binding domains present. Race-matched comparisons between SLE patients (86 African Americans, 228 Caucasians) and the normal cohort showed that 483-C carrier status alone is not a risk factor for SLE or lupus nephritis. The physiological role of this polymorphism remains to be determined.
doi:10.1016/j.molimm.2007.03.007
PMCID: PMC1978224  PMID: 17467802
CR1; polymorphism; complement; SLE

Results 1-18 (18)