PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1231506)

Clipboard (0)
None

Related Articles

1.  Genetic Variants of ABCC10, a Novel Tenofovir Transporter, Are Associated With Kidney Tubular Dysfunction 
The Journal of Infectious Diseases  2011;204(1):145-153.
Background. Tenofovir (TFV) causes kidney tubular dysfunction (KTD) in some patients, but the mechanism is poorly understood. Genetic variants in TFV transporters are implicated; we explored whether ABCC10 transports TFV and whether ABCC10 single-nucleotide polymorphisms (SNPs) are associated with KTD.
Methods. TFV accumulation was assessed in parental and ABCC10-transfected HEK293 cells (HEK293-ABCC10), CD4+ cells and monocyte-derived macrophages (MDMs). Substrate specificity was confirmed by cepharanthine (ABCC10 inhibitor) and small interfering RNA (siRNA) studies. Fourteen SNPs in ABCC10 were genotyped in human immunodeficiency virus–positive patients with KTD (n = 19) or without KTD (controls; n = 96). SNP and haplotype analysis was performed using Haploview.
Results. TFV accumulation was significantly lower in HEK293-ABCC10 cell lines than in parental HEK293 cells (35% lower; P = .02); this was reversed by cepharanthine. siRNA knockdown of ABCC10 resulted in increased accumulation of TFV in CD4+ cells (18%; P = .04) and MDMs (25%; P = .04). Two ABCC10 SNPs (rs9349256: odds ratio [OR], 2.3; P = .02; rs2125739, OR, 2.0; P = .05) and their haplotype (OR, 2.1; P = .05) were significantly associated with KTD. rs9349256 was associated with urine phosphorus wasting (P = .02) and β2 microglobulinuria (P = .04).
Conclusions. TFV is a substrate for ABCC10, and genetic variability within the ABCC10 gene may influence TFV renal tubular transport and contribute to the development of KTD. These results need to be replicated in other cohorts.
doi:10.1093/infdis/jir215
PMCID: PMC3105036  PMID: 21628669
2.  Evaluation of the Effect of Cobicistat on the In Vitro Renal Transport and Cytotoxicity Potential of Tenofovir 
Antimicrobial Agents and Chemotherapy  2013;57(10):4982-4989.
A once-daily single-tablet antiretroviral regimen containing tenofovir (TFV) disoproxil fumarate, emtricitabine (FTC), elvitegravir (EVG), and cobicistat (COBI) is an approved combination for the treatment of patients infected with HIV. COBI and TFV have been reported to interact with distinct transporters in renal proximal tubules; while TFV is renally eliminated by a combination of glomerular filtration and tubular secretion via anion transporters OAT1, OAT3, and MRP4, COBI inhibits renal cation transporters, particularly MATE1, resulting in a measurable decrease in the tubular secretion of creatinine. To investigate the potential for a renal drug-drug interaction between TFV and COBI in vitro, the uptake of TFV in the presence and absence of COBI was determined in fresh human renal cortex tissue and in cells expressing the relevant renal transporters. At concentrations exceeding clinical protein-unbound plasma levels, COBI did not significantly inhibit the transport of TFV by the anion transporters OAT1, OAT3, and MRP4 (50% inhibitory concentrations [IC50s] of >15, 6.6, and 8.5 μM, respectively). Conversely, TFV had little or no effect on the cation transporters OCT2 and MATE1 (IC50 > 100 μM). Consistent with studies using individual transporters, no increase in the accumulation of TFV in freshly isolated human renal cortex tissue or renal proximal tubule cells (RPTECs) was observed in the presence of COBI. Finally, COBI alone or in combination with FTC and EVG did not affect the sensitivity to TFV of cultured primary RPTECs or cells coexpressing OAT1 and MRP4. These results illustrate that COBI and TFV interact primarily with distinct renal transporters and indicate a low potential for pharmacokinetic renal drug-drug interaction.
doi:10.1128/AAC.00712-13
PMCID: PMC3811427  PMID: 23896476
3.  Chronic Administration of Tenofovir to Rhesus Macaques from Infancy through Adulthood and Pregnancy: Summary of Pharmacokinetics and Biological and Virological Effects▿  
The reverse transcriptase (RT) inhibitor tenofovir (TFV) is highly effective in the simian immunodeficiency virus (SIV) macaque model of human immunodeficiency virus infection. The current report describes extended safety and efficacy data on 32 animals that received prolonged (≥1- to 13-year) daily subcutaneous TFV regimens. The likelihood of renal toxicity (proximal renal tubular dysfunction [PRTD]) correlated with plasma drug concentrations, which depended on the dosage regimen and age-related changes in drug clearance. Below a threshold area under the concentration-time curve for TFV in plasma of ∼10 μg·h/ml, an exposure severalfold higher than that observed in humans treated orally with 300 mg TFV disoproxil fumarate (TDF), prolonged TFV administration was not associated with PRTD based on urinalysis, serum chemistry analyses, bone mineral density, and clinical observations. At low-dose maintenance regimens, plasma TFV concentrations and intracellular TFV diphosphate concentrations were similar to or slightly higher than those observed in TDF-treated humans. No new toxicities were identified. The available evidence does not suggest teratogenic effects of prolonged low-dose TFV treatment; by the age of 10 years, one macaque, on TFV treatment since birth, had produced three offspring that were healthy by all criteria up to the age of 5 years. Despite the presence of viral variants with a lysine-to-arginine substitution at codon 65 (K65R) of RT in all 28 SIV-infected animals, 6 animals suppressed viremia to undetectable levels for as long as 12 years of TFV monotherapy. In conclusion, these findings illustrate the safety and sustained benefits of prolonged TFV-containing regimens throughout development from infancy to adulthood, including pregnancy.
doi:10.1128/AAC.00350-08
PMCID: PMC2533487  PMID: 18573931
4.  No Treatment versus 24 or 60 Weeks of Antiretroviral Treatment during Primary HIV Infection: The Randomized Primo-SHM Trial 
PLoS Medicine  2012;9(3):e1001196.
In a three-arm randomized trial conducted among adult patients in HIV treatment centers in The Netherlands, Marlous Grijsen and colleagues examine the effects of temporary combination antiretroviral therapy during primary HIV infection.
Background
The objective of this study was to assess the benefit of temporary combination antiretroviral therapy (cART) during primary HIV infection (PHI).
Methods and Findings
Adult patients with laboratory evidence of PHI were recruited in 13 HIV treatment centers in the Netherlands and randomly assigned to receive no treatment or 24 or 60 wk of cART (allocation in a 1∶1∶1 ratio); if therapy was clinically indicated, participants were randomized over the two treatment arms (allocation in a 1∶1 ratio). Primary end points were (1) viral set point, defined as the plasma viral load 36 wk after randomization in the no treatment arm and 36 wk after treatment interruption in the treatment arms, and (2) the total time that patients were off therapy, defined as the time between randomization and start of cART in the no treatment arm, and the time between treatment interruption and restart of cART in the treatment arms. cART was (re)started in case of confirmed CD4 cell count <350 cells/mm3 or symptomatic HIV disease. In total, 173 participants were randomized. The modified intention-to-treat analysis comprised 168 patients: 115 were randomized over the three study arms, and 53 randomized over the two treatment arms. Of the 115 patients randomized over the three study arms, mean viral set point was 4.8 (standard deviation 0.6) log10 copies/ml in the no treatment arm, and 4.0 (1.0) and 4.3 (0.9) log10 copies/ml in the 24- and 60-wk treatment arms (between groups: p<0.001). The median total time off therapy in the no treatment arm was 0.7 (95% CI 0.0–1.8) y compared to 3.0 (1.9–4.2) and 1.8 (0.5–3.0) y in the 24- and 60-wk treatment arms (log rank test, p<0.001). In the adjusted Cox analysis, both 24 wk (hazard ratio 0.42 [95% CI 0.25–0.73]) and 60 wk of early treatment (hazard ratio 0.55 [0.32–0.95]) were associated with time to (re)start of cART.
Conclusions
In this trial, temporary cART during PHI was found to transiently lower the viral set point and defer the restart of cART during chronic HIV infection.
Trial registration
Current Controlled Trials ISRCTN59497461
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Every year, nearly three million people become infected with HIV, the virus that causes AIDS. The first stage of HIV infection—primary HIV infection—lasts a few weeks and often goes undetected, although most individuals develop a short, flu-like illness. During this stage of infection, the immune system begins to make antibodies to HIV. The second stage of HIV infection, which lasts many years, also has no major symptoms but, during this stage, HIV slowly destroys immune system cells, including CD4 cells, a type of lymphocyte. Eventually, the immune system is unable to fight off other infections and patients enter the third phase of HIV infection—symptomatic HIV infection. The final stage—AIDS—is characterized by the occurrence of one or more AIDS-defining conditions, which include severe but unusual infections and several types of cancer. Early in the AIDS epidemic, most HIV-positive people died within ten years of infection. Nowadays, although there is still no cure for HIV infection, HIV has become a chronic disease because of the availability of combination antiretroviral therapy (cART; cocktails of several powerful drugs). This means that many HIV-positive people have a near-normal life span.
Why Was This Study Done?
It is currently recommended that people start cART when their CD4 count falls below 350 CD4 cells per cubic milliliter (cells/mm3) of blood, when they develop severe constitutional symptoms such as fever lasting longer than a month, or when they develop an AIDS-defining condition. But could a short course of cART during primary HIV infection be clinically beneficial? Some, but not all, nonrandomized studies have shown that such treatment reduces the viral set point (the stabilized viral load that is reached after the immune system begins to make antibodies to HIV; the viral load is the amount of virus in the blood) and slows the decline of CD4 cell count in patients. In this randomized trial (the Primo-SHM trial), the researchers assess the clinical benefit of temporary cART initiated during primary HIV infection by measuring its effects on the viral set point and on when patients have to restart cART during chronic HIV infection. In a randomized controlled trial, patients are assigned by the play of chance to receive different treatments and then followed to compare the effects of these treatments.
What Did the Researchers Do and Find?
The researchers assigned 168 patients with primary HIV infection to receive no treatment, 24 weeks of cART, or 60 weeks of cART. They measured the viral set point (the viral load in the blood 36 weeks after randomization in the no treatment arm and 36 weeks after cART interruption in the treatment arms) and determined the time off therapy (the time between randomization and the start of cART in the no treatment arm, and the time between treatment interruption and restart of cART in the treatment arms) for each patient. cART was (re)started following two consecutive CD4 counts below 350 cells/mm3 or when symptomatic HIV disease developed. The average viral set point was lower in the patients who received early cART than in those who had no treatment. Moreover, on average, the patients in the no treatment arm started cART 0.7 years after randomization whereas those in the 24- and 60-week treatment arms restarted cART after 3.0 and 1.8 years, respectively. There was no statistically significant difference between the 24-week and 60-week treatment arms in time off therapy.
What Do These Findings Mean?
These findings suggest that temporary cART during primary HIV infection can transiently lower the viral set point and can delay the need to restart cART during chromic HIV infection. They also suggest that 24 weeks of cART during primary HIV is as effective as 60 weeks of treatment. These findings need to be confirmed in other settings, and follow-up studies are needed to evaluate the long-term benefits of early temporary cART, but given the short time between cART interruption and treatment restart, the researchers suggest that not interrupting early cART, but instead continuing it for life, should be considered. However, they add, because patients are often physically and emotionally distressed at this stage of HIV infection, adherence to cART during primary HIV infection may be suboptimal, and so patients with primary HIV infection should be advised to start cART only when they feel ready to start treatment.
Additional Information
Please access these web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001196.
Information is available from the US National Institute of Allergy and infectious diseases on HIV infection and AIDS, including information on the clinical progression of HIV infection
NAM/aidsmap provides information about HIV/AIDS, including a factsheet on primary infection
Information is available from Avert, an international AIDS charity on many aspects of HIV/AIDS, including detailed information on HIV treatment and care and on the stages of HIV infection (in English and Spanish)
The World Health Organization's 2010 antiretroviral therapy guidelines provide recommendations on when to initiate cART
Information about Primo-SHM is available
Patient stories about living with HIV/AIDS are available through Avert and through the charity website Healthtalkonline
doi:10.1371/journal.pmed.1001196
PMCID: PMC3313945  PMID: 22479156
5.  Mechanism of Active Renal Tubular Efflux of Tenofovir 
Antimicrobial Agents and Chemotherapy  2006;50(10):3297-3304.
Tenofovir (TFV) undergoes renal elimination by a combination of glomerular filtration and active tubular secretion. While transporter-mediated uptake of TFV from the blood into proximal-tubule cells has been well characterized, comparatively little is known about the efflux system responsible for transporting TFV into the lumen during active tubular secretion. Therefore, members of the ATP-binding cassette family of efflux pumps expressed at the apical side of proximal-tubule cells were studied for the ability to transport TFV. Studies in multiple independent in vitro systems show TFV not to be a substrate for P glycoprotein (Pgp) or multidrug resistance protein type 2 (MRP2). In contrast to Pgp and MRP2, TFV was observed to be a substrate for MRP4. TFV accumulated to fivefold lower levels in MRP4-overexpressing cells, and its accumulation could be increased by an MRP inhibitor. Furthermore, MRP4-overexpressing cells were found to be 2.0- to 2.5-fold less susceptible to cytotoxicity caused by TFV. ATP-dependent uptake of TFV was observed in membrane vesicles containing MRP4 but not in vesicles lacking the transporter. On the basis of these and previous results, the molecular transport pathway for the active tubular secretion of TFV through renal proximal-tubule cells involves uptake from the blood mediated by human organic anion transporters 1 and 3 and efflux into urine by MRP4. A detailed understanding of the molecular mechanism of TFV active tubular secretion will facilitate the assessment of potential renal drug-drug interactions with coadministered agents.
doi:10.1128/AAC.00251-06
PMCID: PMC1610069  PMID: 17005808
6.  Early Predictors of Renal Dysfunction in Egyptian Patients with β-Thalassemia Major and Intermedia 
Background
Better survival of thalassemia patients allowed previously unrecognized renal complications to emerge.
Objectives
Assess prevalence and early predictors of renal dysfunction in young β-thalassemia major (β-TM) and intermedia (β-TI) patients.
Subjects
66 β-TM (group I), 26 β-TI (group II) Egyptian patients and 40 healthy controls.
Methods
Clinical assessment and laboratory data including kidney and liver function tests, such as serum ferritin, serum bicarbonate, plasma osmolality and urinary total proteins, microalbuminuria (MAU), N-acetyl-β-D-glucosaminidase (NAG), retinol binding protein (RBP), α-1 microglobulin, bicarbonate, osmolality, creatinine clearance (CrCl), % fractional excretion of bicarbonate (% FE-HCO3).
Results
The prevalent renal abnormality was proteinuria (71%), followed by increased urinary level of RBP (69.4%), NAG (58.1%), α-1 microglobulin (54.8%) and microalbuminuria (29%) and also decreased urinary osmolality (58.1%). CrCl was a better assessment of renal function and significantly lowered in thalassemia patients. Tubular dysfunctions were more significant in splenectomized β-TM patients who showed more elevation of NAG and α-1 microglobulin and lower urinary osmolality. NAG, RBP and α-1 microglobulin were negatively correlated with CrCl and positively correlated with serum ferritin and urinary total protein. Z-score analysis for identifying patients with renal dysfunction proved superiority of urine total protein and RBP. Comparative statistics of different frequencies revealed significant difference between the urinary total protein and both MAU and % FE-HCO3.
Conclusion
Asymptomatic renal dysfunctions are prevalent in young β-TM and β-TI patients that necessitate regular screening. Urinary total protein and RBP may be cost-effective for early detection.
doi:10.4084/MJHID.2014.057
PMCID: PMC4165495  PMID: 25237470
7.  Effectiveness of Early Antiretroviral Therapy Initiation to Improve Survival among HIV-Infected Adults with Tuberculosis: A Retrospective Cohort Study 
PLoS Medicine  2011;8(5):e1001029.
Molly Franke, Megan Murray, and colleagues report that early cART reduces mortality among HIV-infected adults with tuberculosis and improves retention in care, regardless of CD4 count.
Background
Randomized clinical trials examining the optimal time to initiate combination antiretroviral therapy (cART) in HIV-infected adults with sputum smear-positive tuberculosis (TB) disease have demonstrated improved survival among those who initiate cART earlier during TB treatment. Since these trials incorporated rigorous diagnostic criteria, it is unclear whether these results are generalizable to the vast majority of HIV-infected patients with TB, for whom standard diagnostic tools are unavailable. We aimed to examine whether early cART initiation improved survival among HIV-infected adults who were diagnosed with TB in a clinical setting.
Methods and Findings
We retrospectively reviewed charts for 308 HIV-infected adults in Rwanda with a CD4 count≤350 cells/µl and a TB diagnosis. We estimated the effect of cART on survival using marginal structural models and simulated 2-y survival curves for the cohort under different cART strategies:start cART 15, 30, 60, or 180 d after TB treatment or never start cART. We conducted secondary analyses with composite endpoints of (1) death, default, or lost to follow-up and (2) death, hospitalization, or serious opportunistic infection. Early cART initiation led to a survival benefit that was most marked for individuals with low CD4 counts. For individuals with CD4 counts of 50 or 100 cells/µl, cART initiation at day 15 yielded 2-y survival probabilities of 0.82 (95% confidence interval: [0.76, 0.89]) and 0.86 (95% confidence interval: [0.80, 0.92]), respectively. These were significantly higher than the probabilities computed under later start times. Results were similar for the endpoint of death, hospitalization, or serious opportunistic infection. cART initiation at day 15 versus later times was protective against death, default, or loss to follow-up, regardless of CD4 count. As with any observational study, the validity of these findings assumes that biases from residual confounding by unmeasured factors and from model misspecification are small.
Conclusions
Early cART reduced mortality among individuals with low CD4 counts and improved retention in care, regardless of CD4 count.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
HIV infection has exacerbated the global tuberculosis (TB) epidemic, especially in sub-Saharan Africa, in which in some countries, 70% of people with TB are currently also HIV positive—a condition commonly described as HIV/TB co-infection. The management of patients with HIV/TB co-infection is a major public health concern.
There is relatively little good evidence on the best time to initiate combination antiretroviral therapy (cART) in adults with HIV/TB co-infection. Clinicians sometimes defer cART in individuals initiating TB treatment because of concerns about complications (such as immune reconstitution inflammatory syndrome) and the risk of reduced adherence if patients have to remember to take two sets of pills. However, starting cART later in those patients who are infected with both HIV and TB can result in potentially avoidable deaths during therapy.
Why Was This Study Done?
Several randomized control trials (RCTs) have been carried out, and the results of three of these studies suggest that, among individuals with severe immune suppression, early initiation of cART (two to four weeks after the start of TB treatment) leads to better survival than later ART initiation (two to three months after the start of TB treatment). These results were reported in abstract form, but the full papers have not yet been published. One problem with RCTs is that they are carried out under controlled conditions that might not represent well the conditions in varied settings around the world. Therefore, observational studies that examine how effective a treatment is in routine clinical conditions can provide information that complements that obtained during clinical trials. In this study, the researchers aimed to confirm the results from RCTs among a cohort of adult patients with HIV/TB co-infection in Rwanda, diagnosed under routine program conditions and using routinely collected clinical data. The researchers also wanted to investigate whether early cART initiation reduced the risk of other adverse outcomes, including treatment default and loss to follow-up.
What Did the Researchers Do and Find?
The researchers retrospectively reviewed the charts and other program records of 308 patients with HIV, who had CD4 counts≤350 cells/µl, were aged 15 years or more, had never previously taken cART, and received their first TB treatment at one of five cART sites (two urban, three rural) in Rwanda between January 2004 and February 2007. Using this method, the researchers collected baseline demographic and clinical variables and relevant clinical follow-up data. They then used this data to estimate the effect of cART on survival by using sophisticated statistical models that calculated the effects of initiating cART at 15, 30, 60, or 180 d after the start of TB treatment or not at all.
The researchers then conducted a further analysis to assess combined outcomes of (1) death, default, lost to follow-up, and (2) death, hospitalization due to any cause, or occurrence of severe opportunistic infections, such as Kaposi's sarcoma. The researchers used the resulting multivariable model to estimate survival probabilities for each individual, based on his/her baseline characteristics.
The researchers found that when they set their model to first CD4 cell counts of 50 and 100 cells/µl, and starting cART at day 15, mean survival probabilities at two years were 0.82 and 0.86, respectively, statistically significantly higher than the survival probabilities calculated for each of the other treatment strategies, where cART was started later. They observed a similar pattern for the combined outcome of death, hospitalization, or serious opportunistic infection In addition, two-year outcomes for death or lost to follow-up were also improved with early cART, regardless of CD4 count at treatment initiation.
What Do These Findings Mean?
These findings show that in a real world program setting, starting cART 15 d after the start of TB treatment is more beneficial (measured by differences in survival probabilities) among patients with HIV/TB co-infection who have CD4 cell counts≤100 cells/µl than starting later. Early cART initiation may also increase retention in care for all individuals with CD4 cell counts≤350 cells/µl.
As the outcomes of this modeling study are based on data from a retrospective observational study, the biases associated with use of these data must be carefully addressed. However, the results support the recommendation of cART initiation after 15 d of TB treatment for patients with CD4 cell counts≤100 cells/µl and can be used as an advocacy base for TB treatment to be used as an opportunity to refer and retain HIV-infected individuals in care, regardless of CD4 cell count.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001029.
Information is available on HIV/TB co-infection from the World Health Organization, the US Centers for Disease Control and Prevention, and the International AIDS Society
doi:10.1371/journal.pmed.1001029
PMCID: PMC3086874  PMID: 21559327
8.  Age-related differences in plasma and intracellular tenofovir concentrations in HIV-1 infected children, adolescents and adults 
AIDS (London, England)  2013;27(2):221-225.
Introduction
There is limited pediatric information on the complex relationships among the dose of tenofovir disoproxil fumarate (TDF), plasma concentrations of tenofovir (TFV), and intracellular TFV-diphosphate (TFV-DP) concentrations. Our objectives were to describe TFV-DP pharmacokinetics in children and adolescents and investigate the effect of age on TFV and TFV-DP concentrations.
Methods
TFV-DP pharmacokinetics were determined in 47 children and adolescents. TFV and TFV-DP were quantified with validated LC/MS/MS methods. Data were pooled with other studies in HIV-infected adults (N=55). Nonlinear mixed effects modeling was used to develop the population model and explore the influence of covariates on TFV. A two-compartment model, partitioned for slow and fast absorbers by age, with weight allometrically scaled for children and adolescents best described TFV pharmacokinetics. An indirect stimulation of response model best described TFV-DP formation.
Results
Apparent oral TFV clearance (CL/F) was significantly faster in patients <25 versus ≥25 years. The most significant covariate on CL/F and central distribution volume was creatinine clearance. The TFV plasma concentration producing 50% of maximal TFV-DP concentrations (EC50) was almost 2-fold lower in patients <25 versus ≥25 years. The estimated intracellular TFV-DP half-life for these groups was 70 and 87 hours, respectively.
Conclusions
These data demonstrate children and adolescents receiving standard TDF dosing of 300 mg once daily achieve higher intracellular TFV-DP concentrations than adults, despite lower plasma TFV concentrations. This age-related difference appears to arise from an increased sensitivity to formation of TFV-DP.
doi:10.1097/QAD.0b013e32835a9a2d
PMCID: PMC3586236  PMID: 23032419
age difference; antiretroviral therapy; NRTI; tenofovir; tenofovir diphosphate
9.  Renal handling of Zn-alpha2-glycoprotein as compared with that of albumin and the retinol-binding protein. 
Journal of Clinical Investigation  1976;57(4):945-954.
An unusual electrophoretic pattern of the urine from a patient with malignant lymphoma was observed. One of the major proteins, identified Zn-alpha2-glycoprotein (Zn-alpha2), was isolated from the urine and partly characterized. The Stokes radius was found to be 3.24 nm and the molecular weight, determined by sodium dodecyl sulfate polyacrylamide electrophoresis, 42,000. The plasma level in healthy individuals was 39 +/- 7 (SD) mg/liter. In 12 of 25 healthy individuals, Zn-alpha2 was measurable in the urine and was found to be 1.0 +/- 1.1 mg/liter. In 23 patients with chronic glomerulonephritis (CGN), in 9 with proximal tubular dysfunction (PTD), in 23 with various renal diseases (VRD), and in 10 with malignant lymphoma, the plasma level and the urinary excretion were compared with those of albumin (mol wt 67,000) and of the retinol-binding protein (RBP, mol wt 21,000). A close correlation was found between the urine-to-plasma (U/P) ratios of Zn-alpha2 and albumin in the patients with CGN, whereas in the PTD patients the U/P ratios of Zn-alpha2 and RBP were correlated. No significant renal arteriovenous difference in Zn-alpha2 could be demonstrated. The Zn-alpha2 excretion was increased also in two patients with malignant lymphoma and proteinuria of a tubular pattern. The plasma Zn-alpha2 varied inversely with the glomerular filtration rate in the patients with renal disease, but was normal in those with malignant lymphoma. The results are consistent with the assumption of a sieving coefficient of Zn-alpha2, substantially exceeding that of albumin, but notably lower than that of smaller low-molecular-weight proteins. An increased excretion of Zn-alpha2 may be due to increased glomerular permeability as well as to defective proximal tubular reabsorption.
Images
PMCID: PMC436738  PMID: 985827
10.  Plasma and Intracellular Tenofovir Pharmacokinetics in the Neonate (ANRS 12109 Trial, Step 2)▿ 
The objective of this study was to investigate for the first time tenofovir (TFV) pharmacokinetics in plasma and peripheral blood mononuclear cells (PBMCs) of the neonate. HIV-1-infected pregnant women received two tablets of tenofovir disoproxil fumarate (TDF; 300 mg) and emtricitabine (FTC; 200 mg) at onset of labor and then one tablet daily for 7 days postpartum. A single dose of 13 mg/kg of body weight of TDF was administered to 36 neonates within 12 h of life after the HIV-1-infected mothers had been administered two tablets of TDF-emtricitabine at delivery. A total of 626 samples collected within the 2 days after the drug administration were measured by liquid chromatography-tandem mass spectrometry (LC-MS/MS) and analyzed by a population approach. In the neonate, the median TFV plasma area under the curve and minimal and maximal concentrations, respectively, were 3.73 mg/liter · h and 0.076 and 0.29 mg/liter. In PBMCs, TFV concentrations were detectable in all fetuses, whereas tenofovir diphosphate (TFV-DP) was quantifiable in only two fetuses, suggesting a lag in appearance of TFV-DP. The median TFV-DP neonatal concentration was 146 fmol/106 cells (interquartile range [IQR], 53 to 430 fmol/106 cells); two neonates had very high TFV-DP concentrations (1,530 and 2963 fmol/106 cells). The 13-mg/kg TDF dose given to neonates produced plasma TFV and intracellular active TFV-DP concentrations similar to those in adults. This dose should be given immediately after birth to reduce the delay before the active compound TFV-DP appears in cells.
doi:10.1128/AAC.01377-10
PMCID: PMC3101430  PMID: 21464249
11.  Pretreatment CD4 Cell Slope and Progression to AIDS or Death in HIV-Infected Patients Initiating Antiretroviral Therapy—The CASCADE Collaboration: A Collaboration of 23 Cohort Studies 
PLoS Medicine  2010;7(2):e1000239.
Analyzing data from several thousand cohort study participants, Marcel Wolbers and colleagues find that the rate of CD4 T cell decline is not useful in deciding when to start HIV treatment.
Background
CD4 cell count is a strong predictor of the subsequent risk of AIDS or death in HIV-infected patients initiating combination antiretroviral therapy (cART). It is not known whether the rate of CD4 cell decline prior to therapy is related to prognosis and should, therefore, influence the decision on when to initiate cART.
Methods and Findings
We carried out survival analyses of patients from the 23 cohorts of the CASCADE (Concerted Action on SeroConversion to AIDS and Death in Europe) collaboration with a known date of HIV seroconversion and with at least two CD4 measurements prior to initiating cART. For each patient, a pre-cART CD4 slope was estimated using a linear mixed effects model. Our primary outcome was time from initiating cART to a first new AIDS event or death. We included 2,820 treatment-naïve patients initiating cART with a median (interquartile range) pre-cART CD4 cell decline of 61 (46–81) cells/µl per year; 255 patients subsequently experienced a new AIDS event or death and 125 patients died. In an analysis adjusted for established risk factors, the hazard ratio for AIDS or death was 1.01 (95% confidence interval 0.97–1.04) for each 10 cells/µl per year reduction in pre-cART CD4 cell decline. There was also no association between pre-cART CD4 cell slope and survival. Alternative estimates of CD4 cell slope gave similar results. In 1,731 AIDS-free patients with >350 CD4 cells/µl from the pre-cART era, the rate of CD4 cell decline was also not significantly associated with progression to AIDS or death (hazard ratio 0.99, 95% confidence interval 0.94–1.03, for each 10 cells/µl per year reduction in CD4 cell decline).
Conclusions
The CD4 cell slope does not improve the prediction of clinical outcome in patients with a CD4 cell count above 350 cells/µl. Knowledge of the current CD4 cell count is sufficient when deciding whether to initiate cART in asymptomatic patients.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
More than 30 million people are currently infected with the human immunodeficiency virus (HIV), the cause of acquired immunodeficiency syndrome (AIDS). Most people who become infected with HIV do not become ill immediately although some develop a short flu-like illness shortly after infection. This illness is called “seroconversion” illness because it coincides with the appearance of antibodies to HIV in the blood. The next stage of HIV infection has no major symptoms and may last up to 10 years. During this time, HIV slowly destroys immune system cells (including CD4 cells, a type of lymphocyte). Without treatment, the immune system loses the ability to fight off infections by other disease-causing organisms and HIV-positive people then develop so-called opportunistic infections, Kaposi sarcoma (a skin cancer), or non-Hodgkin lymphoma (a cancer of the lymph nodes) that determine the diagnosis of AIDS. Although HIV-positive people used to die within 10 years of infection on average, the development in 1996 of combination antiretroviral therapy (cART; cocktails of powerful antiretroviral drugs) means that, at least for people living in developed countries, HIV/AIDS is now a chronic, treatable condition.
Why Was This Study Done?
The number of CD4 cells in the blood is a strong predictor of the likelihood of AIDS or death in untreated HIV-positive individuals and in people starting cART. Current guidelines recommend, therefore, that cART is started in HIV-positive patients without symptoms when their CD4 cell count drops below a specified cutoff level (typically 350 cells/µl.) In addition, several guidelines suggest that clinicians should also consider cART in symptom-free HIV-positive patients with a CD4 cell count above the cutoff level if their CD4 cell count has rapidly declined. However, it is not actually known whether the rate of CD4 cell decline (so-called “CD4 slope”) before initiating cART is related to a patient's outcome, so should clinicians consider this measurement when deciding whether to initiate cART? In this study, the researchers use data from CASCADE (Concerted Action on SeroConversion to AIDS and Death in Europe), a large collaborative study of 23 groups of HIV-positive individuals whose approximate date of HIV infection is known, to answer this question.
What Did the Researchers Do and Find?
The researchers undertook survival analyses of patients in the CASCADE collaboration for whom at least two CD4 cell counts had been recorded before starting cART. They calculated a pre-cART CD4 cell count slope from these counts and used statistical methods to investigate whether there was an association between the rate of decline in CD4 cell count and the time from initiating cART to the primary outcome—a first new AIDS-defining event or death. 2820 HIV-positive patients initiating cART were included in the study; the average pre-cART CD4 cell decline among them was 61 cells/µl/year. 255 of the patients experienced a new AIDS-related event or died after starting cART but the researchers found no evidence for an association between the primary outcome and the pre-cART CD4 slope or between survival and this slope. In addition, the rate of CD4 cell count decline was not significantly associated with progression to AIDS or death among 1731 HIV-positive, symptom-free patients with CD4 cell counts above 350 cells/µl who were studied before cART was developed.
What Do These Findings Mean?
These findings suggest that knowledge of the rate of CD4 cell count decline will not improve the prediction of clinical outcome in HIV-positive patients with a CD4 cell count above 350 cells/µl. Indeed, the findings show that the rate of CD4 cell decline in individual patients is highly variable over time. Consequently, a rate measured at one time cannot be used to reliably predict a patient's future CD4 cell count. Because this was an observational study, patients with the greatest rate of decline in their CD4 cell count might have received better care than other patients, a possibility that would lessen the effect of the rate of CD4 cell count decline on outcomes. Nevertheless, the findings of this study strongly suggest that knowledge of the current CD4 cell count and an assessment of other established risk factors for progression to AIDS are sufficient when deciding whether to initiate cART in symptom-free HIV-positive patients.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1000239.
Information is available from the US National Institute of Allergy and Infectious Diseases on HIV infection and AIDS
HIV InSite has comprehensive information on all aspects of HIV/AIDS, including information on treatments and treatment guidelines
Information is available from Avert, an international AIDS charity, on all aspects of HIV/AIDS, including information on treatments for HIV and AIDS, when to start treatment, and the stages of HIV infection (in English and Spanish)
Information on CASCADE is available
doi:10.1371/journal.pmed.1000239
PMCID: PMC2826377  PMID: 20186270
12.  The Simultaneous Assay of Tenofovir and Emtricitabine in Plasma using LC/MS/MS and Isotopically Labeled Internal Standards 
An LC/MS/MS assay we published for tenofovir (TFV) plasma levels is a useful tool for monitoring the pharmacotherapy of HIV-positive individuals (J. Chromatography B 830, 6–12, 2006). A new combination therapy consisting of the TFV pro-drug (300 mg) and another reverse transcriptase inhibitor, emtricitabine (FTC, 200 mg) has become available in a convenient once-daily dosage form (Truvada). This widely used medication has prompted us to develop and validate a convenient assay to determine simultaneously TFV and FTC plasma concentrations. In view of their chemical similarity to the analytes, stable isotope internal standards (IS) were chosen. These consisted of TFV labeled uniformly with 13C in the adenine moiety (Iso-TFV) and FTC labeled with 13C and 15N in the cytosine moiety (Iso-FTC). Trifluoroacetic acid was added to the patient’s EDTA plasma (containing the IS) to produce a de-proteinated extract after high speed centrifugation. The extracts were directly injected into the mobile phase (3% acetonitrile/1% acetic acid, aq.) stream flowing at 200 μL/min. A Synergi Polar-RP, 2.0 x 150mm, reversed-phase analytical column was used to achieve the chromatographic separation. Detection of the analytes was achieved by ESI positive ionization tandem mass spectrometry. The precursor/product transitions (m/z) in the positive ion mode were 288/176 and 293/181 ions for TFV and Iso-TFV, respectively and the precursor/ product transitions (m/z) were 248/130 and 251/133 ions for FTC and Iso-FTC, respectively. When the analyte/IS abundance ratios were plotted against the specified concentrations, the linearity of the concentration curves were in the range 10 ng/mL to 1500 ng/mL for both analytes (250 μL plasma extracted), with a minimum quantifiable limit of 10 ng/mL for both analytes. The inter- and intra-day accuracy and precision for both TFV and FTC were within ±20% at the LLOQ and ±15% at the other QC levels. We have expanded the method originally designed for the assay of TFV alone to incorporate the simultaneous determination of the latter and FTC using stable isotope IS. This assay has been successfully used for the periodic monitoring of 678 HIV-positive patients being treated with the combination therapy.
doi:10.1016/j.jchromb.2009.05.029
PMCID: PMC2714254  PMID: 19493710
Tenofovir (TFV); Tenofovir disoproxil fumarate (TDF); Emtricitabine (FTC); LC/MS/MS; Isotopic Internal Standards (IS); Selective Reaction Monitoring (SRM); Pharmacokinetics
13.  Pharmacy Refill Adherence Compared with CD4 Count Changes for Monitoring HIV-Infected Adults on Antiretroviral Therapy 
PLoS Medicine  2008;5(5):e109.
Background
World Health Organization (WHO) guidelines for monitoring HIV-infected individuals taking combination antiretroviral therapy (cART) in resource-limited settings recommend using CD4+ T cell (CD4) count changes to monitor treatment effectiveness. In practice, however, falling CD4 counts are a consequence, rather than a cause, of virologic failure. Adherence lapses precede virologic failure and, unlike CD4 counts, data on adherence are immediately available to all clinics dispensing cART. However, the accuracy of adherence assessments for predicting future or detecting current virologic failure has not been determined. The goal of this study therefore was to determine the accuracy of adherence assessments for predicting and detecting virologic failure and to compare the accuracy of adherence-based monitoring approaches with approaches monitoring CD4 count changes.
Methodology and Findings
We conducted an observational cohort study among 1,982 of 4,984 (40%) HIV-infected adults initiating non-nucleoside reverse transcriptase inhibitor-based cART in the Aid for AIDS Disease Management Program, which serves nine countries in southern Africa. Pharmacy refill adherence was calculated as the number of months of cART claims submitted divided by the number of complete months between cART initiation and the last refill prior to the endpoint of interest, expressed as a percentage. The main outcome measure was virologic failure defined as a viral load > 1,000 copies/ml (1) at an initial assessment either 6 or 12 mo after cART initiation and (2) after a previous undetectable (i.e., < 400 copies/ml) viral load (breakthrough viremia). Adherence levels outperformed CD4 count changes when used to detect current virologic failure in the first year after cART initiation (area under the receiver operating characteristic [ROC] curves [AUC] were 0.79 and 0.68 [difference = 0.11; 95% CI 0.06 to 0.16; χ2 = 20.1] respectively at 6 mo, and 0.85 and 0.75 [difference = 0.10; 95% CI 0.05 to 0.14; χ2 = 20.2] respectively at 12 mo; p < 0.001 for both comparisons). When used to detect current breakthrough viremia, adherence and CD4 counts were equally accurate (AUCs of 0.68 versus 0.67, respectively [difference = 0.01; 95% CI −0.06 to 0.07]; χ2 = 0.1, p > 0.5). In addition, adherence levels assessed 3 mo prior to viral load assessments were as accurate for virologic failure occurring approximately 3 mo later as were CD4 count changes calculated from cART initiation to the actual time of the viral load assessments, indicating the potential utility of adherence assessments for predicting future, rather than simply detecting current, virologic failure. Moreover, combinations of CD4 count and adherence data appeared useful in identifying patients at very low risk of virologic failure.
Conclusions
Pharmacy refill adherence assessments were as accurate as CD4 counts for detecting current virologic failure in this cohort of patients on cART and have the potential to predict virologic failure before it occurs. Approaches to cART scale-up in resource-limited settings should include an adherence-based monitoring approach.
Analyzing pharmacy and laboratory records from 1,982 patients beginning HIV therapy in southern Africa, Gregory Bisson and colleagues find medication adherence superior to CD4 count changes in identifying treatment failure.
Editors' Summary
Background.
Globally, more than 30 million people are infected with the human immunodeficiency virus (HIV), the cause of acquired immunodeficiency syndrome (AIDS). Combinations of antiretroviral drugs that hold HIV in check (viral suppression) have been available since 1996. Unfortunately, most of the people affected by HIV/AIDS live in developing countries and cannot afford these expensive drugs. As a result, life expectancy has plummeted and economic growth has reversed in these poor countries since the beginning of the AIDS pandemic. Faced with this humanitarian crisis, the lack of access to HIV treatment was declared a global health emergency in 2003. Today, through the concerted efforts of governments, international organizations, and funding bodies, about a quarter of the HIV-positive people in developing and transitional countries who are in immediate need of life-saving, combination antiretroviral therapy (cART) receive the drugs they need.
Why Was This Study Done?
To maximize the benefits of cART, health-care workers in developing countries need simple, affordable ways to monitor viral suppression in their patients—a poor virologic response to cART can lead to the selection of drug-resistant HIV, rapid disease progression, and death. In developed countries, virologic response is monitored by measuring the number of viral particles in patients' blood (viral load) but this technically demanding assay is unavailable in most developing countries. Instead, the World Health Organization recommends that CD4+ T cell (CD4) counts be used to monitor patient responses to cART in resource-limited settings. HIV results in loss of CD4 cells (a type of immune system cell), so a drop in a patient's CD4 count often indicates virologic failure (failure of treatment to suppress the virus). However, falling CD4 counts are often a result of virologic failure and therefore monitoring CD4 counts for drops is unlikely to prevent virologic failure from occurring. Rather, falling CD4 counts are often used only to guide a change to new medicines, which may be even more expensive or difficult to take. On the other hand “adherence lapses”—the failure to take cART regularly—often precede virologic failure, so detecting them early provides an opportunity for improvement in adherence that could prevent virologic failure. Because clinics that dispense cART routinely collect data that can be used to calculate adherence, in this study the researchers investigate whether assessing adherence might provide an alternative, low-cost way to monitor and predict virologic failure among HIV-infected adults on cART.
What Did the Researchers Do and Find?
The Aid for AIDS Disease Management Program provides cART to medical insurance fund subscribers in nine countries in southern Africa. Data on claims for antiretroviral drugs made through this program, plus CD4 counts assessed at about 6 or 12 months after initiating cART, and viral load measurements taken within 45 days of a CD4 count, were available for nearly 2,000 HIV-positive adults who had been prescribed a combination of HIV drugs including either efavirenz or nevirapine. The researchers defined adherence as the number of months of cART claims submitted divided by the number of complete months between cART initiation and the last pharmacy refill before a viral load assessment was performed. Virologic failure was defined in two ways: as a viral load of more than 1,000 copies per ml of blood 6 or 12 months after cART initiation, or as a rebound of viral load to similar levels after a previously very low reading (breakthrough viremia). The researchers' statistical analysis of these data shows that at 6 and 12 months after initiation of cART, adherence levels indicated virologic failure more accurately than CD4 count changes. For breakthrough viremia, both measurements were equally accurate. Adherence levels during the first 3 months of cART predicted virologic failure at 6 months as accurately as did CD4 count changes since cART initiation. Finally, the combination of adherence levels and CD4 count changes accurately identified patients at very low risk of virologic failure.
What Do These Findings Mean?
These findings suggest that adherence assessments (based in this study on insurance claims for pharmacy refills) can identify the patients on cART who are at high and low risk of virologic failure at least as accurately as CD4 counts. In addition, they suggest that adherence assessments could be used for early identification of patients at high risk of virologic failure, averting the health impact of treatment failure and the cost of changing to second-line drug regimens. Studies need to be done in other settings (in particular, in public clinics where cART is provided without charge) to confirm the generalizability of these findings. These finding do not change that fact that monitoring CD4 counts plays an important role in deciding when to start cART or indicating when cART is no longer protecting the immune system. But, write the researchers, systematic monitoring of adherence to cART should be considered as an alternative to CD4 count monitoring in patients who are receiving cART in resource-limited settings or as a way to direct the use of viral load testing where feasible.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0050109.
This study is discussed further in a PLoS Medicine Perspective by David Bangsberg
Information is available from the US National Institute of Allergy and Infectious Diseases on HIV infection and AIDS
HIV InSite has comprehensive information on all aspects of HIV/AIDS, including an article about adherence to antiretroviral therapy
Information is available from Avert, an international AIDS charity, on HIV and AIDS in Africa and on providing AIDS drug treatment for millions
The World Health Organization provides information about universal access to HIV treatment (in several languages) and on its recommendations for antiretroviral therapy for HIV infection in adults and adolescents
The US Centers for Disease Control and Prevention also provides information on global efforts to deal with the HIV/AIDS pandemic (in English and Spanish)
doi:10.1371/journal.pmed.0050109
PMCID: PMC2386831  PMID: 18494555
14.  The Relationship between Proteinuria and Coronary Risk: A Systematic Review and Meta-Analysis 
PLoS Medicine  2008;5(10):e207.
Background
Markers of kidney dysfunction such as proteinuria or albuminuria have been reported to be associated with coronary heart disease, but the consistency and strength of any such relationship has not been clearly defined. This lack of clarity has led to great uncertainty as to how proteinuria should be treated in the assessment and management of cardiovascular risk. We therefore undertook a systematic review of published cohort studies aiming to provide a reliable estimate of the strength of association between proteinuria and coronary heart disease.
Methods and Findings
A meta-analysis of cohort studies was conducted to obtain a summary estimate of the association between measures of proteinuria and coronary risk. MEDLINE and EMBASE were searched for studies reporting an age- or multivariate-adjusted estimate and standard error of the association between proteinuria and coronary heart disease. Studies were excluded if the majority of the study population had known glomerular disease or were the recipients of renal transplants. Two independent researchers extracted the estimates of association between proteinuria (total urinary protein >300 mg/d), microalbuminuria (urinary albumin 30–300 mg/d), macroalbuminuria (urinary albumin >300 mg/d), and risk of coronary disease from individual studies. These estimates were combined using a random-effects model. Sensitivity analyses were conducted to examine possible sources of heterogeneity in effect size. A total of 26 cohort studies were identified involving 169,949 individuals and 7,117 coronary events (27% fatal). The presence of proteinuria was associated with an approximate 50% increase in coronary risk (risk ratio 1.47, 95% confidence interval [CI] 1.23–1.74) after adjustment for known risk factors. For albuminuria, there was evidence of a dose–response relationship: individuals with microalbuminuria were at 50% greater risk of coronary heart disease (risk ratio 1.47, 95% CI 1.30–1.66) than those without; in those with macroalbuminuria the risk was more than doubled (risk ratio 2.17, 1.87–2.52). Sensitivity analysis indicated no important differences in prespecified subgroups.
Conclusion
These data confirm a strong and continuous association between proteinuria and subsequent risk of coronary heart disease, and suggest that proteinuria should be incorporated into the assessment of an individual's cardiovascular risk.
Vlado Perkovic and colleagues show, through a systematic review and meta-analysis of cohort studies, that there is a strong and continuous association between proteinuria and subsequent risk of coronary heart disease.
Editors' Summary
Background.
Coronary heart disease (CHD) is the leading cause of death among adults in developed countries. With age, fatty deposits called atherosclerotic plaques coat the walls of arteries, the vessels that nourish the organs of the body by carrying blood and oxygen to them. Because they narrow the arteries, atherosclerotic plaques restrict the blood flow to the body's organs. If these plaques form in the arteries that feed the heart muscle (the coronary arteries), the result is CHD. The symptoms of CHD include shortness of breath and chest pains (angina). In addition, if a plaque breaks off the wall of a coronary artery, it can completely block that artery, which kills part of the heart muscle and causes a potentially fatal heart attack. Smoking, high blood pressure, high blood levels of cholesterol (a type of fat), having diabetes, being overweight, and being physically inactive are established risk factors for CHD. Treatments for CHD include lifestyle changes (for example, losing weight) and medications that lower blood pressure and blood cholesterol. The narrowed arteries can also be widened using a device called a stent or surgically bypassed.
Why Was This Study Done?
In addition to the established risk factors for CHD, several other factors may also increase a person's risk of developing CHD, including kidney disease, which affects one in six adults to some degree. An early sign of kidney dysfunction is high amounts of a protein called albumin or of total proteins in the urine (albuminuria and proteinuria, respectively). Some studies have suggested that proteinuria is associated with an increased risk of CHD, but the results of these studies are inconsistent. Consequently, it is unclear whether proteinuria should be considered when assessing and managing an individual's CHD risk. In this study, the researchers undertake a systematic review (a study in which predefined search criteria are used to identify all the research on a specific topic) and a meta-analysis (a statistical method for combining the results of several studies) of published studies that have investigated the association between proteinuria and CHD.
What Did the Researchers Do and Find?
The researchers' systematic review identified 26 published studies that provided estimates of the association between CHD risk and proteinuria and albuminuria by measuring baseline urinary protein and albumin levels in people who were then followed for several years to see whether they developed CHD. Nearly 170,000 individuals participated in these studies, which recorded more 7,000 fatal and nonfatal heart attacks and other coronary events. In the meta-analysis, proteinuria (urinary protein of more than 300 mg/d or dipstick 1+ or more) increased CHD risk by 50% after adjustment for other known CHD risk factors. Furthermore, individuals with microalbuminuria (a urinary albumin of 30–300 mg/d) were 50% more likely to develop CHD than those with normal amounts of urinary albumin; people with macroalbuminuria (urinary albumin of more than 300 mg/d) were more than twice as likely to develop CHD. Finally, the association between proteinuria and CHD did not differ substantially between specific subgroups of participants such as people with and without diabetes.
What Do These Findings Mean?
These findings suggest that there is a strong, possibly dose-dependent association between proteinuria and the risk of CHD and that this association is independent of other known CHD risk factors, including diabetes. The finding that people with proteinuria have a 50% or greater increased risk of developing CHD than people without proteinuria may be a slight overestimate of the strength of the association between proteinuria because of publication bias. That is, studies that failed to show an association may not have been published. However, because this systematic review and meta-analysis includes several large population-based studies done in various parts of the world, these findings are likely to be generalizable. Thus, these findings support the inclusion of an evaluation of proteinuria in the assessment of CHD risk and suggest that medications and other strategies that reduce proteinuria might help to reduce the overall burden of CHD.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0050207.
The MedlinePlus encyclopedia has pages on coronary heart disease, atherosclerosis, and chronic kidney failure (in English and Spanish)
Information is available from the US National Heart Lung and Blood Institute on coronary heart disease
The UK National Health Service Direct health encyclopedia also provides information about coronary heart disease (in several languages)
Information for patients and caregivers is provided by the American Heart Association on all aspects of heart disease.
The British Heart Foundation also provides information on heart disease and on keeping the heart healthy
doi:10.1371/journal.pmed.0050207
PMCID: PMC2570419  PMID: 18942886
15.  Urine High and Low Molecular Weight Proteins One-Year Post Kidney Transplant: Relationship to Histology and Graft Survival 
Increased urinary protein excretion is common after renal transplantation and portends worse outcome. In this study we assessed the prognostic contribution of several urinary proteins. Urinary total protein, albumin, retinol binding protein (RBP), α-1-microglobulin, IgG and IgM, were measured in banked urine samples from 221 individuals one-year after renal transplantation (age 52 ± 13 years, 55% male, 93 % Caucasian and 82 % living donor). Levels of all proteins measured were higher than in normal non-transplant populations. Patients with glomerular lesions had higher urinary albumin than those with normal histology, while those with Interstitial Fibrosis and Tubular Atrophy plus Inflammation (ci>0, cg=0, i>0) had higher levels of IgG, IgM, α-1-microglobulin and RBP. Concomitant normal levels of urinary albumin, IgM and RBP identified normal histology (specificity 91%, sensitivity 15 %,). Urinary levels of the specific proteins were highly correlated, could not differentiate among the histologic groups, and appeared to result from tubulointerstitial damage. Increased urinary excretion of the low molecular weight protein RBP was a sensitive marker of allografts at risk, predicting long-term graft loss independent of histology and urinary albumin. This study highlights the prognostic importance of tubulointerstitial disease for long-term graft loss.
doi:10.1111/ajt.12044
PMCID: PMC3582782  PMID: 23414180
Retinol binding protein; α 1 microglobulin; IgG; IgM; Albumin; Protocol Biopsies; Spot urine; Creatinine ratio; Graft survival
16.  Differentiation of glomerular, tubular, and normal proteinuria: determinations of urinary excretion of β2-microglobulin, albumin, and total protein 
Journal of Clinical Investigation  1969;48(7):1189-1198.
A low molecular weight β2-globulin (β2-microglobulin), albumin, and total protein were measured in concentrated 24-hr urine specimens from 20 healthy subjects and 30 patients with clinical proteinuria of glomerular or tubular type. Classification of proteinuria was made on the basis of clinical diagnosis and size distribution of urinary proteins after gel chromatography. The molecular radii (Stokes' radii) of β2-microglobulin and albumin, estimated by gel chromatography, were 15 A and 35 A.
The average 24-hr urinary excretion in healthy subjects was 0.12 mg for β2-microglobulin, 10 mg for albumin, and 80 mg for total protein. The patients with renal glomerular disorders had normal or only somewhat increased excretion of β2-microglobulin, despite considerably increased excretion of albumin and total protein. Most of the patients with tubular dysfunction excreted large amounts of β2-microglobulin, although they excreted normal or only slightly increased amounts of albumin and only moderately increased quantities of total protein. Consequently, the ratio or urinary albumin/urinary β2-microglobulin was high in glomerular proteinuria (1100: 14,200), intermediate in normal proteinuria (33: 163), and low in tubular proteinuria (1.0: 13.3). Determinations of urinary clearances of β2-microglobulin and albumin in four healthy subjects and 11 patients indicated that increased excretions of the two proteins were associated with increased clearances. The results suggest that quantitative determinations of urinary β2-microglobulin and urinary albumin may be useful for detecting disorders of the renal handling of plasma proteins. The findings also seem to suggest a selective tubular reabsorption of the two proteins.
Estimates on sera revealed a close correlation between serum levels of β2-microglobulin and creatinine and also a greatly raised serum concentration of β2-microglobulin after bilateral nephrectomy.
PMCID: PMC322340  PMID: 4978446
17.  Natural Substrate Concentrations Can Modulate the Prophylactic Efficacy of Nucleotide HIV Reverse Transcriptase Inhibitors▿ 
Journal of Virology  2011;85(13):6610-6617.
Preexposure prophylaxis (PrEP) with antiretroviral drugs is a novel human immunodeficiency virus (HIV) prevention strategy. It is generally thought that high systemic and mucosal drug levels are sufficient for protection. We investigated whether GS7340, a next-generation tenofovir (TFV) prodrug that effectively delivers tenofovir diphosphate (TFV-DP) to lymphoid cells and tissues, could protect macaques against repeated weekly rectal simian-human immunodeficiency virus (SHIV) exposures. Macaques received prophylactic GS7340 treatment 3 days prior to each virus exposure. At 3 days postdosing, TFV-DP concentrations in peripheral blood mononuclear cells (PBMCs) were about 50-fold higher than those seen with TFV disoproxil fumarate (TDF), and they remained above 1,000 fmol/106 cells for as long as 7 days. TFV-DP accumulated in lymphoid and rectal tissues, with concentrations at 3 days exceeding 500 fmol/106 mononuclear cells. Despite high mucosal and systemic TFV levels, GS7340 was not protective. Since TFV-DP blocks reverse transcription by competing with the natural dATP substrate, we measured dATP contents in peripheral lymphocytes, lymphoid tissue, and rectal mononuclear cells. Compared to those in circulating lymphocytes and lymphoid tissue, rectal lymphocytes had 100-fold higher dATP concentrations and dATP/TFV-DP ratios, likely reflecting the activated status of the cells and suggesting that TFV-DP may be less active at the rectal mucosa. Our results identify dATP/TFV-DP ratios as a possible correlate of protection by TFV and suggest that natural substrate concentrations at the mucosa will likely modulate the prophylactic efficacy of nucleotide reverse transcriptase inhibitors.
doi:10.1128/JVI.00311-11
PMCID: PMC3126530  PMID: 21525346
18.  CD4 Cell Count and the Risk of AIDS or Death in HIV-Infected Adults on Combination Antiretroviral Therapy with a Suppressed Viral Load: A Longitudinal Cohort Study from COHERE 
PLoS Medicine  2012;9(3):e1001194.
Using data from the Collaboration of Observational HIV Epidemiological Research Europe, Jim Young and colleagues show that in successfully treated patients the risk of a new AIDS event or death follows a CD4 cell count gradient in patients with viral suppression.
Background
Most adults infected with HIV achieve viral suppression within a year of starting combination antiretroviral therapy (cART). It is important to understand the risk of AIDS events or death for patients with a suppressed viral load.
Methods and Findings
Using data from the Collaboration of Observational HIV Epidemiological Research Europe (2010 merger), we assessed the risk of a new AIDS-defining event or death in successfully treated patients. We accumulated episodes of viral suppression for each patient while on cART, each episode beginning with the second of two consecutive plasma viral load measurements <50 copies/µl and ending with either a measurement >500 copies/µl, the first of two consecutive measurements between 50–500 copies/µl, cART interruption or administrative censoring. We used stratified multivariate Cox models to estimate the association between time updated CD4 cell count and a new AIDS event or death or death alone. 75,336 patients contributed 104,265 suppression episodes and were suppressed while on cART for a median 2.7 years. The mortality rate was 4.8 per 1,000 years of viral suppression. A higher CD4 cell count was always associated with a reduced risk of a new AIDS event or death; with a hazard ratio per 100 cells/µl (95% CI) of: 0.35 (0.30–0.40) for counts <200 cells/µl, 0.81 (0.71–0.92) for counts 200 to <350 cells/µl, 0.74 (0.66–0.83) for counts 350 to <500 cells/µl, and 0.96 (0.92–0.99) for counts ≥500 cells/µl. A higher CD4 cell count became even more beneficial over time for patients with CD4 cell counts <200 cells/µl.
Conclusions
Despite the low mortality rate, the risk of a new AIDS event or death follows a CD4 cell count gradient in patients with viral suppression. A higher CD4 cell count was associated with the greatest benefit for patients with a CD4 cell count <200 cells/µl but still some slight benefit for those with a CD4 cell count ≥500 cells/µl.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Currently, about 34 million people are infected with HIV and every year nearly 3 million people are newly infected with this virus, which causes AIDS. Most people do not become ill immediately after infection with HIV although some develop a short, flu-like illness (a “seroconversion” illness). The next stage of HIV infection, which may last up to 10 years, also has no major symptoms but, during this stage, HIV slowly destroys immune system cells (including CD4 cells, a type of lymphocyte). Eventually, the immune system can no longer fight off infections by other disease-causing organisms and HIV-positive people then develop one or more AIDS-defining condition(s), including severe but unusual infections, Kaposi sarcoma (a skin cancer), and non-Hodgkin lymphoma (a cancer of the lymph nodes). Many of these AIDS-defining conditions are life-threatening and, in the past, HIV-positive people died on average within 10 years of infection. Nowadays, although there is still no cure for HIV infection, combination antiretroviral therapy (cART; a cocktail of powerful antiretroviral drugs) has turned HIV/AIDS into a chronic, treatable condition, at least in developed countries.
Why Was This Study Done?
Most HIV-positive adults achieve viral suppression within a year of starting cART. That is, the number of copies of the virus in their blood drops to below 50 copies/ml. But what is the likely clinical outcome for patients who achieve viral suppression and what is their risk of developing a new AIDS-defining condition or of dying? For people starting cART for the first time, the number of CD4 cells in the blood when cART is initiated provides a strong indication of an individual's likely clinical outcome. Specifically, people who start cART when they have a high CD4 cell count tend to do better than people who start treatment when they have a low CD4 cell count. In this study, the researchers use data collected by the Collaboration of Observational HIV Epidemiological Research in Europe (COHERE) to estimate the association between CD4 cell count and progression to a new AIDS-defining event or death among patients who have achieved viral suppression while on cART.
What Did the Researchers Do and Find?
The researchers identified more than 75,000 patients in the COHERE database who, between them, had had more than 104,000 episodes (periods) of viral suppression while on cART and who had had their CD4 cell count determined shortly before or during their viral suppression episodes. The researchers then used stratified multivariate Cox models (a type of statistical analysis method) to estimate the association between CD4 cell counts and the occurrence of a new AIDS-defining event or death. Among the patients included in the study, the mortality (death) rate was 4.8 per 1,000 years of viral suppression. The highest rates of new AIDS-defining events or death were seen in those patients with less than 50 CD4 cells/µl blood and a higher CD4 cell count was associated with a reduced risk of a new AIDS-defining event or death. Finally, among those patients with a CD4 cell count below 200 cells/µl, the risk of progression decreased over time for those patients with higher CD4 cell counts.
What Do These Findings Mean?
These findings suggest that, although new AIDS-defining events and death are uncommon among patients whose viral load is suppressed by cART, the risk of a new AIDS-defining event or death follows a CD4 cell count gradient with the patients with the highest CD4 cell counts having the lowest risk of a new AIDS-defining event or death. The findings also suggest that higher CD4 cell counts provide the greatest benefit for patients with a CD4 cell count below 200 cells/µl blood. These findings have two main clinical implications. First, they add to the evidence that suggests that, to facilitate immune system recovery, cART should be started when a patient's CD4 cell count is between 350 and 500 cells/µl blood, the current recommended range for cART initiation. Unfortunately, most patients in resource-limited settings only start cART when their CD4 cell count is below 200 cells/µl. Second, these findings suggest that patients with sustained viral suppression but low CD4 cell counts should be monitored regularly to ensure that any life-threatening AIDS-defining events are dealt with quickly and effectively.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001194.
Information is available from the US National Institute of Allergy and infectious diseases on HIV infection and AIDS
NAM/aidsmap provides basic information about HIV/AIDS, and summaries of recent research findings on HIV care and treatment
Information is available from Avert, an international AIDS charity on many aspects of HIV/AIDS, including detailed information on HIV treatment and care (in English and Spanish)
The World Health Organization's 2010 antiretroviral therapy guidelines provide recommendations on when to initiate cART
Information about COHERE is available
Patient stories about living with HIV/AIDS are available through Avert and through the charity website Healthtalkonline
doi:10.1371/journal.pmed.1001194
PMCID: PMC3308938  PMID: 22448150
19.  Idiopathic recurrent calcium urolithiasis (IRCU): pathophysiology evaluated in light of oxidative metabolism, without and with variation of several biomarkers in fasting urine and plasma - a comparison of stone-free and -bearing male patients, emphasizing mineral, acid-base, blood pressure and protein status 
Background
IRCU is traditionally considered as lifestyle disease (associations with, among others, overweight, obesity, hypertension, type-2 diabetes), arising from excess, in 24 h urine, of calcium (Ca) salts (calcium oxalate (CaOx), calcium phosphate (CaPi)), supersaturation of, and crystallization in, tubular fluid and urine, causing crystal-induced epithelial cell damage, proteinuria, crystal aggregation and uroliths.
Methods
Another picture emerges from the present uncontrolled study of 154 male adult IRCU patients (75 stone-bearing (SB) and 79 age-matched stone-free (SF)), in whom stone-forming and other parameters in fasting urine and plasma were contrasted with five biomarkers (see footnote) of oxidative metabolism (OM), without and with variation of markers.
Results
1) In SB vs. SF unstratified OM biomarkers were statistically unchanged, but the majority of patients was overweight; despite, in SB vs. SF urine pH, total and non-albumin protein concentration were elevated, fractional urinary uric acid excretion and blood bicarbonate decreased, whereas urine volume, sodium, supersaturation with CaOx and CaPi (as hydroxyapatite) were unchanged; 2) upon variation of OM markers (strata below and above median) numerous stone parameters differed significant!)', among others urine volume, total protein, Ca/Pi ratio, pH, sodium, potassium, plasma Ca/Pi ratio and parathyroid hormone, blood pressure, renal excretion of non-albumin protein and other substances; 3) a significant shift from SF to SB patients occurred with increase of urine pH, decrease of blood bicarbonate, and increase of diastolic blood pressure, whereas increase of plasma uric acid impacted only marginally; 4) in both SF and SB patients a strong curvilinear relationship links a rise of urine Ca/Pi to urine Ca/Pi divided by plasma Ca/Pi, but in SB urine Ca/Pi failed to correlate significantly with urine hydroxyapatite supersaturation; 5) also in SB, plasma Ca/Pi and urinary nitrate were negatively correlated, whereas in SF plasma Ca/Pi ratio, PTH and body mass index correlated positively; 6) multivariate regression analysis revealed that PTH, body mass index and nitrate together could explain 22 (p = 0.002) and only 7 (p = 0.06) per cent of variation of plasma Ca/Pi in SF and SB, respectively
Conclusions
In IRCU a) numerous constituents of fasting urine, plasma, blood and blood pressure change in response to variation of OM biomarkers, suggesting involvement of OM imbalance as factor in functional deterioration of tissue; b) in the majority of patients a positive exponential relationship links urine Ca/Pi to urine Ca/Pi divided by plasma Ca/Pi, presumably to accumulate Ca outside tubular lumen, thereby minimizing intratubular and urinary Ca salt crystallization; c) alteration of interactions of low urine nitrate, PTH and Ca/Pi in plasma may be of importance in formation of new Ca stone and co-regulation of dynamics of blood vasculature; d) overweight, combined with OM-modified renal interstitial environment appears to facilitate these processes, carrying the risk that CaPi mineral develops within or/and close to blood vessel tissue, and spreads towards urothelium.
For future research focussing on IRCU pathogenesis studies are recommended on the role of affluent lifestyle mediated renal ischemia, mild hypertensive nephropathy, rise of uric acid precursor oxypurines and uricemia, clarifying also why loss of significance of interrelationships of OM biomarkers with traditional Ca stone risk factors is characteristic for SB patients.
OM biomarkers
Plasma uric acid - Discussed as scavenger of reactive oxygen species, but also as donator (via the xanthine oxido-reductase reaction)
Urinary malonedialdehydc - Accepted as indicator of peroxidation of lipids within biological cell membranes
Urinaiy nitrate - Accepted as indicator of vasodilation-mediating nitric oxide production by blood vessel endothelium
Urinary malonedialdehyde/Plasma uric acid - Tentative markers of oxidant/antioxidant imbalance
Urinary nitrate/Plasma uric acid - Tentative markers of oxidant/antioxidant imbalance
doi:10.1186/2047-783X-16-8-349
PMCID: PMC3351987  PMID: 21813378
Idiopathic Recurrent Calcium Urolithiasis; New stones absent or present; Oxidative and nitrative metabolism; Variation of biomarkers; State of stone parameters
20.  Urinary NGAL is a useful clinical biomarker of HIV-associated nephropathy 
Nephrology Dialysis Transplantation  2011;26(7):2387-2390.
Background. Urinary neutrophil gelatinase-associated lipocalin (uNGAL) is expressed by kidney tubules that are acutely damaged, but few studies have investigated the association of neutrophil gelatinase-associated lipocalin (NGAL) with different forms of chronic kidney disease (CKD). HIV-associated nephropathy (HIVAN) is a progressive form of CKD characterized by collapsing focal segmental glomerulosclerosis and microcytic tubular dilatation that typically leads to end-stage renal disease (ESRD).
Methods. Previously, we reported that microcystic tubular dilatations specifically expressed NGAL RNA, implying that the detection of uNGAL protein could mark advanced HIVAN. To test this idea, we performed a comparative study of diverse proteinuric glomerulopathies in 25 patients who were HIV positive.
Results. Eighteen patients had HIVAN and seven had other glomerulopathies (four membranoproliferative glomerulonephritis, one membranous glomerulonephritis, one amyloid and one malarial GN). HIVAN and non-HIVAN patients did not differ with respect to age, ethnicity, serum creatinine, estimated GFR, proteinuria or the prevalence of hypocomplementemia (6 versus 29%, P = 0.18), but HIVAN patients were less likely to have HCV infections. HIVAN patients expressed 4-fold higher levels of uNGAL than the patients with other glomerulopathies [387 ± 338 versus 94 ± 101 μg/g urine creatinine (uCr), P = 0.02]. A cutpoint of 121.5 μg uNGAL/g uCr demonstrated 94% sensitivity and 71% specificity for the diagnosis of HIVAN, with an area under the receiver operator characteristic curve of 0.88.
Conclusion. In summary, while HIVAN disease is currently diagnosed only by kidney biopsy, uNGAL can distinguish HIVAN from other proteinuric glomerulopathies in the HIV-infected patient, likely because of its specific expression from characteristic microcysts.
doi:10.1093/ndt/gfr258
PMCID: PMC3164447  PMID: 21555394
biomarker; HIV-associated nephropathy; progressive chronic kidney disease; tubular injury; urinary neutrophil gelatinase-associated lipocalin
21.  Renal Function in Hepatosplenic Schistosomiasis – An Assessment of Renal Tubular Disorders 
PLoS ONE  2014;9(12):e115197.
Background
Renal involvement in Schistosoma mansoni infection is not well studied. The aim of this study is to investigate the occurrence of renal abnormalities in patients with hepatosplenic schistosomiasis (HSS), especially renal tubular disorders.
Methods
This is a cross-sectional study with 20 consecutive patients with HSS followed in a medical center in Maceió, Alagoas, Brazil. Urinary acidification and concentration tests were performed using calcium chloride (CaCl2) after a 12-h period of water and food deprivation. The biomarker monocyte chemoattractant protein 1 (MCP-1) was quantified in urine. Fractional excretion of sodium (FENa+), transtubular potassium gradient (TTKG) and solute-free water reabsorption (TcH2O) were calculated. The HSS group was compared to a group of 17 healthy volunteers.
Results
Patients' mean age and gender were similar to controls. Urinary acidification deficit was found in 45% of HSS patients. Urinary osmolality was significantly lower in HSS patients (588±112 vs. 764±165 mOsm/kg, p = 0,001) after a 12-h period of water deprivation. TcH2O was lower in HSS patients (0.72±0.5 vs. 1.1±0.3, p = 0.04). Urinary concentration deficit was found in 85% of HSS patients. The values of MCP-1 were higher in HSS group than in control group (122±134 vs. 40±28 pg/mg-Cr, p = 0.01) and positively correlated with the values of microalbuminuria and proteinuria.
Conclusions
HSS is associated with important kidney dysfunction. The main abnormalities found were urinary concentrating ability and incomplete distal acidification defect, demonstrating the occurrence of tubular dysfunction. There was also an increase in urinary MCP-1, which appears to be a more sensitive marker of renal damage than urinary albumin excretion rate.
doi:10.1371/journal.pone.0115197
PMCID: PMC4274079  PMID: 25531759
22.  Pharmacokinetics of Antiretroviral Regimens Containing Tenofovir Disoproxil Fumarate and Atazanavir-Ritonavir in Adolescents and Young Adults with Human Immunodeficiency Virus Infection▿ †  
The primary objective of this study was to measure atazanavir-ritonavir and tenofovir pharmacokinetics when the drugs were used in combination in young adults with human immunodeficiency virus (HIV). HIV-infected subjects ≥18 to <25 years old receiving (≥28 days) 300/100 mg atazanavir-ritonavir plus 300 mg tenofovir disoproxil fumarate (TDF) plus one or more other nucleoside analogs underwent intensive 24-h pharmacokinetic studies following a light meal. Peripheral blood mononuclear cells were obtained at 1, 4, and 24 h postdose for quantification of intracellular tenofovir diphosphate (TFV-DP) concentrations. Twenty-two subjects were eligible for analyses. The geometric mean (95% confidence interval [CI]) atazanavir area under the concentration-time curve from 0 to 24 h (AUC0-24), maximum concentration of drug in serum (Cmax), concentration at 24 h postdose (C24), and total apparent oral clearance (CL/F) values were 35,971 ng·hr/ml (30,853 to 41,898), 3,504 ng/ml (2,978 to 4,105), 578 ng/ml (474 to 704), and 8.3 liter/hr (7.2 to 9.7), respectively. The geometric mean (95% CI) tenofovir AUC0-24, Cmax, C24, and CL/F values were 2,762 ng·hr/ml (2,392 to 3,041), 254 ng/ml (221 to 292), 60 ng/ml (52 to 68), and 49.2 liter/hr (43.8 to 55.3), respectively. Body weight was significantly predictive of CL/F for all three drugs. For every 10-kg increase in weight, there was a 10%, 14.8%, and 6.8% increase in the atazanavir, ritonavir, and tenofovir CL/F, respectively (P ≤ 0.01). Renal function was predictive of tenofovir CL/F. For every 10 ml/min increase in creatinine clearance, there was a 4.6% increase in tenofovir CL/F (P < 0.0001). The geometric mean (95% CI) TFV-DP concentrations at 1, 4, and 24 h postdose were 96.4 (71.5 to 130), 93.3 (68 to 130), and 92.7 (70 to 123) fmol/million cells. There was an association between renal function, tenofovir AUC, and tenofovir Cmax and intracellular TFV-DP concentrations, although none of these associations reached statistical significance. In these HIV-infected young adults treated with atazanavir-ritonavir plus TDF, the atazanavir AUC was similar to those of older adults treated with the combination. Based on data for healthy volunteers, a higher tenofovir AUC may have been expected, but was not seen in these subjects. This might be due to faster tenofovir CL/F because of higher creatinine clearance in this age group. Additional studies of the exposure-response relationships of this regimen in children, adolescents, and adults would advance our knowledge of its pharmacodynamic properties.
doi:10.1128/AAC.00761-07
PMCID: PMC2224775  PMID: 18025112
23.  Increased Pericardial Fat Volume Measured From Noncontrast CT Predicts Myocardial Ischemia by SPECT 
JACC. Cardiovascular imaging  2010;3(11):1104-1112.
OBJECTIVES
We evaluated the association between pericardial fat and myocardial ischemia for risk stratification.
BACK GROUND
Pericardial fat volume (PFV) and thoracic fat volume (TFV) measured from noncontrast computed tomography (CT) performed for calculating coronary calcium score (CCS) are associated with increased CCS and risk for major adverse cardiovascular events.
METHODS
From a cohort of 1,777 consecutive patients without previously known coronary artery disease (CAD) with noncontrast CT performed within 6 months of single photon emission computed tomography (SPECT), we compared 73 patients with ischemia by SPECT (cases) with 146 patients with normal SPECT (controls) matched by age, gender, CCS category, and symptoms and risk factors for CAD. TFV was automatically measured. Pericardial contours were manually defined within which fat voxels were automatically identified to compute PFV. Computer-assisted visual interpretation of SPECT was performed using standard 17-segment and 5-point score model; perfusion defect was quantified as summed stress score (SSS) and summed rest score (SRS). Ischemia was defined by: SSS – SRS ≥4. Independent relationships of PFV and TFV to ischemia were examined.
RESULTS
Cases had higher mean PFV (99.1 ± 42.9 cm3 vs. 80.1 ± 31.8 cm3, p = 0.0003) and TFV (196.1 ± 82.7 cm3 vs. 160.8 ± 72.1 cm3, p = 0.001) and higher frequencies of PFV >125 cm3 (22% vs. 8%, p = 0.004) and TFV >200 cm3 (40% vs. 19%, p = 0.001) than controls. After adjustment for CCS, PFV and TFV remained the strongest predictors of ischemia (odds ratio [OR]: 2.91, 95% confidence interval [CI]: 1.53 to 5.52, p = 0.001 for each doubling of PFV; OR: 2.64, 95% CI: 1.48 to 4.72, p = 0.001 for TFV. Receiver operating characteristic analysis showed that prediction of ischemia, as indicated by receiver-operator characteristic area under the curve, improved significantly when PFV or TFV was added to CCS (0.75 vs. 0.68, p = 0.04 for both).
CONCLUSIONS
Pericardial fat was significantly associated with myocardial ischemia in patients without known CAD and may help improve risk assessment.
doi:10.1016/j.jcmg.2010.07.014
PMCID: PMC3057558  PMID: 21070997
computed tomography; ischemia; pericardial fat; SPECT; thoracic fat
24.  Indomethacin Reduces Glomerular and Tubular Damage Markers but Not Renal Inflammation in Chronic Kidney Disease Patients: A Post-Hoc Analysis 
PLoS ONE  2012;7(5):e37957.
Under specific conditions non-steroidal anti-inflammatory drugs (NSAIDs) may be used to lower therapy-resistant proteinuria. The potentially beneficial anti-proteinuric, tubulo-protective, and anti-inflammatory effects of NSAIDs may be offset by an increased risk of (renal) side effects. We investigated the effect of indomethacin on urinary markers of glomerular and tubular damage and renal inflammation. We performed a post-hoc analysis of a prospective open-label crossover study in chronic kidney disease patients (n = 12) with mild renal function impairment and stable residual proteinuria of 4.7±4.1 g/d. After a wash-out period of six wks without any RAAS blocking agents or other therapy to lower proteinuria (untreated proteinuria (UP)), patients subsequently received indomethacin 75 mg BID for 4 wks (NSAID). Healthy subjects (n = 10) screened for kidney donation served as controls. Urine and plasma levels of total IgG, IgG4, KIM-1, beta-2-microglobulin, H-FABP, MCP-1 and NGAL were determined using ELISA. Following NSAID treatment, 24 h -urinary excretion of glomerular and proximal tubular damage markers was reduced in comparison with the period without anti-proteinuric treatment (total IgG: UP 131[38–513] vs NSAID 38[17–218] mg/24 h, p<0.01; IgG4: 50[16–68] vs 10[1–38] mg/24 h, p<0.001; beta-2-microglobulin: 200[55–404] vs 50[28–110] ug/24 h, p = 0.03; KIM-1: 9[5]–[14] vs 5[2]–[9] ug/24 h, p = 0.01). Fractional excretions of these damage markers were also reduced by NSAID. The distal tubular marker H-FABP showed a trend to reduction following NSAID treatment. Surprisingly, NSAID treatment did not reduce urinary excretion of the inflammation markers MCP-1 and NGAL, but did reduce plasma MCP-1 levels, resulting in an increased fractional MCP-1 excretion. In conclusion, the anti-proteinuric effect of indomethacin is associated with reduced urinary excretion of glomerular and tubular damage markers, but not with reduced excretion of renal inflammation markers. Future studies should address whether the short term glomerulo- and tubulo-protective effects as observed outweigh the possible side-effects of NSAID treatment on the long term.
doi:10.1371/journal.pone.0037957
PMCID: PMC3360674  PMID: 22662255
25.  Serum and urinary proteins, lysozyme (muramidase), and renal dysfunction in mono- and myelomonocytic leukemia 
Journal of Clinical Investigation  1970;49(9):1694-1708.
Serum levels, urinary excretion, and clearances of several proteins of different molecular weights were studied in 18 patients with mono- and myelomonocytic leukemia. Nine patients had normal renal function (group A) and nine had impaired renal function with azotemia (group B). The majority of patients in both groups had increased concentration of immunoglobulins, particularly IgG, IgA, and IgM; IgD level was normal. Serum transferrin and α2-macroglobulin were frequently reduced while the level of ceruloplasmin was often increased, especially in patients with azotemia. The activity of lysozyme in the serum was high in all patients, but was considerably higher in group B.
Proteinuria was found in most patients but was more prominent in group B. Almost invariably albumin constituted less than 25% of the total protein excreted. Qualitative analysis of various urinary proteins by immunochemical techniques and clearance studies suggested the presence of glomerular as well as tubular dysfunction. Determination of urinary lysozyme frequently showed no direct correlation between the serum level of the enzyme and its concentration in the urine or its clearance by the kidney. In addition to glomerular filtration, impaired tubular reabsorption may account for the high level of lysozyme in the urine. It is postulated that the very high level of lysozyme in the glomerular filtrate and possibly hypergammaglobulinemia may play a role in the induction of tubular damage. Renal impairment has been correlated with histological changes in the kidneys. From a comparative study of various leukemias, it seems that the combined glomerular-tubular dysfunction is a manifestation unique to mono- and myelomonocytic leukemia.
PMCID: PMC322653  PMID: 5270914

Results 1-25 (1231506)