The aim of this analysis was to estimate biochemical parameters and the costs of treatment of secondary hyperparathyroidism (SHPT) in a subpopulation of the FARO-2 study.
The FARO-2 observational study aimed at evaluating the patterns of treatment for SHPT in naïve hemodialysis patients. Data related to pharmacological treatments and biochemical parameters (parathyroid hormone [PTH], calcium, phosphate) were recorded at entry to hemodialysis (baseline) and 6 months later (second survey). The analysis was performed from the Italian National Health Service perspective.
Two prominent treatment groups were identified, ie, one on oral calcitriol (n=105) and the other on intravenous paricalcitol (n=33); the intravenous calcitriol and intravenous paricalcitol + cinacalcet combination groups were not analyzed due to low patient numbers. At baseline, serum PTH levels were significantly higher in the intravenous paricalcitol group (P<0.0001). At the second survey, the intravenous paricalcitol group showed a higher percentage of patients at target for PTH than in the oral calcitriol group without changing the percentage of patients at target for phosphate. Moreover, between baseline and the second survey, intravenous paricalcitol significantly increased both the percentage of patients at target for PTH (P=0.033) and the percentage of patients at target for the combined endpoint PTH, calcium, and phosphate (P=0.001). The per-patient weekly pharmaceutical costs related to SHPT treatment, erythropoietin-stimulating agents and phosphate binders accounted for 186.32€ and 219.94€ at baseline for oral calcitriol and intravenous paricalcitol, respectively, while after 6 months, the costs were 180.51€ and 198.79€, respectively. Either at the beginning of dialysis or 6 months later, the total cost of SHPT treatment was not significantly lower in the oral calcitriol group compared with the intravenous paricalcitol group, with a difference among groups that decreased by 46% between the two observations. The cost of erythropoietin stimulating agents at the second survey was lower (−22%) in the intravenous paricalcitol group than in the oral calcitriol group (132.13€ versus 168.36€, respectively).
Intravenous paricalcitol significantly increased the percentage of patients at target for the combined endpoint of PTH, calcium, and phosphate (P=0.001). The total cost of treatment for the patients treated with intravenous paricalcitol 6 months after entry to dialysis was not significantly higher than the cost for patients treated with oral calcitriol.
cost consequences analysis; therapeutic costs; outcomes; SHPT treatments; secondary hyperparathyroidism
Pharmaceutical agents provide diagnostic and therapeutic utility that are central to patient care. However, all agents also carry adverse drug effect profiles. While most of these are clinically insignificant, some drugs may cause unacceptable toxicity that impacts negatively on patient morbidity and mortality. Recognizing adverse effects is important for administering appropriate drug doses, instituting preventive strategies, and withdrawing the offending agent due to toxicity. In the present article, we will review those drugs that are associated with impaired renal function. By focusing on pharmaceutical agents that are currently in clinical practice, we will provide an overview of nephrotoxic drugs that a treating physician is most likely to encounter. In doing so, we will summarize risk factors for nephrotoxicity, describe clinical manifestations, and address preventive and treatment strategies.
acute kidney injury; chronic kidney disease; drug nephrotoxicity; chemotherapy; NSAIDs
Peritoneal dialysis (PD) is an effective renal replacement strategy for patients suffering from end-stage renal disease. PD offers patient survival comparable to or better than in-center hemodialysis while preserving residual kidney function, empowering patient autonomy, and reducing financial burden to payors. The majority of patients suffering from kidney failure are eligible for PD. In patients with cardiorenal syndrome and uncontrolled fluid status, PD is of particular benefit, decreasing hospitalization rates and duration. This review discusses the benefits of chronic PD, performed by the patient or a caregiver at home. Recognition of the benefits of PD is a cornerstone in stimulating the use of this treatment strategy.
peritoneal dialysis; survival; quality of life; cost; home dialysis
Diagnosing the etiology of a rapidly progressive glomerulonephritis is of vital importance to guide appropriate therapeutic management. This case highlights the complexity involved in establishing diagnosis when presentation is atypical. In certain cases diagnosis cannot be established based on clinical presentation or biopsy findings alone, and critical analysis of biopsy findings in context of clinical presentation is crucial to guide the clinical decision-making process.
A 47-year-old Hispanic male with history of granulomatosis with polyangiitis (GPA) in remission on azathioprine, presented with fatigue and lethargy. Physical examination was unremarkable. Laboratory data revealed elevated creatinine and otherwise normal electrolytes. Urinalysis showed numerous dysmorphic red blood cells with few red cell casts. His serologic results were all negative except anti-proteinase-3 antibody at very low titers. Kidney biopsy showed necrotizing crescentic glomerulonephritis with linear immunoglobulin G staining along the basement membrane.
This case presented conflicting serologic and histopathologic findings. The presence of anti-proteinase-3 antibody supported diagnosis of recurrence of GPA. However, linear staining of immunoglobulin G (IgG) on immunofluorescence (IF) staining of renal biopsy supported anti-glomerular basement membrane (GBM) disease. The treatment of anti-GBM disease and GPA both involve immunosuppression with prednisone and cyclophosphamide. However, patients with anti-GBM disease are also treated with plasmapheresis early in the disease presentation to prevent further damage. The patient with GPA, on the other hand, was shown to benefit from plasmapheresis only in the case of severe renal disease (serum creatinine level more than 5 mg/dL) or pulmonary hemorrhage. In this case, since the patient did not have detectable circulating anti-GBM antibody, the decision was made not to proceed with plasmapheresis. The patient was treated with a standard immunosuppressive regimen consisting of prednisone and cyclophosphamide with partial renal recovery at 2 months.
Necrotizing RPGN; Anti-GBM disease; GPA; ANCA - associated vasculitis; dual antibody-positive disease
Emergency and critical care medicine have grown into robust self-supporting disciplines with an increasing demand for dedicated highly-skilled physicians. In the past, “core” specialists were asked to offer bedside advice in acute care wards. In the same regard, critical care medicine and nephrology have been fighting but finally emerged altogether with the concept of critical care nephrology almost 20 years ago. Indeed, polyvalence is no longer a valid option in modern critical care. Uniting forces between disciplines represents the only way to cope with the increasing complexity and cumulating knowledge in the critical care setting. For this reason, the wide array of upcoming acute care sub-specialities must be committed to unrestricted growth and development. This will require competent manpower, a well-designed technical framework, and sufficient financial support. The worldwide success of critical care nephrology proves the feasibility for this concept.
translational medicine; multidisciplinarity; acute medicine; CRRT; dialysis; critical care nephrology
Blood oxygen level-dependent magnetic resonance imaging (BOLD MRI) has recently emerged as an important noninvasive technique to assess intrarenal oxygenation under physiologic and pathophysiologic conditions. Although this tool represents a major addition to our armamentarium of methodologies to investigate the role of hypoxia in the pathogenesis of acute kidney injury and progressive chronic kidney disease, numerous technical limitations confound interpretation of data derived from this approach. BOLD MRI has been utilized to assess intrarenal oxygenation in numerous experimental models of kidney disease and in human subjects with diabetic and nondiabetic chronic kidney disease, acute kidney injury, renal allograft rejection, contrast-associated nephropathy, and obstructive uropathy. However, confidence in conclusions based on data derived from BOLD MRI measurements will require continuing advances and technical refinements in the use of this technique.
kidney; hypoxia; oxygenation; diabetes mellitus; chronic kidney disease; acute kidney injury; contrast-associated nephropathy; BOLD MRI
This study addresses for the first time the question whether there is significant macrophage population in human kidney sections from patients with acute tubular injury (ATI). We examined therefore the interstitial macrophage population in human kidney tissue with biopsy-proven diagnosis of ATI, minimal change disease (MCD), and MCD with ATI. Kidney biopsies from patients with the above diagnoses were stained with antibodies directed against CD68 (general macrophage marker), CD163 (M2 marker), and HLA-DR (M1 marker) and their respective electron microscopy samples were evaluated for the presence of interstitial macrophages. Our study shows that patients with ATI have significantly increased numbers of interstitial CD68+ macrophages, with an increase in both HLA-DR+ M1 macrophages and CD163+ M2 macrophages as compared to patients with MCD alone. Approximately 75% of macrophages were M2 (CD163+) whereas only 25% were M1 (HLA-DR+). M2 macrophages, which are believed to be critical for wound healing, were found to localize close to the tubular basement membrane of injured proximal tubule cells. Ultra structural examination showed close adherence of macrophages to the basement membrane of injured tubular epithelial cells. We conclude that macrophages accumulate around injured tubules following ATI and exhibit predominantly an M2 phenotype. We further speculate that macrophage-mediated repair may involve physical contact between the M2 macrophage and the injured tubular epithelial cell.
macrophages; acute kidney injury; CD163; HLA-DR; CD68; electron microscopy
Immunoglobulin (Ig) A nephropathy (IgAN) is the most common form of glomerular disease worldwide and is associated with a poor prognosis. Thus, development of a curative treatment and strategies for early diagnosis and treatment are urgently needed. Pathological analysis of renal biopsy is the gold standard for the diagnosis and assessment of disease activity; however, immediate and frequent assessment based on biopsy specimens is difficult. Therefore, a simple and safe alternative is desirable. On the other hand, it is now widely accepted that multi-hit steps, including production of aberrantly glycosylated serum IgA1 (first hit), and IgG or IgA autoantibodies that recognize glycan containing epitopes on glycosylated serum IgA1 (second hit) and their subsequent immune complex formation (third hit) and glomerular deposition (fourth hit), are required for continued progression of IgAN. Although the prognostic and predictive values of several markers have been discussed elsewhere, we recently developed a highly sensitive and specific diagnostic method by measuring serum levels of glycosylated serum IgA1 and related IgA immune complex. In addition, we confirmed a significant correlation between serum levels of these essential effector molecules and disease activity after treatment, suggesting that each can be considered as a practical surrogate marker of therapeutic effects in this slowly progressive disease. Such a noninvasive diagnostic and activity assessment method using these disease-oriented specific biomarkers may be useful in the early diagnosis of and intervention in IgAN, with appropriate indication for treatment, and thus aid in the future development and dissemination of specific and curative treatments.
galactose-deficient immunoglobulin A1; anti-glycan antibody; immune complex; N-acetylgalactosamine; surrogate marker
Since 2005, an abundance of targeted agents has been approved for the treatment of metastatic renal cell carcinoma (mRCC), without any specification as to what may be the most optimal first-line and second-line sequence. Hence, our objective was to critically examine the evidence supporting the use of first-line and second-line agents in the management of mRCC. Our review suggests that in first line, sunitinib and pazopanib represent treatment options for patients with favorable or intermediate-risk features and clear cell histology. Unfortunately, the Phase III trial cannot conclusively prove the noninferiority of pazopanib relative to sunitinib. Hence, the use of sunitinib as first-line standard of care remains justified. Pazopanib represents an option for specific patients in whom sunitinib might not be tolerated. In patients with poor-risk features, temsirolimus represents the only option supported with level 1 evidence. Less optimal alternatives include sunitinib and bevacizumab combined with interferon, based on the minimal inclusion of poor-risk patients in pivotal Phase III studies of these two molecules. In patients with non-clear cell mRCC, the use of temsirolimus is supported by Phase III data, unlike for any other molecule. In second line, the options consist of everolimus and axitinib. However, the axitinib data are substantially more robust given the inclusion of more patients considered as true second-line, and validly justify the choice of axitinib over everolimus. Nonetheless, the Phase III trial of everolimus may be considered as level 1 evidence for use as third-line or subsequent lines of therapy.
targeted therapy; metastatic; renal cell carcinoma; clear cell; sequential therapy
There has been an exponential increase in the incidence of diabetes and hypertension in India in the last few decades, with a proportional increase in chronic kidney disease (CKD). Preventive health care and maintenance of asymptomatic chronic disease such as CKD are often neglected by patients until they become symptomatic with fluid retention and uremia. Management of hyperphosphatemia in CKD remains one of the challenges of nephrology in India for this reason, as it is almost completely asymptomatic but contributes to renal osteodystrophy, metastatic vascular calcification, and acceleration of cardiovascular disease. Lack of understanding of the dangers of asymptomatic hyperphosphatemia, the huge pill burden of phosphate binders, difficulty with dietary and dialysis compliance, and most importantly, the added expense of the drugs places additional road blocks in the treatment of hyperphosphatemia at a population level in developing countries like India. In this review we seek to address the contribution of hyperphosphatemia to adverse outcomes and discuss economic, cultural, and societal factors unique to the management of phosphate levels in Indian patients with advanced CKD.
dialysis; chronic kidney disease; vascular calcification
To analyze the complications and costs of minilaparotomies performed by a nephrologist (group A) compared with conventional laparotomies performed by a surgeon (group B) for peritoneal catheter implantation.
Two university hospitals (Santa Sofia and Caldas) in Manizales, Caldas, Colombia.
The study included stage 5 chronic kidney disease patients, with indication of renal replacement therapy, who were candidates for peritoneal dialysis and gave informed consent for a peritoneal catheter implant. Minilaparotomies were performed by a nephrologist in a minor surgery room under local anesthesia. Conventional laparotomies were performed by a surgeon in an operating room under general anesthesia.
Two nephrologists inserted 157 peritoneal catheters, and seven general surgeons inserted 185 peritoneal catheters. The groups had similar characteristics: the mean age was 55 years, 49.5% were men, and the primary diagnoses were diabetic nephropathy, hypertensive nephropathy, and unknown etiology. The implant was successful for 98.09% of group A and 99.46% of group B. There was no procedure-related mortality. The most frequent complications in the first 30 days postsurgery in group A versus group B, respectively, were: peritonitis (6.37% versus 3.78%), exit-site infection (3.82% versus 2.16%), tunnel infection (0% versus 0.54%), catheter entrapment by omentum (1.27% versus 3.24%), peritoneal effluent spillover (1.91% versus 2.16%), draining failure (4.46% versus 6.49%), hematoma (0% versus 1.08%), catheter migration with kinking (3.18% versus 2.70%), hemoperitoneum (1.27% versus 0%), and hollow viscera accidental puncture (1.91% versus 0.54%). There were no statistically significant differences in the number of complications between groups. In 2013, the cost of a surgeon-implanted peritoneal dialysis catheter in Colombia was US $366 (666,000 COP), whereas the cost of a nephrologist-implanted catheter was US $198 (356,725 COP).
Nephrologist-performed minilaparotomies had similar effectiveness to surgeon-performed conventional laparotomies and were cost-effective; however, the nonuse of general anesthesia may be related with hollow viscera puncture during the procedure.
catheter implantation; surgical technique; minilaparotomy; complications
Diabetic nephropathy is a significant cause of chronic kidney disease and end-stage renal failure globally. Much research has been conducted in both basic science and clinical therapeutics, which has enhanced understanding of the pathophysiology of diabetic nephropathy and expanded the potential therapies available. This review will examine the current concepts of diabetic nephropathy management in the context of some of the basic science and pathophysiology aspects relevant to the approaches taken in novel, investigative treatment strategies.
diabetes; diabetic nephropathy; albuminuria; kidney disease; inflammation
The objective of the study reported here was to describe dose equivalence and hemoglobin (Hb) stability in a cohort of unselected hemodialysis patients who were switched simultaneously from epoetin alfa to darbepoetin alfa.
This was a multicenter, observational, retrospective study in patients aged ≥18 years who switched from intravenous (IV) epoetin alfa to IV darbepoetin alfa in October 2007 (Month 0) and continued on hemodialysis for at least 24 months. The dose was adjusted to maintain Hb within 1.0 g/dL of baseline.
We included 125 patients (59.7% male, mean [standard deviation (SD)] age 70.4 [13.4] years). No significant changes were observed in Hb levels (mean [SD] 11.9 [1.3] g/dL, 12.0 [1.5], 12.0 [1.5], and 12.0 [1.7] at Months −12, 0, 12 and 24, respectively, P=0.409). After conversion, the erythropoiesis-stimulating agent (ESA) dose decreased significantly (P<0.0001), with an annual mean of 174.7 (88.7) international units (IU)/kg/week for epoetin versus 95.7 (43.4) (first year) and 91.4 (42.7) IU/kg/week (second year) for darbepoetin (65% and 64% reduction, respectively). The ESA resistance index decreased from 15.1 (8.5) IU/kg/week/g/dL with epoetin to 8.1 (3.9) (first year) and 7.9 (4.0) (second year) with darbepoetin (P<0.0001). The conversion rate was 354:1 in patients requiring high (>200 IU/kg/week) doses of epoetin and 291:1 in patients requiring low doses.
In patients on hemodialysis receiving ESAs, conversion from epoetin alfa to darbepoetin alfa was associated with an approximate and persistent reduction of 65% of the required dose. To maintain Hb stability, a conversion rate of 300:1 seems to be appropriate for most patients receiving low doses of epoetin alfa (≤200 IU/kg/week), while 350:1 would be better for patients receiving higher doses.
chronic kidney disease; darbepoetin alfa; dose equivalence; epoetin alfa; hemodialysis; hemoglobin
Nephrologists are faced with enormous challenges in the management of patients with end-stage renal disease, especially in sub-Saharan Africa, where hemodialysis is the most common modality of renal replacement therapy in the region. Therefore, we reviewed our 3 years of experience with hemodialysis services in a tertiary hospital located in a rural community of South West Nigeria. This was with a view to presenting the profile of hemodialysis patients and the challenges they face in sustaining hemodialysis.
We reviewed the case records and hemodialysis registers for 176 patients over the 3 years from November 2010 to December 2013. The data were analyzed using Statistical Package for the Social Sciences version 20 software.
Of the 176 patients, 119 (66.9%) were males. The mean age of the patients was 44.87±17.21 years. Most were semiskilled or unskilled (111; 63.5%) and 29 (16.5%) were students. Twenty-six (14.8%) had acute kidney injury in the failure stage. Chronic glomerulonephritis, hypertensive nephropathy, and diabetic nephropathy accounted for 45.3%, 23.3%, and 12.1%, respectively, of patients with end-stage renal disease. Only 6.8% of patients could afford hemodialysis beyond 3 months.
Sustainability of maintenance hemodialysis is poor in our environment. Efforts should be intensified to improve other modalities of renal replacement therapy, in particular kidney transplantation, which is cost-effective in the long-term. Also, preventive measures such as education for affected patients and the general population would assist in reducing the prevalence and progression to end-stage renal disease.
end-stage renal disease; hemodialysis; sustainability; outcome
Arteriovenous fistula-formation remains critical for the provision of hemodialysis in end-stage renal failure patients. Its creation results in a significant increase in cardiac output, with resultant alterations in cardiac stroke volume, systemic blood flow, and vascular resistance. The impact of fistula-formation on cardiac and vascular structure and function has not yet been evaluated via “gold standard” imaging techniques in the modern era of end-stage renal failure care.
A total of 24 patients with stage 5 chronic kidney disease undergoing fistula-creation were studied in a single-arm pilot study. Cardiovascular magnetic resonance imaging was undertaken at baseline, and prior to and 6 months following fistula-creation. This gold standard imaging modality was used to evaluate, via standard brachial flow-mediated techniques, cardiac structure and function, aortic distensibility, and endothelial function.
At follow up, left ventricular ejection fraction remained unchanged, while mean cardiac output increased by 25.0% (P<0.0001). Significant increases in left and right ventricular end-systolic volumes (21% [P=0.014] and 18% [P<0.01]), left and right atrial area (11% [P<0.01] and 9% [P<0.01]), and left ventricular mass were observed (12.7% increase) (P<0.01). Endothelial-dependent vasodilation was significantly decreased at follow up (9.0%±9% vs 3.0%±6%) (P=0.01). No significant change in aortic distensibility was identified.
In patients with end-stage renal failure, fistula-formation is associated with an increase in cardiac output, dilation of all cardiac chambers and deterioration in endothelial function.
cardiac; cardiovascular disease; vascular biology
Introduction and objective
While pruritus is a common complication in hemodialysis patients, the pathophysiological mechanisms remain obscure. Recently, B-type (brain) natriuretic peptide (BNP) has been defined as an itch-selective neuropeptide in pruriceptive neurons in mice, and higher serum levels of BNP are frequently observed in hemodialysis patients. The objective of the present study was to evaluate the role of serum BNP in pruritus in patients undergoing hemodialysis.
Patients and methods
The current cross-sectional study was performed on 43 patients undergoing maintenance hemodialysis. A visual analog scale (VAS) measuring the general severity of pruritus (values from 0 to 10, with higher values indicating more severe pruritus) in daytime and at night was self-reported by patients. Each patient’s background and laboratory tests, including serum BNP in the post-hemodialysis period, were collected. The correlation between VAS and clinical parameters was evaluated.
Both daytime and nighttime VAS scores in diabetic patients were significantly less than those in nondiabetic patients. Multiple regression analysis revealed that pruritus in daytime was worsened by serum BNP (β=2.0, t=2.4, P=0.03), calcium (β=4.4, t=5.2, P<0.0001), and β2-microglobulin (β=2.0, t=3.0, P=0.007), while it was eased by age (β=−2.2, t=−3.2, P=0.0004). Nocturnal pruritus was severe in nondiabetic patients (β=1.7, t=3.8, P=0.0005) and weakened by the total iron binding capacity (β=−2.9, t=−3.1, P=0.004).
It is suggested that a higher level of serum BNP increases the pruritus of hemodialysis patients in daytime and that diabetic patients are less sensitive to itch, especially at nighttime.
B-type brain natriuretic peptide; pruritus; hemodialysis; visual analog scale; itch-selective neuropeptide; pruriceptive neurons; cerebrospinal fluid
Infectious peritonitis (IP) is the most common complication in peritoneal dialysis (PD). The purpose of this study is to assess the prevalence of IP and to determine its clinical, biological, and evolutive characteristics.
Patients and methods
We conducted a five year, five months retrospective study from July 2006 to December 2011. All patients on peritoneal dialysis that have been followed on PD for a minimum of 3 months and who presented IP during follow-up were included. Data were analyzed using SPSS 17.0.
The 76 episodes of IP were identified in 36 patients. The peritonitis rate (months × patients/peritonitis), as calculated by the Registre de Dialyse Péritonéale de Langue Française (RDPLF Registry) [French peritoneal dialysis registry] in December 2011, was 18.59. Time to occurrence of peritonitis from the start of peritoneal exchange was 15.44±10 months. The mean age of our patients was 49.1±16.8 years [10–80]: the youngest patient’s age was 10, while the oldest was 80 years old (male to female: sex ratio M/F=1,66). Also, 22% of our patients were diabetic. The mean follow-up in PD was 22.6±14 months. Abdominal pain was present in 79% of the cases. Fever and vomiting were noted in 42% and 38% of cases, respectively. The C-reactive protein rate was elevated in 77% of cases, and leukocytosis was found in 27% of cases. Bacteriological proof was present in 73.68% of cases. Gram-positive cocci were involved in 56.6% of microbiologically proven IP cases. Gram-negative bacilli were represented in 37.7%. The outcome was favorable in 89.4%. The PD catheter was removed in 2.63% of the cases. In addition, 7.89% of our patients were transferred to hemodialysis.
The rate of IP remains high in our series. More than one-half of the peritonitis cases with positive cultures (56.6%) were caused by Gram-positive cocci. Gram-negative bacilli ranked second (27.7%). These results agree with data in the literature. Moreover, the rate of culture-negative IP in our series is high (26%). Evolution is good in most cases (89%).
Despite the gradual decrease of its rate, peritonitis remains frequent in our center and calls for optimization of means of prevention. The high frequency of negative culture IP in our study urges us toward better collaboration with biologists to target antibiotic therapy and improve IP management.
peritoneal dialysis; infectious peritonitis; hemodialysis
Chronic allograft nephropathy (CAN) is the leading cause of late allograft loss after renal transplantation (RT), which continues to remain an unresolved problem. A rat model of CAN was first described in 1969 by White et al. Although the rat model of RT can be technically challenging, it is attractive because the pathogenesis of CAN is similar to that following human RT and the pathological features of CAN develop within months as compared with years in human RT. The rat model of RT is considered as a useful investigational tool in the field of experimental transplantation research. We have reviewed the literature on studies of rat RT reporting the donor and recipient strain combinations that have investigated resultant survival and histological outcomes. Several different combinations of inbred and outbred rat combinations have been reported to investigate the multiple aspects of transplantation, including acute rejection, cellular and humoral rejection mechanisms and their treatments, CAN, and potential targets for its prevention.
interventions; therapy; late allograft loss; renal transplantation
The study aimed to test the hypothesis that therapeutic treatment with a mammalian target of rapamycin complex 1 inhibitor reduces renal cell proliferation and attenuates glomerular and tubulointerstitial injury in the early phase of nephrotoxic serum nephritis (NSN) in rats.
Male Wistar-Kyoto rats received a single tail-vein injection of sheep anti-rat glomerular basement membrane serum (day 0) and were treated with vehicle or sirolimus (0.25 mg/kg/day by subcutaneous injection) from day 1 until day 14.
Treatment with sirolimus attenuated kidney enlargement by 41% (P<0.05), improved endogenous creatinine clearance by 50% (P<0.05), and reduced glomerular and tubulointerstitial cell proliferation by 53% and 70%, respectively, (P<0.05 compared to vehicle) in rats with NSN. In glomeruli, sirolimus reduced segmental fibrinoid necrosis by 69%, autologous rat immunoglobulin G deposition, glomerular capillary tuft enlargement, and periglomerular myofibroblast (α-smooth muscle actin-positive cells) accumulation (all P<0.05) but did not significantly affect glomerular crescent formation (P=0.15), macrophage accumulation (P=0.25), or the progression of proteinuria. In contrast, sirolimus preserved tubulointerstitial structure and attenuated all markers of injury (interstitial ED-1- and α-smooth muscle actin-positive cells and tubular vimentin expression; all P<0.05). By immunohistochemistry and Western blot analysis, sirolimus reduced the glomerular and tubulointerstitial expression of phosphorylated (Ser 235/236) S6-ribosomal protein (P<0.05).
Induction monotherapy with sirolimus suppressed target of rapamycin complex 1 activation, renal cell proliferation, and injury during the early stages of rodent NSN, but the degree of histological protection was more consistent in the tubulointerstitium than the glomerular compartment.
glomerulonephritis; proliferation; crescentic; rapamycin; inflammation; kidney
Cystinosis is an autosomal recessive inherited lysosomal storage disease. It is characterized by generalized proximal tubular dysfunction known as renal Fanconi syndrome and causes end-stage renal disease by the age of about 10 years if left untreated. Extrarenal organs are also affected, including the thyroid gland, gonads, pancreas, liver, muscle, and brain. Treatment consists of administration of cysteamine, resulting in depletion of cystine that is trapped inside the lysosomes. Since cysteamine has a short half-life, it should be administered every 6 hours. Recently, a new delayed-release formulation was marketed, that should be administered every 12 hours. The first studies comparing both cysteamine formulations show comparable results regarding white blood cell cystine depletion (which serves as a measure for cystine accumulation in the body), while a slightly lower daily dose of cysteamine can be used.
cystinosis; cysteamine; delayed-release; immediate-release
A well-functioning vascular access (VA) is a mainstay to perform an efficient hemodialysis (HD) procedure. There are three main types of access: native arteriovenous fistula (AVF), arteriovenous graft, and central venous catheter (CVC). AVF, described by Brescia and Cimino, remains the first choice for chronic HD. It is the best access for longevity and has the lowest association with morbidity and mortality, and for this reason AVF use is strongly recommended by guidelines from different countries. Once autogenous options have been exhausted, prosthetic fistulae become the second option of maintenance HD access alternatives. CVCs have become an important adjunct in maintaining patients on HD. The preferable locations for insertion are the internal jugular and femoral veins. The subclavian vein is considered the third choice because of the high risk of thrombosis. Complications associated with CVC insertion range from 5% to 19%. Since an increasing number of patients have implanted pacemakers and defibrillators, usually inserted via the subclavian vein and superior vena cava into the right heart, a careful assessment of risk and benefits should be taken. Infection is responsible for the removal of about 30%–60% of HD CVCs, and hospitalization rates are higher among patients with CVCs than among AVF ones. Proper VA maintenance requires integration of different professionals to create a VA team. This team should include a nephrologist, radiologist, vascular surgeon, infectious disease consultant, and members of the dialysis staff. They should provide their experience in order to give the best options to uremic patients and the best care for their VA.
arteriovenous fistula; prosthetic grafts; central venous catheter; infection
The purpose of this study was to conduct a retrospective analysis of serum phosphate level variability in patients new to hemodialysis (HD) and to identify patient characteristics associated with this variability. The medical records of 47,742 incident HD patients attending US outpatient dialysis centers between January 1, 2006 and March 31, 2009 were analyzed. Monthly mean serum phosphate levels determined over a 6-month evaluation period (months 4–9 after HD initiation) were assigned to one of three strata: low (<1.13 mmol/L [<3.5 mg/dL]); target (1.13–1.78 mmol/L [3.5–5.5 mg/dL]); or high (>1.78 mmol/L [>5.5 mg/dL]). Patients were classified into one of six serum phosphate variability groups based on variability among monthly mean phosphate levels over the 6-month evaluation period: consistently target; consistently high; high-to-target; high-to-low; target-to-low; or consistently low. Only 15% of patients (consistently target group) maintained monthly mean serum phosphate levels within the target range throughout the 6-month evaluation period. Age, Charlson comorbidity index, serum phosphate, and intact parathyroid hormone levels prior to HD initiation were strongly associated (P<0.001) with serum phosphate levels after HD initiation. Overall patient-reported phosphate binder usage increased from 35% at baseline to 52% at end of study. The low proportion of patients achieving target phosphate levels and low rates of phosphate binder usage observed during the study suggest that alternative strategies could be developed to control serum phosphate levels. Possible strategies that might be incorporated to help improve the management of hyperphosphatemia in incident HD patients include dietary modification, dialysis optimization, and earlier and sustained use of phosphate binders.
chronic kidney disease; end-stage renal disease; hyperphosphatemia
Predicting the timing and number of end-stage renal disease (ESRD) cases from a population of individuals with pre-ESRD chronic kidney disease (CKD) has not previously been reported. The objective is to predict the timing and number of cases of ESRD occurring over the lifetime of a cohort of hypothetical CKD patients in the US based on a range of baseline estimated glomerular filtration rate (eGFR) values and varying rates of eGFR decline.
A three-state Markov model – functioning kidney, ESRD, and death – with an annual cycle length is used to project changes in baseline eGFR on long-term health outcomes in a hypothetical cohort of CKD patients. Using published eGFR-specific risk equations and adjusting for predictive characteristics, the probability of ESRD (eGFR <10), time to death, and incremental cost-effectiveness ratios for hypothetical treatments (costing US$10, $5, and $2/day), are projected over the cohort’s lifetime under two scenarios: an acute drop in eGFR (mimicking acute kidney injury) and a reduced hazard ratio for ESRD (mimicking an effective intervention).
Among CKD patients aged 50 years, an acute eGFR decrement from 45 mL/minute to 35 mL/minute yields decreases of 1.6 life-years, 1.5 quality-adjusted life-years (QALYs), 0.8 years until ESRD, and an increase of 183 per 1,000 progressing to ESRD. Among CKD patients aged 60 years, lowering the hazard ratio of ESRD to 0.8 yields values of 0.2, 0.2, 0.2, and 46 per 1,000, respectively. Incremental cost-effectiveness ratios are higher (ie, less favorable) for higher baseline eGFR, indicating that interventions occurring later in the course of disease are more likely to be economically attractive.
Both acute kidney injury and slowing the rate of eGFR decline produce substantial shifts in expected numbers and timing of ESRD among CKD patients. This model is a useful tool for planning management of CKD patients.
epidemiology; decision model; policy analysis; cost effectiveness; acute kidney injury; disease progression; end-stage renal disease
Hereditary leiomyomatosis and renal cell carcinoma (HLRCC) is an autosomal-dominant hereditary syndrome, which is caused by germline mutations in the FH gene that encodes the tricarboxylic acid cycle enzyme fumarate hydratase (FH). HLRCC patients are predisposed to develop cutaneous leiomyomas, multiple, symptomatic uterine fibroids in young women resulting in early hysterectomies, and early onset renal tumors with a type 2 papillary morphology that can progress and metastasize, even when small. Since HLRCC-associated renal tumors can be more aggressive than renal tumors in other hereditary renal cancer syndromes, caution is warranted, and surgical intervention is recommended rather than active surveillance. At-risk members of an HLRCC family who test positive for the familial germline FH mutation should undergo surveillance by annual magnetic resonance imaging from the age of 8 years. Biochemical studies have shown that FH-deficient kidney cancer is characterized by a metabolic shift to aerobic glycolysis. It is hoped that through ongoing clinical trials evaluating targeted molecular therapies, an effective form of treatment for HLRCC-associated kidney cancer will be developed that will offer an improved prognosis for individuals affected with HLRCC-associated kidney cancer.
hereditary leiomyomatosis and renal cell carcinoma; HLRCC; FH mutation; type 2 papillary RCC