Background & Aims
Recently an association was demonstrated between the single nucleotide polymorphism (SNP), rs11209026, within the interleukin-23 receptor (IL23R) locus and Crohn’s disease (CD) as a consequence of a genome wide association study of this disease in adults. We examined the effects of this and other previously reported SNPs at this locus with respect to CD in children.
Utilizing data from our ongoing genome-wide association study in our cohort of 142 pediatric CD cases and 281 matched controls, we investigated the association of the previously reported SNPs at the IL23R locus with the childhood form of this disease.
Using a Fisher’s exact test, the minor allele frequency (MAF) of rs1120902 in the cases was 1.75% while it was 6.61% in controls, yielding a protective odds ratio (OR) of 0.25 (95% CI 0.10 – 0.65; one-sided P = 9.2×10−4). Furthermore, of all the SNPs previously reported, rs11209026 was the most strongly associated. A subsequent family-based association test (which is more resistant to population stratification) with 65 sets of trios derived from our initial patient cohort yielded significant association with rs11209026 in a transmission disequilibrium test (one-sided P=0.0017). In contrast, no association was detected to the CARD15 gene for the IBD phenotype.
The OR of the IL23R variant in our pediatric study is highly comparable with that reported previously in a non-Jewish adult IBD case-control cohort (OR=0.26). As such, variants in IL23R gene confer a similar magnitude of risk of CD to children as for their adult counterparts.
IL23R; gene; association; Crohn’s Disease
Background & Aims
Bowel perforation is a rare but serious complication of colonoscopy. Its prevalence is increasing with the rapidly growing volume of procedures performed. Although colonoscopies have been performed for decades, the risk factors for perforation are not completely understood. We investigated risk factors for perforation during colonoscopy, assessing variables that included sedation type and endoscopist specialty and level of training.
We performed a retrospective multivariate analysis of risk factors for early perforation (occurring at any point during the colonoscopy but recognized during or immediately after the procedure) in adult patients using the Clinical Outcomes Research Initiative National Endoscopic Database. Risk factors were determined from published articles. Additional variables assessed included endoscopist specialty and years of experience, trainee involvement, and sedation with propofol.
We identified 192 perforation events during 1,144,900 colonoscopies from 85 centers entered into the database from January 2000 through March 2011. On multivariate analysis, increasing age, American Society of Anesthesia class, female sex, hospital setting, any therapy, and polyps >10 mm were significantly associated with increased risk of early perforation. Colonoscopies performed by surgeons and endoscopists of unknown specialty had higher rates of perforation than those performed by gastroenterologists (odds ratio, 2.00; 95% confidence interval, 1.30–3.08). Propofol sedation did not significantly affect risk for perforation.
In addition to previously established risk factors, non-gastroenterologist specialty was found to affect risk for perforations detected during or immediately after colonoscopy. This finding could result from differences in volume and style of endoscopy training. Further investigation into these observed associations is warranted.
ASA classification; GI; intestine; quality control; endoscopy training
Diarrheal diseases remain a leading cause of morbidity and mortality for children in developing countries while representing an important cause of morbidity worldwide. The WHO recommended low osmolarity oral rehydration solutions plus zinc save lives in patients with acute diarrhea1, but there are no approved, safe drugs which have been shown to be effective against most causes of acute diarrhea. Identification of abnormalities in electrolyte handling by the intestine in diarrhea, including increased intestinal anion secretion and reduced Na+ absorption, suggest a number of potential drug targets. This is based on the view that successful drug therapy for diarrhea will result from correcting the abnormalities in electrolyte transport that are pathophysiologic for diarrhea. We review the molecular mechanisms of physiologic regulation of intestinal ion transport and changes that occur in diarrhea and the status of drugs being developed to correct the transport abnormalities in Na+ absorption which occur in diarrhea. Mechanisms of Cl− secretion and approaches to anti-Cl− secretory therapies of diarrhea are discussed in a companion review.
In patients with appropriate indications, performance of both colonoscopy and esophagogastroduodonscopy (EGD) at the same time (bundling) is convenient for patients, efficient for providers, and cost saving for the healthcare system. However, Medicare reimbursement for bundled procedures is at a rate that is less than the sum of the two procedures when charged separately; this may create a disincentive to bundle. The practice patterns of bundling are unknown at a US population-based level.
We examined 2007 to 2009 Medicare claims from the Carrier file in a national, random sample of fee-for-service beneficiaries aged 66 and older. We identified patients who had both a colonoscopy and EGD performed within 180 days of each other and calculated the proportions of patients with both procedures bundled on the same date, within 1 to 30 days, and within 31 to180 days of each other. We compared patients in these 3 groups for demographics and clinical indications for the procedures (bleeding, lower or upper gastrointestinal [GI] symptoms, surveillance, and screening).
We identified 12,982 Medicare-enrolled individuals who had a colonoscopy and an EGD performed within 180 days of each other. Approximately 35% of procedures were not bundled on the same day; and, of these, 2,359 (18%) were performed within 30 days of each other; and 2,219 (17%) were performed within 31 to180 days of each other. There were marked geographic differences in the percentage of bundling, with the lowest in the Northeast and the highest in the West. Patients with bundled procedures were more likely to have GI bleeding and less likely to have screening or surveillance indications.
Although same-day bundling of endoscopic procedures offers a number of advantages, it is not practiced in more than one third of cases in a national sample of Medicare beneficiaries.
Colonic diverticula are common in developed countries and complications of colonic diverticulosis are responsible for a significant burden of disease. Several recent publications have called into question long held beliefs about diverticular disease. Contrary to conventional wisdom, studies have not shown that a high fiber diet protects against asymptomatic diverticulosis. The risk of developing diverticulitis among individuals with diverticulosis is lower than the 10–25% commonly quoted, and may be as low as 1% over 11 years. Nuts and seeds do not increase the risk of diverticulitis or diverticular bleeding. It is unclear whether diverticulosis, absent diverticulitis or overt colitis, is responsible for chronic gastrointestinal symptoms or worse quality of life. The role of antibiotics in acute diverticulitis has been challenged by a large randomized trial that showed no benefit in selected patients. The decision to perform elective surgery should be made on a case-by-case basis and not routinely after a second episode of diverticulitis, when there has been a complication, or in young people. A colonoscopy should be performed to exclude colon cancer after an attack of acute diverticulitis but may not alter outcomes among individuals who have had a colonoscopy prior to the attack. Given these surprising findings, it is time to reconsider conventional wisdom about diverticular disease.
Background & Aims
An association between inflammatory activity and colorectal neoplasia (CRN) has been documented in patients with ulcerative colitis (UC). However, previous studies did not address the duration of inflammation or the effects of therapy on risk for CRN. We investigated the effects of inflammation, therapies, and characteristics of patients with UC on their risk for CRN.
We collected data from 141 patients with UC without CRN (controls), and 59 matched patients with UC who developed CRN (cases), comparing disease extent and duration and patients’ ages. We used a new 6-point histologic inflammatory activity (HIA) scale to score biopsy fragments (n=4449). Information on medications, smoking status, primary sclerosing cholangitis (PSC), and family history of CRN were collected from the University of Chicago Inflammatory Bowel Disease Endoscopy Database. Relationships between HIA, clinical features, and CRN were assessed by conditional logistic regression.
Cases and controls were similar in numbers of procedures and biopsies, exposure to steroids or mesalamine, smoking status, and family history of CRN. They differed in proportion of men vs women, exposure to immune modulators, and PSC prevalence. In univariate analysis, HIA was positively associated with CRN (odds ratio [OR], 2.56/ per unit increase; P=.001), whereas immune modulators (including azathioprine, 6-mercaptopurine, and methotrexate) reduced the risk for CRN (OR, 0.35; P<.01). HIA was also associated with CRN in multivariate analysis (OR, 3.68; P=.001).
In a case–control study, we associated increased inflammation with CRN in patients with UC. Use of immune modulators reduced the risk for CRN, indicating that these drugs have chemoprotective effects. Based on these data, we propose new stratified surveillance and treatment strategies to prevent and detect CRN in patients with UC.
IBD; AZA; 6MP; 5-ASA; colorectal cancer; chemoprevention; dysplasia
A 62-year-old man with cirrhosis secondary to hepatitis C and chronic alcohol abuse is admitted to the intensive care unit with hematemesis and mental status changes. Physical examination reveals ascites and stigmata of chronic liver disease. Blood pressure is noted as 87/42 mm Hg while laboratory studies show serum creatinine level of 0.8 mg/dL, estimated glomerular filtration rate (GFR) 84 ml/min/1.73 m2 calculated using the Modification of Diet in Renal Disease (MDRD) Study equation, serum sodium 123 mEq/L, total serum bilirubin 4.3 mg/dL and international normalization ratio (INR) 1.6. The patient is resuscitated with packed red blood cells and fresh frozen plasma and bleeding is controlled. However, on the third day of admission, creatinine rises to 1.5 mg/dL. Examination of urine sediment reveals 1–5 bilirubin stained granular casts per high powered field and a few renal tubular epithelial cells. Urine sodium is 21 mEq/L and fractional excretion of sodium (FENa) is 0.43%.
Background & Aims
Recent studies have shown geographic and seasonal variation in hospital admissions for diverticulitis. Because this variation parallels differences in ultraviolet light exposure, the most important contributor to vitamin D status, we examined the association of pre-diagnostic serum levels of vitamin D with diverticulitis.
Among patients within the Partners Healthcare System that had blood drawn and serum levels of 25-hydroxyvitamin D (25-[OH]D) measured, from 1993 through 2012, we identified 9116 patients with uncomplicated diverticulosis and 922 patients who developed diverticulitis that required hospitalization. We used multivariate logistic regression to estimate relative risks (RRs) and 95% confidence intervals (CIs) to compare serum 25(OH)D levels between these groups.
Patients with uncomplicated diverticulosis had significantly higher mean pre-diagnostic serum levels of 25(OH)D (29.1 ng/mL) than patients with diverticulitis that required hospitalization (25.3 ng/mL; P<.0001). Compared to patients in the lowest quintile of 25(OH)D, the multivariate-adjusted RR for diverticulitis hospitalization was 0.49 (95% CI, 0.38–0.62; P for trend = <.0001) among patients in the highest quintile of 25(OH)D. Compared to patients with uncomplicated diverticulosis, the mean level of 25(OH)D was significantly lower for patients with acute diverticulitis without other sequlae (25.9 ng/mL; P<.0001; n=594), for patients with diverticulitis with abscess (25.8 ng/mL; P=.0095; n=124), for patients with diverticulitis requiring emergent laparotomy (22.7 ng/mL; P=.002; n=65), and for patients with recurrent diverticulitis (23.5 ng/mL; P<.0001; n=139).
Among patients with diverticulosis, higher pre-diagnostic levels of 25(OH)D are significantly associated with a lower risk of diverticulitis. These data indicate that vitamin D deficiency could be involved in the pathogenesis of diverticulitis.
Vitamin D; diverticulitis; diverticulosis; risk factor; epidemiology
Background & Aims
Asymptomatic diverticulosis is commonly attributed to constipation secondary to a low-fiber diet, although evidence for this mechanism is limited. We examined the associations between constipation and low dietary fiber intake with risk of asymptomatic diverticulosis.
We performed a cross sectional study, analyzing data from 539 individuals with diverticulosis and 1569 without (controls). Participants underwent colonoscopy and assessment of diet, physical activity and bowel habits. Our analysis was limited our analysis to participants with no knowledge of their diverticular disease, to reduce the risk of biased responses.
Constipation was not associated with an increased risk of diverticulosis. Participants with less frequent bowel movements (BM: <7/wk) had reduced odds of diverticulosis compared to those with regular (7/wk) BM (odds ratio [OR] 0.56, 95% confidence interval [CI], 0.40–0.80). Those reporting hard stools also had a reduced odds (OR, 0.75; 95% CI, 0.55–1.02). There was no association between diverticulosis and straining (OR, 0.85; 95% CI, 0.59–1.22) or incomplete BM (OR, 0.85; 95% CI, 0.61–1.20). We found no association between dietary fiber intake and diverticulosis (OR, 0.96; 95% CI, 0.71–1.30) in comparing the highest quartile to the lowest (mean intake 25 versus 8 g/day).
In our cross-sectional, colonoscopy-based study, neither constipation nor a low-fiber diet was associated with an increased risk of diverticulosis.
diverticular disease; risk factors; database analysis
Background & Aims
There is uncertainty about the efficacy and safety of treatment for hepatitis C virus (HCV) infection in patients with inflammatory bowel disease (IBD). IBD can become exacerbated during treatment with interferon (IFN) and serious adverse events, such as pancytopenia or hepatotoxicity, can be compounded by drug interactions. We investigated the risk of exacerbation of IBD during HCV therapy and the rate of adverse effects of concomitant therapy for HCV and IBD. We also evaluated the efficacy of HCV treatment in the IBD population.
We conducted a retrospective review of all patients who underwent IFN-based treatment for HCV at the Mayo Clinic in Rochester, Minnesota from 2001 through 2012. Exacerbation of IBD was evaluated by clinical, endoscopic, and histologic parameters during antiviral therapy and the ensuing 12 months. Hematologic toxicity was assessed by levels of all 3 cell lineages at baseline and during therapy. Efficacy of antiviral treatment was assessed by serum levels of HCV RNA until 24 weeks after completion of therapy. We also conducted a detailed Medline database search and reviewed the literature on this topic.
We identified 15 subjects with concomitant IBD (8 with ulcerative colitis and 7 with Crohn’s disease). Only 1 patient experienced an exacerbation of the disease during therapy; symptoms were controlled with mesalamine enemas. Another patient developed a flare shortly after completing antiviral therapy; symptoms returned spontaneously to baseline 2 weeks later. All subjects experienced an anticipated degree of pancytopenia while on IFN-based therapy. The rate of sustained virologic response was 67 %. A concise review of available literature regarding the safety and efficacy of HCV treatment in IBD patients is also presented; although limited, the published data appears to support the safety of treatment with IFN in patients whose IBD is under control.
In conjunction with data from the literature, our findings indicate that the efficacy and safety of HCV therapy with IFN and ribavirin for patients with IBD are comparable to those of subjects without IBD.
UC; CD; SVR; inflammation
BACKGROUND & AIMS
Liver stiffness measurement (LSM), using elastography, can independently predict outcomes of patients with chronic liver diseases (CLDs). However, there is much variation in reporting and consistency of findings. We performed a systematic review and meta-analysis to evaluate the association between LSM and outcomes of patients with CLDs.
We performed a systematic review of the literature, through February 2013, for studies that followed up patients with CLDs prospectively for at least 6 months and reported the association between baseline LSM and subsequent development of decompensated cirrhosis or hepatocellular carcinoma (HCC), as well as mortality. Summary relative risk (RR) estimates per unit of LSM and 95% confidence intervals (CIs) were estimated using the random effects model.
Our final analysis included 17 studies, reporting on 7058 patients with CLDs. Baseline LSM was associated significantly with risk of hepatic decompensation (6 studies; RR, 1.07; 95% CI, 1.03–1.11), HCC (9 studies; RR, 1.11; 95% CI, 1.05–1.18), death (5 studies; RR, 1.22; 95% CI, 1.05–1.43), or a composite of these outcomes (7 studies; RR, 1.32; 95% CI, 1.16–1.51). We observed considerable heterogeneity among studies—primarily in the magnitude of effect, rather than the direction of effect. This heterogeneity could not be explained by variations in study locations, etiologies and stages of CLD, techniques to measure liver stiffness, adjustment for covariates, or method of imputing relationship in the meta-analysis.
Based on a meta-analysis of cohort studies, the degree of liver stiffness is associated with risk of decompensated cirrhosis, HCC, and death in patients with CLDs. LSM therefore might be used in risk stratification.
Elastography; Prognosis; Outcomes; Cirrhosis; Cancer
Background & Aims
US guidelines recommend surveillance of patients with
Barrett's esophagus (BE) to detect dysplasia. BE is conventionally
monitored via white-light endoscopy (WLE) and collection of random biopsies.
However, this approach does not definitively or consistently detect areas of
dysplasia. Advanced imaging technologies can increase detection of dysplasia
and cancer. We investigated whether these can increase the diagnostic yield
for detection of neoplasia in patients with BE, compared with WLE and
analysis of random biopsies.
We performed a systematic review, using Medline and Embase to
identify relevant peer-review studies. Fourteen studies were included in the
final analysis, with a total of 843 patients. Our metameter (estimate) of
interest was the paired-risk difference (RD), defined as the difference in
yield of detection of dysplasia or cancer using advanced imaging vs WLE. The
estimated paired-RD and 95% confidence interval (CI) were obtained
using random effects models. Heterogeneity was assessed by means of the Q
statistic and I2 statistic. An exploratory meta-regression was performed to
look for associations between the metameter and potential confounders or
Overall, advanced imaging techniques increased the diagnostic yield
for detection of dysplasia or cancer by 34% (95% CI,
20%–56%; P<.0001). A
subgroup analysis showed that virtual chromoendoscopy significantly
increased diagnostic yield (RD=0.34; 95% CI, 0.14 –
0.56; P<.0001). The RD for chromoendoscopy was 0.35
(0.13–0.56; P=.0001). There was no
significant difference between virtual chromoendoscopy and chromoendoscopy,
based on Student t test analysis (P=.45).
Based on a meta-analysis, advanced imaging techniques such as
chromoendoscopy or virtual chromoendoscopy significantly increase diagnostic
yield for identification of dysplasia or cancer in patients with BE.
Barrett's esophagus; PRISMA; QUADAS; advanced imaging; risk difference; esophageal adenocarcinoma
BACKGROUND & AIMS
Little is known about the effects of family history of hepatocellular carcinoma (HCC) on hepatitis B progression or risk of HCC. We examined how family HCC history and presence or stage of hepatitis B virus (HBV) infection affect risk for HCC.
We performed a population-based cohort study of 22,472 participants from 7 townships in Taiwan who underwent evaluation for liver disease from 1991 through 1992. Those who received a first diagnosis of HCC from January 1, 1991, to December 31, 2008, were identified from the Taiwanese cancer registry.
There were 374 cases of incident HCC over 362,268 person-years of follow-up evaluation. The cumulative risk of HCC in hepatitis B surface antigen (HBsAg)-seronegative patients without a family history of HCC was 0.62%, in those with a family history of HCC the cumulative risk was 0.65%, in HBsAg-seropositive patients without a family history of HCC the cumulative risk was 7.5%, and in HBsAg-seropositive patients with a family history of HCC the cumulative risk was 15.8% (P < .001). The multivariate-adjusted hazard ratio for HBsAg-seropositive individuals with family history, compared with HBsAg-seronegative individuals without a family history of HCC, was 32.33 (95% confidence interval, 20.8–50.3; P < .001). The relative excess risk owing to interaction was 19, the attributable proportion was 0.59, and the synergy index value was 2.54. These findings indicate synergy between family HCC history and HBsAg serostatus. The synergy between these factors remained significant in stratification analyses by HBeAg serostatus and serum level of HBV DNA.
Family history of HCC multiplies the risk of HCC at each stage of HBV infection. Patients with a family history of HCC require more intensive management of HBV infection and surveillance for liver cancer.
Cirrhosis; Liver Disease; Epidemiology; Cancer Risk Factor
Ulcerative colitis is a chronic inflammatory disease of the colon; as many as 25% of patients with this disease require hospitalization. The goals of hospitalization are to assess disease severity, exclude infection, administer rapidly acting and highly effective medication regimens, and determine response. During the hospitalization, patients should be given venous thromboembolism prophylaxis and monitored for development of toxic megacolon. Patients who do not respond to intravenous corticosteroids should be considered for rescue therapy with infliximab or cyclosporine. Patients who are refractory to medical therapies or who develop toxic megacolon should be evaluated promptly for colectomy. Patients who do respond to medical therapies should be discharged on an appropriate maintenance regimen when they meet discharge criteria. We review practical evidence-based management principles and propose a day-by-day algorithm for managing patients hospitalized for ulcerative colitis.
Background & Aims
Clostridium difficile infection (CDI) can cause life-threatening complications. Severe complicated CDI is characterized by hypotension, shock, sepsis, ileus, megacolon, and colon perforation. We created a model to identify clinical factors associated with severe complicated CDI.
We analyzed data from 1446 inpatient cases of CDI (48.6% female, median age 62.5 y, range 0.1–103.7 y) at the Mayo Clinic from June 28, 2007 through June 25, 2010. Patients with severe complicated CDI (n=487) were identified as those who required admission to the intensive-care unit (ICU) or colectomy, or died, within 30 days of CDI diagnosis. Logistic regression models were used to identify variables that were independently associated with the occurrence of severe complicated CDI in 2 cohorts. One cohort comprised all hospitalized patients; the other comprised a subset of these inpatients who were residents of Olmsted County, MN, to assess the association of comorbid conditions with the development of severe complicated infection in a population-based cohort. The linear combinations of variables identified using logistic regression models provided scores to predict the risk of developing severe-complicated CDI.
In a multivariable model that included all inpatients, increasing age, leukocyte count >15×109/L, increase in serum level of creatinine >1.5-fold from baseline, and use of proton pump inhibitors or narcotic medications were independently associated with severe complicated CDI. In the secondary analysis, which included only patients from Olmsted County, comorbid conditions were not significantly associated with severe complicated CDI.
Older age, high numbers of leukocytes in blood samples, an increased serum level of creatinine, gastric acid suppression, and use of narcotic medications were independently associated with development of severe complicated CDI in hospitalized patients. Early aggressive monitoring and intervention could improve outcomes.
database analysis; antibiotic resistant bacteria; PPI use; risk factor
Background & Aims
RM-131, a synthetic ghrelin agonist, greatly accelerates gastric emptying of solids in patients with type 2 diabetes and delayed gastric emptying (DGE). We investigated the safety and effects of a single dose of RM-131 on gastric emptying and upper gastrointestinal (GI) symptoms in patients with type 1 diabetes and previously documented DGE.
In a double-blind cross-over study, 10 patients with type 1 diabetes (age, 45.7 ± 4.4 y; body mass index, 24.1 ± 1.1 kg/m2) and previously documented DGE were assigned in random order to receive a single dose of RM-131 (100 μg, subcutaneously) or placebo. Thirty minutes later, they ate a radiolabeled solid–liquid meal containing EggBeaters (ConAgra Foods, Omaha, NE), and then underwent 4 hours of gastric emptying and 6 hours of colonic filling analyses by scintigraphy. Upper GI symptoms were assessed using a daily diary, gastroparesis cardinal symptom index (total GCSI-DD) and a combination of nausea, vomiting, fullness, and pain (NVFP) scores (each rated on a 0–5 scale).
At screening, participants' mean level of hemoglobin A1c was 9.1% ± 0.5%; their total GCSI-DD score was 1.66 ± 0.38 (median, 1.71), and their total NVFP score was 1.73 ± 0.39 (median, 1.9). The t1/2 of solid gastric emptying was 84.9 ± 31.6 minutes when subjects were given RM-131 and 118.7 ± 26.7 when they were given a placebo. The median difference (Δ) was 33.9 minutes (interquartile range [IQR] −12, −49), or -54.7% (IQR, −21%, 110%). RM-131 decreased gastric retention of solids at 1 hour (P = .005) and 2 hours (P = .019). Numeric differences in t1/2 for gastric emptying of liquids, solid gastric emptying lag time, and colonic filling at 6 hours were not significant. Total GCSI-DD scores were 0.79 on placebo (IQR, 0.75, 2.08) and 0.17 on RM-131 (IQR, 0.00, 0.67; P = .026); NVFP scores were lower on RM-131 (P = .041). There were no significant adverse effects.
RM-131 significantly accelerates of gastric emptying of solids and reduces upper GI symptoms in patients with type 1 diabetes and documented DGE.
Gastroparesis; Prokinetic; Drug; Clinical Trial