Objective To assess risks of mortality associated with use of individual antipsychotic drugs in elderly residents in nursing homes.
Design Population based cohort study with linked data from Medicaid, Medicare, the Minimum Data Set, the National Death Index, and a national assessment of nursing home quality.
Setting Nursing homes in the United States.
Participants 75 445 new users of antipsychotic drugs (haloperidol, aripiprazole, olanzapine, quetiapine, risperidone, ziprasidone). All participants were aged ≥65, were eligible for Medicaid, and lived in a nursing home in 2001-5.
Main outcome measures Cox proportional hazards models were used to compare 180 day risks of all cause and cause specific mortality by individual drug, with propensity score adjustment to control for potential confounders.
Results Compared with risperidone, users of haloperidol had an increased risk of mortality (hazard ratio 2.07, 95% confidence interval 1.89 to 2.26) and users of quetiapine a decreased risk (0.81, 0.75 to 0.88). The effects were strongest shortly after the start of treatment, remained after adjustment for dose, and were seen for all causes of death examined. No clinically meaningful differences were observed for the other drugs. There was no evidence that the effect measure modification in those with dementia or behavioural disturbances. There was a dose-response relation for all drugs except quetiapine.
Conclusions Though these findings cannot prove causality, and we cannot rule out the possibility of residual confounding, they provide more evidence of the risk of using these drugs in older patients, reinforcing the concept that they should not be used in the absence of clear need. The data suggest that the risk of mortality with these drugs is generally increased with higher doses and seems to be highest for haloperidol and least for quetiapine.
Weekly bisphosphonates are the primary agents used to treat osteoporosis. Although these agents are generally well tolerated, serious gastrointestinal adverse events, including hospitalization for gastrointestinal bleed, may arise. We compared the gastrointestinal safety between weekly alendronate and weekly risedronate and found no important difference between new users of these agents.
Weekly bisphosphonates are the primary agents prescribed for osteoporosis. We examined the comparative gastrointestinal safety between weekly bisphosphonates.
We studied new users of weekly alendronate and weekly risedronate from June 2002 to August 2005 among enrollees in a state-wide pharmaceutical benefit program for seniors. Our primary outcome was hospitalization for upper gastrointestinal bleed. Secondary outcomes included outpatient diagnoses for upper gastrointestinal disease, symptoms, endoscopic procedures, use of gastroprotective agents, and switching between therapies. We used Cox proportional hazard models to compare outcomes between agents within 120 days of treatment initiation, adjusting for propensity score quintiles. We also examined composite safety outcomes and stratified results by age and prior gastrointestinal history.
A total of 10,420 new users were studied, mean age=79 years (SD, 6.9), and 95% women. We observed 31 hospitalizations for upper gastrointestinal bleed (0.91 per 100 person-years) within 120 days of treatment initiation. Adjusting for covariates, there was no difference in hospitalization for upper gastrointestinal bleed among those treated with risedronate compared with alendronate (HR, 1.12; 95%CI, 0.55 to 2.28). Risedronate switching rates were lower; otherwise, no differences were observed for secondary or composite outcomes.
We found no important difference in gastrointestinal safety between weekly oral bisphosphonates.
Bisphosphonates; Drug evaluation; Drug safety; Osteoporosis; Population studies; Treatment
Bioinformatics; molecular medicine; publication; medical education; medical students; teacher-learner relationship
Partial seizures produce increased cerebral blood flow in the region of seizure onset. These regional cerebral blood flow increases can be detected by single photon emission computed tomography (ictal SPECT), providing a useful clinical tool for seizure localization. However, when partial seizures secondarily generalize, there are often questions of interpretation since propagation of seizures could produce ambiguous results. Ictal SPECT from secondarily generalized seizures has not been thoroughly investigated. We analysed ictal SPECT from 59 secondarily generalized tonic–clonic seizures obtained during epilepsy surgery evaluation in 53 patients. Ictal versus baseline interictal SPECT difference analysis was performed using ISAS (http://spect.yale.edu). SPECT injection times were classified based on video/EEG review as either pre-generalization, during generalization or in the immediate post-ictal period. We found that in the pre-generalization and generalization phases, ictal SPECT showed significantly more regions of cerebral blood flow increases than in partial seizures without secondary generalization. This made identification of a single unambiguous region of seizure onset impossible 50% of the time with ictal SPECT in secondarily generalized seizures. However, cerebral blood flow increases on ictal SPECT correctly identified the hemisphere (left versus right) of seizure onset in 84% of cases. In addition, when a single unambiguous region of cerebral blood flow increase was seen on ictal SPECT, this was the correct localization 80% of the time. In agreement with findings from partial seizures without secondary generalization, cerebral blood flow increases in the post-ictal period and cerebral blood flow decreases during or following seizures were not useful for localizing seizure onset. Interestingly, however, cerebral blood flow hypoperfusion during the generalization phase (but not pre-generalization) was greater on the side opposite to seizure onset in 90% of patients. These findings suggest that, with appropriate cautious interpretation, ictal SPECT in secondarily generalized seizures can help localize the region of seizure onset.
epilepsy; cerebral blood flow; grand mal; surgery; nuclear medicine
Generalized tonic–clonic seizures are among the most dramatic physiological events in the nervous system. The brain regions involved during partial seizures with secondary generalization have not been thoroughly investigated in humans. We used single photon emission computed tomography (SPECT) to image cerebral blood flow (CBF) changes in 59 secondarily generalized seizures from 53 patients. Images were analysed using statistical parametric mapping to detect cortical and subcortical regions most commonly affected in three different time periods: (i) during the partial seizure phase prior to generalization; (ii) during the generalization period; and (iii) post-ictally. We found that in the pre-generalization period, there were focal CBF increases in the temporal lobe on group analysis, reflecting the most common region of partial seizure onset. During generalization, individual patients had focal CBF increases in variable regions of the cerebral cortex. Group analysis during generalization revealed that the most consistent increase occurred in the superior medial cerebellum, thalamus and basal ganglia. Post-ictally, there was a marked progressive CBF increase in the cerebellum which spread to involve the bilateral lateral cerebellar hemispheres, as well as CBF increases in the midbrain and basal ganglia. CBF decreases were seen in the fronto-parietal association cortex, precuneus and cingulate gyrus during and following seizures, similar to the ‘default mode’ regions reported previously to show decreased activity in seizures and in normal behavioural tasks. Analysis of patient behaviour during and following seizures showed impaired consciousness at the time of SPECT tracer injections. Correlation analysis across patients demonstrated that cerebellar CBF increases were related to increases in the upper brainstem and thalamus, and to decreases in the fronto-parietal association cortex. These results reveal a network of cortical and subcortical structures that are most consistently involved in secondarily generalized tonic–clonic seizures. Abnormal increased activity in subcortical structures (cerebellum, basal ganglia, brainstem and thalamus), along with decreased activity in the association cortex may be crucial for motor manifestations and for impaired consciousness in tonic–clonic seizures. Understanding the networks involved in generalized tonic–clonic seizures can provide insights into mechanisms of behavioural changes, and may elucidate targets for improved therapies.
default mode; cerebellum; thalamus; SPECT; epilepsy
Myelosuppression has been observed with several multikinase angiogenesis inhibitors in clinical studies, although the frequency and severity varies among the different agents. Inhibitors targeting vascular endothelial growth factor receptor (VEGFR) often inhibit other kinases, which may contribute to their adverse-event profiles.
Kinase selectivity of pazopanib, sorafenib, and sunitinib was evaluated in a panel of 242 kinases. Cellular potency was measured using autophosphorylation assays. Effect on human bone marrow progenitor growth in the presence of multiple growth factors was evaluated and correlated with the kinase selectivity.
Sunitinib inhibited more kinases than pazopanib and sorafenib, at potencies within 10-fold of VEGFR-2. All three compounds potently inhibited VEGFR-2, platelet-derived growth factor receptor-β and c-Kit, However, pazopanib was less active against Flt-3 in both kinase and cellular assays. The inhibitory properties of pazopanib, sorafenib, and sunitinib were dependent on the growth factor used to initiate bone marrow colony formation. Addition of stem cell factor and/or Flt-3 ligand with granulocyte-macrophage colony stimulating factor resulted in significant shifts in potency for sorafenib and sunitinib but less so for pazopanib.
Activity against c-kit and Flt-3 by multikinase angiogenesis inhibitors provide a potential explanation for the differences in myelosuppression observed with these agents in patients.
kinase inhibitors; selectivity; myelosuppression
Gastric colonization with Helicobacter pylori is a proposed protective factor against gastroesophageal reflux disease (GERD), but little population-based data exist and other data conflict.
We conducted a community-based case-control study that compared GERD-free subjects with two groups: 1) subjects with a physician-assigned GERD diagnosis and 2) general population subjects with self-described weekly GERD symptoms. Subjects completed interviews, GERD questionnaires, and antibody testing for Helicobacter pylori and its cagA protein.
Serologic data was available for 301 physician-assigned GERD patients, 81 general population patients with GERD symptoms, and 175 subjects from the general population without GERD. Physician-assigned GERD patients were less likely to have Helicobacter pylori antibodies than GERD-free population controls (odds ratio [OR] = 0.27, 95% confidence interval [CI] 0.15-0.47); there was also an inverse association between Helicobacter pylori and GERD symptom severity (OR=0.18, 95%CI 0.08-0.41; severe or very severe symptoms) and GERD frequency (OR=0.18, 95%CI 0.09-0.38; for symptoms at least weekly). The association was stronger among persons with erosive GERD and was similar between Helicobacter pylori positive subjects with and without cagA. There was no association among persons who were cagA positive, but Helicobacter pylori negative. Similar findings were found in analyses of population members with self-described GERD symptoms.
Helicobacter pylori antibody status was inversely associated with a GERD diagnosis and GERD symptoms in a community-based population.
Helicobacter pylori; gastroesophageal reflux; GERD
The present study evaluated the associations between antioxidants, fruit and vegetable intakes and the risk of Barrett’s esophagus, a potential precursor to esophageal adenocarcinoma.
We conducted a case-control study within the Kaiser Permanente Northern California population. Incident Barrett’s esophagus cases (n=296) were matched to persons with gastroesophageal reflux disease (GERD) (GERD controls, n=308) and to population controls (n=309). Nutrient intake was measured using a validated 110-item food frequency questionnaire. The antioxidant results were stratified by dietary vs. total intake of antioxidants.
Comparing cases to population controls, dietary intake of vitamin C and beta-carotene were inversely associated with the risk of Barrett’s esophagus [4th vs. 1st quartile, adjusted odds ratio [OR]=0.48 95% confidence interval [CI] (0.26–0.90); OR=0.56 95%CI(0.32–0.99), respectively], and the inverse association was strongest for vitamin E [OR=0.25 95%CI (0.11–0.59)]. The inverse trends for antioxidant index (total and dietary) and fruit and vegetable intake were statistically significant, while most total intakes were not associated with reduced risk. The use of antioxidant supplements did not influence the risk of Barrett’s esophagus, and antioxidants and fruits and vegetables were inversely associated with a GERD diagnosis.
Dietary antioxidants, fruit and vegetable are inversely associated with the risk of Barrett’s esophagus, while no association was observed for supplement intake. Our results suggest that fruits and vegetables themselves or associated undetected confounders may influence early events in the carcinogenesis of esophageal adenocarcinoma.
epidemiology; nutrition; antioxidants; fruits and vegetables; Barrett’s esophagus
Background & Aims
Little is known about the effects of alcohol use and sociodemographics on the risk of Barrett’s esophagus, a precursor to esophageal adenocarcinoma. We evaluated the association between alcohol use, alcohol type, sociodemographic profiles, other lifestyle factors and the risk of Barrett’s esophagus.
Using a case-control study within the Kaiser Permanente Northern California membership, patients with a new diagnosis of Barrett’s esophagus (n=320) diagnosed between 2002–2005 were matched to persons with gastroesophageal reflux disease (GERD) (n=316) and to population controls (n=317). We collected information using validated questionnaires during direct in-person interviews. Analyses used multivariate unconditional logistic regression.
Total alcohol use was not significantly associated with the risk of Barrett’s esophagus, although stratification by beverage type showed an inverse association for wine drinkers compared to nondrinkers (7+ drinks wine/week vs. none: OR=0.44, 95%CI (0.20–0.99); multivariate analysis). Among population controls, those who preferred wine were more likely to have college degrees and regularly take vitamin supplements than those who preferred beer or liquor, although adjustment for these factors or GERD symptoms did not eliminate the inverse association between wine consumption and Barrett’s esophagus. Education status was significantly inversely associated with the risk of Barrett’s esophagus.
There are associations between alcohol types, socioeconomic status and the risk of Barrett’s esophagus. Although choice of alcoholic beverages was associated with several factors, multiple adjustments (including for GERD) did not eliminate the association between alcohol and Barrett’s esophagus. Further research to evaluate the associations among socioeconomic status, GERD, and Barrett’s esophagus is warranted.
To evaluate the demographics and incidence of Barrett’s oesophagus diagnosis using community-based data.
Kaiser Permanente, Northern California health-care membership, 1994–2006.
Members with an electronic diagnosis of Barrett’s oesophagus.
Main outcome measures
Incidence and prevalence of a new Barrett’s oesophagus diagnosis by race, sex, age and calendar year.
4205 persons met the study definition for a diagnosis of Barrett’s oesophagus. The annual incidence in 2006 was highest among non-Hispanic whites (39/100 000 race-specific member-years, 95% confidence interval (95% CI) 35 to 43), with lower rates among Hispanics (22/100 000, 95% CI 16 to 29), Asians (16/100 000, 95% CI 11 to 22), and blacks (6/100 000, 95% CI 2 to 12). The annual incidence was higher among men than women (31 vs 17/100 000, respectively, year 2006; p<0.01). The incidence increased with age from 2 per 100 000 for persons aged 21–30 years, to a peak of 31 per 100 000 member-years for persons aged 61–70 years (year 2006). There was no increase in the incidence of new diagnoses until the last two observation years, which coincided with changes in data collection methods and may be due to bias. The overall prevalence among active members increased almost linearly to 131/100 000 member-years by 2006.
The demographic distributions of Barrett’s oesophagus differ markedly by race, age and sex and were comparable to those for oesophageal adenocarcinoma. Thus, demographic disparities in oesophageal adenocarcinoma risk may arise partly from the risk of having Barrett’s oesophagus, rather than from differing risks of progression from Barrett’s oesophagus to cancer. There has been an almost linear increase in the prevalence of diagnosed disease.
Conditions causing high iron levels, such as hemochromatosis, are proposed risk factors for esophageal adenocarcinoma. Although this hypothesis is supported by animal models, no human data currently exist. We conducted a case-control study of persons with a new Barrett’s esophagus diagnosis (cases), persons with gastroesophageal reflux disease (GERD) (without Barrett’s esophagus), and population controls. Subjects completed detailed examinations and assays for hemochromatosis mutations and serum iron stores. We evaluated 317 cases, 306 GERD patients, and 308 population controls. There was no significant association between Barrett’s esophagus and any hemochromatosis gene defect (odds ratio [OR] = 1.32, 95% confidence interval [CI]: 0.95–1.84), a moderate or severe mutation (OR = 1.54, 95% CI: 0.94–2.52), or a severe mutation (C282Y homozygote or C282Y/H63D heterozygote; OR = 0.77, 95% CI: 0.24–2.48) compared with the population controls. As expected, gene defects were associated with increased iron stores. We can conclude from our findings that Barrett’s esophagus was not associated with hemochromatosis gene defects, although we cannot exclude small effects.
Keywords Barrett’s esophagus; Esophageal adenocarcinoma; Iron; Hemochromatosis
Gastric colonization with the Helicobacter pylori bacterium is a proposed protective factor against oesophageal adenocarcinoma, but its point of action is unknown. We evaluated its associations with Barrett’s oesophagus, a metaplastic change that is a probable early event in the carcinogenesis of oesophageal adenocarcinoma.
A case-control study
The Kaiser Permanente Northern California population, a large health services delivery organization
Persons with a new Barrett’s oesophagus diagnosis (cases) were matched to subjects with gastrooesophageal reflux disease (GORD) without Barrett’s oesophagus and to population controls.
Subjects completed direct in-person interviews and antibody testing for Helicobacter pylori and its cagA protein.
Serologic data were available on 318 Barrett’s oesophagus cases, 312 GORD patients, and 299 population controls. Patients with Barrett’s oesophagus were substantially less likely to have antibodies for Helicobacter pylori (odds ratio [OR] = 0.42, 95% confidence interval [CI] 0.26–0.70) than population controls; this inverse association was stronger among those with lower body mass indexes (BMI<25 OR=0.03, 95% CI 0.00 – 0.20) and those with cagA+ strains (OR=0.08, 95% CI 0.02–0.35). The associations were diminished after adjustment for GORD symptoms. The H. pylori status was not an independent risk factor for Barrett’s oesophagus compared to the GORD controls.
Helicobacter pylori infection and cagA+ status were inversely associated with a new diagnosis of Barrett’s oesophagus. The findings are consistent with the hypothesis that Helicobacter pylori colonization protects against Barrett’s oesophagus and that the association may be at least partially mediated through GORD.
Barrett’s esophagus; Barrett’s oesophagus; helicobacter; GERD; GORD; esophageal adenocarcinoma; oesophageal adenocarcinoma
We examined the association between smoking and the risk of Barrett's esophagus (BE), a metaplastic precursor to esophageal adenocarcinoma.
We conducted a case-control study within the Kaiser Permanente Northern California population. Patients with a new diagnosis of BE (n=320) were matched to persons with gastroesophageal reflux disease (GERD) (n=316) and to population controls (n=317). Information was collected using validated questionnaires from direct in-person interviews and electronic databases. Analyses used multivariate unconditional logistic regression that controlled for age, gender, race and education.
Ever smoking status, smoking intensity (pack-years), and smoking cessation were not associated with the risk of BE. Stratified analyses suggested that ever smoking may be associated with an increased risk of BE among some groups (compared to population controls): persons with long-segment Barrett's esophagus (odds ratio [OR]=1.72, 95% confidence interval [CI] 1.12-2.63); subjects without GERD symptoms (OR=3.98, 95% CI 1.58-10.0); obese subjects (OR=3.38, 95%CI 1.46-7.82); and persons with a large abdominal circumference (OR=3.02, 95%CI (1.18-2.75)).
Smoking was not a strong or consistent risk factor for BE in a large community-based study, although associations may be present in some population subgroups.
Smoking; Barrett's esophagus; Gastroesophageal reflux disease; esophageal adenocarcinoma
Background: Several previous studies have found that females and older individuals are at greater risk of having incomplete flexible sigmoidoscopy. However, no prior study has reported the subsequent risk of colorectal cancer (CRC) following incomplete sigmoidoscopy.
Methods: Using data from 55 791 individuals screened as part of the Colon Cancer Prevention (CoCaP) programme of Kaiser Permanente of Northern California, we evaluated the likelihood of having an inadequate (<40 cm) examination by age and sex, and estimated the risk of distal CRC according to depth of sigmoidoscope insertion at the baseline screening examination. Multivariate estimation of risks was performed using Poisson regression.
Results: Older individuals were at a much greater risk of having an inadequate examination (relative risk (RR) for age 80+ years compared with 50–59 years 2.6 (95% confidence interval (CI) 2.3–3.0)), as were females (RR 2.3 (95% CI 2.2–2.5)); these associations were attenuated but remained strong if Poisson models were further adjusted for examination limitations (pain, stool, and angulation). There was an approximate threefold increase in the risk of distal CRC if the baseline sigmoidoscopy did not reach a depth of at least 40 cm; a smaller increase in risk was observed for examinations that reached 40–59 cm.
Conclusions: Older individuals and women are at an increased risk of having inadequate sigmoidoscopy. Because inadequate sigmoidoscopy results in an increased risk of subsequent CRC, physicians should consider steps to maximise the depth of insertion of the sigmoidoscope or, failing this, should consider an alternative screening test.
colorectal cancer; screening; sigmoidoscopy
Background: Flexible sigmoidoscopy (FS) is a complex technical procedure performed in a variety of settings, by examiners with diverse professional backgrounds, training, and experience. Potential variation in technical quality may have a profound impact on the effectiveness of FS on the early detection and prevention of colorectal cancer.
Aim: We propose a set of consensus and evidence based recommendations to assist the development of continuous quality improvement programmes around the delivery of FS for colorectal cancer screening.
Recommendations: These recommendations address the intervals between FS examinations, documentation of results, training of endoscopists, decision making around referral for colonoscopy, policies for antibiotic prophylaxis and management of anticoagulation, insertion of the FS endoscope, bowel preparation, complications, the use of non-physicians as FS endoscopists, and FS endoscope reprocessing. For each of these areas, continuous quality improvement targets are recommended, and research questions are proposed.
flexible sigmoidoscopy; colorectal cancer screening; endoscopic screening; technical performance quality
This study aimed to evaluate whether patients with advanced non-small-cell lung cancer experience disrupted rest–activity daily rhythms, poor sleep quality, weakness, and maintain attributes that are linked to circadian function such as fatigue. This report describes the rest–activity patterns of 33 non-small-cell lung cancer patients who participated in a randomised clinical trial evaluating the benefits of melatonin. Data are reported on circadian function, health-related quality of life (QoL), subjective sleep quality, and anxiety/depression levels prior to randomisation and treatment. Actigraphy data, an objective measure of circadian function, demonstrated that patients' rest–activity circadian function differs significantly from control subjects. Our patients reported poor sleep quality and high levels of fatigue. Ferrans and Powers QoL Index instrument found a high level of dissatisfaction with health-related QoL. Data from the European Organization for Research and Treatment for Cancer reported poor capacity to fulfil the activities of daily living. Patients studied in the hospital during or near chemotherapy had significantly more abnormal circadian function than those studied in the ambulatory setting. Our data indicate that measurement of circadian sleep/activity dynamics should be accomplished in the outpatient/home setting for a minimum of 4–7 circadian cycles to assure that they are most representative of the patients' true condition. We conclude that the daily sleep/activity patterns of patients with advanced lung cancer are disturbed. These are accompanied by marked disruption of QoL and function. These data argue for investigating how much of this poor functioning and QoL are actually caused by this circadian disruption, and, whether behavioural, light-based, and or pharmacologic strategies to correct the circadian/sleep activity patterns can improve function and QoL.
circadian function; non-small-cell lung cancer; rest/activity function; sleep quality; quality of life; actigraphy
STUDY OBJECTIVE—To investigate the association between drinking water quality and gastrointestinal illness in the elderly of Philadelphia.
DESIGN—Within the general population, children and the elderly are at highest risk for gastrointestinal disease. This study investigates the potential association between daily fluctuations in drinking water turbidity and subsequent hospital admissions for gastrointestinal illness of elderly persons, controlling for time trends, seasonal patterns, and temperature using Poisson regression analysis.
SETTING AND PARTICIPANTS—All residents of Philadelphia aged 65 and older in 1992-1993 were studied through their MEDICARE records.
MAIN RESULTS—For Philadelphia's population aged 65 and older, we found water quality 9 to 11 days before the visit was associated with hospital admissions for gastrointestinal illness, with an interquartile range increase in turbidity being associated with a 9% increase (95% CI 5.3%, 12.7%). In the Belmont service area, there was also an association evident at a lag of 4 to 6 days (9.1% increase, 95% CI 5.2, 13.3). Both associations were stronger in those over 75 than in the population aged 65-74. This association occurred in a filtered water supply in compliance with US standards.
CONCLUSIONS—Elderly residents of Philadelphia remain at risk of waterborne gastrointestinal illness under current water treatment practices. Hospitalisations represent a very small percentage of total morbidity.
Keywords: waterborne disease; drinking water; gastrointestinal illness; elderly
coli heat stable enterotoxin (STa) is a major cause of secretory
diarrhoea in humans.
AIMS—To assess the effects of
instilling STa into the ileum on remote fluid secretion in the jejunum
and colon in rats in vivo by a gravimetric technique.
METHODS AND RESULTS—Ileal STa (55 ng/ml) stimulated fluid secretion in both ileal and jejunal loops but
not in the colon. The fluid secretion induced by ileal STa was
inhibited by L-NAME
methyl ester, 40 mg/kg intraperitoneally) but not by D-NAME
methyl ester). Ileal carbachol (183 mg/ml) instilled into the lumen
stimulated ileal secretion but not jejunal secretion, and was
unaffected by L-NAME. Capsaicin (10 µM), instilled
luminally with STa in the ileum, blocked both the ileal and jejunal
fluid secretion. Acute bilateral vagotomy prevented luminal ileal STa from inducing jejunal fluid secretion but not from activating ileal
E coli STa stimulates remote secretion in
the rat jejunum but not in the colon, probably by a nitrinergic, vagal
reflex mediated by C fibres. This neural pathway will amplify the
action of the toxin in its generation of secretory diarrhoea.
intestinal fluid secretion; Escherichia coli STa; vagus; nitric oxide; C
Doppler ultrasound was used to study the effect of the first
intravenous dose of caffeine on splanchnic haemodynamics in preterm neonates. Peak systolic velocity in the superior mesenteric artery and
coeliac axis was significantly reduced for 6 hours after caffeine infusion.
The effect of this reduction in blood flow to the neonatal
gut is not known.
the effect of enteral feeding on splanchnic blood flow velocity in
axis and superior mesenteric artery (SMA) blood flow velocity were
measured longitudinally in a cohort of 61 babies using Doppler ultrasound.
fed 1 hourly had significantly higher preprandial SMA peak systolic
velocity (PSV) than those fed 3 hourly (70 vs
53cm/s). Those fed 1 hourly showed no postprandial change
whereas those fed 3 hourly showed significant postprandial hyperaemia. This hyperaemia had longer latency (42 vs 27 mins) and smaller amplitude (31 vs 25 mins)
after expressed breast milk compared with preterm formula. The addition
of long chain polyunsaturated fatty acids to the formulas had no effect
on the postprandial response.
bolus feeding leads to a persistent hyperaemic state in the SMA. The
composition of feeds is an important determinant of the postprandial
response of the SMA to 3 hourly feeding.
BACKGROUND: The HIV-1 matrix (MA) protein, p17, contains two subcellular localization signals that facilitate both nuclear import of the viral preintegration complex early during infection and virus particle assembly late in infection. The dual role of MA in both the afferent and efferent arms of the HIV-1 life cycle makes it an important target for intracellular immunization-based gene therapy strategies. MATERIALS AND METHODS: Here we report, using a new bicistronic vector, that an intracellular Fab antibody, or Fab intrabody, directed against a carboxy-terminal epitope of MA from the Clade B HIV-1 genotype, can inhibit HIV-1 infection when expressed in the cytoplasm of actively dividing CD4+ T cells. RESULTS: Marked inhibition of proviral gene expression occurred when single-round HIV-1 CAT virus was used for infections. In challenge experiments using both laboratory strains and syncytium-inducing primary isolates of HIV-1, a substantial reduction in the infectivity of virions released from the cells was also observed. CONCLUSIONS: This novel strategy of simultaneously blocking early and late events of the HIV-1 life cycle may prove useful in clinical gene therapy approaches for the treatment of HIV-1 infection and AIDS, particularly when combined with genetic or pharmacologic-based strategies that inhibit other HIV-1 target molecules simultaneously.
Since colchicine-sensitive microtubules regulate the expression and topography of surface glycoproteins on a variety of cells, we sought evidence that colchicine interferes with neutrophil-endothelial interactions by altering the number and/or distribution of selectins on endothelial cells and neutrophils. Extremely low, prophylactic, concentrations of colchicine (IC50 = 3 nM) eliminated the E-selectin-mediated increment in endothelial adhesiveness for neutrophils in response to IL-1 (P < 0.001) or TNF alpha (P < 0.001) by changing the distribution, but not the number, of E-selectin molecules on the surface of the endothelial cells. Colchicine inhibited stimulated endothelial adhesiveness via its effects on microtubules since vinblastine, an agent which perturbs microtubule function by other mechanisms, diminished adhesiveness whereas the photoinactivated colchicine derivative gamma-lumicolchicine was inactive. Colchicine had no effect on cell viability. At higher, therapeutic, concentrations colchicine (IC50 = 300 nM, P < 0.001) also diminished the expression of L-selectin on the surface of neutrophils (but not lymphocytes) without affecting expression of the beta 2-integrin CD11b/CD18. In confirmation, L-selectin expression was strikingly reduced (relative to CD11b/CD18 expression) on neutrophils from two individuals who had ingested therapeutic doses of colchicine. These results suggest that colchicine may exert its prophylactic effects on cytokine-provoked inflammation by diminishing the qualitative expression of E-selectin on endothelium, and its therapeutic effects by diminishing the quantitative expression of L-selectin on neutrophils.
Electrogenic (Cl-) secretion was measured as the short circuit current (Isc, microA/cm2) across muscle-stripped sheets of jejunum and ileum incubated in vitro after removal from fed rats, rats starved for three days, and chronically undernourished rats (50% of fed control intake for 21 days). Concentration and Isc response curves for serially-added mucosal Escherichia coli STa enterotoxin showed that the rats which had undergone dietary deprivation had a larger secretory Isc maximum but the ED50 values were unchanged compared with fed animals. In fed intestine the action of STa was transient, with an Isc peak and subsequent decay to the baseline over 60 minutes but in the undernourished intestine the response consisted of a significantly greater peak than that of the fed state (jejunum = 94%; ileum = 168%) and the Isc was maintained at or near the peak for at least 60 minutes. The starved intestine had a less well developed maintenance of its enhanced peak Isc. Serosal tetrodotoxin (1 microM) had no effect on the initial peak Isc values but caused a decay of the maintained Isc down to the basal or fed levels in the starved and, especially, in the undernourished intestines. Thus, dietary deprivation, especially chronic undernutrition, enhances the maximum electrogenic secretion due to STa and creates a new neural path in the submucosal plexus that, when activated by STa, maintains its enhanced secretory action. Its putative role in exacerbating secretory diarrhoea in malnourished human subjects could be an important component underlying the known relation between malnourishment and the increased severity of diarrhoea.
Fluid transport was gravimetrically measured in vivo in the duodenum, jejunum, and ileum of anaesthetised fed, 72 hour starved and 72 hour starved rats refed for up to five days after starvation. Basal unstimulated fluid transport was monitored by instilling 0.9% NaCl into the lumen and measuring the gain or loss in weight of the closed intestinal loop. Fluid was absorbed in all the areas of the intestine in the fed rats. Increasing basal fluid absorption was observed in the duodenum over the three days of starvation but in the jejunum there was no significant change. In the ileum, the pattern was very different, on day 1 the fluid was absorbed but on days 2 and 3 there was an increasing secretion of fluid. Refeeding the rats with their normal diet restored the basal absorption of fluid in the duodenum within 24 hours, had no effect in the jejunum but in the case of the ileum the hypersecretion of fluid observed in the day 3 starved rat was maintained on day 1 of refeeding, increased further on day 2, decreased on day 3 but returned to absorption on day 4. The normal absorption was restored to the ileum on day 5 of refeeding. Fluid secretion was induced in all the rat groups by bethanechol (ip 60 micrograms/kg bw) a stable cholinergic agonist, PGE2 (ip 10 micrograms/kg (bw) and E coli STa (luminally instilled, 500 ng/ml) a secretory enterotoxin. All the secretagogues gave enhanced secretion compared with the fed by day 2 of starvation which increased considerably on day 3. Refeeding returned their secretion back to the fed level in the duodenum within 24 hours, in the jejunum within 48 hours but in the ileum their induced secretion on day 2 of refeeding was greater than that of the day 2 of refeeding was greater than that of day 3 starved and took until day 4 to return to the fed levels for behanechol and PGE2 and until day 5 for E. coli STa. This behaviour of rat small intestine showing even greater hypersecretion in the refed state than the starved mimics the human condition of alimentary induced diarrhoea where incautious feeding of starved humans induces severe, often lethal diarrhoea. The refed starved rat appears to be a possible model for this condition.