Search tips
Search criteria

Results 1-8 (8)

Clipboard (0)
more »
Year of Publication
2.  The effect of sham feeding on neurocardiac regulation in healthy human volunteers 
Distension and electrical stimuli in the esophagus alter heart rate variability (HRV) consistent with activation of vagal afferent and efferent pathways. Sham feeding stimulates gastric acid secretion by means of vagal efferent pathways. It is not known, however, whether activation of vagal efferent pathways is organ- or stimulus-specific.
To test the hypothesis that sham feeding increases the high frequency (HF) component of HRV, indicating increased neurocardiac vagal activity in association with the known, vagally mediated, increase in gastric acid secretion.
Continuous electrocardiography recordings were obtained in 12 healthy, semirecumbent subjects during consecutive 45 min baseline, 20 min sham feeding (standard hamburger meal) and 45 min recovery periods. The R-R intervals and beat-to-beat heart rate signal were determined from digitized electrocardiography recordings; power spectra were computed from the heart rate signal to determine sympathetic (low frequency [LF]) and vagal (HF) components of HRV.
Heart rate increased during sham feeding (median 70.8 beats/min, 95% CI 66.0 to 77.6; P<0.001), compared with baseline (63.6, 95% CI 60.8 to 70.0) and returned to baseline levels within 45 min. Sham feeding increased the LF to HF area ratio (median: 1.55, 95% C.I 1.28 to 1.77; P<0.021, compared with baseline (1.29, 95% CI 1.05 to 1.46); this increase in LF to HF area ratio was associated with a decrease in the HF component of HRV.
Sham feeding produces a reversible increase in heart rate that is attributable to a decrease in neurocardiac parasympathetic activity despite its known ability to increase vagally mediated gastric acid secretion. These findings suggest that concurrent changes in cardiac and gastric function are modulated independently by vagal efferent fibres and that vagally mediated changes in organ function are stimulus- and organ-specific.
PMCID: PMC2658586  PMID: 18026575
Autonomic nervous system; Heart rate variability; Sham feeding; Vagus nerve
3.  Improvement of lumbar bone mass after infliximab therapy in Crohn’s disease patients 
Patients with Crohn’s disease (CD) have a high risk of developing osteoporosis, but the mechanisms underlying bone mass loss are unclear. Elevated proinflammatory cytokines, such as tumour necrosis factor-alpha (TNFα), have been implicated in the pathogenesis of bone resorption.
To assess whether suppression of TNFα with infliximab treatment has a beneficial effect on lumbar bone mass.
Adult CD patients who had received infliximab treatment, and who underwent lumbar densitometric evaluation before and during treatment, were selected. Adult CD patients who had never received infliximab treatment were selected as controls. Information regarding age, sex, weight, duration of CD, use of glucocorticoids and bisphosphonates, and signs of disease activity between both densitometric measurements were collected.
Data from 45 patients were analyzed. The control group (n=30, mean [± SD] 26.7±9 years of age) had a significantly higher increase in body weight between both evaluations (6.26%±8%) than the infliximab group (n=15, 30.6±13 years), which had an increase of 0.3%±7.4%. There was a strong correlation between the final weight and lumbar bone mineral content (BMC) in both groups. The infliximab group had a significant increase in lumbar bone area (4.15%±6.6%), BMC (12.8%±13.6%) and bone mineral density (8.13%±7.7%) between both evaluations (interval 22.6±11 months) compared with the control group. The increase in BMC in patients who had received infliximab treatment was significant when compared with control patients who had received glucocorticoids (n=8) or had evidence of disease activity (n=13).
Infliximab therapy improved lumbar bone mass independent of nutritional status. This finding suggests that TNFα plays a role in bone loss in CD.
PMCID: PMC2658130  PMID: 17948133
Bone mass; Crohn’s disease; Infliximab; Osteopenia; Osteoporosis
4.  The relationship between social deprivation and the quality of primary care: a national survey using indicators from the UK Quality and Outcomes Framework 
The existence of health inequalities between least and most socially deprived areas is now well established.
To use Quality and Outcomes Framework (QOF) indicators to explore the characteristics of primary care in deprived communities.
Design of study
Two-year study.
Primary care in England.
QOF data were obtained for each practice in England in 2004–2005 and 2005–2006 and linked with census derived social deprivation data (Index of Multiple Deprivation scores 2004), national urbanicity scores and a database of practice characteristics. Data were available for 8480 practices in 2004–2005 and 8264 practices in 2005–2006. Comparisons were made between practices in the least and most deprived quintiles.
The difference in mean total QOF score between practices in least and most deprived quintiles was 64.5 points in 2004–2005 (mean score, all practices, 959.9) and 30.4 in 2005–2006 (mean, 1012.6). In 2005–2006, the QOF indicators displaying the largest differences between least and most deprived quintiles were: recall of patients not attending appointments for injectable neuroleptics (79 versus 58%, respectively), practices opening ≥45 hours/week (90 versus 74%), practices conducting ≥12 significant event audits in previous 3 years (93 versus 81 %), proportion of epileptics who were seizure free ≥12 months (77 versus 65%) and proportion of patients taking lithium with serum lithium within therapeutic range (90 versus 78%). Geographical differences were less in group and training practices.
Overall differences between primary care quality indicators in deprived and prosperous communities were small. However, shortfalls in specific indicators, both clinical and non-clinical, suggest that focused interventions could be applied to improve the quality of primary care in deprived areas.
PMCID: PMC2078188  PMID: 17550668
primary care; quality indicators; social deprivation
7.  Photodynamic therapy for Barrett’s esophagus with high-grade dysplasia: A cost-effectiveness analysis 
To assess the cost-effectiveness of photodynamic therapy (PDT) and esophagectomy (ESO) relative to surveillance (SURV) for patients with Barrett’s esophagus (BE) and high-grade dysplasia (HGD).
A Markov decision tree was constructed to estimate costs and health outcomes of PDT, ESO and SURV in a hypothetical cohort of male patients, 50 years of age, with BE and HGD. Outcomes included unadjusted life-years (LYs) and quality-adjusted LYs (QALYs). Direct medical costs (2003 CDN$) were measured from the perspective of a provincial ministry of health. The time horizon for the model was five years (cycle length three months), and costs and outcomes were discounted at 3%. Model parameters were assigned unique distributions, and a probabilistic analysis with 10,000 Monte Carlo simulations was performed.
SURV was the least costly strategy, followed by PDT and ESO, but SURV was also the least effective. In terms of LYs, the incremental cost-effectiveness ratios were $814/LY for PDT versus SURV and $3,397/LY for ESO versus PDT. PDT dominated ESO for QALYs in the base-case. The incremental cost-effectiveness ratio of PDT versus SURV was $879/QALY. In probabilistic analysis, PDT was most likely to be cost-effective at willingness-to-pay (WTP) values between $100/LY and $3,500/LY, and ESO was most likely to be cost-effective for WTP values over $3500/LY. For quality-adjusted survival, PDT was most likely to be cost-effective for all WTP thresholds above $1,000/QALY. The likelihood that PDT was the most cost-effective strategy reached 0.99 at a WTP ceiling of $25,000/QALY.
In male patients with BE and HGD, PDT and ESO are cost-effective alternatives to SURV.
PMCID: PMC2657694  PMID: 17431509
Barrett’s esophagus; High-grade dysplasia; Photodynamic therapy
8.  Foot ulcers in the diabetic patient, prevention and treatment 
Lower extremity complications in persons with diabetes have become an increasingly significant public health concern in both the developed and developing world. These complications, beginning with neuropathy and subsequent diabetic foot wounds frequently lead to infection and lower extremity amputation even in the absence of critical limb ischemia. In order to diminish the detrimental consequences associated with diabetic foot ulcers, a com-mon-sense-based treatment approach must be implemented. Many of the etiological factors contributing to the formation of diabetic foot ulceration may be identified using simple, inexpensive equipment in a clinical setting. Prevention of diabetic foot ulcers can be accomplished in a primary care setting with a brief history and screening for loss of protective sensation via the Semmes-Weinstein monofilament. Specialist clinics may quantify neuropathy, plantar foot pressure, and assess vascular status with Doppler ultrasound and ankle-brachial blood pressure indices. These measurements, in conjunction with other findings from the history and physical examination, may enable clinicians to stratify patients based on risk and help determine the type of intervention. Other effective clinical interventions may include patient education, optimizing glycemic control, smoking cessation, and diligent foot care. Recent technological advanced combined with better understanding of the wound healing process have resulted in a myriad of advanced wound healing modalities in the treatment of diabetic foot ulcers. However, it is imperative to remember the fundamental basics in the healing of diabetic foot ulcers: adequate perfusion, debridement, infection control, and pressure mitigation. Early recognition of the etiological factors along with prompt management of diabetic foot ulcers is essential for successful outcome.
PMCID: PMC1994045  PMID: 17583176
diabetes; ulcer; prevention; infection; amputation

Results 1-8 (8)