PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1491517)

Clipboard (0)
None

Related Articles

1.  Clinical Utility of Vitamin D Testing 
Executive Summary
This report from the Medical Advisory Secretariat (MAS) was intended to evaluate the clinical utility of vitamin D testing in average risk Canadians and in those with kidney disease. As a separate analysis, this report also includes a systematic literature review of the prevalence of vitamin D deficiency in these two subgroups.
This evaluation did not set out to determine the serum vitamin D thresholds that might apply to non-bone health outcomes. For bone health outcomes, no high or moderate quality evidence could be found to support a target serum level above 50 nmol/L. Similarly, no high or moderate quality evidence could be found to support vitamin D’s effects in non-bone health outcomes, other than falls.
Vitamin D
Vitamin D is a lipid soluble vitamin that acts as a hormone. It stimulates intestinal calcium absorption and is important in maintaining adequate phosphate levels for bone mineralization, bone growth, and remodelling. It’s also believed to be involved in the regulation of cell growth proliferation and apoptosis (programmed cell death), as well as modulation of the immune system and other functions. Alone or in combination with calcium, Vitamin D has also been shown to reduce the risk of fractures in elderly men (≥ 65 years), postmenopausal women, and the risk of falls in community-dwelling seniors. However, in a comprehensive systematic review, inconsistent results were found concerning the effects of vitamin D in conditions such as cancer, all-cause mortality, and cardiovascular disease. In fact, no high or moderate quality evidence could be found concerning the effects of vitamin D in such non-bone health outcomes. Given the uncertainties surrounding the effects of vitamin D in non-bone health related outcomes, it was decided that this evaluation should focus on falls and the effects of vitamin D in bone health and exclusively within average-risk individuals and patients with kidney disease.
Synthesis of vitamin D occurs naturally in the skin through exposure to ultraviolet B (UVB) radiation from sunlight, but it can also be obtained from dietary sources including fortified foods, and supplements. Foods rich in vitamin D include fatty fish, egg yolks, fish liver oil, and some types of mushrooms. Since it is usually difficult to obtain sufficient vitamin D from non-fortified foods, either due to low content or infrequent use, most vitamin D is obtained from fortified foods, exposure to sunlight, and supplements.
Clinical Need: Condition and Target Population
Vitamin D deficiency may lead to rickets in infants and osteomalacia in adults. Factors believed to be associated with vitamin D deficiency include:
darker skin pigmentation,
winter season,
living at higher latitudes,
skin coverage,
kidney disease,
malabsorption syndromes such as Crohn’s disease, cystic fibrosis, and
genetic factors.
Patients with chronic kidney disease (CKD) are at a higher risk of vitamin D deficiency due to either renal losses or decreased synthesis of 1,25-dihydroxyvitamin D.
Health Canada currently recommends that, until the daily recommended intakes (DRI) for vitamin D are updated, Canada’s Food Guide (Eating Well with Canada’s Food Guide) should be followed with respect to vitamin D intake. Issued in 2007, the Guide recommends that Canadians consume two cups (500 ml) of fortified milk or fortified soy beverages daily in order to obtain a daily intake of 200 IU. In addition, men and women over the age of 50 should take 400 IU of vitamin D supplements daily. Additional recommendations were made for breastfed infants.
A Canadian survey evaluated the median vitamin D intake derived from diet alone (excluding supplements) among 35,000 Canadians, 10,900 of which were from Ontario. Among Ontarian males ages 9 and up, the median daily dietary vitamin D intake ranged between 196 IU and 272 IU per day. Among females, it varied from 152 IU to 196 IU per day. In boys and girls ages 1 to 3, the median daily dietary vitamin D intake was 248 IU, while among those 4 to 8 years it was 224 IU.
Vitamin D Testing
Two laboratory tests for vitamin D are available, 25-hydroxy vitamin D, referred to as 25(OH)D, and 1,25-dihydroxyvitamin D. Vitamin D status is assessed by measuring the serum 25(OH)D levels, which can be assayed using radioimmunoassays, competitive protein-binding assays (CPBA), high pressure liquid chromatography (HPLC), and liquid chromatography-tandem mass spectrometry (LC-MS/MS). These may yield different results with inter-assay variation reaching up to 25% (at lower serum levels) and intra-assay variation reaching 10%.
The optimal serum concentration of vitamin D has not been established and it may change across different stages of life. Similarly, there is currently no consensus on target serum vitamin D levels. There does, however, appear to be a consensus on the definition of vitamin D deficiency at 25(OH)D < 25 nmol/l, which is based on the risk of diseases such as rickets and osteomalacia. Higher target serum levels have also been proposed based on subclinical endpoints such as parathyroid hormone (PTH). Therefore, in this report, two conservative target serum levels have been adopted, 25 nmol/L (based on the risk of rickets and osteomalacia), and 40 to 50 nmol/L (based on vitamin D’s interaction with PTH).
Ontario Context
Volume & Cost
The volume of vitamin D tests done in Ontario has been increasing over the past 5 years with a steep increase of 169,000 tests in 2007 to more than 393,400 tests in 2008. The number of tests continues to rise with the projected number of tests for 2009 exceeding 731,000. According to the Ontario Schedule of Benefits, the billing cost of each test is $51.7 for 25(OH)D (L606, 100 LMS units, $0.517/unit) and $77.6 for 1,25-dihydroxyvitamin D (L605, 150 LMS units, $0.517/unit). Province wide, the total annual cost of vitamin D testing has increased from approximately $1.7M in 2004 to over $21.0M in 2008. The projected annual cost for 2009 is approximately $38.8M.
Evidence-Based Analysis
The objective of this report is to evaluate the clinical utility of vitamin D testing in the average risk population and in those with kidney disease. As a separate analysis, the report also sought to evaluate the prevalence of vitamin D deficiency in Canada. The specific research questions addressed were thus:
What is the clinical utility of vitamin D testing in the average risk population and in subjects with kidney disease?
What is the prevalence of vitamin D deficiency in the average risk population in Canada?
What is the prevalence of vitamin D deficiency in patients with kidney disease in Canada?
Clinical utility was defined as the ability to improve bone health outcomes with the focus on the average risk population (excluding those with osteoporosis) and patients with kidney disease.
Literature Search
A literature search was performed on July 17th, 2009 using OVID MEDLINE, MEDLINE In-Process and Other Non-Indexed Citations, EMBASE, the Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Cochrane Library, and the International Agency for Health Technology Assessment (INAHTA) for studies published from January 1, 1998 until July 17th, 2009. Abstracts were reviewed by a single reviewer and, for those studies meeting the eligibility criteria, full-text articles were obtained. Reference lists were also examined for any additional relevant studies not identified through the search. Articles with unknown eligibility were reviewed with a second clinical epidemiologist, then a group of epidemiologists until consensus was established. The quality of evidence was assessed as high, moderate, low or very low according to GRADE methodology.
Observational studies that evaluated the prevalence of vitamin D deficiency in Canada in the population of interest were included based on the inclusion and exclusion criteria listed below. The baseline values were used in this report in the case of interventional studies that evaluated the effect of vitamin D intake on serum levels. Studies published in grey literature were included if no studies published in the peer-reviewed literature were identified for specific outcomes or subgroups.
Considering that vitamin D status may be affected by factors such as latitude, sun exposure, food fortification, among others, the search focused on prevalence studies published in Canada. In cases where no Canadian prevalence studies were identified, the decision was made to include studies from the United States, given the similar policies in vitamin D food fortification and recommended daily intake.
Inclusion Criteria
Studies published in English
Publications that reported the prevalence of vitamin D deficiency in Canada
Studies that included subjects from the general population or with kidney disease
Studies in children or adults
Studies published between January 1998 and July 17th 2009
Exclusion Criteria
Studies that included subjects defined according to a specific disease other than kidney disease
Letters, comments, and editorials
Studies that measured the serum vitamin D levels but did not report the percentage of subjects with serum levels below a given threshold
Outcomes of Interest
Prevalence of serum vitamin D less than 25 nmol/L
Prevalence of serum vitamin D less than 40 to 50 nmol/L
Serum 25-hydroxyvitamin D was the metabolite used to assess vitamin D status. Results from adult and children studies were reported separately. Subgroup analyses according to factors that affect serum vitamin D levels (e.g., seasonal effects, skin pigmentation, and vitamin D intake) were reported if enough information was provided in the studies
Quality of Evidence
The quality of the prevalence studies was based on the method of subject recruitment and sampling, possibility of selection bias, and generalizability to the source population. The overall quality of the trials was examined according to the GRADE Working Group criteria.
Summary of Findings
Fourteen prevalence studies examining Canadian adults and children met the eligibility criteria. With the exception of one longitudinal study, the studies had a cross-sectional design. Two studies were conducted among Canadian adults with renal disease but none studied Canadian children with renal disease (though three such US studies were included). No systematic reviews or health technology assessments that evaluated the prevalence of vitamin D deficiency in Canada were identified. Two studies were published in grey literature, consisting of a Canadian survey designed to measure serum vitamin D levels and a study in infants presented as an abstract at a conference. Also included were the results of vitamin D tests performed in community laboratories in Ontario between October 2008 and September 2009 (provided by the Ontario Association of Medical Laboratories).
Different threshold levels were used in the studies, thus we reported the percentage of subjects with serum levels of between 25 and 30 nmol/L and between 37.5 and 50 nmol/L. Some studies stratified the results according to factors affecting vitamin D status and two used multivariate models to investigate the effects of these characteristics (including age, season, BMI, vitamin D intake, skin pigmentation, and season) on serum 25(OH)D levels. It’s unclear, however, if these studies were adequately powered for these subgroup analyses.
Study participants generally consisted of healthy, community-dwelling subjects and most excluded individuals with conditions or medications that alter vitamin D or bone metabolism, such as kidney or liver disease. Although the studies were conducted in different parts of Canada, fewer were performed in Northern latitudes, i.e. above 53°N, which is equivalent to the city of Edmonton.
Adults
Serum vitamin D levels of < 25 to 30 nmol/L were observed in 0% to 25.5% of the subjects included in five studies; the weighted average was 3.8% (95% CI: 3.0, 4.6). The preliminary results of the Canadian survey showed that approximately 5% of the subjects had serum levels below 29.5 nmol/L. The results of over 600,000 vitamin D tests performed in Ontarian community laboratories between October 2008 and September 2009 showed that 2.6% of adults (> 18 years) had serum levels < 25 nmol/L.
The prevalence of serum vitamin D levels below 37.5-50 nmol/L reported among studies varied widely, ranging from 8% to 73.6% with a weighted average of 22.5%. The preliminary results of the CHMS survey showed that between 10% and 25% of subjects had serum levels below 37 to 48 nmol/L. The results of the vitamin D tests performed in community laboratories showed that 10% to 25% of the individuals had serum levels between 39 and 50 nmol/L.
In an attempt to explain this inter-study variation, the study results were stratified according to factors affecting serum vitamin D levels, as summarized below. These results should be interpreted with caution as none were adjusted for other potential confounders. Adequately powered multivariate analyses would be necessary to determine the contribution of risk factors to lower serum 25(OH)D levels.
Seasonal variation
Three adult studies evaluating serum vitamin D levels in different seasons observed a trend towards a higher prevalence of serum levels < 37.5 to 50 nmol/L during the winter and spring months, specifically 21% to 39%, compared to 8% to 14% in the summer. The weighted average was 23.6% over the winter/spring months and 9.6% over summer. The difference between the seasons was not statistically significant in one study and not reported in the other two studies.
Skin Pigmentation
Four studies observed a trend toward a higher prevalence of serum vitamin D levels < 37.5 to 50 nmol/L in subjects with darker skin pigmentation compared to those with lighter skin pigmentation, with weighted averages of 46.8% among adults with darker skin colour and 15.9% among those with fairer skin.
Vitamin D intake and serum levels
Four adult studies evaluated serum vitamin D levels according to vitamin D intake and showed an overall trend toward a lower prevalence of serum levels < 37.5 to 50 nmol/L with higher levels of vitamin D intake. One study observed a dose-response relationship between higher vitamin D intake from supplements, diet (milk), and sun exposure (results not adjusted for other variables). It was observed that subjects taking 50 to 400 IU or > 400 IU of vitamin D per day had a 6% and 3% prevalence of serum vitamin D level < 40 nmol/L, respectively, versus 29% in subjects not on vitamin D supplementation. Similarly, among subjects drinking one or two glasses of milk per day, the prevalence of serum vitamin D levels < 40 nmol/L was found to be 15%, versus 6% in those who drink more than two glasses of milk per day and 21% among those who do not drink milk. On the other hand, one study observed little variation in serum vitamin D levels during winter according to milk intake, with the proportion of subjects exhibiting vitamin D levels of < 40 nmol/L being 21% among those drinking 0-2 glasses per day, 26% among those drinking > 2 glasses, and 20% among non-milk drinkers.
The overall quality of evidence for the studies conducted among adults was deemed to be low, although it was considered moderate for the subgroups of skin pigmentation and seasonal variation.
Newborn, Children and Adolescents
Five Canadian studies evaluated serum vitamin D levels in newborns, children, and adolescents. In four of these, it was found that between 0 and 36% of children exhibited deficiency across age groups with a weighted average of 6.4%. The results of over 28,000 vitamin D tests performed in children 0 to 18 years old in Ontario laboratories (Oct. 2008 to Sept. 2009) showed that 4.4% had serum levels of < 25 nmol/L.
According to two studies, 32% of infants 24 to 30 months old and 35.3% of newborns had serum vitamin D levels of < 50 nmol/L. Two studies of children 2 to 16 years old reported that 24.5% and 34% had serum vitamin D levels below 37.5 to 40 nmol/L. In both studies, older children exhibited a higher prevalence than younger children, with weighted averages 34.4% and 10.3%, respectively. The overall weighted average of the prevalence of serum vitamin D levels < 37.5 to 50 nmol/L among pediatric studies was 25.8%. The preliminary results of the Canadian survey showed that between 10% and 25% of subjects between 6 and 11 years (N= 435) had serum levels below 50 nmol/L, while for those 12 to 19 years, 25% to 50% exhibited serum vitamin D levels below 50 nmol/L.
The effects of season, skin pigmentation, and vitamin D intake were not explored in Canadian pediatric studies. A Canadian surveillance study did, however, report 104 confirmed cases1 (2.9 cases per 100,000 children) of vitamin D-deficient rickets among Canadian children age 1 to 18 between 2002 and 2004, 57 (55%) of which from Ontario. The highest incidence occurred among children living in the North, i.e., the Yukon, Northwest Territories, and Nunavut. In 92 (89%) cases, skin pigmentation was categorized as intermediate to dark, 98 (94%) had been breastfed, and 25 (24%) were offspring of immigrants to Canada. There were no cases of rickets in children receiving ≥ 400 IU VD supplementation/day.
Overall, the quality of evidence of the studies of children was considered very low.
Kidney Disease
Adults
Two studies evaluated serum vitamin D levels in Canadian adults with kidney disease. The first included 128 patients with chronic kidney disease stages 3 to 5, 38% of which had serum vitamin D levels of < 37.5 nmol/L (measured between April and July). This is higher than what was reported in Canadian studies of the general population during the summer months (i.e. between 8% and 14%). In the second, which examined 419 subjects who had received a renal transplantation (mean time since transplantation: 7.2 ± 6.4 years), the prevalence of serum vitamin D levels < 40 nmol/L was 27.3%. The authors concluded that the prevalence observed in the study population was similar to what is expected in the general population.
Children
No studies evaluating serum vitamin D levels in Canadian pediatric patients with kidney disease could be identified, although three such US studies among children with chronic kidney disease stages 1 to 5 were. The mean age varied between 10.7 and 12.5 years in two studies but was not reported in the third. Across all three studies, the prevalence of serum vitamin D levels below the range of 37.5 to 50 nmol/L varied between 21% and 39%, which is not considerably different from what was observed in studies of healthy Canadian children (24% to 35%).
Overall, the quality of evidence in adults and children with kidney disease was considered very low.
Clinical Utility of Vitamin D Testing
A high quality comprehensive systematic review published in August 2007 evaluated the association between serum vitamin D levels and different bone health outcomes in different age groups. A total of 72 studies were included. The authors observed that there was a trend towards improvement in some bone health outcomes with higher serum vitamin D levels. Nevertheless, precise thresholds for improved bone health outcomes could not be defined across age groups. Further, no new studies on the association were identified during an updated systematic review on vitamin D published in July 2009.
With regards to non-bone health outcomes, there is no high or even moderate quality evidence that supports the effectiveness of vitamin D in outcomes such as cancer, cardiovascular outcomes, and all-cause mortality. Even if there is any residual uncertainty, there is no evidence that testing vitamin D levels encourages adherence to Health Canada’s guidelines for vitamin D intake. A normal serum vitamin D threshold required to prevent non-bone health related conditions cannot be resolved until a causal effect or correlation has been demonstrated between vitamin D levels and these conditions. This is as an ongoing research issue around which there is currently too much uncertainty to base any conclusions that would support routine vitamin D testing.
For patients with chronic kidney disease (CKD), there is again no high or moderate quality evidence supporting improved outcomes through the use of calcitriol or vitamin D analogs. In the absence of such data, the authors of the guidelines for CKD patients consider it best practice to maintain serum calcium and phosphate at normal levels, while supplementation with active vitamin D should be considered if serum PTH levels are elevated. As previously stated, the authors of guidelines for CKD patients believe that there is not enough evidence to support routine vitamin D [25(OH)D] testing. According to what is stated in the guidelines, decisions regarding the commencement or discontinuation of treatment with calcitriol or vitamin D analogs should be based on serum PTH, calcium, and phosphate levels.
Limitations associated with the evidence of vitamin D testing include ambiguities in the definition of an ‘adequate threshold level’ and both inter- and intra- assay variability. The MAS considers both the lack of a consensus on the target serum vitamin D levels and assay limitations directly affect and undermine the clinical utility of testing. The evidence supporting the clinical utility of vitamin D testing is thus considered to be of very low quality.
Daily vitamin D intake, either through diet or supplementation, should follow Health Canada’s recommendations for healthy individuals of different age groups. For those with medical conditions such as renal disease, liver disease, and malabsorption syndromes, and for those taking medications that may affect vitamin D absorption/metabolism, physician guidance should be followed with respect to both vitamin D testing and supplementation.
Conclusions
Studies indicate that vitamin D, alone or in combination with calcium, may decrease the risk of fractures and falls among older adults.
There is no high or moderate quality evidence to support the effectiveness of vitamin D in other outcomes such as cancer, cardiovascular outcomes, and all-cause mortality.
Studies suggest that the prevalence of vitamin D deficiency in Canadian adults and children is relatively low (approximately 5%), and between 10% and 25% have serum levels below 40 to 50 nmol/L (based on very low to low grade evidence).
Given the limitations associated with serum vitamin D measurement, ambiguities in the definition of a ‘target serum level’, and the availability of clear guidelines on vitamin D supplementation from Health Canada, vitamin D testing is not warranted for the average risk population.
Health Canada has issued recommendations regarding the adequate daily intake of vitamin D, but current studies suggest that the mean dietary intake is below these recommendations. Accordingly, Health Canada’s guidelines and recommendations should be promoted.
Based on a moderate level of evidence, individuals with darker skin pigmentation appear to have a higher risk of low serum vitamin D levels than those with lighter skin pigmentation and therefore may need to be specially targeted with respect to optimum vitamin D intake. The cause-effect of this association is currently unclear.
Individuals with medical conditions such as renal and liver disease, osteoporosis, and malabsorption syndromes, as well as those taking medications that may affect vitamin D absorption/metabolism, should follow their physician’s guidance concerning both vitamin D testing and supplementation.
PMCID: PMC3377517  PMID: 23074397
2.  Variation in outpatient oral antimicrobial use patterns among Canadian provinces, 2000 to 2010 
One of the most important ways to curb the spread of antibiotic resistance among disease-causing organisms is to ensure the prudent use of antimicrobial agents, thus avoiding selective pressures that drive the spread of resistance. Accordingly, it is important to perform regular surveillance of antibiotic prescribing patterns to identify areas for improvement as well as to act as a benchmark for the measurement of change in future studies. This article, the first of five complementary articles in the current issue of the Journal, summarizes the prescribing pattern of all classes of antibiotics according to time and province in Canada.
BACKGROUND:
The volume and patterns of antimicrobial drug use are key variables to consider when developing guidelines for prescribing, and programs to address stewardship and combat the increasing prevalence of antimicrobial resistant pathogens. Because drug programs are regulated at the provincial level, there is an expectation that antibiotic use may vary among provinces.
OBJECTIVE:
To assess these potential differences according to province and time.
METHODS:
Provincial antimicrobial prescribing data at the individual drug level were acquired from the Canadian Integrated Program for Antimicrobial Resistance Surveillance for 2000 to 2010. Data were used to calculate two yearly metrics: prescriptions per 1000 inhabitant-days and the average defined daily doses per prescription. The proportion of liquid oral prescriptions of total prescriptions was also calculated as a proxy measure for the proportion of prescriptions given to children versus adults. To assess the significance of provincial antimicrobial use, linear mixed models were developed for each metric, accounting for repeated measurements over time.
RESULTS:
Significant differences among provinces were found, as well as significant changes in use over time. Newfoundland and Labrador was found to have significantly higher prescribing rates than all other provinces (P<0.001) in 2010, as well as the mean of all other provinces (P<0.001). In contrast, Quebec exhibited significantly lower prescribing than all other provinces (P<0.001 for all provinces except British Columbia, where P=0.024) and the mean of all other provinces (P<0.001).
DISCUSSION/CONCLUSION:
Reports of reductions in antimicrobial use at the Canadian level are promising, especially prescribing to children; however, care must be taken to avoid the pitfall of the ecological fallacy. Reductions are not consistent among the provinces or among the classes of antimicrobial drugs dispensed in Canada.
PMCID: PMC4028675  PMID: 24855477
Antimicrobial spending; Antimicrobial stewardship; Antimicrobial use; Prescribing patterns; Surveillance
3.  A Guide for Health Professionals Working with Aboriginal Peoples: Executive Summary 
Objective
to provide Canadian health professionals with a network of information and recommendations regarding Aboriginal health.
Options
health professionals working with Aboriginal individuals and communities in the area of women’s health care.
Outcomes
improved health status of Aboriginal peoples in Canada.
Appropriateness and accessibility of women’s health services for Aboriginal peoples.
Improved communication and clinical skills of health professionals in the area of Aboriginal health.
Improved quality of relationship between health professionals and Aboriginal individuals and communities.
Improved quality of relationship between health care professionals and Aboriginal individuals and communities.
Evidence
recommendations are based on expert opinion and a review of the literature. Published references were identified by a Medline search of all review articles, randomized clinical control trials, meta-analyses, and practice guidelines from 1966 to February 1999, using the MeSH headings “Indians, North American or Eskimos” and “Health.”* Subsequently published articles were brought to the attention of the authors in the process of writing and reviewing the document. Ancillary and unpublished references were recommended by members of the SOGC Aboriginal Health Issues Committee and the panel of expert reviewers.
Values
information collected was reviewed by the principal author. The social, cultural, political, and historic context of Aboriginal peoples in Canada, systemic barriers regarding the publication of information by Aboriginal authors, the diversity of Aboriginal peoples in Canada, and the need for a culturally appropriate and balanced presentation were carefully considered in addition to more traditional scientific evaluation. The majority of information collected consisted of descriptive health and social information and such evaluation tools as the evidence guidelines of the Canadian Task Force on the Periodic Health exam were not appropriate.
Benefits, costs, and harms
utilization of the information and recommendations by Canadian health professionals will enhance understanding, communication, and clinical skills in the area of Aboriginal health. The resulting enhancement of collaborative relationships between Aboriginal peoples and their women’s health providers may contribute to health services that are more appropriate, effective, efficient, and accessible for Aboriginal peoples in Canada. The educational process may require an initial investment of time from the health professional.
Recommendations
Recommendations were grouped according to four themes: sociocultural context, health concerns, cross-cultural understanding, and Aboriginal health resources. Health professionals are encouraged to learn the appropriate names, demographics, and traditional geographic territories and language groups of the various Aboriginal groups in Canada. In addition, sensitivity to the impact of colonization and current socioeconomic challenges to the health status of Aboriginal peoples is warranted. Health services for Aboriginal peoples should take place as close to home as possible. Governmental obligations and policies regarding determination are recognized. With respect to health concerns, holistic definitions of health, based on Aboriginal perspectives, are put forward. Aboriginal peoples continue to experience a disproportionate burden of health problems. Health professionals are encouraged to become familiar with several key areas of morbidity and mortality. Relationships between Aboriginal peoples and their care providers need to be based on a foundation of mutual respect. Gaps and barriers in the current health care system for Aboriginal peoples are identified. Health professionals are encouraged to work with Aboriginal individuals and communities to address these gaps and barriers. Aboriginal peoples require culturally appropriate health care, including treatment in their own languages when possible. This may require interpreters or Aboriginal health advocates. Health professionals are encouraged to recognize the importance of family and community roles, and to respect traditional medicines and healers. Health professionals can develop their sensitivities towards Aboriginal peoples by participating in workshops, making use of educational resources, and by spending time with Aboriginal peoples in their communities. Aboriginal communities and health professionals are encouraged to support community-based, community-directed health services and health research for Aboriginal peoples. In addition, the education of more Aboriginal health professionals is essential. The need for a preventative approach to health programming in Aboriginal communities is stressed.
Validation
recommendations were reviewed and revised by the SOGC Aboriginal Health Issues Committee, a panel of expert reviewers, and the SOGC Council. In addition, this document was also reviewed and supported by the Assembly of First Nations, Canadian Institute of Child Health, Canadian Paediatric Society, College of Family Physicians of Canada, Congress of Aboriginal Peoples, Federation of Medical Women of Canada, Inuit Tapirisat of Canada, Metis National Council, National Indian and Inuit Community Health Representatives Organization, and Pauktuutit Inuit Women’s Association.
Sponsor
Society of Obstetricians and Gynaecologists of Canada.
PMCID: PMC3653835  PMID: 23682204 CAMSID: cams2752
4.  Implementing the 2009 Institute of Medicine recommendations on resident physician work hours, supervision, and safety 
Long working hours and sleep deprivation have been a facet of physician training in the US since the advent of the modern residency system. However, the scientific evidence linking fatigue with deficits in human performance, accidents and errors in industries from aeronautics to medicine, nuclear power, and transportation has mounted over the last 40 years. This evidence has also spawned regulations to help ensure public safety across safety-sensitive industries, with the notable exception of medicine.
In late 2007, at the behest of the US Congress, the Institute of Medicine embarked on a year-long examination of the scientific evidence linking resident physician sleep deprivation with clinical performance deficits and medical errors. The Institute of Medicine’s report, entitled “Resident duty hours: Enhancing sleep, supervision and safety”, published in January 2009, recommended new limits on resident physician work hours and workload, increased supervision, a heightened focus on resident physician safety, training in structured handovers and quality improvement, more rigorous external oversight of work hours and other aspects of residency training, and the identification of expanded funding sources necessary to implement the recommended reforms successfully and protect the public and resident physicians themselves from preventable harm.
Given that resident physicians comprise almost a quarter of all physicians who work in hospitals, and that taxpayers, through Medicare and Medicaid, fund graduate medical education, the public has a deep investment in physician training. Patients expect to receive safe, high-quality care in the nation’s teaching hospitals. Because it is their safety that is at issue, their voices should be central in policy decisions affecting patient safety. It is likewise important to integrate the perspectives of resident physicians, policy makers, and other constituencies in designing new policies. However, since its release, discussion of the Institute of Medicine report has been largely confined to the medical education community, led by the Accreditation Council for Graduate Medical Education (ACGME).
To begin gathering these perspectives and developing a plan to implement safer work hours for resident physicians, a conference entitled “Enhancing sleep, supervision and safety: What will it take to implement the Institute of Medicine recommendations?” was held at Harvard Medical School on June 17–18, 2010. This White Paper is a product of a diverse group of 26 representative stakeholders bringing relevant new information and innovative practices to bear on a critical patient safety problem. Given that our conference included experts from across disciplines with diverse perspectives and interests, not every recommendation was endorsed by each invited conference participant. However, every recommendation made here was endorsed by the majority of the group, and many were endorsed unanimously. Conference members participated in the process, reviewed the final product, and provided input before publication. Participants provided their individual perspectives, which do not necessarily represent the formal views of any organization.
In September 2010 the ACGME issued new rules to go into effect on July 1, 2011. Unfortunately, they stop considerably short of the Institute of Medicine’s recommendations and those endorsed by this conference. In particular, the ACGME only applied the limitation of 16 hours to first-year resident physicans. Thus, it is clear that policymakers, hospital administrators, and residency program directors who wish to implement safer health care systems must go far beyond what the ACGME will require. We hope this White Paper will serve as a guide and provide encouragement for that effort.
Resident physician workload and supervision
By the end of training, a resident physician should be able to practice independently. Yet much of resident physicians’ time is dominated by tasks with little educational value. The caseload can be so great that inadequate reflective time is left for learning based on clinical experiences. In addition, supervision is often vaguely defined and discontinuous. Medical malpractice data indicate that resident physicians are frequently named in lawsuits, most often for lack of supervision. The recommendations are: The ACGME should adjust resident physicians workload requirements to optimize educational value. Resident physicians as well as faculty should be involved in work redesign that eliminates nonessential and noneducational activity from resident physician dutiesMechanisms should be developed for identifying in real time when a resident physician’s workload is excessive, and processes developed to activate additional providersTeamwork should be actively encouraged in delivery of patient care. Historically, much of medical training has focused on individual knowledge, skills, and responsibility. As health care delivery has become more complex, it will be essential to train resident and attending physicians in effective teamwork that emphasizes collective responsibility for patient care and recognizes the signs, both individual and systemic, of a schedule and working conditions that are too demanding to be safeHospitals should embrace the opportunities that resident physician training redesign offers. Hospitals should recognize and act on the potential benefits of work redesign, eg, increased efficiency, reduced costs, improved quality of care, and resident physician and attending job satisfactionAttending physicians should supervise all hospital admissions. Resident physicians should directly discuss all admissions with attending physicians. Attending physicians should be both cognizant of and have input into the care patients are to receive upon admission to the hospitalInhouse supervision should be required for all critical care services, including emergency rooms, intensive care units, and trauma services. Resident physicians should not be left unsupervised to care for critically ill patients. In settings in which the acuity is high, physicians who have completed residency should provide direct supervision for resident physicians. Supervising physicians should always be physically in the hospital for supervision of resident physicians who care for critically ill patientsThe ACGME should explicitly define “good” supervision by specialty and by year of training. Explicit requirements for intensity and level of training for supervision of specific clinical scenarios should be providedCenters for Medicare and Medicaid Services (CMS) should use graduate medical education funding to provide incentives to programs with proven, effective levels of supervision. Although this action would require federal legislation, reimbursement rules would help to ensure that hospitals pay attention to the importance of good supervision and require it from their training programs
Resident physician work hours
Although the IOM “Sleep, supervision and safety” report provides a comprehensive review and discussion of all aspects of graduate medical education training, the report’s focal point is its recommendations regarding the hours that resident physicians are currently required to work. A considerable body of scientific evidence, much of it cited by the Institute of Medicine report, describes deteriorating performance in fatigued humans, as well as specific studies on resident physician fatigue and preventable medical errors.
The question before this conference was what work redesign and cultural changes are needed to reform work hours as recommended by the Institute of Medicine’s evidence-based report? Extensive scientific data demonstrate that shifts exceeding 12–16 hours without sleep are unsafe. Several principles should be followed in efforts to reduce consecutive hours below this level and achieve safer work schedules. The recommendations are: Limit resident physician work hours to 12–16 hour maximum shiftsA minimum of 10 hours off duty should be scheduled between shiftsResident physician input into work redesign should be actively solicitedSchedules should be designed that adhere to principles of sleep and circadian science; this includes careful consideration of the effects of multiple consecutive night shifts, and provision of adequate time off after night work, as specified in the IOM reportResident physicians should not be scheduled up to the maximum permissible limits; emergencies frequently occur that require resident physicians to stay longer than their scheduled shifts, and this should be anticipated in scheduling resident physicians’ work shiftsHospitals should anticipate the need for iterative improvement as new schedules are initiated; be prepared to learn from the initial phase-in, and change the plan as neededAs resident physician work hours are redesigned, attending physicians should also be considered; a potential consequence of resident physician work hour reduction and increased supervisory requirements may be an increase in work for attending physicians; this should be carefully monitored, and adjustments to attending physician work schedules made as needed to prevent unsafe work hours or working conditions for this group“Home call” should be brought under the overall limits of working hours; work load and hours should be monitored in each residency program to ensure that resident physicians and fellows on home call are getting sufficient sleepMedicare funding for graduate medical education in each hospital should be linked with adherence to the Institute of Medicine limits on resident physician work hours
Moonlighting by resident physicians
The Institute of Medicine report recommended including external as well as internal moonlighting in working hour limits. The recommendation is: All moonlighting work hours should be included in the ACGME working hour limits and actively monitored. Hospitals should formalize a moonlighting policy and establish systems for actively monitoring resident physician moonlighting
Safety of resident physicians
The “Sleep, supervision and safety” report also addresses fatigue-related harm done to resident physicians themselves. The report focuses on two main sources of physical injury to resident physicians impaired by fatigue, ie, needle-stick exposure to blood-borne pathogens and motor vehicle crashes. Providing safe transportation home for resident physicians is a logistical and financial challenge for hospitals. Educating physicians at all levels on the dangers of fatigue is clearly required to change driving behavior so that safe hospital-funded transport home is used effectively. Fatigue-related injury prevention (including not driving while drowsy) should be taught in medical school and during residency, and reinforced with attending physicians; hospitals and residency programs must be informed that resident physicians’ ability to judge their own level of impairment is impaired when they are sleep deprived; hence, leaving decisions about the capacity to drive to impaired resident physicians is not recommendedHospitals should provide transportation to all resident physicians who report feeling too tired to drive safely; in addition, although consecutive work should not exceed 16 hours, hospitals should provide transportation for all resident physicians who, because of unforeseen reasons or emergencies, work for longer than consecutive 24 hours; transportation under these circumstances should be automatically provided to house staff, and should not rely on self-identification or request
Training in effective handovers and quality improvement
Handover practice for resident physicians, attendings, and other health care providers has long been identified as a weak link in patient safety throughout health care settings. Policies to improve handovers of care must be tailored to fit the appropriate clinical scenario, recognizing that information overload can also be a problem. At the heart of improving handovers is the organizational effort to improve quality, an effort in which resident physicians have typically been insufficiently engaged. The recommendations are: Hospitals should train attending and resident physicians in effective handovers of careHospitals should create uniform processes for handovers that are tailored to meet each clinical setting; all handovers should be done verbally and face-to-face, but should also utilize written toolsWhen possible, hospitals should integrate hand-over tools into their electronic medical records (EMR) systems; these systems should be standardized to the extent possible across residency programs in a hospital, but may be tailored to the needs of specific programs and services; federal government should help subsidize adoption of electronic medical records by hospitals to improve signoutWhen feasible, handovers should be a team effort including nurses, patients, and familiesHospitals should include residents in their quality improvement and patient safety efforts; the ACGME should specify in their core competency requirements that resident physicians work on quality improvement projects; likewise, the Joint Commission should require that resident physicians be included in quality improvement and patient safety programs at teaching hospitals; hospital administrators and residency program directors should create opportunities for resident physicians to become involved in ongoing quality improvement projects and root cause analysis teams; feedback on successful quality improvement interventions should be shared with resident physicians and broadly disseminatedQuality improvement/patient safety concepts should be integral to the medical school curriculum; medical school deans should elevate the topics of patient safety, quality improvement, and teamwork; these concepts should be integrated throughout the medical school curriculum and reinforced throughout residency; mastery of these concepts by medical students should be tested on the United States Medical Licensing Examination (USMLE) stepsFederal government should support involvement of resident physicians in quality improvement efforts; initiatives to improve quality by including resident physicians in quality improvement projects should be financially supported by the Department of Health and Human Services
Monitoring and oversight of the ACGME
While the ACGME is a key stakeholder in residency training, external voices are essential to ensure that public interests are heard in the development and monitoring of standards. Consequently, the Institute of Medicine report recommended external oversight and monitoring through the Joint Commission and Centers for Medicare and Medicaid Services (CMS). The recommendations are: Make comprehensive fatigue management a Joint Commission National Patient Safety Goal; fatigue is a safety concern not only for resident physicians, but also for nurses, attending physicians, and other health care workers; the Joint Commission should seek to ensure that all health care workers, not just resident physicians, are working as safely as possibleFederal government, including the Centers for Medicare and Medicaid Services and the Agency for Healthcare Research and Quality, should encourage development of comprehensive fatigue management programs which all health systems would eventually be required to implementMake ACGME compliance with working hours a “ condition of participation” for reimbursement of direct and indirect graduate medical education costs; financial incentives will greatly increase the adoption of and compliance with ACGME standards
Future financial support for implementation
The Institute of Medicine’s report estimates that $1.7 billion (in 2008 dollars) would be needed to implement its recommendations. Twenty-five percent of that amount ($376 million) will be required just to bring hospitals into compliance with the existing 2003 ACGME rules. Downstream savings to the health care system could potentially result from safer care, but these benefits typically do not accrue to hospitals and residency programs, who have been asked historically to bear the burden of residency reform costs. The recommendations are: The Institute of Medicine should convene a panel of stakeholders, including private and public funders of health care and graduate medical education, to lay down the concrete steps necessary to identify and allocate the resources needed to implement the recommendations contained in the IOM “Resident duty hours: Enhancing sleep, supervision and safety” report. Conference participants suggested several approaches to engage public and private support for this initiativeEfforts to find additional funding to implement the Institute of Medicine recommendations should focus more broadly on patient safety and health care delivery reform; policy efforts focused narrowly upon resident physician work hours are less likely to succeed than broad patient safety initiatives that include residency redesign as a key componentHospitals should view the Institute of Medicine recommendations as an opportunity to begin resident physician work redesign projects as the core of a business model that embraces safety and ultimately saves resourcesBoth the Secretary of Health and Human Services and the Director of the Centers for Medicare and Medicaid Services should take the Institute of Medicine recommendations into consideration when promulgating rules for innovation grantsThe National Health Care Workforce Commission should consider the Institute of Medicine recommendations when analyzing the nation’s physician workforce needs
Recommendations for future research
Conference participants concurred that convening the stakeholders and agreeing on a research agenda was key. Some observed that some sectors within the medical education community have been reluctant to act on the data. Several logical funders for future research were identified. But above all agencies, Centers for Medicare and Medicaid Services is the only stakeholder that funds graduate medical education upstream and will reap savings downstream if preventable medical errors are reduced as a result of reform of resident physician work hours.
doi:10.2147/NSS.S19649
PMCID: PMC3630963  PMID: 23616719
resident; hospital; working hours; safety
5.  Internet-Based Device-Assisted Remote Monitoring of Cardiovascular Implantable Electronic Devices 
Executive Summary
Objective
The objective of this Medical Advisory Secretariat (MAS) report was to conduct a systematic review of the available published evidence on the safety, effectiveness, and cost-effectiveness of Internet-based device-assisted remote monitoring systems (RMSs) for therapeutic cardiac implantable electronic devices (CIEDs) such as pacemakers (PMs), implantable cardioverter-defibrillators (ICDs), and cardiac resynchronization therapy (CRT) devices. The MAS evidence-based review was performed to support public financing decisions.
Clinical Need: Condition and Target Population
Sudden cardiac death (SCD) is a major cause of fatalities in developed countries. In the United States almost half a million people die of SCD annually, resulting in more deaths than stroke, lung cancer, breast cancer, and AIDS combined. In Canada each year more than 40,000 people die from a cardiovascular related cause; approximately half of these deaths are attributable to SCD.
Most cases of SCD occur in the general population typically in those without a known history of heart disease. Most SCDs are caused by cardiac arrhythmia, an abnormal heart rhythm caused by malfunctions of the heart’s electrical system. Up to half of patients with significant heart failure (HF) also have advanced conduction abnormalities.
Cardiac arrhythmias are managed by a variety of drugs, ablative procedures, and therapeutic CIEDs. The range of CIEDs includes pacemakers (PMs), implantable cardioverter-defibrillators (ICDs), and cardiac resynchronization therapy (CRT) devices. Bradycardia is the main indication for PMs and individuals at high risk for SCD are often treated by ICDs.
Heart failure (HF) is also a significant health problem and is the most frequent cause of hospitalization in those over 65 years of age. Patients with moderate to severe HF may also have cardiac arrhythmias, although the cause may be related more to heart pump or haemodynamic failure. The presence of HF, however, increases the risk of SCD five-fold, regardless of aetiology. Patients with HF who remain highly symptomatic despite optimal drug therapy are sometimes also treated with CRT devices.
With an increasing prevalence of age-related conditions such as chronic HF and the expanding indications for ICD therapy, the rate of ICD placement has been dramatically increasing. The appropriate indications for ICD placement, as well as the rate of ICD placement, are increasingly an issue. In the United States, after the introduction of expanded coverage of ICDs, a national ICD registry was created in 2005 to track these devices. A recent survey based on this national ICD registry reported that 22.5% (25,145) of patients had received a non-evidence based ICD and that these patients experienced significantly higher in-hospital mortality and post-procedural complications.
In addition to the increased ICD device placement and the upfront device costs, there is the need for lifelong follow-up or surveillance, placing a significant burden on patients and device clinics. In 2007, over 1.6 million CIEDs were implanted in Europe and the United States, which translates to over 5.5 million patient encounters per year if the recommended follow-up practices are considered. A safe and effective RMS could potentially improve the efficiency of long-term follow-up of patients and their CIEDs.
Technology
In addition to being therapeutic devices, CIEDs have extensive diagnostic abilities. All CIEDs can be interrogated and reprogrammed during an in-clinic visit using an inductive programming wand. Remote monitoring would allow patients to transmit information recorded in their devices from the comfort of their own homes. Currently most ICD devices also have the potential to be remotely monitored. Remote monitoring (RM) can be used to check system integrity, to alert on arrhythmic episodes, and to potentially replace in-clinic follow-ups and manage disease remotely. They do not currently have the capability of being reprogrammed remotely, although this feature is being tested in pilot settings.
Every RMS is specifically designed by a manufacturer for their cardiac implant devices. For Internet-based device-assisted RMSs, this customization includes details such as web application, multiplatform sensors, custom algorithms, programming information, and types and methods of alerting patients and/or physicians. The addition of peripherals for monitoring weight and pressure or communicating with patients through the onsite communicators also varies by manufacturer. Internet-based device-assisted RMSs for CIEDs are intended to function as a surveillance system rather than an emergency system.
Health care providers therefore need to learn each application, and as more than one application may be used at one site, multiple applications may need to be reviewed for alarms. All RMSs deliver system integrity alerting; however, some systems seem to be better geared to fast arrhythmic alerting, whereas other systems appear to be more intended for remote follow-up or supplemental remote disease management. The different RMSs may therefore have different impacts on workflow organization because of their varying frequency of interrogation and methods of alerts. The integration of these proprietary RM web-based registry systems with hospital-based electronic health record systems has so far not been commonly implemented.
Currently there are 2 general types of RMSs: those that transmit device diagnostic information automatically and without patient assistance to secure Internet-based registry systems, and those that require patient assistance to transmit information. Both systems employ the use of preprogrammed alerts that are either transmitted automatically or at regular scheduled intervals to patients and/or physicians.
The current web applications, programming, and registry systems differ greatly between the manufacturers of transmitting cardiac devices. In Canada there are currently 4 manufacturers—Medtronic Inc., Biotronik, Boston Scientific Corp., and St Jude Medical Inc.—which have regulatory approval for remote transmitting CIEDs. Remote monitoring systems are proprietary to the manufacturer of the implant device. An RMS for one device will not work with another device, and the RMS may not work with all versions of the manufacturer’s devices.
All Internet-based device-assisted RMSs have common components. The implanted device is equipped with a micro-antenna that communicates with a small external device (at bedside or wearable) commonly known as the transmitter. Transmitters are able to interrogate programmed parameters and diagnostic data stored in the patients’ implant device. The information transfer to the communicator can occur at preset time intervals with the participation of the patient (waving a wand over the device) or it can be sent automatically (wirelessly) without their participation. The encrypted data are then uploaded to an Internet-based database on a secure central server. The data processing facilities at the central database, depending on the clinical urgency, can trigger an alert for the physician(s) that can be sent via email, fax, text message, or phone. The details are also posted on the secure website for viewing by the physician (or their delegate) at their convenience.
Research Questions
The research directions and specific research questions for this evidence review were as follows:
To identify the Internet-based device-assisted RMSs available for follow-up of patients with therapeutic CIEDs such as PMs, ICDs, and CRT devices.
To identify the potential risks, operational issues, or organizational issues related to Internet-based device-assisted RM for CIEDs.
To evaluate the safety, acceptability, and effectiveness of Internet-based device-assisted RMSs for CIEDs such as PMs, ICDs, and CRT devices.
To evaluate the safety, effectiveness, and cost-effectiveness of Internet-based device-assisted RMSs for CIEDs compared to usual outpatient in-office monitoring strategies.
To evaluate the resource implications or budget impact of RMSs for CIEDs in Ontario, Canada.
Research Methods
Literature Search
The review included a systematic review of published scientific literature and consultations with experts and manufacturers of all 4 approved RMSs for CIEDs in Canada. Information on CIED cardiac implant clinics was also obtained from Provincial Programs, a division within the Ministry of Health and Long-Term Care with a mandate for cardiac implant specialty care. Various administrative databases and registries were used to outline the current clinical follow-up burden of CIEDs in Ontario. The provincial population-based ICD database developed and maintained by the Institute for Clinical Evaluative Sciences (ICES) was used to review the current follow-up practices with Ontario patients implanted with ICD devices.
Search Strategy
A literature search was performed on September 21, 2010 using OVID MEDLINE, MEDLINE In-Process and Other Non-Indexed Citations, EMBASE, the Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Cochrane Library, and the International Agency for Health Technology Assessment (INAHTA) for studies published from 1950 to September 2010. Search alerts were generated and reviewed for additional relevant literature until December 31, 2010. Abstracts were reviewed by a single reviewer and, for those studies meeting the eligibility criteria full-text articles were obtained. Reference lists were also examined for any additional relevant studies not identified through the search.
Inclusion Criteria
published between 1950 and September 2010;
English language full-reports and human studies;
original reports including clinical evaluations of Internet-based device-assisted RMSs for CIEDs in clinical settings;
reports including standardized measurements on outcome events such as technical success, safety, effectiveness, cost, measures of health care utilization, morbidity, mortality, quality of life or patient satisfaction;
randomized controlled trials (RCTs), systematic reviews and meta-analyses, cohort and controlled clinical studies.
Exclusion Criteria
non-systematic reviews, letters, comments and editorials;
reports not involving standardized outcome events;
clinical reports not involving Internet-based device assisted RM systems for CIEDs in clinical settings;
reports involving studies testing or validating algorithms without RM;
studies with small samples (<10 subjects).
Outcomes of Interest
The outcomes of interest included: technical outcomes, emergency department visits, complications, major adverse events, symptoms, hospital admissions, clinic visits (scheduled and/or unscheduled), survival, morbidity (disease progression, stroke, etc.), patient satisfaction, and quality of life.
Summary of Findings
The MAS evidence review was performed to review available evidence on Internet-based device-assisted RMSs for CIEDs published until September 2010. The search identified 6 systematic reviews, 7 randomized controlled trials, and 19 reports for 16 cohort studies—3 of these being registry-based and 4 being multi-centered. The evidence is summarized in the 3 sections that follow.
1. Effectiveness of Remote Monitoring Systems of CIEDs for Cardiac Arrhythmia and Device Functioning
In total, 15 reports on 13 cohort studies involving investigations with 4 different RMSs for CIEDs in cardiology implant clinic groups were identified in the review. The 4 RMSs were: Care Link Network® (Medtronic Inc,, Minneapolis, MN, USA); Home Monitoring® (Biotronic, Berlin, Germany); House Call 11® (St Jude Medical Inc., St Pauls, MN, USA); and a manufacturer-independent RMS. Eight of these reports were with the Home Monitoring® RMS (12,949 patients), 3 were with the Care Link® RMS (167 patients), 1 was with the House Call 11® RMS (124 patients), and 1 was with a manufacturer-independent RMS (44 patients). All of the studies, except for 2 in the United States, (1 with Home Monitoring® and 1 with House Call 11®), were performed in European countries.
The RMSs in the studies were evaluated with different cardiac implant device populations: ICDs only (6 studies), ICD and CRT devices (3 studies), PM and ICD and CRT devices (4 studies), and PMs only (2 studies). The patient populations were predominately male (range, 52%–87%) in all studies, with mean ages ranging from 58 to 76 years. One study population was unique in that RMSs were evaluated for ICDs implanted solely for primary prevention in young patients (mean age, 44 years) with Brugada syndrome, which carries an inherited increased genetic risk for sudden heart attack in young adults.
Most of the cohort studies reported on the feasibility of RMSs in clinical settings with limited follow-up. In the short follow-up periods of the studies, the majority of the events were related to detection of medical events rather than system configuration or device abnormalities. The results of the studies are summarized below:
The interrogation of devices on the web platform, both for continuous and scheduled transmissions, was significantly quicker with remote follow-up, both for nurses and physicians.
In a case-control study focusing on a Brugada population–based registry with patients followed-up remotely, there were significantly fewer outpatient visits and greater detection of inappropriate shocks. One death occurred in the control group not followed remotely and post-mortem analysis indicated early signs of lead failure prior to the event.
Two studies examined the role of RMSs in following ICD leads under regulatory advisory in a European clinical setting and noted:
– Fewer inappropriate shocks were administered in the RM group.
– Urgent in-office interrogations and surgical revisions were performed within 12 days of remote alerts.
– No signs of lead fracture were detected at in-office follow-up; all were detected at remote follow-up.
Only 1 study reported evaluating quality of life in patients followed up remotely at 3 and 6 months; no values were reported.
Patient satisfaction was evaluated in 5 cohort studies, all in short term follow-up: 1 for the Home Monitoring® RMS, 3 for the Care Link® RMS, and 1 for the House Call 11® RMS.
– Patients reported receiving a sense of security from the transmitter, a good relationship with nurses and physicians, positive implications for their health, and satisfaction with RM and organization of services.
– Although patients reported that the system was easy to implement and required less than 10 minutes to transmit information, a variable proportion of patients (range, 9% 39%) reported that they needed the assistance of a caregiver for their transmission.
– The majority of patients would recommend RM to other ICD patients.
– Patients with hearing or other physical or mental conditions hindering the use of the system were excluded from studies, but the frequency of this was not reported.
Physician satisfaction was evaluated in 3 studies, all with the Care Link® RMS:
– Physicians reported an ease of use and high satisfaction with a generally short-term use of the RMS.
– Physicians reported being able to address the problems in unscheduled patient transmissions or physician initiated transmissions remotely, and were able to handle the majority of the troubleshooting calls remotely.
– Both nurses and physicians reported a high level of satisfaction with the web registry system.
2. Effectiveness of Remote Monitoring Systems in Heart Failure Patients for Cardiac Arrhythmia and Heart Failure Episodes
Remote follow-up of HF patients implanted with ICD or CRT devices, generally managed in specialized HF clinics, was evaluated in 3 cohort studies: 1 involved the Home Monitoring® RMS and 2 involved the Care Link® RMS. In these RMSs, in addition to the standard diagnostic features, the cardiac devices continuously assess other variables such as patient activity, mean heart rate, and heart rate variability. Intra-thoracic impedance, a proxy measure for lung fluid overload, was also measured in the Care Link® studies. The overall diagnostic performance of these measures cannot be evaluated, as the information was not reported for patients who did not experience intra-thoracic impedance threshold crossings or did not undergo interventions. The trial results involved descriptive information on transmissions and alerts in patients experiencing high morbidity and hospitalization in the short study periods.
3. Comparative Effectiveness of Remote Monitoring Systems for CIEDs
Seven RCTs were identified evaluating RMSs for CIEDs: 2 were for PMs (1276 patients) and 5 were for ICD/CRT devices (3733 patients). Studies performed in the clinical setting in the United States involved both the Care Link® RMS and the Home Monitoring® RMS, whereas all studies performed in European countries involved only the Home Monitoring® RMS.
3A. Randomized Controlled Trials of Remote Monitoring Systems for Pacemakers
Two trials, both multicenter RCTs, were conducted in different countries with different RMSs and study objectives. The PREFER trial was a large trial (897 patients) performed in the United States examining the ability of Care Link®, an Internet-based remote PM interrogation system, to detect clinically actionable events (CAEs) sooner than the current in-office follow-up supplemented with transtelephonic monitoring transmissions, a limited form of remote device interrogation. The trial results are summarized below:
In the 375-day mean follow-up, 382 patients were identified with at least 1 CAE—111 patients in the control arm and 271 in the remote arm.
The event rate detected per patient for every type of CAE, except for loss of atrial capture, was higher in the remote arm than the control arm.
The median time to first detection of CAEs (4.9 vs. 6.3 months) was significantly shorter in the RMS group compared to the control group (P < 0.0001).
Additionally, only 2% (3/190) of the CAEs in the control arm were detected during a transtelephonic monitoring transmission (the rest were detected at in-office follow-ups), whereas 66% (446/676) of the CAEs were detected during remote interrogation.
The second study, the OEDIPE trial, was a smaller trial (379 patients) performed in France evaluating the ability of the Home Monitoring® RMS to shorten PM post-operative hospitalization while preserving the safety of conventional management of longer hospital stays.
Implementation and operationalization of the RMS was reported to be successful in 91% (346/379) of the patients and represented 8144 transmissions.
In the RM group 6.5% of patients failed to send messages (10 due to improper use of the transmitter, 2 with unmanageable stress). Of the 172 patients transmitting, 108 patients sent a total of 167 warnings during the trial, with a greater proportion of warnings being attributed to medical rather than technical causes.
Forty percent had no warning message transmission and among these, 6 patients experienced a major adverse event and 1 patient experienced a non-major adverse event. Of the 6 patients having a major adverse event, 5 contacted their physician.
The mean medical reaction time was faster in the RM group (6.5 ± 7.6 days vs. 11.4 ± 11.6 days).
The mean duration of hospitalization was significantly shorter (P < 0.001) for the RM group than the control group (3.2 ± 3.2 days vs. 4.8 ± 3.7 days).
Quality of life estimates by the SF-36 questionnaire were similar for the 2 groups at 1-month follow-up.
3B. Randomized Controlled Trials Evaluating Remote Monitoring Systems for ICD or CRT Devices
The 5 studies evaluating the impact of RMSs with ICD/CRT devices were conducted in the United States and in European countries and involved 2 RMSs—Care Link® and Home Monitoring ®. The objectives of the trials varied and 3 of the trials were smaller pilot investigations.
The first of the smaller studies (151 patients) evaluated patient satisfaction, achievement of patient outcomes, and the cost-effectiveness of the Care Link® RMS compared to quarterly in-office device interrogations with 1-year follow-up.
Individual outcomes such as hospitalizations, emergency department visits, and unscheduled clinic visits were not significantly different between the study groups.
Except for a significantly higher detection of atrial fibrillation in the RM group, data on ICD detection and therapy were similar in the study groups.
Health-related quality of life evaluated by the EuroQoL at 6-month or 12-month follow-up was not different between study groups.
Patients were more satisfied with their ICD care in the clinic follow-up group than in the remote follow-up group at 6-month follow-up, but were equally satisfied at 12- month follow-up.
The second small pilot trial (20 patients) examined the impact of RM follow-up with the House Call 11® system on work schedules and cost savings in patients randomized to 2 study arms varying in the degree of remote follow-up.
The total time including device interrogation, transmission time, data analysis, and physician time required was significantly shorter for the RM follow-up group.
The in-clinic waiting time was eliminated for patients in the RM follow-up group.
The physician talk time was significantly reduced in the RM follow-up group (P < 0.05).
The time for the actual device interrogation did not differ in the study groups.
The third small trial (115 patients) examined the impact of RM with the Home Monitoring® system compared to scheduled trimonthly in-clinic visits on the number of unplanned visits, total costs, health-related quality of life (SF-36), and overall mortality.
There was a 63.2% reduction in in-office visits in the RM group.
Hospitalizations or overall mortality (values not stated) were not significantly different between the study groups.
Patient-induced visits were higher in the RM group than the in-clinic follow-up group.
The TRUST Trial
The TRUST trial was a large multicenter RCT conducted at 102 centers in the United States involving the Home Monitoring® RMS for ICD devices for 1450 patients. The primary objectives of the trial were to determine if remote follow-up could be safely substituted for in-office clinic follow-up (3 in-office visits replaced) and still enable earlier physician detection of clinically actionable events.
Adherence to the protocol follow-up schedule was significantly higher in the RM group than the in-office follow-up group (93.5% vs. 88.7%, P < 0.001).
Actionability of trimonthly scheduled checks was low (6.6%) in both study groups. Overall, actionable causes were reprogramming (76.2%), medication changes (24.8%), and lead/system revisions (4%), and these were not different between the 2 study groups.
The overall mean number of in-clinic and hospital visits was significantly lower in the RM group than the in-office follow-up group (2.1 per patient-year vs. 3.8 per patient-year, P < 0.001), representing a 45% visit reduction at 12 months.
The median time from onset of first arrhythmia to physician evaluation was significantly shorter (P < 0.001) in the RM group than in the in-office follow-up group for all arrhythmias (1 day vs. 35.5 days).
The median time to detect clinically asymptomatic arrhythmia events—atrial fibrillation (AF), ventricular fibrillation (VF), ventricular tachycardia (VT), and supra-ventricular tachycardia (SVT)—was also significantly shorter (P < 0.001) in the RM group compared to the in-office follow-up group (1 day vs. 41.5 days) and was significantly quicker for each of the clinical arrhythmia events—AF (5.5 days vs. 40 days), VT (1 day vs. 28 days), VF (1 day vs. 36 days), and SVT (2 days vs. 39 days).
System-related problems occurred infrequently in both groups—in 1.5% of patients (14/908) in the RM group and in 0.7% of patients (3/432) in the in-office follow-up group.
The overall adverse event rate over 12 months was not significantly different between the 2 groups and individual adverse events were also not significantly different between the RM group and the in-office follow-up group: death (3.4% vs. 4.9%), stroke (0.3% vs. 1.2%), and surgical intervention (6.6% vs. 4.9%), respectively.
The 12-month cumulative survival was 96.4% (95% confidence interval [CI], 95.5%–97.6%) in the RM group and 94.2% (95% confidence interval [CI], 91.8%–96.6%) in the in-office follow-up group, and was not significantly different between the 2 groups (P = 0.174).
The CONNECT Trial
The CONNECT trial, another major multicenter RCT, involved the Care Link® RMS for ICD/CRT devices in a15-month follow-up study of 1,997 patients at 133 sites in the United States. The primary objective of the trial was to determine whether automatically transmitted physician alerts decreased the time from the occurrence of clinically relevant events to medical decisions. The trial results are summarized below:
Of the 575 clinical alerts sent in the study, 246 did not trigger an automatic physician alert. Transmission failures were related to technical issues such as the alert not being programmed or not being reset, and/or a variety of patient factors such as not being at home and the monitor not being plugged in or set up.
The overall mean time from the clinically relevant event to the clinical decision was significantly shorter (P < 0.001) by 17.4 days in the remote follow-up group (4.6 days for 172 patients) than the in-office follow-up group (22 days for 145 patients).
– The median time to a clinical decision was shorter in the remote follow-up group than in the in-office follow-up group for an AT/AF burden greater than or equal to 12 hours (3 days vs. 24 days) and a fast VF rate greater than or equal to 120 beats per minute (4 days vs. 23 days).
Although infrequent, similar low numbers of events involving low battery and VF detection/therapy turned off were noted in both groups. More alerts, however, were noted for out-of-range lead impedance in the RM group (18 vs. 6 patients), and the time to detect these critical events was significantly shorter in the RM group (same day vs. 17 days).
Total in-office clinic visits were reduced by 38% from 6.27 visits per patient-year in the in-office follow-up group to 3.29 visits per patient-year in the remote follow-up group.
Health care utilization visits (N = 6,227) that included cardiovascular-related hospitalization, emergency department visits, and unscheduled clinic visits were not significantly higher in the remote follow-up group.
The overall mean length of hospitalization was significantly shorter (P = 0.002) for those in the remote follow-up group (3.3 days vs. 4.0 days) and was shorter both for patients with ICD (3.0 days vs. 3.6 days) and CRT (3.8 days vs. 4.7 days) implants.
The mortality rate between the study arms was not significantly different between the follow-up groups for the ICDs (P = 0.31) or the CRT devices with defribillator (P = 0.46).
Conclusions
There is limited clinical trial information on the effectiveness of RMSs for PMs. However, for RMSs for ICD devices, multiple cohort studies and 2 large multicenter RCTs demonstrated feasibility and significant reductions in in-office clinic follow-ups with RMSs in the first year post implantation. The detection rates of clinically significant events (and asymptomatic events) were higher, and the time to a clinical decision for these events was significantly shorter, in the remote follow-up groups than in the in-office follow-up groups. The earlier detection of clinical events in the remote follow-up groups, however, was not associated with lower morbidity or mortality rates in the 1-year follow-up. The substitution of almost all the first year in-office clinic follow-ups with RM was also not associated with an increased health care utilization such as emergency department visits or hospitalizations.
The follow-up in the trials was generally short-term, up to 1 year, and was a more limited assessment of potential longer term device/lead integrity complications or issues. None of the studies compared the different RMSs, particularly the different RMSs involving patient-scheduled transmissions or automatic transmissions. Patients’ acceptance of and satisfaction with RM were reported to be high, but the impact of RM on patients’ health-related quality of life, particularly the psychological aspects, was not evaluated thoroughly. Patients who are not technologically competent, having hearing or other physical/mental impairments, were identified as potentially disadvantaged with remote surveillance. Cohort studies consistently identified subgroups of patients who preferred in-office follow-up. The evaluation of costs and workflow impact to the health care system were evaluated in European or American clinical settings, and only in a limited way.
Internet-based device-assisted RMSs involve a new approach to monitoring patients, their disease progression, and their CIEDs. Remote monitoring also has the potential to improve the current postmarket surveillance systems of evolving CIEDs and their ongoing hardware and software modifications. At this point, however, there is insufficient information to evaluate the overall impact to the health care system, although the time saving and convenience to patients and physicians associated with a substitution of in-office follow-up by RM is more certain. The broader issues surrounding infrastructure, impacts on existing clinical care systems, and regulatory concerns need to be considered for the implementation of Internet-based RMSs in jurisdictions involving different clinical practices.
PMCID: PMC3377571  PMID: 23074419
6.  Self-Administered Outpatient Antimicrobial Infusion by Uninsured Patients Discharged from a Safety-Net Hospital: A Propensity-Score-Balanced Retrospective Cohort Study 
PLoS Medicine  2015;12(12):e1001922.
Background
Outpatient parenteral antimicrobial therapy (OPAT) is accepted as safe and effective for medically stable patients to complete intravenous (IV) antibiotics in an outpatient setting. Since, however, uninsured patients in the United States generally cannot afford OPAT, safety-net hospitals are often burdened with long hospitalizations purely to infuse antibiotics, occupying beds that could be used for patients requiring more intensive services. OPAT is generally delivered in one of four settings: infusion centers, nursing homes, at home with skilled nursing assistance, or at home with self-administered therapy. The first three—termed healthcare-administered OPAT (H-OPAT)—are most commonly used in the United States by patients with insurance funding. The fourth—self-administered OPAT (S-OPAT)—is relatively uncommon, with the few published studies having been conducted in the United Kingdom. With multidisciplinary planning, we established an S-OPAT clinic in 2009 to shift care of selected uninsured patients safely to self-administration of their IV antibiotics at home. We undertook this study to determine whether the low-income mostly non-English-speaking patients in our S-OPAT program could administer their own IV antimicrobials at home with outcomes as good as, or better than, those receiving H-OPAT.
Methods and Findings
Parkland Hospital is a safety-net hospital serving Dallas County, Texas. From 1 January 2009 to 14 October 2013, all uninsured patients meeting criteria were enrolled in S-OPAT, while insured patients were discharged to H-OPAT settings. The S-OPAT patients were trained through multilingual instruction to self-administer IV antimicrobials by gravity, tested for competency before discharge, and thereafter followed at designated intervals in the S-OPAT outpatient clinic for IV access care, laboratory monitoring, and physician follow-up. The primary outcome was 30-d all-cause readmission, and the secondary outcome was 1-y all-cause mortality. The study was adequately powered for readmission but not for mortality. Clinical, sociodemographic, and outcome data were collected from the Parkland Hospital electronic medical records and the US census, constituting a historical prospective cohort study. We used multivariable logistic regression to develop a propensity score predicting S-OPAT versus H-OPAT group membership from covariates. We then estimated the effect of S-OPAT versus H-OPAT on the two outcomes using multivariable proportional hazards regression, controlling for selection bias and confounding with the propensity score and covariates.
Of the 1,168 patients discharged to receive OPAT, 944 (81%) were managed in the S-OPAT program and 224 (19%) by H-OPAT services. In multivariable proportional hazards regression models controlling for confounding and selection bias, the 30-d readmission rate was 47% lower in the S-OPAT group (adjusted hazard ratio [aHR], 0.53; 95% CI 0.35–0.81; p = 0.003), and the 1-y mortality rate did not differ significantly between the groups (aHR, 0.86; 95% CI 0.37–2.00; p = 0.73). The S-OPAT program shifted a median 26 d of inpatient infusion per patient to the outpatient setting, avoiding 27,666 inpatient days. The main limitation of this observational study—the potential bias from the difference in healthcare funding status of the groups—was addressed by propensity score modeling.
Conclusions
S-OPAT was associated with similar or better clinical outcomes than H-OPAT. S-OPAT may be an acceptable model of treatment for uninsured, medically stable patients to complete extended courses of IV antimicrobials at home.
In a propensity score-balanced retrospective cohort study, Kavita Bhavan and colleagues compare health outcomes for patients undergoing self-administered versus healthcare-administered outpatient parenteral antimicrobial therapy.
Editors' Summary
Background
Patients sometimes need lengthy courses of antimicrobial agents to treat life-threatening infections. For example, patients who develop endocarditis (an infection of the inner lining of the heart usually caused by bacteria entering the blood and traveling to the heart) need to be given antimicrobial drugs for up to six weeks. Initially, these patients require intensive diagnostic and therapeutic care in the hospital. But once the antimicrobial treatment starts to work, most patients only need regular intravenous antimicrobial infusions. Patients who stay in the hospital to receive this low intensity care occupy beds that could be used for patients requiring more intensive care. Moreover, they are at risk of catching a hospital-acquired, antibiotic-resistant infection. For these reasons, and because long-term administration of antimicrobial agents in the hospital is costly, outpatient parenteral (injected or infused) antimicrobial therapy (OPAT) is increasingly being used as a safe and effective way for medically stable patients to complete a course of intravenous antibiotics outside the hospital.
Why Was This Study Done?
In the US, OPAT is usually delivered in infusion centers, in nursing homes, or at home by visiting nurses. But healthcare-administered OPAT (H-OPAT) is available only to insured patients (in the US, medical insurance provided by employers or by the government-run Medicare and Medicaid programs funds healthcare). Uninsured people cannot usually afford H-OPAT and have to stay in safety-net hospitals (public hospitals that provide care to low-income, uninsured populations) for intravenous antibiotic treatment. In this propensity-score-balanced retrospective cohort study, the researchers investigate whether uninsured patients discharged from a safety-net hospital in Texas to self-administer OPAT at home (S-OPAT) can achieve outcomes as good as or better than those achieved by patients receiving H-OPAT. A retrospective cohort study compares recorded clinical outcomes in groups of patients who received different treatments. Because the patients were not chosen at random, such studies are subject to selection bias and confounding. Propensity score balancing is used to control for selection bias—the possibility that some members of the population are less likely to be included in a study than others. Adjustment for covariates (patient characteristics that may affect the outcome under study) is used to control for confounding—the possibility that unknown characteristics shared by patients with a specific outcome, rather than any treatment, may be responsible for that outcome.
What Did the Researchers Do and Find?
Between 2010 and 2013, 994 uninsured patients were enrolled in the hospital’s S-OPAT program, and 224 insured patients were discharged to an H-OPAT program. Patients in the S-OPAT group were trained to self-administer intravenous antimicrobials, tested for their ability to treat themselves before discharge, and then monitored by weekly visits to the S-OPAT outpatient clinic. The researchers estimated the effect of S-OPAT versus H-OPAT on 30-day all-cause readmission and one-year all-cause mortality (the primary and secondary outcomes, respectively) after adjusting for covariates and controlling for selection bias with a propensity score developed using baseline clinical and sociodemographic information collected from the patients. The 30-day readmission rate was 47% lower in the S-OPAT group than in the H-OPAT group (a significant result unlikely to have arisen by chance), and the one-year mortality rate did not differ significantly between the two groups. Notably, because the S-OPAT program resulted in patients spending fewer days having inpatient infusions, 27,666 inpatient days were avoided over the study period.
What Do These Findings Mean?
These findings indicate that, after adjusting for preexisting differences between those patients receiving S-OPAT and those receiving H-OPAT and for potential confounders, the risk of readmission within 30 days of discharge was lower in the S-OPAT group than in the H-OPAT group and the risk of dying within one year of hospital discharge did not differ significantly between the two groups (the study did not include enough participants to detect any subtle difference that might have existed for this end point). Thus, S-OPAT was associated with similar or better outcomes than H-OPAT. Note that there may be residual selection bias and confounding by characteristics not included in the propensity score. This study did not address whether S-OPAT actually improves outcomes for patients compared with H-OPAT; a randomized controlled trial in which patients are randomly assigned to receive the two treatments is needed to do this. Nevertheless, these findings suggest that S-OPAT might make it possible for uninsured, medically stable patients to have extended courses of intravenous antimicrobials at home rather than remaining in the hospital until their treatment is complete.
Additional Information
This list of resources contains links that can be accessed when viewing the PDF on a device or via the online version of the article at http://dx.doi.org/10.1371/journal.pmed.1001922.
The UK National Health Service Choices website provides basic information about the use of antibiotics, including information about when intravenous antibiotics are needed and about endocarditis
The US National Heart, Lung, and Blood Institute also provides information about endocarditis and its treatment
The Infectious Diseases Society of America provides clinical guidelines for the use of OPAT
The OPAT Initiative of the British Society for Antimicrobial Chemotherapy is a multi-stakeholder project that supports the establishment of standardized OPAT services throughout the UK; it also provides guidelines for the use of OPAT
Wikipedia has a page on propensity score matching (note that Wikipedia is a free online encyclopedia that anyone can edit; available in several languages)
doi:10.1371/journal.pmed.1001922
PMCID: PMC4686020  PMID: 26671467
7.  Canada Chair in hypertension prevention and control: A pilot project 
A five-year pilot project was initiated in Canada to fund an individual to lead the effort in improving hypertension prevention and control. As the initial recipient of the funding, the author’s objectives were to provide leadership to improve the management of hypertension through enhancements to the Canadian Hypertension Education Program, to increase public knowledge of hypertension, to reduce the prevalence of hypertension by reducing dietary sodium additives and to develop a comprehensive hypertension surveillance program. The initiative has received strong support from the hypertension community, the Public Health Agency of Canada, the Heart and Stroke Foundation of Canada, and many Canadian health care professional and scientific organizations. Progress has been made on all objectives. The pilot project was funded by The Canadian Hypertension Society, the Canadian Institutes of Health Research and sanofi-aventis, in partnership with Blood Pressure Canada, and will finish in July 2011.
PMCID: PMC2650759  PMID: 17534462
Education; Epidemiology; High blood pressure; Hypertension; Knowledge translation; Public health; Sodium; Surveillance
8.  Utilization of DXA Bone Mineral Densitometry in Ontario 
Executive Summary
Issue
Systematic reviews and analyses of administrative data were performed to determine the appropriate use of bone mineral density (BMD) assessments using dual energy x-ray absorptiometry (DXA), and the associated trends in wrist and hip fractures in Ontario.
Background
Dual Energy X-ray Absorptiometry Bone Mineral Density Assessment
Dual energy x-ray absorptiometry bone densitometers measure bone density based on differential absorption of 2 x-ray beams by bone and soft tissues. It is the gold standard for detecting and diagnosing osteoporosis, a systemic disease characterized by low bone density and altered bone structure, resulting in low bone strength and increased risk of fractures. The test is fast (approximately 10 minutes) and accurate (exceeds 90% at the hip), with low radiation (1/3 to 1/5 of that from a chest x-ray). DXA densitometers are licensed as Class 3 medical devices in Canada. The World Health Organization has established criteria for osteoporosis and osteopenia based on DXA BMD measurements: osteoporosis is defined as a BMD that is >2.5 standard deviations below the mean BMD for normal young adults (i.e. T-score <–2.5), while osteopenia is defined as BMD that is more than 1 standard deviation but less than 2.5 standard deviation below the mean for normal young adults (i.e. T-score< –1 & ≥–2.5). DXA densitometry is presently an insured health service in Ontario.
Clinical Need
 
Burden of Disease
The Canadian Multicenter Osteoporosis Study (CaMos) found that 16% of Canadian women and 6.6% of Canadian men have osteoporosis based on the WHO criteria, with prevalence increasing with age. Osteopenia was found in 49.6% of Canadian women and 39% of Canadian men. In Ontario, it is estimated that nearly 530,000 Ontarians have some degrees of osteoporosis. Osteoporosis-related fragility fractures occur most often in the wrist, femur and pelvis. These fractures, particularly those in the hip, are associated with increased mortality, and decreased functional capacity and quality of life. A Canadian study showed that at 1 year after a hip fracture, the mortality rate was 20%. Another 20% required institutional care, 40% were unable to walk independently, and there was lower health-related quality of life due to attributes such as pain, decreased mobility and decreased ability to self-care. The cost of osteoporosis and osteoporotic fractures in Canada was estimated to be $1.3 billion in 1993.
Guidelines for Bone Mineral Density Testing
With 2 exceptions, almost all guidelines address only women. None of the guidelines recommend blanket population-based BMD testing. Instead, all guidelines recommend BMD testing in people at risk of osteoporosis, predominantly women aged 65 years or older. For women under 65 years of age, BMD testing is recommended only if one major or two minor risk factors for osteoporosis exist. Osteoporosis Canada did not restrict its recommendations to women, and thus their guidelines apply to both sexes. Major risk factors are age greater than or equal to 65 years, a history of previous fractures, family history (especially parental history) of fracture, and medication or disease conditions that affect bone metabolism (such as long-term glucocorticoid therapy). Minor risk factors include low body mass index, low calcium intake, alcohol consumption, and smoking.
Current Funding for Bone Mineral Density Testing
The Ontario Health Insurance Program (OHIP) Schedule presently reimburses DXA BMD at the hip and spine. Measurements at both sites are required if feasible. Patients at low risk of accelerated bone loss are limited to one BMD test within any 24-month period, but there are no restrictions on people at high risk. The total fee including the professional and technical components for a test involving 2 or more sites is $106.00 (Cdn).
Method of Review
This review consisted of 2 parts. The first part was an analysis of Ontario administrative data relating to DXA BMD, wrist and hip fractures, and use of antiresorptive drugs in people aged 65 years and older. The Institute for Clinical Evaluative Sciences extracted data from the OHIP claims database, the Canadian Institute for Health Information hospital discharge abstract database, the National Ambulatory Care Reporting System, and the Ontario Drug Benefit database using OHIP and ICD-10 codes. The data was analyzed to examine the trends in DXA BMD use from 1992 to 2005, and to identify areas requiring improvement.
The second part included systematic reviews and analyses of evidence relating to issues identified in the analyses of utilization data. Altogether, 8 reviews and qualitative syntheses were performed, consisting of 28 published systematic reviews and/or meta-analyses, 34 randomized controlled trials, and 63 observational studies.
Findings of Utilization Analysis
Analysis of administrative data showed a 10-fold increase in the number of BMD tests in Ontario between 1993 and 2005.
OHIP claims for BMD tests are presently increasing at a rate of 6 to 7% per year. Approximately 500,000 tests were performed in 2005/06 with an age-adjusted rate of 8,600 tests per 100,000 population.
Women accounted for 90 % of all BMD tests performed in the province.
In 2005/06, there was a 2-fold variation in the rate of DXA BMD tests across local integrated health networks, but a 10-fold variation between the county with the highest rate (Toronto) and that with the lowest rate (Kenora). The analysis also showed that:
With the increased use of BMD, there was a concomitant increase in the use of antiresorptive drugs (as shown in people 65 years and older) and a decrease in the rate of hip fractures in people age 50 years and older.
Repeat BMD made up approximately 41% of all tests. Most of the people (>90%) who had annual BMD tests in a 2-year or 3-year period were coded as being at high risk for osteoporosis.
18% (20,865) of the people who had a repeat BMD within a 24-month period and 34% (98,058) of the people who had one BMD test in a 3-year period were under 65 years, had no fracture in the year, and coded as low-risk.
Only 19% of people age greater than 65 years underwent BMD testing and 41% received osteoporosis treatment during the year following a fracture.
Men accounted for 24% of all hip fractures and 21 % of all wrist fractures, but only 10% of BMD tests. The rates of BMD tests and treatment in men after a fracture were only half of those in women.
In both men and women, the rate of hip and wrist fractures mainly increased after age 65 with the sharpest increase occurring after age 80 years.
Findings of Systematic Review and Analysis
Serial Bone Mineral Density Testing for People Not Receiving Osteoporosis Treatment
A systematic review showed that the mean rate of bone loss in people not receiving osteoporosis treatment (including postmenopausal women) is generally less than 1% per year. Higher rates of bone loss were reported for people with disease conditions or on medications that affect bone metabolism. In order to be considered a genuine biological change, the change in BMD between serial measurements must exceed the least significant change (variability) of the testing, ranging from 2.77% to 8% for precisions ranging from 1% to 3% respectively. Progression in BMD was analyzed, using different rates of baseline BMD values, rates of bone loss, precision, and BMD value for initiating treatment. The analyses showed that serial BMD measurements every 24 months (as per OHIP policy for low-risk individuals) is not necessary for people with no major risk factors for osteoporosis, provided that the baseline BMD is normal (T-score ≥ –1), and the rate of bone loss is less than or equal to 1% per year. The analyses showed that for someone with a normal baseline BMD and a rate of bone loss of less than 1% per year, the change in BMD is not likely to exceed least significant change (even for a 1% precision) in less than 3 years after the baseline test, and is not likely to drop to a BMD level that requires initiation of treatment in less than 16 years after the baseline test.
Serial Bone Mineral Density Testing in People Receiving Osteoporosis Therapy
Seven published meta-analysis of randomized controlled trials (RCTs) and 2 recent RCTs on BMD monitoring during osteoporosis therapy showed that although higher increases in BMD were generally associated with reduced risk of fracture, the change in BMD only explained a small percentage of the fracture risk reduction.
Studies showed that some people with small or no increase in BMD during treatment experienced significant fracture risk reduction, indicating that other factors such as improved bone microarchitecture might have contributed to fracture risk reduction.
There is conflicting evidence relating to the role of BMD testing in improving patient compliance with osteoporosis therapy.
Even though BMD may not be a perfect surrogate for reduction in fracture risk when monitoring responses to osteoporosis therapy, experts advised that it is still the only reliable test available for this purpose.
A systematic review conducted by the Medical Advisory Secretariat showed that the magnitude of increases in BMD during osteoporosis drug therapy varied among medications. Although most of the studies yielded mean percentage increases in BMD from baseline that did not exceed the least significant change for a 2% precision after 1 year of treatment, there were some exceptions.
Bone Mineral Density Testing and Treatment After a Fragility Fracture
A review of 3 published pooled analyses of observational studies and 12 prospective population-based observational studies showed that the presence of any prevalent fracture increases the relative risk for future fractures by approximately 2-fold or more. A review of 10 systematic reviews of RCTs and 3 additional RCTs showed that therapy with antiresorptive drugs significantly reduced the risk of vertebral fractures by 40 to 50% in postmenopausal osteoporotic women and osteoporotic men, and 2 antiresorptive drugs also reduced the risk of nonvertebral fractures by 30 to 50%. Evidence from observational studies in Canada and other jurisdictions suggests that patients who had undergone BMD measurements, particularly if a diagnosis of osteoporosis is made, were more likely to be given pharmacologic bone-sparing therapy. Despite these findings, the rate of BMD investigation and osteoporosis treatment after a fracture remained low (<20%) in Ontario as well as in other jurisdictions.
Bone Mineral Density Testing in Men
There are presently no specific Canadian guidelines for BMD screening in men. A review of the literature suggests that risk factors for fracture and the rate of vertebral deformity are similar for men and women, but the mortality rate after a hip fracture is higher in men compared with women. Two bisphosphonates had been shown to reduce the risk of vertebral and hip fractures in men. However, BMD testing and osteoporosis treatment were proportionately low in Ontario men in general, and particularly after a fracture, even though men accounted for 25% of the hip and wrist fractures. The Ontario data also showed that the rates of wrist fracture and hip fracture in men rose sharply in the 75- to 80-year age group.
Ontario-Based Economic Analysis
The economic analysis focused on analyzing the economic impact of decreasing future hip fractures by increasing the rate of BMD testing in men and women age greater than or equal to 65 years following a hip or wrist fracture. A decision analysis showed the above strategy, especially when enhanced by improved reporting of BMD tests, to be cost-effective, resulting in a cost-effectiveness ratio ranging from $2,285 (Cdn) per fracture avoided (worst-case scenario) to $1,981 (Cdn) per fracture avoided (best-case scenario). A budget impact analysis estimated that shifting utilization of BMD testing from the low risk population to high risk populations within Ontario would result in a saving of $0.85 million to $1.5 million (Cdn) to the health system. The potential net saving was estimated at $1.2 million to $5 million (Cdn) when the downstream cost-avoidance due to prevention of future hip fractures was factored into the analysis.
Other Factors for Consideration
There is a lack of standardization for BMD testing in Ontario. Two different standards are presently being used and experts suggest that variability in results from different facilities may lead to unnecessary testing. There is also no requirement for standardized equipment, procedure or reporting format. The current reimbursement policy for BMD testing encourages serial testing in people at low risk of accelerated bone loss. This review showed that biannual testing is not necessary for all cases. The lack of a database to collect clinical data on BMD testing makes it difficult to evaluate the clinical profiles of patients tested and outcomes of the BMD tests. There are ministry initiatives in progress under the Osteoporosis Program to address the development of a mandatory standardized requisition form for BMD tests to facilitate data collection and clinical decision-making. Work is also underway for developing guidelines for BMD testing in men and in perimenopausal women.
Conclusion
Increased use of BMD in Ontario since 1996 appears to be associated with increased use of antiresorptive medication and a decrease in hip and wrist fractures.
Data suggest that as many as 20% (98,000) of the DXA BMD tests in Ontario in 2005/06 were performed in people aged less than 65 years, with no fracture in the current year, and coded as being at low risk for accelerated bone loss; this is not consistent with current guidelines. Even though some of these people might have been incorrectly coded as low-risk, the number of tests in people truly at low risk could still be substantial.
Approximately 4% (21,000) of the DXA BMD tests in 2005/06 were repeat BMDs in low-risk individuals within a 24-month period. Even though this is in compliance with current OHIP reimbursement policies, evidence showed that biannual serial BMD testing is not necessary in individuals without major risk factors for fractures, provided that the baseline BMD is normal (T-score < –1). In this population, BMD measurements may be repeated in 3 to 5 years after the baseline test to establish the rate of bone loss, and further serial BMD tests may not be necessary for another 7 to 10 years if the rate of bone loss is no more than 1% per year. Precision of the test needs to be considered when interpreting serial BMD results.
Although changes in BMD may not be the perfect surrogate for reduction in fracture risk as a measure of response to osteoporosis treatment, experts advised that it is presently the only reliable test for monitoring response to treatment and to help motivate patients to continue treatment. Patients should not discontinue treatment if there is no increase in BMD after the first year of treatment. Lack of response or bone loss during treatment should prompt the physician to examine whether the patient is taking the medication appropriately.
Men and women who have had a fragility fracture at the hip, spine, wrist or shoulder are at increased risk of having a future fracture, but this population is presently under investigated and under treated. Additional efforts have to be made to communicate to physicians (particularly orthopaedic surgeons and family physicians) and the public about the need for a BMD test after fracture, and for initiating treatment if low BMD is found.
Men had a disproportionately low rate of BMD tests and osteoporosis treatment, especially after a fracture. Evidence and fracture data showed that the risk of hip and wrist fractures in men rises sharply at age 70 years.
Some counties had BMD utilization rates that were only 10% of that of the county with the highest utilization. The reasons for low utilization need to be explored and addressed.
Initiatives such as aligning reimbursement policy with current guidelines, developing specific guidelines for BMD testing in men and perimenopausal women, improving BMD reports to assist in clinical decision making, developing a registry to track BMD tests, improving access to BMD tests in remote/rural counties, establishing mechanisms to alert family physicians of fractures, and educating physicians and the public, will improve the appropriate utilization of BMD tests, and further decrease the rate of fractures in Ontario. Some of these initiatives such as developing guidelines for perimenopausal women and men, and developing a standardized requisition form for BMD testing, are currently in progress under the Ontario Osteoporosis Strategy.
PMCID: PMC3379167  PMID: 23074491
9.  Public-Health and Individual Approaches to Antiretroviral Therapy: Township South Africa and Switzerland Compared 
PLoS Medicine  2008;5(7):e148.
Background
The provision of highly active antiretroviral therapy (HAART) in resource-limited settings follows a public health approach, which is characterised by a limited number of regimens and the standardisation of clinical and laboratory monitoring. In industrialized countries doctors prescribe from the full range of available antiretroviral drugs, supported by resistance testing and frequent laboratory monitoring. We compared virologic response, changes to first-line regimens, and mortality in HIV-infected patients starting HAART in South Africa and Switzerland.
Methods and Findings
We analysed data from the Swiss HIV Cohort Study and two HAART programmes in townships of Cape Town, South Africa. We included treatment-naïve patients aged 16 y or older who had started treatment with at least three drugs since 2001, and excluded intravenous drug users. Data from a total of 2,348 patients from South Africa and 1,016 patients from the Swiss HIV Cohort Study were analysed. Median baseline CD4+ T cell counts were 80 cells/μl in South Africa and 204 cells/μl in Switzerland. In South Africa, patients started with one of four first-line regimens, which was subsequently changed in 514 patients (22%). In Switzerland, 36 first-line regimens were used initially, and these were changed in 539 patients (53%). In most patients HIV-1 RNA was suppressed to 500 copies/ml or less within one year: 96% (95% confidence interval [CI] 95%–97%) in South Africa and 96% (94%–97%) in Switzerland, and 26% (22%–29%) and 27% (24%–31%), respectively, developed viral rebound within two years. Mortality was higher in South Africa than in Switzerland during the first months of HAART: adjusted hazard ratios were 5.90 (95% CI 1.81–19.2) during months 1–3 and 1.77 (0.90–3.50) during months 4–24.
Conclusions
Compared to the highly individualised approach in Switzerland, programmatic HAART in South Africa resulted in similar virologic outcomes, with relatively few changes to initial regimens. Further innovation and resources are required in South Africa to both achieve more timely access to HAART and improve the prognosis of patients who start HAART with advanced disease.
Comparing HIV treatment in Switzerland, where drug selection is individualized, and South Africa, where a programmatic approach is used, Matthias Egger and colleagues find similar virologic outcomes over two years.
Editors' Summary
Background.
Acquired immunodeficiency syndrome (AIDS) has killed more than 25 million people since the first reported case in 1981, and more than 30 million people are now infected with the human immunodeficiency virus (HIV), which causes AIDS. HIV destroys immune system cells (including CD4 cells, a type of lymphocyte), leaving infected individuals susceptible to other infections. Early in the AIDS epidemic, most HIV-infected people died within 10 years of becoming infected. Then, in 1996, highly active antiretroviral therapy (HAART)—a combination of several antiretroviral drugs—was developed. Now, in resource-rich countries, clinicians provide individually tailored care for HIV-infected people by prescribing combinations of antiretroviral drugs chosen from more than 20 approved medicines. The approach to treatment of HIV in developed countries typically also includes frequent monitoring of the amount of virus in patients' blood (viral load), viral resistance testing (to see whether any viruses are resistant to specific antiretroviral drugs), and regular CD4 cell counts (an indication of immune-system health). Since the implementation of these interventions, the health and life expectancy of people with HIV has improved dramatically in these countries.
Why Was This Study Done?
The history of HIV care in resource-poor countries has been very different. Initially, these countries could not afford to provide HAART for their populations. In 2003, however, governments, international agencies, and funding bodies began to implement plans to increase HAART coverage in developing countries. By December 2006, more than a quarter of the HIV-infected people in low- and middle-income countries who urgently needed treatment were receiving HAART. However, instead of individualized treatment, HAART programs in developing countries follow a public-health approach developed by the World Health Organization. That is, drug regimens, clinical decision-making, and clinical and laboratory monitoring are all standardized. This public-health approach takes into account the realities of under-resourced health systems, but is it as effective as the individualized approach? The researchers addressed this question by comparing virologic responses (the effect of treatment on the viral load), changes to first-line (initial) therapy, and deaths in patients receiving HAART in South Africa (public-health approach) and in Switzerland (individualized approach).
What Did the Researchers Do and Find?
The researchers analyzed data collected since 2001 from more than 2,000 patients enrolled in HAART programs in two townships (Gugulethu and Khayelitsha) in Cape Town, South Africa, and from more than 1,000 patients enrolled in the Swiss HIV Cohort Study, a nationwide study of HIV-infected people. The patients in South Africa, who had a lower starting CD4 cell count and were more likely to have advanced AIDS than the patients in Switzerland, started their treatment for HIV infection with one of four first-line therapies, and about a quarter changed to a second-line therapy during the study. By contrast, 36 first-line regimens were used in Switzerland and half the patients changed to a different regimen. Despite these differences, the viral load was greatly reduced within a year in virtually all the patients and viral rebound (an increased viral load after a low measurement) developed within 2 years in a quarter of the patients in both countries. However, more patients died in South Africa than in Switzerland, particularly during the first 3 months of therapy.
What Do These Findings Mean?
These findings suggest that the public-health approach to HAART practiced in South Africa is as effective in terms of virologic outcomes as the individualized approach practiced in Switzerland. This is reassuring because it suggests that “antiretroviral anarchy” (the unregulated use of antiretroviral drugs, interruptions in drug supplies, and the lack of treatment monitoring), which is likely to lead to the emergence of viral resistance, is not happening in South Africa as some experts feared it might. Thus, these findings support the continued rollout of the public-health approach to HAART in resource-poor countries. Conversely, they also suggest that a more standardized approach to HAART could be taken in Switzerland (and in other industrialized countries) without compromising its effectiveness. Finally, the higher mortality in South Africa than in Switzerland, which partly reflects the many patients in South Africa in desperate need of HAART and their more advanced disease at the start of therapy, suggests that HIV-infected patients in South Africa and in other resource-limited countries would benefit from earlier initiation of therapy.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0050148.
The World Health Organization provides information about universal access to HIV treatment (in several languages) and on its recommendations for a public-health approach to antiretroviral therapy for HIV infection
More details on the Swiss HIV Cohort Study and on the studies in Gugulethu and Khayelitsha are available
Information is available from the US National Institute of Allergy and Infectious Diseases on HIV infection and AIDS
HIV InSite has comprehensive information on all aspects of HIV/AIDS, including detailed information about antiretroviral therapy and links to treatment guidelines for various countries
Information is available from Avert, an international AIDS charity, on HIV and AIDS around the world and on providing AIDS drug treatment for millions
doi:10.1371/journal.pmed.0050148
PMCID: PMC2443185  PMID: 18613745
10.  The Effect of Universal Influenza Immunization on Mortality and Health Care Use 
PLoS Medicine  2008;5(10):e211.
Background
In 2000, Ontario, Canada, initiated a universal influenza immunization program (UIIP) to provide free influenza vaccines for the entire population aged 6 mo or older. Influenza immunization increased more rapidly in younger age groups in Ontario compared to other Canadian provinces, which all maintained targeted immunization programs. We evaluated the effect of Ontario's UIIP on influenza-associated mortality, hospitalizations, emergency department (ED) use, and visits to doctors' offices.
Methods and Findings
Mortality and hospitalization data from 1997 to 2004 for all ten Canadian provinces were obtained from national datasets. Physician billing claims for visits to EDs and doctors' offices were obtained from provincial administrative datasets for four provinces with comprehensive data. Since outcomes coded as influenza are known to underestimate the true burden of influenza, we studied more broadly defined conditions. Hospitalizations, ED use, doctors' office visits for pneumonia and influenza, and all-cause mortality from 1997 to 2004 were modelled using Poisson regression, controlling for age, sex, province, influenza surveillance data, and temporal trends, and used to estimate the expected baseline outcome rates in the absence of influenza activity. The primary outcome was then defined as influenza-associated events, or the difference between the observed events and the expected baseline events. Changes in influenza-associated outcome rates before and after UIIP introduction in Ontario were compared to the corresponding changes in other provinces. After UIIP introduction, influenza-associated mortality decreased more in Ontario (relative rate [RR] = 0.26) than in other provinces (RR = 0.43) (ratio of RRs = 0.61, p = 0.002). Similar differences between Ontario and other provinces were observed for influenza-associated hospitalizations (RR = 0.25 versus 0.44, ratio of RRs = 0.58, p < 0.001), ED use (RR = 0.31 versus 0.69, ratio of RRs = 0.45, p < 0.001), and doctors' office visits (RR = 0.21 versus 0.52, ratio of RRs = 0.41, p < 0.001). Sensitivity analyses were carried out to assess consistency, specificity, and the presence of a dose-response relationship. Limitations of this study include the ecological study design, the nonspecific outcomes, difficulty in modeling baseline events, data quality and availability, and the inability to control for potentially important confounders.
Conclusions
Compared to targeted programs in other provinces, introduction of universal vaccination in Ontario in 2000 was associated with relative reductions in influenza-associated mortality and health care use. The results of this large-scale natural experiment suggest that universal vaccination may be an effective public health measure for reducing the annual burden of influenza.
Comparing influenza-related mortality and health care use between Ontario and other Canadian provinces, Jeffrey Kwong and colleagues find evidence that Ontario's universal vaccination program has reduced the burden of influenza.
Editors' Summary
Background.
Seasonal outbreaks (epidemics) of influenza—a viral disease of the nose, throat, and airways—affect millions of people and kill about 500,000 individuals every year. These epidemics occur because of “antigenic drift”: small but frequent changes in the viral proteins to which the human immune system responds mean that an immune response produced one year by exposure to an influenza virus provides only partial protection against influenza the next year. Immunization can boost this natural immunity and reduce a person's chances of catching influenza. That is, an injection of killed influenza viruses can be used to prime the immune system so that it responds quickly and efficiently when exposed to live virus. However, because of antigenic drift, for influenza immunization to be effective, it has to be repeated annually with a vaccine that contains the major circulating strains of the influenza virus.
Why Was This Study Done?
Public-health organizations recommend targeted vaccination programs, so that elderly people, infants, and chronically ill individuals—the people most likely to die from pneumonia and other complications of influenza—receive annual influenza vaccination. Some experts argue, however, that universal vaccination might provide populations with better protection from influenza, both directly by increasing the number of vaccinated people and indirectly through “herd immunity,” which occurs when a high proportion of the population is immune to an infectious disease, so that even unvaccinated people are unlikely to become infected (because infected people rarely come into contact with susceptible people). In this study, the researchers compare the effects of the world's first free universal influenza immunization program (UIIP), which started in 2000 in the Canadian province of Ontario, on influenza-associated deaths and health care use with the effects of targeted vaccine programs on the same outcomes elsewhere in Canada.
What Did the Researchers Do and Find?
Using national records, the researchers collected data on influenza vaccination, on all deaths, and on hospitalizations for pneumonia and influenza in all Canadian provinces between 1997 and 2004. They also collected data on emergency department and doctors' office visits for pneumonia and influenza for Ontario, Quebec, Alberta, and Manitoba. They then used a mathematical model to estimate the baseline rates for these outcomes in the absence of influenza activity, and from these calculated weekly rates for deaths and health care use specifically resulting from influenza. In 1996–1997, 18% of the population was vaccinated against influenza in Ontario whereas in the other provinces combined the vaccination rate was 13%. On average, since 2000—the year in which UIIP was introduced in Ontario—vaccination rates have risen to 38% and 24% in Ontario and the other provinces, respectively. Since the introduction of UIIP, the researchers report, influenza-associated deaths have decreased by 74% in Ontario but by only 57% in the other provinces combined. Influenza-associated use of health care facilities has also decreased more in Ontario than in the other provinces over the same period.
What Do These Findings Mean?
These findings are limited by some aspects of the study design. For example, they depend on the accuracy of the assumptions made when calculating events due specifically to influenza, and on the availability and accuracy of vaccination and clinical outcome data. In addition, it is possible that influenza-associated deaths and health care use may have decreased more in Ontario than in the other Canadian provinces because of some unrecognized health care changes specific to Ontario but unrelated to the introduction of universal influenza vaccination. Nevertheless, these findings indicate that, compared to the targeted vaccination programs in the other Canadian provinces, the Ontarian UIIP is associated with reductions in influenza-associated deaths and health care use, particularly in people younger than 65 years old. This effect is seen at a level of vaccination unlikely to produce herd immunity so might be more marked if the uptake of vaccination could be further increased. Thus, although it is possible that Canada is a special case, these findings suggest that universal influenza vaccination might be an effective way to reduce the global burden of influenza.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0050211.
Read the related PLoSMedicine Perspective by Cécile Viboud and Mark Miller
A related PLoSMedicine Research Article by Carline van den Dool and colleagues is also available
The Ontario Ministry of Health provides information on its universal influenza immunization program (in English and French)
The World Health Organization provides information on influenza and on influenza vaccines (in several languages)
The US Centers for Disease Control and Prevention provide information for patients and professionals on all aspects of influenza (in English and Spanish)
MedlinePlus provides a list of links to other information about influenza (in English and Spanish)
The UK National Health Service provides information about the science of immunization, including a simple explanatory animation of immunity
doi:10.1371/journal.pmed.0050211
PMCID: PMC2573914  PMID: 18959473
11.  Pulmonary rehabilitation in Canada: A report from the Canadian Thoracic Society COPD Clinical Assembly 
Pulmonary rehabilitation (PR) is a cornerstone in the management of individuals with chronic respiratory disease. Despite its central role in improving patient self-management and lowering health care costs, access to PR in Canada remains paradoxically low. Although recent investigations of PR in the United States and Europe have been published, these studies were not comprehensive, and did not address several key aspects of PR delivery and care. The most recent survey of PR programs in Canada was conducted a decade ago; however, it focused only on larger facilities and highlighted an extremely low capacity of PR programs. Furthermore, newer, more novel methods of PR delivery have been explored in some provinces since then. This comprehensive, survey-based study aimed to gauge the capacity and program charaterstics of all PR programs in Canada.
BACKGROUND:
Pulmonary rehabilitation (PR) is a recommended intervention in the management of individuals with chronic lung disease. It is important to study the characteristics and capacity of programs in Canada to confirm best practices and identify future areas of program improvement and research.
OBJECTIVE:
To identify all Canadian PR programs, regardless of setting, and to comprehensively describe all aspects of PR program delivery. The present article reports the results of the survey related to type of program, capacity and program characteristics.
METHODS:
All hospitals in Canada were contacted to identify PR programs. A representative from each program completed a 175-item online survey encompassing 16 domains, 10 of which are reported in the present article.
RESULTS:
A total of 155 facilities in Canada offered PR, of which 129 returned surveys (83% response rate). PR programs were located in all provinces, but none in the three territories. Most (60%) programs were located in hospital settings, 24% were in public health units and 8% in recreation centres. The national capacity of programs was estimated to be 10,280 patients per year, resulting in 0.4% of all Canadians with chronic obstructive pulmonary disease (COPD) and 0.8% of Canadians with moderate to severe COPD having access to PR. COPD, interstitial lung disease, and asthma were the most common diagnoses of patients. The majority of programs had at least four health care professionals involved; 9% had only one health care professional involved.
CONCLUSION:
The present comprehensive survey of PR in Canada reports an increase in the number of programs and the total number of patients enrolled since the previous survey in 2005. However, PR capacity has not kept pace with demand, with only 0.4% of Canadians with COPD having access.
PMCID: PMC4470547  PMID: 25848802
Canada; Pulmonary rehabilitation; Survey
12.  Prevalence of antimicrobial use in a network of Canadian hospitals in 2002 and 2009 
The Canadian Nosocomial Infection Surveillance Program has been performing surveillance of antibiotic-resistant organisms in Canada since 1994. The authors of this study compared two point-prevalence surveys of antimicrobial use that were conducted in hospitals that were participating in the program in 2002 and 2009. The authors compared the use of antimicrobials between these two surveys. The changes in antimicrobial use over time are presented, in addition to potential reasons for and consequences of these changes.
BACKGROUND:
Increasing antimicrobial resistance has been identified as an important global health threat. Antimicrobial use is a major driver of resistance, especially in the hospital sector. Understanding the extent and type of antimicrobial use in Canadian hospitals will aid in developing national antimicrobial stewardship priorities.
METHODS:
In 2002 and 2009, as part of one-day prevalence surveys to quantify hospital-acquired infections in Canadian Nosocomial Infection Surveillance Program hospitals, data were collected on the use of systemic antimicrobial agents in all patients in participating hospitals. Specific agents in use (other than antiviral and antiparasitic agents) on the survey day and patient demographic information were collected.
RESULTS:
In 2002, 2460 of 6747 patients (36.5%) in 28 hospitals were receiving antimicrobial therapy. In 2009, 3989 of 9953 (40.1%) patients in 44 hospitals were receiving antimicrobial therapy (P<0.001). Significantly increased use was observed in central Canada (37.4% to 40.8%) and western Canada (36.9% to 41.1%) but not in eastern Canada (32.9% to 34.1%). In 2009, antimicrobial use was most common on solid organ transplant units (71.0% of patients), intensive care units (68.3%) and hematology/oncology units (65.9%). Compared with 2002, there was a significant decrease in use of first-and second-generation cephalosporins, and significant increases in use of carbapenems, antifungal agents and vancomycin in 2009. Piperacillin-tazobactam, as a proportion of all penicillins, increased from 20% in 2002 to 42.8% in 2009 (P<0.001). There was a significant increase in simultaneous use of >1 agent, from 12.0% of patients in 2002 to 37.7% in 2009.
CONCLUSION:
From 2002 to 2009, the prevalence of antimicrobial agent use in Canadian Nosocomial Infection Surveillance Program hospitals significantly increased; additionally, increased use of broad-spectrum agents and a marked increase in simultaneous use of multiple agents were observed.
PMCID: PMC4419819  PMID: 26015790
Antimicrobial use; Hospital; Prevalence
13.  Public funding for research on antibacterial resistance in the JPIAMR countries, the European Commission, and related European Union agencies: a systematic observational analysis 
The Lancet. Infectious Diseases  2016;16(4):431-440.
Summary
Background
Antibacterial resistant infections are rising continuously, resulting in increased morbidity and mortality worldwide. With no new antibiotic classes entering the market and the possibility of returning to the pre-antibiotic era, the Joint Programming Initiative on Antimicrobial Resistance (JPIAMR) was established to address this problem. We aimed to quantify the scale and scope of publicly funded antibacterial resistance research across JPIAMR countries and at the European Union (EU) level to identify gaps and future opportunities.
Methods
We did a systematic observational analysis examining antibacterial resistance research funding. Databases of funding organisations across 19 countries and at EU level were systematically searched for publicly funded antibacterial resistance research from Jan 1, 2007, to Dec 31, 2013. We categorised studies on the basis of the JPIAMR strategic research agenda's six priority topics (therapeutics, diagnostics, surveillance, transmission, environment, and interventions) and did an observational analysis. Only research funded by public funding bodies was collected and no private organisations were contacted for their investments. Projects in basic, applied, and clinical research, including epidemiological, public health, and veterinary research and trials were identified using keyword searches by organisations, and inclusion criteria were based on the JPIAMR strategic research agenda's six priority topics, using project titles and abstracts as filters.
Findings
We identified 1243 antibacterial resistance research projects, with a total public investment of €1·3 billion across 19 countries and at EU level, including public investment in the Innovative Medicines Initiative. Of the total amount invested in antibacterial resistance research across the time period, €646·6 million (49·5%) was invested at the national level and €659·2 million (50·5%) at the EU level. When projects were classified under the six priority topics we found that 763 (63%) of 1208 projects funded at national level were within the area of therapeutics, versus 185 (15%) in transmission, 131 (11%) in diagnostics, 53 (4%) in interventions, and only 37 (3%) in environment and 39 (3%) in surveillance.
Interpretation
This was the first systematic analysis of research funding of antibacterial resistance of this scale and scope, which relied on the availability and accuracy of data from organisations included. Large variation was seen between countries both in terms of number of projects and associated investment and across the six priority topics. To determine the future direction of JPIAMR countries a clear picture of the funding landscape across Europe and Canada is needed. Countries should work together to increase the effect of research funding by strengthening national and international coordination and collaborations, harmonising research activities, and collectively pooling resources to fund multidisciplinary projects. The JPIAMR have developed a publicly available database to document the antibacterial resistance research collected and can be used as a baseline to analyse funding from 2014 onwards.
Funding
JPIAMR and the European Commission.
doi:10.1016/S1473-3099(15)00350-3
PMCID: PMC4802226  PMID: 26708524
14.  Antibiotic Selection Pressure and Macrolide Resistance in Nasopharyngeal Streptococcus pneumoniae: A Cluster-Randomized Clinical Trial 
PLoS Medicine  2010;7(12):e1000377.
Jeremy Keenan and colleagues report that during a cluster-randomized clinical trial in Ethiopia, nasopharyngeal pneumococcal resistance to macrolides was significantly higher in communities randomized to receive azithromycin compared with untreated control communities.
Background
It is widely thought that widespread antibiotic use selects for community antibiotic resistance, though this has been difficult to prove in the setting of a community-randomized clinical trial. In this study, we used a randomized clinical trial design to assess whether macrolide resistance was higher in communities treated with mass azithromycin for trachoma, compared to untreated control communities.
Methods and Findings
In a cluster-randomized trial for trachoma control in Ethiopia, 12 communities were randomized to receive mass azithromycin treatment of children aged 1–10 years at months 0, 3, 6, and 9. Twelve control communities were randomized to receive no antibiotic treatments until the conclusion of the study. Nasopharyngeal swabs were collected from randomly selected children in the treated group at baseline and month 12, and in the control group at month 12. Antibiotic susceptibility testing was performed on Streptococcus pneumoniae isolated from the swabs using Etest strips. In the treated group, the mean prevalence of azithromycin resistance among all monitored children increased from 3.6% (95% confidence interval [CI] 0.8%–8.9%) at baseline, to 46.9% (37.5%–57.5%) at month 12 (p = 0.003). In control communities, azithromycin resistance was 9.2% (95% CI 6.7%–13.3%) at month 12, significantly lower than the treated group (p<0.0001). Penicillin resistance was identified in 0.8% (95% CI 0%–4.2%) of isolates in the control group at 1 year, and in no isolates in the children-treated group at baseline or 1 year.
Conclusions
This cluster-randomized clinical trial demonstrated that compared to untreated control communities, nasopharyngeal pneumococcal resistance to macrolides was significantly higher in communities randomized to intensive azithromycin treatment. Mass azithromycin distributions were given more frequently than currently recommended by the World Health Organization's trachoma program. Azithromycin use in this setting did not select for resistance to penicillins, which remain the drug of choice for pneumococcal infections.
Trial registration
www.ClinicalTrials.gov NCT00322972
Please see later in the article for the Editors' Summary
Editors' Summary
Background
In 1928, Alexander Fleming discovered penicillin, the first antibiotic (a drug that kills bacteria). By the early 1940s, scientists were able to make large quantities of penicillin and, in the following decades, several other classes of powerful antibiotics were discovered. For example, erythromycin—the first macrolide antibiotic—was developed in the early 1950s. For a time, it looked like bacteria and the diseases that they cause had been defeated. But bacteria rapidly become resistant to antibiotics. Under the “selective pressure” of an antibiotic, bacteria that have acquired a random change in their DNA that allows them to survive in the antibiotic's presence outgrow nonresistant bacteria. What's more, bacteria can transfer antibiotic resistance genes between themselves. Nowadays, antibiotic resistance is a major public health concern. Almost every type of disease-causing bacteria has developed resistance to one or more antibiotic in clinical use and multi-drug resistant bacteria are causing outbreaks of potentially fatal diseases in hospitals and in the community.
Why Was This Study Done?
Although epidemiological studies (investigations of the causes, distribution, and control of disease in population) show a correlation between antibiotic use and antibiotic resistance in populations, such studies cannot prove that antibiotic use actually causes antibiotic resistance. It could be that the people who use more antibiotics share other characteristics that increase their chance of developing antibiotic resistance (so-called “confounding”). A causal link between antibiotic use and the development of antibiotic resistance can only be established by doing a randomized controlled trial. In such trials, groups of individuals are chosen at random to avoid confounding, given different treatments, and outcomes in the different groups compared. Here, the researchers undertake a randomized clinical trial to assess whether macrolide resistance is higher in communities treated with azithromycin for trachoma than in untreated communities. Azithromycin—an erythromycin derivative—is used to treat common bacterial infections such as middle ear infections caused by Streptococcus pneumoniae. Trachoma—the world's leading infectious cause of blindness—is caused by Chlamydia trachomatis. The World Health Organization's trachoma elimination strategy includes annual azithromycin treatment of at-risk communities.
What Did the Researchers Do and Find?
In this cluster-randomized trial (a study that randomly assigns groups of people rather than individuals to different treatments), 12 Ethiopian communities received mass azithromycin treatment of children aged 1–10 years old at 0, 3, 6, and 9 months, and 12 control communities received the antibiotic only at 12 months. The researchers took nasopharyngeal (nose and throat) swabs from randomly selected treated children at 0 and 12 months and from randomly selected control children at 12 months. They isolated S. pneumoniae from the swabs and tested the isolates for antibiotic susceptibility. 70%–80% of the children tested had S. pneumoniae in their nose or throat. In the treated group, 3.6% of monitored children were carrying azithromycin-resistant S. pneumoniae at 0 months, whereas 46.9% were doing so at 12 months—a statistically significant increase. Only 9.2% of the monitored children in the untreated group were carrying azithromycin-resistant S. pneumoniae at 12 months, a significantly lower prevalence than in the treated group. Importantly, there was no resistance to penicillin in any S. pneumoniae isolates obtained from the treated children at 0 or 12 months; one penicillin-resistant isolate was obtained from the control children.
What Do These Findings Mean?
These findings indicate that macrolide resistance is higher in nasopharyngeal S. pneumoniae in communities receiving intensive azithromycin treatment than in untreated communities. Thus, they support the idea that frequent antibiotic use selects for antibiotic resistance in populations. Although the study was undertaken in Ethiopian communities with high rates of nasopharyngeal S. pneumoniae carriage, this finding is likely to be generalizable to other settings. Importantly, these findings have no bearing on current trachoma control activities, which use less frequent antibiotic treatments and are less likely to select for azithromycin resistance. The lack of any increase in penicillin resistance, which is usually the first-line therapy for S. pneumoniae infections, is also reassuring. However, although these findings suggest that the benefits of mass azithromycin treatment for trachoma outweigh any potential adverse affects, they nonetheless highlight the importance of continued monitoring for the secondary effects of mass antibiotic distributions.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1000377.
The Bugs and Drugs website provides information about antibiotic resistance and links to other resources
The US National Institute of Allergy and Infectious Diseases provides information on antimicrobial drug resistance and on diseases caused by S. pneumoniae (pneumococcal diseases)
The US Centers for Disease Control and Prevention also have information on antibiotic resistance (in English and Spanish)
The World Health Organization has information about the global threat of antimicrobial resistance and about trachoma (in several languages)
More information about the trial described in this paper is available on ClinicalTrials.gov
doi:10.1371/journal.pmed.1000377
PMCID: PMC3001893  PMID: 21179434
15.  Modelling the evolution of drug resistance in the presence of antiviral drugs 
BMC Public Health  2007;7:300.
Background
The emergence of drug resistance in treated populations and the transmission of drug resistant strains to newly infected individuals are important public health concerns in the prevention and control of infectious diseases such as HIV and influenza. Mathematical modelling may help guide the design of treatment programs and also may help us better understand the potential benefits and limitations of prevention strategies.
Methods
To explore further the potential synergies between modelling of drug resistance in HIV and in pandemic influenza, the Public Health Agency of Canada and the Mathematics for Information Technology and Complex Systems brought together selected scientists and public health experts for a workshop in Ottawa in January 2007, to discuss the emergence and transmission of HIV antiviral drug resistance, to report on progress in the use of mathematical models to study the emergence and spread of drug resistant influenza viral strains, and to recommend future research priorities.
Results
General lectures and round-table discussions were organized around the issues on HIV drug resistance at the population level, HIV drug resistance in Western Canada, HIV drug resistance at the host level (with focus on optimal treatment strategies), and drug resistance for pandemic influenza planning.
Conclusion
Some of the issues related to drug resistance in HIV and pandemic influenza can possibly be addressed using existing mathematical models, with a special focus on linking the existing models to the data obtained through the Canadian HIV Strain and DR Surveillance Program. Preliminary statistical analysis of these data carried out at PHAC, together with the general model framework developed by Dr. Blower and her collaborators, should provide further insights into the mechanisms behind the observed trends and thus could help with the prediction and analysis of future trends in the aforementioned items. Remarkable similarity between dynamic, compartmental models for the evolution of wild and drug resistance strains of both HIV and pandemic influenza may provide sufficient common ground to create synergies between modellers working in these two areas. One of the key contributions of mathematical modeling to the control of infectious diseases is the quantification and design of optimal strategies, combining techniques of operations research with dynamic modeling would enhance the contribution of mathematical modeling to the prevention and control of infectious diseases.
doi:10.1186/1471-2458-7-300
PMCID: PMC2148062  PMID: 17953775
16.  HIV, Gender, Race, Sexual Orientation, and Sex Work: A Qualitative Study of Intersectional Stigma Experienced by HIV-Positive Women in Ontario, Canada 
PLoS Medicine  2011;8(11):e1001124.
Mona Loutfy and colleagues used focus groups to examine experiences of stigma and coping strategies among HIV-positive women in Ontario, Canada.
Background
HIV infection rates are increasing among marginalized women in Ontario, Canada. HIV-related stigma, a principal factor contributing to the global HIV epidemic, interacts with structural inequities such as racism, sexism, and homophobia. The study objective was to explore experiences of stigma and coping strategies among HIV-positive women in Ontario, Canada.
Methods and Findings
We conducted a community-based qualitative investigation using focus groups to understand experiences of stigma and discrimination and coping methods among HIV-positive women from marginalized communities. We conducted 15 focus groups with HIV-positive women in five cities across Ontario, Canada. Data were analyzed using thematic analysis to enhance understanding of the lived experiences of diverse HIV-positive women. Focus group participants (n = 104; mean age = 38 years; 69% ethnic minority; 23% lesbian/bisexual; 22% transgender) described stigma/discrimination and coping across micro (intra/interpersonal), meso (social/community), and macro (organizational/political) realms. Participants across focus groups attributed experiences of stigma and discrimination to: HIV-related stigma, sexism and gender discrimination, racism, homophobia and transphobia, and involvement in sex work. Coping strategies included resilience (micro), social networks and support groups (meso), and challenging stigma (macro).
Conclusions
HIV-positive women described interdependent and mutually constitutive relationships between marginalized social identities and inequities such as HIV-related stigma, sexism, racism, and homo/transphobia. These overlapping, multilevel forms of stigma and discrimination are representative of an intersectional model of stigma and discrimination. The present findings also suggest that micro, meso, and macro level factors simultaneously present barriers to health and well being—as well as opportunities for coping—in HIV-positive women's lives. Understanding the deleterious effects of stigma and discrimination on HIV risk, mental health, and access to care among HIV-positive women can inform health care provision, stigma reduction interventions, and public health policy.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
HIV-related stigma and discrimination—prejudice, negative attitudes, abuse, and maltreatment directed at people living with HIV—is a major factor contributing to the global HIV epidemic. HIV-related stigma, which devalues and stereotypes people living with HIV, increases vulnerability to HIV infection by reducing access to HIV prevention, testing, treatment, and support. At the personal (micro) level, HIV-related stigma can make it hard for people to take tests to determine their HIV status or to tell other people that they are HIV positive. At the social/community (meso) level, it can mean that HIV-positive people are ostracized from their communities. At the organizational/political (macro) level, it can mean that health-care workers treat HIV-positive people differently and that governments are deterred from taking fast, effective action against the HIV epidemic. In addition, HIV-related stigma is negatively associated with well-being among people living with HIV. Thus, among HIV-positive people, those who have experienced HIV-related stigma have higher levels of mental and physical illness.
Why Was This Study Done?
Racism (oppression and inequity founded on ethno-racial differences), sexism and gender discrimination (oppression and inequity based on gender bias in attitudes), and homophobia and transphobia (discrimination, fear, hostility, and violence towards nonheterosexual and transgender people, respectively) can also affect access to HIV services. However, little is known about how these different forms of stigma and discrimination interact (intersect). A better understanding of the effect of intersecting stigmas on people living with HIV could help in the development of stigma reduction interventions and HIV prevention, treatment and care programs, and could help to control global HIV infection rates. In this qualitative study (an analysis of people's attitudes and experiences rather than numerical data), the researchers investigate the intersection of HIV-related stigma, racism, sexism and gender discrimination, homophobia and transphobia among marginalized HIV-positive women in Ontario, Canada. As elsewhere in the world, HIV infection rates are increasing among women in Canada. Nearly 25% of people living with HIV in Canada are women and about a quarter of all new infections are in women. Moreover, there is a disproportionately high infection rate among marginalized women in Canada such as sex workers and lesbian, bisexual, and queer women.
What Did the Researchers Do and Find?
The researchers held 15 focus groups with 104 marginalized HIV-positive women who were recruited by word-of-mouth and through flyers circulated in community agencies serving women of diverse ethno-cultural origins. Each focus group explored topics that included challenges in daily life, medical issues and needs, and issues that were silenced within the participants' communities. The researchers analyzed the data from these focus groups using thematic analysis, an approach that identifies, analyzes, and reports themes in qualitative data. They found that women living with HIV in Ontario experienced multiple types of stigma at different levels. So, for example, women experienced HIV-related stigma at the micro (“If you're HIV-positive, you feel shameful”), meso (“The thing I hate most for people that test positive for HIV is that society ostracizes them”), and macro (“A lot of women are not getting employed because they have to disclose their status”) levels. The women also attributed their experiences of stigma and discrimination to sexism and gender discrimination, racism, homophobia and transphobia, and involvement in sex work at all three levels and described coping strategies at the micro (resilience; “I always live with hope”), meso (participation in social networks), and macro (challenging stigma) levels.
What Do These Findings Mean?
These findings indicate that marginalized HIV-positive women living in Ontario experience overlapping forms of stigma and discrimination and that these forms of stigma operate over micro, meso, and macro levels, as do the coping strategies adopted by the women. Together, these results support an intersectional model of stigma and discrimination that should help to inform discussions about the complexity of stigma and coping strategies. However, because only a small sample of nonrandomly selected women was involved in this study, these findings need to be confirmed in other groups of HIV-positive women. If confirmed, the complex system of interplay of different forms of stigma revealed here should help to inform health-care provision, stigma reduction interventions, and public-health policy, and could, ultimately, help to bring the global HIV epidemic under control.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001124.
Information is available from the US National Institute of Allergy and Infectious Diseases on HIV infection and AIDS
NAM/aidsmap basic information about HIV/AIDS, and summaries of recent research findings on HIV care and treatment; its publication HIV and stigma deals with HIV-related stigma in the UK
Information is available from Avert, an international AIDS charity on many aspects of HIV/AIDS, including information on women, HIV, and AIDS, on HIV and AIDS stigma and discrimination, and on HIV/AIDS statistics for Canada (in English and Spanish)
The People Living with Stigma Index to address stigma relating to HIV and advocate on key barriers and issues perpetuating stigma; it has recently published Piecing it together for women and girls, the gender dimensions of HIV-related stigma; its website will soon include a selection of individual stories about HIV-related stigma
Patient stories about living with HIV/AIDS are available through Avert and through the charity website Healthtalkonline
doi:10.1371/journal.pmed.1001124
PMCID: PMC3222645  PMID: 22131907
17.  Public health preparedness in Alberta: a systems-level study 
BMC Public Health  2006;6:313.
Background
Recent international and national events have brought critical attention to the Canadian public health system and how prepared the system is to respond to various types of contemporary public health threats. This article describes the study design and methods being used to conduct a systems-level analysis of public health preparedness in the province of Alberta, Canada. The project is being funded under the Health Research Fund, Alberta Heritage Foundation for Medical Research.
Methods/Design
We use an embedded, multiple-case study design, integrating qualitative and quantitative methods to measure empirically the degree of inter-organizational coordination existing among public health agencies in Alberta, Canada. We situate our measures of inter-organizational network ties within a systems-level framework to assess the relative influence of inter-organizational ties, individual organizational attributes, and institutional environmental features on public health preparedness. The relative contribution of each component is examined for two potential public health threats: pandemic influenza and West Nile virus.
Discussion
The organizational dimensions of public health preparedness depend on a complex mix of individual organizational characteristics, inter-agency relationships, and institutional environmental factors. Our study is designed to discriminate among these different system components and assess the independent influence of each on the other, as well as the overall level of public health preparedness in Alberta. While all agree that competent organizations and functioning networks are important components of public health preparedness, this study is one of the first to use formal network analysis to study the role of inter-agency networks in the development of prepared public health systems.
doi:10.1186/1471-2458-6-313
PMCID: PMC1779785  PMID: 17194305
18.  Evidence for Community Transmission of Community-Associated but Not Health-Care-Associated Methicillin-Resistant Staphylococcus Aureus Strains Linked to Social and Material Deprivation: Spatial Analysis of Cross-sectional Data 
PLoS Medicine  2016;13(1):e1001944.
Background
Identifying and tackling the social determinants of infectious diseases has become a public health priority following the recognition that individuals with lower socioeconomic status are disproportionately affected by infectious diseases. In many parts of the world, epidemiologically and genotypically defined community-associated (CA) methicillin-resistant Staphylococcus aureus (MRSA) strains have emerged to become frequent causes of hospital infection. The aim of this study was to use spatial models with adjustment for area-level hospital attendance to determine the transmission niche of genotypically defined CA- and health-care-associated (HA)-MRSA strains across a diverse region of South East London and to explore a potential link between MRSA carriage and markers of social and material deprivation.
Methods and Findings
This study involved spatial analysis of cross-sectional data linked with all MRSA isolates identified by three National Health Service (NHS) microbiology laboratories between 1 November 2011 and 29 February 2012. The cohort of hospital-based NHS microbiology diagnostic services serves 867,254 usual residents in the Lambeth, Southwark, and Lewisham boroughs in South East London, United Kingdom (UK). Isolates were classified as HA- or CA-MRSA based on whole genome sequencing. All MRSA cases identified over 4 mo within the three-borough catchment area (n = 471) were mapped to small geographies and linked to area-level aggregated socioeconomic and demographic data. Disease mapping and ecological regression models were used to infer the most likely transmission niches for each MRSA genetic classification and to describe the spatial epidemiology of MRSA in relation to social determinants. Specifically, we aimed to identify demographic and socioeconomic population traits that explain cross-area extra variation in HA- and CA-MRSA relative risks following adjustment for hospital attendance data. We explored the potential for associations with the English Indices of Deprivation 2010 (including the Index of Multiple Deprivation and several deprivation domains and subdomains) and the 2011 England and Wales census demographic and socioeconomic indicators (including numbers of households by deprivation dimension) and indicators of population health. Both CA-and HA-MRSA were associated with household deprivation (CA-MRSA relative risk [RR]: 1.72 [1.03–2.94]; HA-MRSA RR: 1.57 [1.06–2.33]), which was correlated with hospital attendance (Pearson correlation coefficient [PCC] = 0.76). HA-MRSA was also associated with poor health (RR: 1.10 [1.01–1.19]) and residence in communal care homes (RR: 1.24 [1.12–1.37]), whereas CA-MRSA was linked with household overcrowding (RR: 1.58 [1.04–2.41]) and wider barriers, which represent a combined score for household overcrowding, low income, and homelessness (RR: 1.76 [1.16–2.70]). CA-MRSA was also associated with recent immigration to the UK (RR: 1.77 [1.19–2.66]). For the area-level variation in RR for CA-MRSA, 28.67% was attributable to the spatial arrangement of target geographies, compared with only 0.09% for HA-MRSA. An advantage to our study is that it provided a representative sample of usual residents receiving care in the catchment areas. A limitation is that relationships apparent in aggregated data analyses cannot be assumed to operate at the individual level.
Conclusions
There was no evidence of community transmission of HA-MRSA strains, implying that HA-MRSA cases identified in the community originate from the hospital reservoir and are maintained by frequent attendance at health care facilities. In contrast, there was a high risk of CA-MRSA in deprived areas linked with overcrowding, homelessness, low income, and recent immigration to the UK, which was not explainable by health care exposure. Furthermore, areas adjacent to these deprived areas were themselves at greater risk of CA-MRSA, indicating community transmission of CA-MRSA. This ongoing community transmission could lead to CA-MRSA becoming the dominant strain types carried by patients admitted to hospital, particularly if successful hospital-based MRSA infection control programmes are maintained. These results suggest that community infection control programmes targeting transmission of CA-MRSA will be required to control MRSA in both the community and hospital. These epidemiological changes will also have implications for effectiveness of risk-factor-based hospital admission MRSA screening programmes.
Community associated MRSA variants, rather than hospital associated ones, are more readily transmitted and this is where control programs should focus to limit both hospital and community infections.
Editors' Summary
Background
Addressing health inequality requires understanding the social determinants of poor health. Previous studies have suggested a link between deprived living conditions and infections with methicillin-resistant Staphylococcus aureus (MRSA), that is, strains of the common bacterium S. aureus that have acquired antibiotic resistance and are therefore more difficult to treat. MRSA was first identified in the 1960s and for years thought of as a dangerous health-care-associated (HA-) pathogen that infects hospital patients who are predominantly older, sick, or undergoing invasive procedures. In the late 1990s, however, community-associated MRSA (CA-MRSA) emerged as pathogen infecting healthy individuals of all ages and without recent hospital contact. Most CA-MRSA cases are contagious skin infections, and numerous outbreaks have been reported in different communities. The traditional distinction between HA-MRSA and CA-MRSA based on where transmission occurred has become problematic in recent years, because CA-MRSA transmission has also been reported in health care settings. However, as HA- and CA-MRSA strains are genetically distinct, cases can be classified by DNA sequencing regardless of where a patient got infected.
Why Was This Study Done?
With hospitals historically considered the only place of MRSA transmission, prevention efforts remain focused on health care settings. Given the changing patterns of MRSA infections, however, the need to consider HA and CA transmission settings together has been recognized. This study was designed to take a closer look at the relationship between both HA- and CA-MRSA and socioeconomic deprivation, with the ultimate aim to inform prevention efforts. The researchers selected three boroughs in South East London with a highly diverse population of approximately 850,000 residents for whom socioeconomic and demographic data were available at a high level of spatial resolution. They also had data on hospital attendance for the residents and were therefore able to account for this factor in their analysis. The study addressed the following questions: is there a link between socioeconomic deprivation and both HA- and CA-MRSA cases among the residents? What social determinants are associated with HA- and CA-MRSA cases? What are the transmission settings (i.e., community versus health care) for HA- and CA-MRSA?
What Did the Researchers Do and Find?
They analyzed data on all MRSA samples collected over 4 consecutive mo in late 2011 and early 2012 by microbiology laboratories that serve the three boroughs. Of 471 MRSA cases that occurred in residents, 392 could be classified based on genome sequencing. Of these, approximately 72% were HA-MRSA, and 26% were CA-MRSA. Approximately 2% of residents carried both HA- and CA-MRSA. All MRSA cases were mapped to 513 smaller areas (called Lower Layer Super Output Areas, or LSOAs) in the three boroughs for which extensive socioeconomic and demographic data existed. The former included data on income, employment, health, and education, the latter data on number individuals per household, their ages and gender, and length of residence in the UK. MRSA cases were detected in just over half of the LSOAs in the study area. The researchers then used mathematical models to determine the most likely transmission settings for each MRSA genetic classification. They also described the spatial distributions of the two in relation to socioeconomic and demographic determinants. Both CA-and HA-MRSA were associated with household deprivation, which was itself correlated with hospital attendance. HA-MRSA was also associated with poor health and with living in communal care homes, whereas CA-MRSA was linked with household overcrowding and a combination of household overcrowding, low income, and homelessness. CA-MRSA was also associated with recent immigration to the UK. Around 27% of local variation in CA-MRSA could be explained by the spatial arrangement of LSOAs, meaning areas of high risk tended to cluster. No such clustering was observed for HA-MRSA.
What Do these Findings Mean?
The results show that residents in the most deprived areas are at greater risk for MRSA. The absence of spatial clusters of HA-MRSA suggests that transmission of genetically determined HA-MRSA occurs in hospitals, with little or no transmission in the community. The most important risk factor for acquiring HA-MRSA is therefore likely to be hospital attendance as a result of deprivation. In contrast, genetically determined CA-MRSA both affects deprived areas disproportionately, and—as the clusters imply—spreads from such areas in the community. This suggests that living in deprived conditions itself is a risk factor for acquiring CA-MRSA, as is living near deprived neighbors. Some of the CA-MRSA cases are also likely imported by recent immigrants. Whereas transmission of CA-MRSA in health care settings has been reported in a number of other studies, data from this study cannot answer whether or to what extent this is the case here. However, because of ongoing transmission in the community, and because deprived residents are both more likely to have CA-MRSA and to attend a hospital, importation of CA-MRSA strains into hospitals is an obvious concern. While the researchers intentionally located the study in an area with a very diverse population, it is not clear how generalizable the findings are to other communities, either in the UK or in other countries. Nonetheless, the results justify special focus on deprived populations in the control of MRSA and are useful for the design of specific strategies for HA-MRSA and CA-MRSA.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001944.
Online information on MRSA from the UK National Health Service: http://www.nhs.uk/conditions/MRSA/Pages/Introduction.aspx
MRSA webpage from the US Centers of Disease Control and Prevention: http://www.cdc.gov/mrsa/
MRSA page from the San Francisco Department of Public Health: http://www.sfcdcp.org/mrsa.html
MedlinePlus provides links to information about MRSA, including sources in languages other than English: https://www.nlm.nih.gov/medlineplus/mrsa.html
doi:10.1371/journal.pmed.1001944
PMCID: PMC4727805  PMID: 26812054
19.  Bias in Research Grant Evaluation Has Dire Consequences for Small Universities 
PLoS ONE  2016;11(6):e0155876.
Federal funding for basic scientific research is the cornerstone of societal progress, economy, health and well-being. There is a direct relationship between financial investment in science and a nation’s scientific discoveries, making it a priority for governments to distribute public funding appropriately in support of the best science. However, research grant proposal success rate and funding level can be skewed toward certain groups of applicants, and such skew may be driven by systemic bias arising during grant proposal evaluation and scoring. Policies to best redress this problem are not well established. Here, we show that funding success and grant amounts for applications to Canada’s Natural Sciences and Engineering Research Council (NSERC) Discovery Grant program (2011–2014) are consistently lower for applicants from small institutions. This pattern persists across applicant experience levels, is consistent among three criteria used to score grant proposals, and therefore is interpreted as representing systemic bias targeting applicants from small institutions. When current funding success rates are projected forward, forecasts reveal that future science funding at small schools in Canada will decline precipitously in the next decade, if skews are left uncorrected. We show that a recently-adopted pilot program to bolster success by lowering standards for select applicants from small institutions will not erase funding skew, nor will several other post-evaluation corrective measures. Rather, to support objective and robust review of grant applications, it is necessary for research councils to address evaluation skew directly, by adopting procedures such as blind review of research proposals and bibliometric assessment of performance. Such measures will be important in restoring confidence in the objectivity and fairness of science funding decisions. Likewise, small institutions can improve their research success by more strongly supporting productive researchers and developing competitive graduate programming opportunities.
doi:10.1371/journal.pone.0155876
PMCID: PMC4892638  PMID: 27258385
20.  Mortality and Hospital Stay Associated with Resistant Staphylococcus aureus and Escherichia coli Bacteremia: Estimating the Burden of Antibiotic Resistance in Europe 
PLoS Medicine  2011;8(10):e1001104.
The authors calculate excess mortality, excess hospital stay, and related hospital expenditure associated with antibiotic-resistant bacterial bloodstream infections (Staphylococcus aureus and Escherichia coli) in Europe.
Background
The relative importance of human diseases is conventionally assessed by cause-specific mortality, morbidity, and economic impact. Current estimates for infections caused by antibiotic-resistant bacteria are not sufficiently supported by quantitative empirical data. This study determined the excess number of deaths, bed-days, and hospital costs associated with blood stream infections (BSIs) caused by methicillin-resistant Staphylococcus aureus (MRSA) and third-generation cephalosporin-resistant Escherichia coli (G3CREC) in 31 countries that participated in the European Antimicrobial Resistance Surveillance System (EARSS).
Methods and Findings
The number of BSIs caused by MRSA and G3CREC was extrapolated from EARSS prevalence data and national health care statistics. Prospective cohort studies, carried out in hospitals participating in EARSS in 2007, provided the parameters for estimating the excess 30-d mortality and hospital stay associated with BSIs caused by either MRSA or G3CREC. Hospital expenditure was derived from a publicly available cost model. Trends established by EARSS were used to determine the trajectories for MRSA and G3CREC prevalence until 2015. In 2007, 27,711 episodes of MRSA BSIs were associated with 5,503 excess deaths and 255,683 excess hospital days in the participating countries, whereas 15,183 episodes of G3CREC BSIs were associated with 2,712 excess deaths and 120,065 extra hospital days. The total costs attributable to excess hospital stays for MRSA and G3CREC BSIs were 44.0 and 18.1 million Euros (63.1 and 29.7 million international dollars), respectively. Based on prevailing trends, the number of BSIs caused by G3CREC is likely to rapidly increase, outnumbering the number of MRSA BSIs in the near future.
Conclusions
Excess mortality associated with BSIs caused by MRSA and G3CREC is significant, and the prolongation of hospital stay imposes a considerable burden on health care systems. A foreseeable shift in the burden of antibiotic resistance from Gram-positive to Gram-negative infections will exacerbate this situation and is reason for concern.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Antimicrobial resistance—a consequence of the use and misuse of antimicrobial medicines—occurs when a microorganism becomes resistant (usually by mutation or acquiring a resistance gene) to an antimicrobial drug to which it was previously sensitive. Then standard treatments become ineffective, leading to persistent infections, which may spread to other people. With some notable exceptions such as TB, HIV, malaria, and gonorrhea, most of the disease burden attributable to antimicrobial resistance is caused by hospital-associated infections due to opportunistic bacterial pathogens. These bacteria often cause life-threatening or difficult-to-manage conditions such as deep tissue, wound, or bone infections, or infections of the lower respiratory tract, central nervous system, or blood stream. The two most frequent causes of blood stream infections encountered worldwide are Staphylococcus aureus and Escherichia coli.
Why Was This Study Done?
Although hospital-associated infections have gained much attention over the past decade, the overall effect of this growing phenomenon on human health and medical services has still to be adequately quantified. The researchers proposed to fill this information gap by estimating the impact—morbidity, mortality, and demands on health care services—of antibiotic resistance in Europe for two types of resistant organisms that are typically associated with resistance to multiple classes of antibiotics and can be regarded as surrogate markers for multi-drug resistance—methicillin-resistant S. aureus and third-generation cephalosporin-resistant E. coli.
What Did the Researchers Do and Find?
Recently, the Burden of Resistance and Disease in European Nations project collected representative data on the clinical impact of antimicrobial resistance throughout Europe. Using and combining this information with 2007 prevalence data from the European Antibiotic Resistance Surveillance System, the researchers calculated the burden of disease associated with methicillin-resistant S. aureus and third-generation cephalosporin-resistant E. coli blood stream infections. This burden of disease was expressed as excess number of deaths, excess number of days in hospital, and excess costs. Using statistical models, the researchers predicted trend-based resistance trajectories up to 2015 for the 31 participating countries in the European region.
The researchers included 1,293 hospitals from the 31 countries, typically covering 47% of all available acute care hospital beds in most countries, in their analysis. For S. aureus, the estimated number of blood stream infections totaled 108,434, of which 27,711 (25.6%) were methicillin-resistant. E. coli caused 163,476 blood stream infections, of which 15,183 (9.3%) were resistant to third-generation cephalosporins. An estimated 5,503 excess deaths were associated with blood stream infections caused by methicillin-resistant S. aureus (with the UK and France predicted to experience the highest excess mortality), and 2,712 excess deaths with blood stream infections caused by third-generation cephalosporin-resistant E. coli (predicted to be the highest in Turkey and the UK). The researchers also found that blood stream infections caused by both methicillin-resistant S. aureus and third-generation cephalosporin-resistant E. coli contributed respective excesses of 255,683 and 120,065 extra bed-days, accounting for an estimated extra cost of 62.0 million Euros (92.8 million international dollars). In their trend analysis, the researchers found that 97,000 resistant blood stream infections and 17,000 associated deaths could be expected in 2015, along with increases in the lengths of hospital stays and costs. Importantly, the researchers estimated that in the near future, the burden of disease associated with third-generation cephalosporin-resistant E. coli is likely to surpass that associated with methicillin-resistant S. aureus.
What Do These Findings Mean?
These findings show that even though the blood stream infections studied represent only a fraction of the total burden of disease associated with antibiotic resistance, excess mortality associated with these infections caused by methicillin-resistant S. aureus and third-generation cephalosporin-resistant E. coli is high, and the associated prolonged length of stays in hospital imposes a considerable burden on health care systems in Europe. Importantly, a possible shift in the burden of antibiotic resistance from Gram-positive to Gram-negative infections is concerning. Such forecasts suggest that despite anticipated gains in the control of methicillin-resistant S. aureus, the increasing number of infections caused by third-generation cephalosporin-resistant Gram-negative pathogens, such as E. coli, is likely to outweigh this achievement soon. This increasing burden will have a big impact on already stretched health systems.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001104.
The World Health Organization has a fact sheet on general antimicrobial resistance
The US Centers for Disease Control and Prevention webpage on antibiotic/antimicrobial resistance includes information on educational campaigns and resources
The European Centre for Disease Control provides data about the prevalence of resistance in Europe through an interactive database
doi:10.1371/journal.pmed.1001104
PMCID: PMC3191157  PMID: 22022233
21.  How much are we spending? The estimation of research expenditures on cardiovascular disease in Canada 
Background
Cardiovascular disease (CVD) is a leading cause of death in Canada and is a priority area for medical research. The research funding landscape in Canada has changed quite a bit over the last few decades, as have funding levels. Our objective was to estimate the magnitude of expenditures on CVD research for the public and charitable (not-for profit) sectors in Canada between 1975 and 2005.
Methods
To estimate research expenditures for the public and charitable sectors, we compiled a complete list of granting agencies in Canada, contacted each agency and the Canadian Institutes of Health Research (CIHR), and extracted data from the organizations’ annual reports and the Reference Lists of health research in Canada. Two independent reviewers scanned all grant and fellowship/scholarship titles (and summary/key words, when available) of all research projects funded to determine their inclusion in our analysis; only grants and fellowships/scholarships that focused on heart and peripheral vascular diseases were selected.
Results
Public/charitable sector funding increased 7.5 times, from close to $13 million (in constant dollars) in 1975 to almost $96 million (in constant dollars) in 2005 (base year). The Medical Research Council of Canada (MRCC)/CIHR and the Heart & Stroke Foundation of Canada have been the main founders of this type of research during our analysis period; the Alberta Heritage Foundation for Medical Research and the Fonds de la recherche en santé du Quebec have played major roles at the provincial level. The Indirect Costs Research Program and Canada Foundation for Innovation have played major roles in terms of funding in the last years of our analysis.
Conclusion
Public/charitable-funded research expenditures devoted to CVD have increased substantially over the last three decades. By international standards, the evidence suggests Canada spends less on health-related research than the UK and the US, at least in absolute terms. However, this may not be too problematic as Canada is likely to free-ride from research undertaken elsewhere. Understanding these past trends in research funding may provide decision makers with important information for planning future research efforts. Future work in this area should include the use of our coding methods to obtain estimates of funded research for other diseases in Canada.
doi:10.1186/1472-6963-12-281
PMCID: PMC3469373  PMID: 22929001
Cardiovascular disease; Research expenditures; Health policy
22.  Alcohol Sales and Risk of Serious Assault 
PLoS Medicine  2008;5(5):e104.
Background
Alcohol is a contributing cause of unintentional injuries, such as motor vehicle crashes. Prior research on the association between alcohol use and violent injury was limited to survey-based data, and the inclusion of cases from a single trauma centre, without adequate controls. Beyond these limitations was the inability of prior researchers to comprehensively capture most alcohol sales. In Ontario, most alcohol is sold through retail outlets run by the provincial government, and hospitals are financed under a provincial health care system. We assessed the risk of being hospitalized due to assault in association with retail alcohol sales across Ontario.
Methods and Findings
We performed a population-based case-crossover analysis of all persons aged 13 years and older hospitalized for assault in Ontario from 1 April 2002 to 1 December 2004. On the day prior to each assault case's hospitalization, the volume of alcohol sold at the store in closest proximity to the victim's home was compared to the volume of alcohol sold at the same store 7 d earlier. Conditional logistic regression analysis was used to determine the associated relative risk (RR) of assault per 1,000 l higher daily sales of alcohol. Of the 3,212 persons admitted to hospital for assault, nearly 25% were between the ages of 13 and 20 y, and 83% were male. A total of 1,150 assaults (36%) involved the use of a sharp or blunt weapon, and 1,532 (48%) arose during an unarmed brawl or fight. For every 1,000 l more of alcohol sold per store per day, the relative risk of being hospitalized for assault was 1.13 (95% confidence interval [CI] 1.02–1.26). The risk was accentuated for males (1.18, 95% CI 1.05–1.33), youth aged 13 to 20 y (1.21, 95% CI 0.99–1.46), and those in urban areas (1.19, 95% CI 1.06–1.35).
Conclusions
The risk of being a victim of serious assault increases with alcohol sales, especially among young urban men. Akin to reducing the risk of driving while impaired, consideration should be given to novel methods of preventing alcohol-related violence.
In a population-based case-crossover analysis, Joel Ray and colleagues find that the risk of being a victim of serious assault increases with retail alcohol sales, especially among young urban men.
Editors' Summary
Background.
Alcohol has been produced and consumed around the world since prehistoric times. In the Western world it is now the most commonly consumed psychoactive drug (a substance that changes mood, behavior, and thought processes). The World Health Organization reports that there are 76.3 million persons with alcohol use disorders worldwide. Alcohol consumption is an important factor in unintentional injuries, such as motor vehicle crashes, and in violent criminal behavior. In the United Kingdom, for example, a higher proportion of heavy drinkers than light drinkers cause violent criminal offenses. Other figures suggest that people (in particular, young men) have an increased risk of committing a criminally violent offense within 24 h of drinking alcohol. There is also some evidence that suggests that the victims as well as the perpetrators of assaults have often been drinking recently, possibly because alcohol impairs the victim's ability to judge potentially explosive situations.
Why Was This Study Done?
The researchers wanted to know more about the relationship between alcohol and intentional violence. The recognition of a clear link between driving when impaired by alcohol and motor vehicle crashes has led many countries to introduce public awareness programs that stigmatize drunk driving. If a clear link between alcohol consumption by the people involved in violent crime could also be established, similar programs might reduce alcohol-related assaults. The researchers tested the hypothesis that the risk of being hospitalized due to a violent assault increases when there are increased alcohol sales in the immediate vicinity of the victim's place of residence.
What Did the Researchers Do and Find?
The researchers did their study in Ontario, Canada for three reasons. First, Ontario is Canada's largest province. Second, the province keeps detailed computerized medical records, including records of people hospitalized from being violently assaulted. Third, most alcohol is sold in government-run shops, and the district has the infrastructure to allow daily alcohol sales to be tracked. The researchers identified more than 3,000 people over the age of 13 y who were hospitalized in the province because of a serious assault during a 32-mo period. They compared the volume of alcohol sold at the liquor store nearest to the victim's home the day before the assault with the volume sold at the same store a week earlier (this type of study is called a “case-crossover” study). For every extra 1,000 l of alcohol sold per store per day (a doubling of alcohol sales), the overall risk of being hospitalized for assault increased by 13%. The risk was highest in three subgroups of people: men (18% increased risk), youths aged 13 to 20 y (21% increased risk), and those living in urban areas (19% increased risk). At peak times of alcohol sales, the risk of assault was 41% higher than at times when alcohol sales were lowest.
What Do These Findings Mean?
These findings indicate that the risk of being seriously assaulted increases with the amount of alcohol sold locally the day before the assault and show that the individuals most at risk are young men living in urban areas. Because the study considers only serious assaults and alcohol sold in shops (i.e., not including alcohol sold in bars), it probably underestimates the association between alcohol and assault. It also does not indicate whether the victim or perpetrator of the assault (or both) had been drinking, and its findings may not apply to countries with different drinking habits. Nevertheless, these findings support the idea that the consumption of alcohol contributes to the occurrence of medical injuries from intentional violence. Increasing the price of alcohol or making alcohol harder to obtain might help to reduce the occurrence of alcohol-related assaults. The researchers suggest that a particularly effective approach may be to stigmatize alcohol-related brawling, analogous to the way that driving under the influence of alcohol has been made socially unacceptable.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0050104.
This study is further discussed in a PLoS Medicine Perspective by Bennetts and Seabrook
The US National Institute on Alcohol Abuse and Alcoholism provides information on all aspects of alcohol abuse, including an article on alcohol use and violence among young adults
Alcohol-related assault is examined in the British Crime Survey
Alcohol Concern, the UK national agency on alcohol misuse, provides fact sheets on the health impacts of alcohol, young people's drinking, and alcohol and crime
The Canadian Centre for Addiction and Mental Health in Toronto provides information about alcohol addiction (in English and French)
doi:10.1371/journal.pmed.0050104
PMCID: PMC2375945  PMID: 18479181
23.  Supporting chronic pain management across provincial and territorial health systems in Canada: Findings from two stakeholder dialogues 
Chronic pain is a serious health problem affecting one in five Canadians. To provide better care for patients affected by chronic pain, there is a need to identify how provinces and territories across the country can strengthen its management. In this report, the authors summarize key findings from two stakeholder dialogues that addressed the support of chronic pain management by health system decisionmakers and across health systems. An overview of examples of the progress that has been made since the dialogues is also provided.
BACKGROUND:
Chronic pain is a serious health problem given its prevalence, associated disability, impact on quality of life and the costs associated with the extensive use of health care services by individuals living with it.
OBJECTIVE:
To summarize the research evidence and elicit health system policymakers’, stakeholders’ and researchers’ tacit knowledge and views about improving chronic pain management in Canada and engaging provincial and territorial health system decision makers in supporting comprehensive chronic pain management in Canada.
METHODS:
For these two topics, the global and local research evidence regarding each of the two problems were synthesized in evidence briefs. Three options were generated for addressing each problem, and implementation considerations were assessed. A stakeholder dialogue regarding each topic was convened (with 29 participants in total) and the deliberations were synthesized.
RESULTS:
To inform the first stakeholder dialogue, the authors found that systematic reviews supported the use of evidence-based tools for strengthening chronic pain management, including patient education, self-management supports, interventions to implement guidelines and multidisciplinary approaches to pain management. While research evidence about patient registries/treatment-monitoring systems is limited, many dialogue participants argued that a registry/system is needed. Many saw a registry as a precondition for moving forward with other options, including creating a national network of chronic pain centres with a coordinating ‘hub’ to provide chronic pain-related decision support and a cross-payer, cross-discipline model of patient-centred primary health care-based chronic pain management. For the second dialogue, systematic reviews indicated that traditional media can be used to positively influence individual health-related behaviours, and that multistakeholder partnerships can contribute to increasing attention devoted to issues on policy agendas. Dialogue participants emphasized the need to mobilize behind an effort to build a national network that would bring together existing organizations and committed individuals.
CONCLUSIONS:
Developing a national network and, thereafter, a national pain strategy are important initiatives that garnered broad-based support during the dialogues. Efforts toward achieving this goal have been made since convening the dialogues.
PMCID: PMC4596635  PMID: 26291124
Canada; Chronic pain; Dialogue; Health systems; Pain management
24.  Ultraviolet Phototherapy Management of Moderate-to-Severe Plaque Psoriasis 
Executive Summary
Objective
The purpose of this evidence based analysis was to determine the effectiveness and safety of ultraviolet phototherapy for moderate-to-severe plaque psoriasis.
Research Questions
The specific research questions for the evidence review were as follows:
What is the safety of ultraviolet phototherapy for moderate-to-severe plaque psoriasis?
What is the effectiveness of ultraviolet phototherapy for moderate-to-severe plaque psoriasis?
Clinical Need: Target Population and Condition
Psoriasis is a common chronic, systemic inflammatory disease affecting the skin, nails and occasionally the joints and has a lifelong waning and waxing course. It has a worldwide occurrence with a prevalence of at least 2% of the general population, making it one of the most common systemic inflammatory diseases. The immune-mediated disease has several clinical presentations with the most common (85% - 90%) being plaque psoriasis.
Characteristic features of psoriasis include scaling, redness, and elevation of the skin. Patients with psoriasis may also present with a range of disabling symptoms such as pruritus (itching), pain, bleeding, or burning associated with plaque lesions and up to 30% are classified as having moderate-to-severe disease. Further, some psoriasis patients can be complex medical cases in which diabetes, inflammatory bowel disease, and hypertension are more likely to be present than in control populations and 10% also suffer from arthritis (psoriatic arthritis). The etiology of psoriasis is unknown but is thought to result from complex interactions between the environment and predisposing genes.
Management of psoriasis is related to the extent of the skin involvement, although its presence on the hands, feet, face or genitalia can present challenges. Moderate-to-severe psoriasis is managed by phototherapy and a range of systemic agents including traditional immunosuppressants such as methotrexate and cyclospsorin. Treatment with modern immunosuppressant agents known as biologicals, which more specifically target the immune defects of the disease, is usually reserved for patients with contraindications and those failing or unresponsive to treatments with traditional immunosuppressants or phototherapy.
Treatment plans are based on a long-term approach to managing the disease, patient’s expectations, individual responses and risk of complications. The treatment goals are several fold but primarily to:
1) improve physical signs and secondary psychological effects,
2) reduce inflammation and control skin shedding,
3) control physical signs as long as possible, and to
4) avoid factors that can aggravate the condition.
Approaches are generally individualized because of the variable presentation, quality of life implications, co-existent medical conditions, and triggering factors (e.g. stress, infections and medications). Individual responses and commitments to therapy also present possible limitations.
Phototherapy
Ultraviolet phototherapy units have been licensed since February 1993 as a class 2 device in Canada. Units are available as hand held devices, hand and foot devices, full-body panel, and booth styles for institutional and home use. Units are also available with a range of ultraviolet A, broad and narrow band ultraviolet B (BB-UVB and NB-UVB) lamps. After establishing appropriate ultraviolet doses, three-times weekly treatment schedules for 20 to 25 treatments are generally needed to control symptoms.
Evidence-Based Analysis Methods
The literature search strategy employed keywords and subject headings to capture the concepts of 1) phototherapy and 2) psoriasis. The search involved runs in the following databases: Ovid MEDLINE (1996 to March Week 3 2009), OVID MEDLINE In-Process and Other Non-Indexed Citations, EMBASE (1980 to 2009 Week 13), the Wiley Cochrane Library, and the Centre for Reviews and Dissemination/International Agency for Health Technology Assessment. Parallel search strategies were developed for the remaining databases. Search results were limited to human and English-language published between January 1999 and March 31, 2009. Search alerts were generated and reviewed for relevant literature up until May 31, 2009.
English language reports and human studies
Ultraviolet phototherapy interventions for plaque-type psoriasis
Reports involving efficacy and/or safety outcome studies
Original reports with defined study methodology
Standardized measurements on outcome events such as technical success, safety, effectiveness, durability, quality of life or patient satisfaction
Non-systematic reviews, letters, comments and editorials
Randomized trials involving side-to-side or half body comparisons
Randomized trials not involving ultraviolet phototherapy intervention for plaque-type psoriasis
Trials involving dosing studies, pilot feasibility studies or lacking control groups
Summary of Findings
A 2000 health technology evidence report on the overall management of psoriasis by The National Institute Health Research (NIHR) Health Technology Assessment Program of the UK was identified in the MAS evidence-based review. The report included 109 RCT studies published between 1966 and June 1999 involving four major treatment approaches – 51 on phototherapy, 32 on oral retinoids, 18 on cyclosporin and five on fumarates.. The absence of RCTs on methotrexate was noted as original studies with this agent had been performed prior to 1966.
Of the 51 RCT studies involving phototherapy, 22 involved UVA, 21 involved UVB, five involved both UVA and UVB and three involved natural light as a source of UV. The RCT studies included comparisons of treatment schedules, ultraviolet source, addition of adjuvant therapies, and comparisons between phototherapy and topical treatment schedules. Because of heterogeneity, no synthesis or meta-analysis could be performed. Overall, the reviewers concluded that the efficacy of only five therapies could be supported from the RCT-based evidence review: photochemotherapy or phototherapy, cyclosporin, systemic retinoids, combination topical vitamin D3 analogues (calcipotriol) and corticosteroids in combination with phototherapy and fumarates. Although there was no RCT evidence supporting methotrexate, it’s efficacy for psoriasis is well known and it continues to be a treatment mainstay.
The conclusion of the NIHR evidence review was that both photochemotherapy and phototherapy were effective treatments for clearing psoriasis, although their comparative effectiveness was unknown. Despite the conclusions on efficacy, a number of issues were identified in the evidence review and several areas for future research were discussed to address these limitations. Trials focusing on comparative effectiveness, either between ultraviolet sources or between classes of treatment such as methotrexate versus phototherapy, were recommended to refine treatment algorithms. The need for better assessment of cost-effectiveness of therapies to consider systemic drug costs and costs of surveillance, as well as drug efficacy, were also noted. Overall, the authors concluded that phototherapy and photochemotherapy had important roles in psoriasis management and were standard therapeutic options for psoriasis offered in dermatology practices.
The MAS evidence-based review focusing on the RCT trial evidence for ultraviolet phototherapy management of moderate-to-severe plaque psoriasis was performed as an update to the NIHR 2000 systemic review on treatments for severe psoriasis. In this review, an additional 26 RCT reports examining phototherapy or photochemotherapy for psoriasis were identified. Among the studies were two RCTs comparing ultraviolet wavelength sources, five RCTs comparing different forms of phototherapy, four RCTs combining phototherapy with prior spa saline bathing, nine RCTs combining phototherapy with topical agents, two RCTs combining phototherapy with the systemic immunosuppressive agents methotrexate or alefacept, one RCT comparing phototherapy with an additional light source (the excimer laser), and one comparing a combination therapy with phototherapy and psychological intervention involving simultaneous audiotape sessions on mindfulness and stress reduction. Two trials also examined the effect of treatment setting on effectiveness of phototherapy, one on inpatient versus outpatient therapy and one on outpatient clinic versus home-based phototherapy.
Conclusions
The conclusions of the MAS evidence-based review are outlined in Table ES1. In summary, phototherapy provides good control of clinical symptoms in the short term for patients with moderate-to-severe plaque-type psoriasis that have failed or are unresponsive to management with topical agents. However, many of the evidence gaps identified in the NIHR 2000 evidence review on psoriasis management persisted. In particular, the lack of evidence on the comparative effectiveness and/or cost-effectiveness between the major treatment options for moderate-to-severe psoriasis remained. The evidence on effectiveness and safety of longer term strategies for disease management has also not been addressed. Evidence for the safety, effectiveness, or cost-effectiveness of phototherapy delivered in various settings is emerging but is limited. In addition, because all available treatments for psoriasis – a disease with a high prevalence, chronicity, and cost – are palliative rather than curative, strategies for disease control and improvements in self-efficacy employed in other chronic disease management strategies should be investigated.
RCT Evidence for Ultraviolet Phototherapy Treatment of Moderate-To-Severe Plaque Psoriasis
Phototherapy is an effective treatment for moderate-to-severe plaque psoriasis
Narrow band PT is more effective than broad band PT for moderate-to-severe plaque psoriasis
Oral-PUVA has a greater clinical response, requires less treatments and has a greater cumulative UV irradiation dose than UVB to achieve treatment effects for moderate-to-severe plaque psoriasis
Spa salt water baths prior to phototherapy did increase short term clinical response of moderate-to-severe plaque psoriasis but did not decrease cumulative UV irradiation dose
Addition of topical agents (vitamin D3 calcipotriol) to NB-UVB did not increase mean clinical response or decrease treatments or cumulative UV irradiation dose
Methotrexate prior to NB-UVB in high need psoriasis patients did significantly increase clinical response, decrease number of treatment sessions and decrease cumulative UV irradiation dose
Phototherapy following alefacept did increase early clinical response in moderate-to-severe plaque psoriasis
Effectiveness and safety of home NB-UVB phototherapy was not inferior to NB-UVB phototherapy provided in a clinic to patients with psoriasis referred for phototherapy. Treatment burden was lower and patient satisfaction was higher with home therapy and patients in both groups preferred future phototherapy treatments at home
Ontario Health System Considerations
A 2006 survey of ultraviolet phototherapy services in Canada identified 26 phototherapy clinics in Ontario for a population of over 12 million. At that time, there were 177 dermatologists and 50 geographic regions in which 28% (14/50) provided phototherapy services. The majority of the phototherapy services were reported to be located in densely populated areas; relatively few patients living in rural communities had access to these services. The inconvenience of multiple weekly visits for optimal phototherapy treatment effects poses additional burdens to those with travel difficulties related to health, job, or family-related responsibilities.
Physician OHIP billing for phototherapy services totaled 117,216 billings in 2007, representing approximately 1,800 patients in the province treated in private clinics. The number of patients treated in hospitals is difficult to estimate as physician costs are not billed directly to OHIP in this setting. Instead, phototherapy units and services provided in hospitals are funded by hospitals’ global budgets. Some hospitals in the province, however, have divested their phototherapy services, so the number of phototherapy clinics and their total capacity is currently unknown.
Technological advances have enabled changes in phototherapy treatment regimens from lengthy hospital inpatient stays to outpatient clinic visits and, more recently, to an at-home basis. When combined with a telemedicine follow-up, home phototherapy may provide an alternative strategy for improved access to service and follow-up care, particularly for those with geographic or mobility barriers. Safety and effectiveness have, however, so far been evaluated for only one phototherapy home-based delivery model. Alternate care models and settings could potentially increase service options and access, but the broader consequences of the varying cost structures and incentives that either increase or decrease phototherapy services are unknown.
Economic Analyses
The focus of the current economic analysis was to characterize the costs associated with the provision of NB-UVB phototherapy for plaque-type, moderate-to-severe psoriasis in different clinical settings, including home therapy. A literature review was conducted and no cost-effectiveness (cost-utility) economic analyses were published in this area.
Hospital, Clinic, and Home Costs of Phototherapy
Costs for NB-UVB phototherapy were based on consultations with equipment manufacturers and dermatologists. Device costs applicable to the provision of NB-UVB phototherapy in hospitals, private clinics and at a patient’s home were estimated. These costs included capital costs of purchasing NB-UVB devices (amortized over 15-20 years), maintenance costs of replacing equipment bulbs, physician costs of phototherapy treatment in private clinics ($7.85 per phototherapy treatment), and medication and laboratory costs associated with treatment of moderate-to-severe psoriasis.
NB-UVB phototherapy services provided in a hospital setting were paid for by hospitals directly. Phototherapy services in private clinic and home settings were paid for by the clinic and patient, respectively, except for physician services covered by OHIP. Indirect funding was provided to hospitals as part of global budgeting and resource allocation. Home therapy services for NB-UVB phototherapy were not covered by the MOHLTC. Coverage for home-based phototherapy however, was in some cases provided by third party insurers.
Device costs for NB-UVB phototherapy were estimated for two types of phototherapy units: a “booth unit” consisting of 48 bulbs used in hospitals and clinics, and a “panel unit” consisting of 10 bulbs for home use. The device costs of the booth and panel units were estimated at approximately $18,600 and $2,900, respectively; simple amortization over 15 and 20 years implied yearly costs of approximately $2,500 and $150, respectively. Replacement cost for individual bulbs was about $120 resulting in total annual cost of maintenance of about $8,640 and $120 for booth and panel units, respectively.
Estimated Total Costs for Ontario
Average annual cost per patient for NB-UVB phototherapy provided in the hospital, private clinic or at home was estimated to be $292, $810 and $365 respectively. For comparison purposes, treatment of moderate-to-severe psoriasis with methotrexate and cyclosporin amounted to $712 and $3,407 annually per patient respectively; yearly costs for biological drugs were estimated to be $18,700 for alefacept and $20,300 for etanercept-based treatments.
Total annual costs of NB-UVB phototherapy were estimated by applying average costs to an estimated proportion of the population (age 18 or older) eligible for phototherapy treatment. The prevalence of psoriasis was estimated to be approximately 2% of the population, of which about 85% was of plaque-type psoriasis and approximately 20% to 30% was considered moderate-to-severe in disease severity. An estimate of 25% for moderate-to-severe psoriasis cases was used in the current economic analysis resulting in a range of 29,400 to 44,200 cases. Approximately 21% of these patients were estimated to be using NB-UVB phototherapy for treatment resulting in a number of cases in the range between 6,200 and 9,300 cases. The average (7,700) number of cases was used to calculate associated costs for Ontario by treatment setting.
Total annual costs were as follows: $2.3 million in a hospital setting, $6.3 million in a private clinic setting, and $2.8 million for home phototherapy. Costs for phototherapy services provided in private clinics were greater ($810 per patient annually; total of $6.3 million annually) and differed from the same services provided in the hospital setting only in terms of additional physician costs associated with phototherapy OHIP fees.
Keywords
Psoriasis, ultraviolet radiation, phototherapy, photochemotherapy, NB-UVB, BB-UVB PUVA
PMCID: PMC3377497  PMID: 23074532
25.  Geographic Distribution of Staphylococcus aureus Causing Invasive Infections in Europe: A Molecular-Epidemiological Analysis 
PLoS Medicine  2010;7(1):e1000215.
Hajo Grundmann and colleagues describe the development of a new interactive mapping tool for analyzing the spatial distribution of invasive Staphylococcus aureus clones.
Background
Staphylococcus aureus is one of the most important human pathogens and methicillin-resistant variants (MRSAs) are a major cause of hospital and community-acquired infection. We aimed to map the geographic distribution of the dominant clones that cause invasive infections in Europe.
Methods and Findings
In each country, staphylococcal reference laboratories secured the participation of a sufficient number of hospital laboratories to achieve national geo-demographic representation. Participating laboratories collected successive methicillin-susceptible (MSSA) and MRSA isolates from patients with invasive S. aureus infection using an agreed protocol. All isolates were sent to the respective national reference laboratories and characterised by quality-controlled sequence typing of the variable region of the staphylococcal spa gene (spa typing), and data were uploaded to a central database. Relevant genetic and phenotypic information was assembled for interactive interrogation by a purpose-built Web-based mapping application. Between September 2006 and February 2007, 357 laboratories serving 450 hospitals in 26 countries collected 2,890 MSSA and MRSA isolates from patients with invasive S. aureus infection. A wide geographical distribution of spa types was found with some prevalent in all European countries. MSSA were more diverse than MRSA. Genetic diversity of MRSA differed considerably between countries with dominant MRSA spa types forming distinctive geographical clusters. We provide evidence that a network approach consisting of decentralised typing and visualisation of aggregated data using an interactive mapping tool can provide important information on the dynamics of MRSA populations such as early signalling of emerging strains, cross border spread, and importation by travel.
Conclusions
In contrast to MSSA, MRSA spa types have a predominantly regional distribution in Europe. This finding is indicative of the selection and spread of a limited number of clones within health care networks, suggesting that control efforts aimed at interrupting the spread within and between health care institutions may not only be feasible but ultimately successful and should therefore be strongly encouraged.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
The bacterium Staphylococcus aureus lives on the skin and in the nose of about a third of healthy people. Although S. aureus usually coexists peacefully with its human carriers, it is also an important disease-causing organism or pathogen. If it enters the body through a cut or during a surgical procedure, S. aureus can cause minor infections such as pimples and boils or more serious, life-threatening infections such as blood poisoning and pneumonia. Minor S. aureus infections can be treated without antibiotics—by draining a boil, for example. Invasive infections are usually treated with antibiotics. Unfortunately, many of the S. aureus clones (groups of bacteria that are all genetically related and descended from a single, common ancestor) that are now circulating are resistant to methicillin and several other antibiotics. Invasive methicillin-resistant S. aureus (MRSA) infections are a particular problem in hospitals and other health care facilities (so-called hospital-acquired MRSA infections), but they can also occur in otherwise healthy people who have not been admitted to a hospital (community-acquired MRSA infections).
Why Was This Study Done?
The severity and outcome of an S. aureus infection in an individual depends in part on the ability of the bacterial clone with which the individual is infected to cause disease—the clone's “virulence.” Public-health officials and infectious disease experts would like to know the geographic distribution of the virulent S. aureus clones that cause invasive infections, because this information should help them understand how these pathogens spread and thus how to control them. Different clones of S. aureus can be distinguished by “molecular typing,” the determination of clone-specific sequences of nucleotides in variable regions of the bacterial genome (the bacterium's blueprint; genomes consist of DNA, long chains of nucleotides). In this study, the researchers use molecular typing to map the geographic distribution of MRSA and methicillin-sensitive S. aureus (MSSA) clones causing invasive infections in Europe; a MRSA clone emerges when an MSSA clone acquires antibiotic resistance from another type of bacteria so it is useful to understand the geographic distribution of both MRSA and MSSA.
What Did the Researchers Do and Find?
Between September 2006 and February 2007, 357 laboratories serving 450 hospitals in 26 European countries collected almost 3,000 MRSA and MSSA isolates from patients with invasive S. aureus infections. The isolates were sent to the relevant national staphylococcal reference laboratory (SRL) where they were characterized by quality-controlled sequence typing of the variable region of a staphylococcal gene called spa (spa typing). The spa typing data were entered into a central database and then analyzed by a public, purpose-built Web-based mapping tool (SRL-Maps), which provides interactive access and easy-to-understand illustrations of the geographical distribution of S. aureus clones. Using this mapping tool, the researchers found that there was a wide geographical distribution of spa types across Europe with some types being common in all European countries. MSSA isolates were more diverse than MRSA isolates and the genetic diversity (variability) of MRSA differed considerably between countries. Most importantly, major MRSA spa types occurred in distinct geographical clusters.
What Do These Findings Mean?
These findings provide the first representative snapshot of the genetic population structure of S. aureus across Europe. Because the researchers used spa typing, which analyzes only a small region of one gene, and characterized only 3,000 isolates, analysis of other parts of the S. aureus genome in more isolates is now needed to build a complete portrait of the geographical abundance of the S. aureus clones that cause invasive infections in Europe. However, the finding that MRSA spa types occur mainly in geographical clusters has important implications for the control of MRSA, because it indicates that a limited number of clones are spreading within health care networks, which means that MRSA is mainly spread by patients who are repeatedly admitted to different hospitals. Control efforts aimed at interrupting this spread within and between health care institutions may be feasible and ultimately successful, suggest the researchers, and should be strongly encouraged. In addition, this study shows how, by sharing typing results on a Web-based platform, an international surveillance network can provide clinicians and infection control teams with crucial information about the dynamics of pathogens such as S. aureus, including early warnings about emerging virulent clones.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1000215.
This study is further discussed in a PLoS Medicine Perspective by Franklin D. Lowy
The UK Health Protection Agency provides information about Staphylococcus aureus
The UK National Health Service Choices Web site has pages on staphylococcal infections and on MRSA
The US National Institute of Allergy and Infectious Disease has information about MRSA
The US Centers for Disease Control and Infection provides information about MRSA for the public and professionals
MedlinePlus provides links to further resources on staphylococcal infections and on MRSA (in English and Spanish)
SRL-Maps can be freely accessed
doi:10.1371/journal.pmed.1000215
PMCID: PMC2796391  PMID: 20084094

Results 1-25 (1491517)