PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1201124)

Clipboard (0)
None

Related Articles

1.  Medication safety in acute care in Australia: where are we now? Part 2: a review of strategies and activities for improving medication safety 2002-2008 
Background
This paper presents Part 2 of a literature review examining medication safety in the Australian acute care setting. This review was undertaken for the Australian Commission on Safety and Quality in Health Care, updating the 2002 national report on medication safety. Part 2 of the review examined the Australian evidence base for approaches to build safer medication systems in acute care.
Methods
A literature search was conducted to identify Australian studies and programs published from 2002 to 2008 which examined strategies and activities for improving medication safety in acute care.
Results and conclusion
Since 2002 there has been significant progress in strategies to improve prescription writing in hospitals with the introduction of a National Inpatient Medication Chart. There are also systems in place to ensure a nationally coordinated approach to the ongoing optimisation of the chart. Progress has been made with Australian research examining the implementation of computerised prescribing systems with clinical decision support. These studies have highlighted barriers and facilitators to the introduction of such systems that can inform wider implementation. However, Australian studies assessing outcomes of this strategy on medication incidents or patient outcomes are still lacking. In studies assessing education for reducing medication errors, academic detailing has been demonstrated to reduce errors in prescriptions for Schedule 8 medicines and a program was shown to be effective in reducing error prone prescribing abbreviations. Published studies continue to support the role of clinical pharmacist services in improving medication safety. Studies on strategies to improve communication between different care settings, such as liaison pharmacist services, have focussed on implementation issues now that funding is available for community-based services. Double checking versus single-checking by nurses and patient self-administration in hospital has been assessed in small studies. No new studies were located assessing the impact of individual patient medication supply, adverse drug event alerts or bar coding. There is still limited research assessing the impact of an integrated systems approach on medication safety in Australian acute care.
doi:10.1186/1743-8462-6-24
PMCID: PMC2754989  PMID: 19772663
2.  Effects of Two Commercial Electronic Prescribing Systems on Prescribing Error Rates in Hospital In-Patients: A Before and After Study 
PLoS Medicine  2012;9(1):e1001164.
In a before-and-after study, Johanna Westbrook and colleagues evaluate the change in prescribing error rates after the introduction of two commercial electronic prescribing systems in two Australian hospitals.
Background
Considerable investments are being made in commercial electronic prescribing systems (e-prescribing) in many countries. Few studies have measured or evaluated their effectiveness at reducing prescribing error rates, and interactions between system design and errors are not well understood, despite increasing concerns regarding new errors associated with system use. This study evaluated the effectiveness of two commercial e-prescribing systems in reducing prescribing error rates and their propensities for introducing new types of error.
Methods and Results
We conducted a before and after study involving medication chart audit of 3,291 admissions (1,923 at baseline and 1,368 post e-prescribing system) at two Australian teaching hospitals. In Hospital A, the Cerner Millennium e-prescribing system was implemented on one ward, and three wards, which did not receive the e-prescribing system, acted as controls. In Hospital B, the iSoft MedChart system was implemented on two wards and we compared before and after error rates. Procedural (e.g., unclear and incomplete prescribing orders) and clinical (e.g., wrong dose, wrong drug) errors were identified. Prescribing error rates per admission and per 100 patient days; rates of serious errors (5-point severity scale, those ≥3 were categorised as serious) by hospital and study period; and rates and categories of postintervention “system-related” errors (where system functionality or design contributed to the error) were calculated. Use of an e-prescribing system was associated with a statistically significant reduction in error rates in all three intervention wards (respectively reductions of 66.1% [95% CI 53.9%–78.3%]; 57.5% [33.8%–81.2%]; and 60.5% [48.5%–72.4%]). The use of the system resulted in a decline in errors at Hospital A from 6.25 per admission (95% CI 5.23–7.28) to 2.12 (95% CI 1.71–2.54; p<0.0001) and at Hospital B from 3.62 (95% CI 3.30–3.93) to 1.46 (95% CI 1.20–1.73; p<0.0001). This decrease was driven by a large reduction in unclear, illegal, and incomplete orders. The Hospital A control wards experienced no significant change (respectively −12.8% [95% CI −41.1% to 15.5%]; −11.3% [−40.1% to 17.5%]; −20.1% [−52.2% to 12.4%]). There was limited change in clinical error rates, but serious errors decreased by 44% (0.25 per admission to 0.14; p = 0.0002) across the intervention wards compared to the control wards (17% reduction; 0.30–0.25; p = 0.40). Both hospitals experienced system-related errors (0.73 and 0.51 per admission), which accounted for 35% of postsystem errors in the intervention wards; each system was associated with different types of system-related errors.
Conclusions
Implementation of these commercial e-prescribing systems resulted in statistically significant reductions in prescribing error rates. Reductions in clinical errors were limited in the absence of substantial decision support, but a statistically significant decline in serious errors was observed. System-related errors require close attention as they are frequent, but are potentially remediable by system redesign and user training. Limitations included a lack of control wards at Hospital B and an inability to randomize wards to the intervention.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Medication errors—for example, prescribing the wrong drug or giving a drug by the wrong route—frequently occur in health care settings and are responsible for thousands of deaths every year. Until recently, medicines were prescribed and dispensed using systems based on hand-written scripts. In hospitals, for example, physicians wrote orders for medications directly onto a medication chart, which was then used by the nursing staff to give drugs to their patients. However, drugs are now increasingly being prescribed using electronic prescribing (e-prescribing) systems. With these systems, prescribers use a computer and order medications for their patients with the help of a drug information database and menu items, free text boxes, and prewritten orders for specific conditions (so-called passive decision support). The system reviews the patient's medication and known allergy list and alerts the physician to any potential problems, including drug interactions (active decision support). Then after the physician has responded to these alerts, the order is transmitted electronically to the pharmacy and/or the nursing staff who administer the prescription.
Why Was This Study Done?
By avoiding the need for physicians to write out prescriptions and by providing active and passive decision support, e-prescribing has the potential to reduce medication errors. But, even though many countries are investing in expensive commercial e-prescribing systems, few studies have evaluated the effects of these systems on prescribing error rates. Moreover, little is known about the interactions between system design and errors despite fears that e-prescribing might introduce new errors. In this study, the researchers analyze prescribing error rates in hospital in-patients before and after the implementation of two commercial e-prescribing systems.
What Did the Researchers Do and Find?
The researchers examined medication charts for procedural errors (unclear, incomplete, or illegal orders) and for clinical errors (for example, wrong drug or dose) at two Australian hospitals before and after the introduction of commercial e-prescribing systems. At Hospital A, the Cerner Millennium e-prescribing system was introduced on one ward; three other wards acted as controls. At Hospital B, the researchers compared the error rates on two wards before and after the introduction of the iSoft MedChart e-prescribing system. The introduction of an e-prescribing system was associated with a substantial reduction in error rates in the three intervention wards; error rates on the control wards did not change significantly during the study. At Hospital A, medication errors declined from 6.25 to 2.12 per admission after the introduction of e-prescribing whereas at Hospital B, they declined from 3.62 to 1.46 per admission. This reduction in error rates was mainly driven by a reduction in procedural error rates and there was only a limited change in overall clinical error rates. Notably, however, the rate of serious errors decreased across the intervention wards from 0.25 to 0.14 per admission (a 44% reduction), whereas the serious error rate only decreased by 17% in the control wards during the study. Finally, system-related errors (for example, selection of an inappropriate drug located on a drop-down menu next to a likely drug selection) accounted for 35% of errors in the intervention wards after the implementation of e-prescribing.
What Do These Findings Mean?
These findings show that the implementation of these two e-prescribing systems markedly reduced hospital in-patient prescribing error rates, mainly by reducing the number of incomplete, illegal, or unclear medication orders. The limited decision support built into both the e-prescribing systems used here may explain the limited reduction in clinical error rates but, importantly, both e-prescribing systems reduced serious medication errors. Finally, the high rate of system-related errors recorded in this study is worrying but is potentially remediable by system redesign and user training. Because this was a “real-world” study, it was not possible to choose the intervention wards randomly. Moreover, there was no control ward at Hospital B, and the wards included in the study had very different specialties. These and other aspects of the study design may limit the generalizability of these findings, which need to be confirmed and extended in additional studies. Even so, these findings provide persuasive evidence of the current and potential ability of commercial e-prescribing systems to reduce prescribing errors in hospital in-patients provided these systems are continually monitored and refined to improve their performance.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001164.
ClinfoWiki has pages on medication errors and on electronic prescribing (note: the Clinical Informatics Wiki is a free online resource that anyone can add to or edit)
Electronic prescribing in hospitals challenges and lessons learned describes the implementation of e-prescribing in UK hospitals; more information about e-prescribing in the UK is available on the NHS Connecting for Health Website
The Clinicians Guide to e-Prescribing provides up-to-date information about e-prescribing in the USA
Information about e-prescribing in Australia is also available
Information about electronic health records in Australia
doi:10.1371/journal.pmed.1001164
PMCID: PMC3269428  PMID: 22303286
3.  Hospital-at-Home Programs for Patients With Acute Exacerbations of Chronic Obstructive Pulmonary Disease (COPD) 
Executive Summary
In July 2010, the Medical Advisory Secretariat (MAS) began work on a Chronic Obstructive Pulmonary Disease (COPD) evidentiary framework, an evidence-based review of the literature surrounding treatment strategies for patients with COPD. This project emerged from a request by the Health System Strategy Division of the Ministry of Health and Long-Term Care that MAS provide them with an evidentiary platform on the effectiveness and cost-effectiveness of COPD interventions.
After an initial review of health technology assessments and systematic reviews of COPD literature, and consultation with experts, MAS identified the following topics for analysis: vaccinations (influenza and pneumococcal), smoking cessation, multidisciplinary care, pulmonary rehabilitation, long-term oxygen therapy, noninvasive positive pressure ventilation for acute and chronic respiratory failure, hospital-at-home for acute exacerbations of COPD, and telehealth (including telemonitoring and telephone support). Evidence-based analyses were prepared for each of these topics. For each technology, an economic analysis was also completed where appropriate. In addition, a review of the qualitative literature on patient, caregiver, and provider perspectives on living and dying with COPD was conducted, as were reviews of the qualitative literature on each of the technologies included in these analyses.
The Chronic Obstructive Pulmonary Disease Mega-Analysis series is made up of the following reports, which can be publicly accessed at the MAS website at: http://www.hqontario.ca/en/mas/mas_ohtas_mn.html.
Chronic Obstructive Pulmonary Disease (COPD) Evidentiary Framework
Influenza and Pneumococcal Vaccinations for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Smoking Cessation for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Community-Based Multidisciplinary Care for Patients With Stable Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Pulmonary Rehabilitation for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Long-term Oxygen Therapy for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Noninvasive Positive Pressure Ventilation for Acute Respiratory Failure Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Noninvasive Positive Pressure Ventilation for Chronic Respiratory Failure Patients With Stable Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Hospital-at-Home Programs for Patients With Acute Exacerbations of Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Home Telehealth for Patients with Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Cost-Effectiveness of Interventions for Chronic Obstructive Pulmonary Disease Using an Ontario Policy Model
Experiences of Living and Dying With COPD: A Systematic Review and Synthesis of the Qualitative Empirical Literature
For more information on the qualitative review, please contact Mita Giacomini at: http://fhs.mcmaster.ca/ceb/faculty_member_giacomini.htm.
For more information on the economic analysis, please visit the PATH website: http://www.path-hta.ca/About-Us/Contact-Us.aspx.
The Toronto Health Economics and Technology Assessment (THETA) collaborative has produced an associated report on patient preference for mechanical ventilation. For more information, please visit the THETA website: http://theta.utoronto.ca/static/contact.
Objective
The objective of this analysis was to compare hospital-at-home care with inpatient hospital care for patients with acute exacerbations of chronic obstructive pulmonary disease (COPD) who present to the emergency department (ED).
Clinical Need: Condition and Target Population
Acute Exacerbations of Chronic Obstructive Pulmonary Disease
Chronic obstructive pulmonary disease is a disease state characterized by airflow limitation that is not fully reversible. This airflow limitation is usually both progressive and associated with an abnormal inflammatory response of the lungs to noxious particles or gases. The natural history of COPD involves periods of acute-onset worsening of symptoms, particularly increased breathlessness, cough, and/or sputum, that go beyond normal day-to-day variations; these are known as acute exacerbations.
Two-thirds of COPD exacerbations are caused by an infection of the tracheobronchial tree or by air pollution; the cause in the remaining cases is unknown. On average, patients with moderate to severe COPD experience 2 or 3 exacerbations each year.
Exacerbations have an important impact on patients and on the health care system. For the patient, exacerbations result in decreased quality of life, potentially permanent losses of lung function, and an increased risk of mortality. For the health care system, exacerbations of COPD are a leading cause of ED visits and hospitalizations, particularly in winter.
Technology
Hospital-at-home programs offer an alternative for patients who present to the ED with an exacerbation of COPD and require hospital admission for their treatment. Hospital-at-home programs provide patients with visits in their home by medical professionals (typically specialist nurses) who monitor the patients, alter patients’ treatment plans if needed, and in some programs, provide additional care such as pulmonary rehabilitation, patient and caregiver education, and smoking cessation counselling.
There are 2 types of hospital-at-home programs: admission avoidance and early discharge hospital-at-home. In the former, admission avoidance hospital-at-home, after patients are assessed in the ED, they are prescribed the necessary medications and additional care needed (e.g., oxygen therapy) and then sent home where they receive regular visits from a medical professional. In early discharge hospital-at-home, after being assessed in the ED, patients are admitted to the hospital where they receive the initial phase of their treatment. These patients are discharged into a hospital-at-home program before the exacerbation has resolved. In both cases, once the exacerbation has resolved, the patient is discharged from the hospital-at-home program and no longer receives visits in his/her home.
In the models that exist to date, hospital-at-home programs differ from other home care programs because they deal with higher acuity patients who require higher acuity care, and because hospitals retain the medical and legal responsibility for patients. Furthermore, patients requiring home care services may require such services for long periods of time or indefinitely, whereas patients in hospital-at-home programs require and receive the services for a short period of time only.
Hospital-at-home care is not appropriate for all patients with acute exacerbations of COPD. Ineligible patients include: those with mild exacerbations that can be managed without admission to hospital; those who require admission to hospital; and those who cannot be safely treated in a hospital-at-home program either for medical reasons and/or because of a lack of, or poor, social support at home.
The proposed possible benefits of hospital-at-home for treatment of exacerbations of COPD include: decreased utilization of health care resources by avoiding hospital admission and/or reducing length of stay in hospital; decreased costs; increased health-related quality of life for patients and caregivers when treated at home; and reduced risk of hospital-acquired infections in this susceptible patient population.
Ontario Context
No hospital-at-home programs for the treatment of acute exacerbations of COPD were identified in Ontario. Patients requiring acute care for their exacerbations are treated in hospitals.
Research Question
What is the effectiveness, cost-effectiveness, and safety of hospital-at-home care compared with inpatient hospital care of acute exacerbations of COPD?
Research Methods
Literature Search
Search Strategy
A literature search was performed on August 5, 2010, using OVID MEDLINE, OVID MEDLINE In-Process and Other Non-Indexed Citations, OVID EMBASE, EBSCO Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Wiley Cochrane Library, and the Centre for Reviews and Dissemination database for studies published from January 1, 1990, to August 5, 2010. Abstracts were reviewed by a single reviewer and, for those studies meeting the eligibility criteria, full-text articles were obtained. Reference lists and health technology assessment websites were also examined for any additional relevant studies not identified through the systematic search.
Inclusion Criteria
English language full-text reports;
health technology assessments, systematic reviews, meta-analyses, and randomized controlled trials (RCTs);
studies performed exclusively in patients with a diagnosis of COPD or studies including patients with COPD as well as patients with other conditions, if results are reported for COPD patients separately;
studies performed in patients with acute exacerbations of COPD who present to the ED;
studies published between January 1, 1990, and August 5, 2010;
studies comparing hospital-at-home and inpatient hospital care for patients with acute exacerbations of COPD;
studies that include at least 1 of the outcomes of interest (listed below).
Cochrane Collaboration reviews have defined hospital-at-home programs as those that provide patients with active treatment for their acute exacerbation in their home by medical professionals for a limited period of time (in this case, until the resolution of the exacerbation). If a hospital-at-home program had not been available, these patients would have been admitted to hospital for their treatment.
Exclusion Criteria
< 18 years of age
animal studies
duplicate publications
grey literature
Outcomes of Interest
Patient/clinical outcomes
mortality
lung function (forced expiratory volume in 1 second)
health-related quality of life
patient or caregiver preference
patient or caregiver satisfaction with care
complications
Health system outcomes
hospital readmissions
length of stay in hospital and hospital-at-home
ED visits
transfer to long-term care
days to readmission
eligibility for hospital-at-home
Statistical Methods
When possible, results were pooled using Review Manager 5 Version 5.1; otherwise, results were summarized descriptively. Data from RCTs were analyzed using intention-to-treat protocols. In addition, a sensitivity analysis was done assigning all missing data/withdrawals to the event. P values less than 0.05 were considered significant. A priori subgroup analyses were planned for the acuity of hospital-at-home program, type of hospital-at-home program (early discharge or admission avoidance), and severity of the patients’ COPD. Additional subgroup analyses were conducted as needed based on the identified literature. Post hoc sample size calculations were performed using STATA 10.1.
Quality of Evidence
The quality of each included study was assessed, taking into consideration allocation concealment, randomization, blinding, power/sample size, withdrawals/dropouts, and intention-to-treat analyses.
The quality of the body of evidence was assessed as high, moderate, low, or very low according to the GRADE Working Group criteria. The following definitions of quality were used in grading the quality of the evidence:
Summary of Findings
Fourteen studies met the inclusion criteria and were included in this review: 1 health technology assessment, 5 systematic reviews, and 7 RCTs.
The following conclusions are based on low to very low quality of evidence. The reviewed evidence was based on RCTs that were inadequately powered to observe differences between hospital-at-home and inpatient hospital care for most outcomes, so there is a strong possibility of type II error. Given the low to very low quality of evidence, these conclusions must be considered with caution.
Approximately 21% to 37% of patients with acute exacerbations of COPD who present to the ED may be eligible for hospital-at-home care.
Of the patients who are eligible for care, some may refuse to participate in hospital-at-home care.
Eligibility for hospital-at-home care may be increased depending on the design of the hospital-at-home program, such as the size of the geographical service area for hospital-at-home and the hours of operation for patient assessment and entry into hospital-at-home.
Hospital-at-home care for acute exacerbations of COPD was associated with a nonsignificant reduction in the risk of mortality and hospital readmissions compared with inpatient hospital care during 2- to 6-month follow-up.
Limited, very low quality evidence suggests that hospital readmissions are delayed in patients who received hospital-at-home care compared with those who received inpatient hospital care (mean additional days before readmission comparing hospital-at-home to inpatient hospital care ranged from 4 to 38 days).
There is insufficient evidence to determine whether hospital-at-home care, compared with inpatient hospital care, is associated with improved lung function.
The majority of studies did not find significant differences between hospital-at-home and inpatient hospital care for a variety of health-related quality of life measures at follow-up. However, follow-up may have been too late to observe an impact of hospital-at-home care on quality of life.
A conclusion about the impact of hospital-at-home care on length of stay for the initial exacerbation (defined as days in hospital or days in hospital plus hospital-at-home care for inpatient hospital and hospital-at-home, respectively) could not be determined because of limited and inconsistent evidence.
Patient and caregiver satisfaction with care is high for both hospital-at-home and inpatient hospital care.
PMCID: PMC3384361  PMID: 23074420
4.  Introduction of Medical Emergency Teams in Australia and New Zealand: a multi-centre study 
Critical Care  2008;12(2):R46.
Introduction
Information about Medical Emergency Teams (METs) in Australia and New Zealand (ANZ) is limited to local studies and a cluster randomised controlled trial (the Medical Emergency Response and Intervention Trial [MERIT]). Thus, we sought to describe the timing of the introduction of METs into ANZ hospitals relative to relevant publications and to assess changes in the incidence and rate of intensive care unit (ICU) admissions due to a ward cardiac arrest (CA) and ICU readmissions.
Methods
We used the Australian and New Zealand Intensive Care Society database to obtain the study data. We related MET introduction to publications about adverse events and MET services. We compared the incidence and rate of readmissions and admitted CAs from wards before and after the introduction of an MET. Finally, we identified hospitals without an MET system which had contributed to the database for at least two years from 2002 to 2005 and measured the incidence of adverse events from the first year of contribution to the second.
Results
The MET status was known for 131 of the 172 (76.2%) hospitals that did not participate in the MERIT study. Among these hospitals, 110 (64.1%) had introduced an MET service by 2005. In the 79 hospitals in which the MET commencement date was known, 75% had introduced an MET by May 2002. Of the 110 hospitals in which an MET service was introduced, 24 (21.8%) contributed continuous data in the year before and after the known commencement date. In these hospitals, the mean incidence of CAs admitted to the ICU from the wards changed from 6.33 per year before to 5.04 per year in the year after the MET service began (difference of 1.29 per year, 95% confidence interval [CI] -0.09 to 2.67; P = 0.0244). The incidence of ICU readmissions and the mortality for both ICU-admitted CAs from wards and ICU readmissions did not change. Data were available to calculate the change in ICU admissions due to ward CAs for 16 of 62 (25.8%) hospitals without an MET system. In these hospitals, admissions to the ICU after a ward CA decreased from 5.0 per year in the first year of data contribution to 4.2 per year in the following year (difference of 0.8 per year, 95% CI -0.81 to 3.49; P = 0.3).
Conclusion
Approximately 60% of hospitals in ANZ with an ICU report having an MET service. Most introduced the MET service early and in association with literature related to adverse events. Although available in only a quarter of hospitals, temporal trends suggest an overall decrease in the incidence of ward CAs admitted to the ICU in MET as well as non-MET hospitals.
doi:10.1186/cc6857
PMCID: PMC2447594  PMID: 18394192
5.  Pressure Ulcer Prevention 
Executive Summary
In April 2008, the Medical Advisory Secretariat began an evidence-based review of the literature concerning pressure ulcers.
Please visit the Medical Advisory Secretariat Web site, http://www.health.gov.on.ca/english/providers/program/mas/tech/tech_mn.html to review these titles that are currently available within the Pressure Ulcers series.
Pressure ulcer prevention: an evidence based analysis
The cost-effectiveness of prevention strategies for pressure ulcers in long-term care homes in Ontario: projections of the Ontario Pressure Ulcer Model (field evaluation)
Management of chronic pressure ulcers: an evidence-based analysis (anticipated pubicstion date - mid-2009)
Purpose
A pressure ulcer, also known as a pressure sore, decubitus ulcer, or bedsore, is defined as a localized injury to the skin/and or underlying tissue occurring most often over a bony prominence and caused by pressure, shear, or friction, alone or in combination. (1) Those at risk for developing pressure ulcers include the elderly and critically ill as well as persons with neurological impairments and those who suffer conditions associated with immobility. Pressure ulcers are graded or staged with a 4-point classification system denoting severity. Stage I represents the beginnings of a pressure ulcer and stage IV, the severest grade, consists of full thickness tissue loss with exposed bone, tendon, and or muscle. (1)
In a 2004 survey of Canadian health care settings, Woodbury and Houghton (2) estimated that the prevalence of pressure ulcers at a stage 1 or greater in Ontario ranged between 13.1% and 53% with nonacute health care settings having the highest prevalence rate (Table 1).
Executive Summary Table 1: Prevalence of Pressure Ulcers*
CI indicates confidence interval.
Nonacute care included sub-acute care, chronic care, complex continuing care, long-term care, and nursing home care.
Mixed health care includes a mixture of acute, nonacute, and/or community care health care delivery settings.
Pressure ulcers have a considerable economic impact on health care systems. In Australia, the cost of treating a single stage IV ulcer has been estimated to be greater than $61,000 (AUD) (approximately $54,000 CDN), (3) while in the United Kingdom the total cost of pressure ulcers has been estimated at £1.4–£2.1 billion annually or 4% of the National Health Service expenditure. (4)
Because of the high physical and economic burden of pressure ulcers, this review was undertaken to determine which interventions are effective at preventing the development of pressure ulcers in an at-risk population.
Review Strategy
The main objective of this systematic review is to determine the effectiveness of pressure ulcer preventive interventions including Risk Assessment, Distribution Devices, Nutritional Supplementation, Repositioning, and Incontinence Management.
A comprehensive literature search was completed for each of the above 5 preventive interventions. The electronic databases searched included MEDLINE, MEDLINE In-Process and Other Non-Indexed Citations, EMBASE, the Cochrane Library, and the Cumulative Index to Nursing and Allied Health Literature. As well, the bibliographic references of selected studies were searched. All studies meeting explicit inclusion and exclusion criteria for each systematic review section were retained and the quality of the body of evidence was determined using the Grading of Recommendation Assessment, Development, and Evaluation (GRADE) system. (5) Where appropriate, a meta-analysis was undertaken to determine the overall estimate of effect of the preventive intervention under review.
Summary of Findings
Risk Assessment
There is very low quality evidence to support the hypothesis that allocating the type of pressure-relieving equipment according to the person’s level of pressure ulcer risk statistically decreases the incidence of pressure ulcer development. Similarly, there is very low quality evidence to support the hypothesis that incorporating a risk assessment into nursing practice increases the number of preventative measures used per person and that these interventions are initiated earlier in the care continuum.
Pressure Redistribution Devices
There is moderate quality evidence that the use of an alternative foam mattress produces a relative risk reduction (RRR) of 69% in the incidence of pressure ulcers compared with a standard hospital mattress. The evidence does not support the superiority of one particular type of alternative foam mattress.
There is very low quality evidence that the use of an alternating pressure mattress is associated with an RRR of 71% in the incidence of grade 1 or 2 pressure ulcers. Similarly, there is low quality evidence that the use of an alternating pressure mattress is associated with an RRR of 68% in the incidence of deteriorating skin changes.
There is moderate quality evidence that there is a statistically nonsignificant difference in the incidence of grade 2 pressure ulcers between persons using an alternating pressure mattress and those using an alternating pressure overlay.
There is moderate quality evidence that the use of an Australian sheepskin produces an RRR of 58% in the incidence of pressure ulcers grade 1 or greater. There is also evidence that sheepskins are uncomfortable to use. The Pressure Ulcer Advisory Panel noted that, in general, sheepskins are not a useful preventive intervention because they bunch up in a patient’s bed and may contribute to wound infection if not properly cleaned, and this reduces their acceptability as a preventive intervention.
There is very low quality evidence that the use of a Micropulse System alternating pressure mattress used intra operatively and postoperatively produces an RRR of 79% in the incidence of pressure ulcers compared with a gel-pad used intraoperatively and a standard hospital mattress used postoperatively (standard care). It is unclear if this effect is due to the use of the alternating pressure mattress intraoperatively or postoperatively or if indeed it must be used in both patient care areas.
There is low quality evidence that the use of a vesico-elastic polymer pad (gel pad) on the operating table for surgeries of at least 90 minutes’ duration produces a statistically significant RRR of 47% in the incidence of pressure ulcers grade 1 or greater compared with a standard operating table foam mattress.
There is low quality evidence that the use of an air suspension bed in the intensive care unit (ICU) for stays of at least 3 days produces a statistically significant RRR of 76% in the incidence of pressure ulcers compared with a standard ICU bed.
There is very low quality evidence that the use of an alternating pressure mattress does not statistically reduce the incidence of pressure ulcers compared with an alternative foam mattress.
Nutritional Supplementation
There is very low quality evidence supporting an RRR of 15% in the incidence of pressure ulcers when nutritional supplementation is added to a standard hospital diet.
Repositioning
There is low quality evidence supporting the superiority of a 4-hourly turning schedule with a vesico-elastic polyurethane foam mattress compared with a 2-hourly or 3-hourly turning schedule and a standard foam mattress to reduce the incidence of grade 1 or 2 pressure ulcers.
Incontinence Management
There is very low quality evidence supporting the benefit of a structured skin care protocol to reduce the incidence of grade 1 or 2 pressure ulcers in persons with urinary and/or fecal incontinence.
There is low quality evidence supporting the benefit of a pH-balanced cleanser compared with soap and water to reduce the incidence of grade 1 or 2 pressure ulcers in persons with urinary and fecal incontinence.
Conclusions
There is moderate quality evidence that an alternative foam mattress is effective in preventing the development of pressure ulcers compared with a standard hospital foam mattress.
However, overall there remains a paucity of moderate or higher quality evidence in the literature to support many of the preventive interventions. Until better quality evidence is available, pressure ulcer preventive care must be guided by expert opinion for those interventions where low or very low quality evidence supports the effectiveness of such interventions.
Abbreviations
Confidence interval
Grading of Recommendation Assessment, Development, and Evaluation
Intensive care unit
Medical Advisory Secretariat
National Pressure Ulcer Advisory Panel
Risk assessment scale
Randomized controlled trial
Registered Nurses Association of Ontario
Relative risk
Relative risk reduction
PMCID: PMC3377566  PMID: 23074524
6.  Collagen Cross-Linking Using Riboflavin and Ultraviolet-A for Corneal Thinning Disorders 
Executive Summary
Objective
The main objectives for this evidence-based analysis were to determine the safety and effectiveness of photochemical corneal collagen cross-linking with riboflavin (vitamin B2) and ultraviolet-A radiation, referred to as CXL, for the management of corneal thinning disease conditions. The comparative safety and effectiveness of corneal cross-linking with other minimally invasive treatments such as intrastromal corneal rings was also reviewed. The Medical Advisory Secretariat (MAS) evidence-based analysis was performed to support public financing decisions.
Subject of the Evidence-Based Analysis
The primary treatment objective for corneal cross-linking is to increase the strength of the corneal stroma, thereby stabilizing the underlying disease process. At the present time, it is the only procedure that treats the underlying disease condition. The proposed advantages for corneal cross-linking are that the procedure is minimally invasive, safe and effective, and it can potentially delay or defer the need for a corneal transplant. In addition, corneal cross-linking does not adversely affect subsequent surgical approaches, if they are necessary, or interfere with corneal transplants. The evidence for these claims for corneal cross-linking in the management of corneal thinning disorders such as keratoconus will be the focus of this review.
The specific research questions for the evidence review were as follows:
Technical: How technically demanding is corneal cross-linking and what are the operative risks?
Safety: What is known about the broader safety profile of corneal cross-linking?
Effectiveness - Corneal Surface Topographic Affects:
What are the corneal surface remodeling effects of corneal cross-linking?
Do these changes interfere with subsequent interventions, particularly corneal transplant known as penetrating keratoplasty (PKP)?
Effectiveness -Visual Acuity:
What impacts does the remodeling have on visual acuity?
Are these impacts predictable, stable, adjustable and durable?
Effectiveness - Refractive Outcomes: What impact does remodeling have on refractive outcomes?
Effectiveness - Visual Quality (Symptoms): What impact does corneal cross-linking have on vision quality such as contrast vision, and decreased visual symptoms (halos, fluctuating vision)?
Effectiveness - Contact lens tolerance: To what extent does contact lens intolerance improve after corneal cross-linking?
Vision-Related QOL: What is the impact of corneal cross-linking on functional visual rehabilitation and quality of life?
Patient satisfaction: Are patients satisfied with their vision following the procedure?
Disease Process:
What impact does corneal cross-linking have on the underling corneal thinning disease process?
Does corneal cross-linking delay or defer the need for a corneal transplant?
What is the comparative safety and effectiveness of corneal cross-linking compared with other minimally invasive treatments for corneal ectasia such as intrastromal corneal rings?
Clinical Need: Target Population and Condition
Corneal ectasia (thinning) disorders represent a range of disorders involving either primary disease conditions, such as keratoconus (KC) and pellucid marginal corneal degeneration, or secondary iatrogenic conditions, such as corneal thinning occurring after laser in situ keratomileusis (LASIK) refractive surgery.
Corneal thinning is a disease that occurs when the normally round dome-shaped cornea progressively thins causing a cone-like bulge or forward protrusion in response to the normal pressure of the eye. The thinning occurs primarily in the stroma layers and is believed to be a breakdown in the collagen process. This bulging can lead to irregular astigmatism or shape of the cornea. Because the anterior part of the cornea is responsible for most of the focusing of the light on the retina, this can then result in loss of visual acuity. The reduced visual acuity can make even simple daily tasks, such as driving, watching television or reading, difficult to perform.
Keratoconus is the most common form of corneal thinning disorder and involves a noninflammatory chronic disease process of progressive corneal thinning. Although the specific cause for the biomechanical alterations in the corneal stroma is unknown, there is a growing body of evidence suggesting that genetic factors may play an important role. Keratoconus is a rare disease (< 0.05% of the population) and is unique among chronic eye diseases because it has an early onset, with a median age of 25 years. Disease management for this condition follows a step-wise approach depending on disease severity. Contact lenses are the primary treatment of choice when there is irregular astigmatism associated with the disease. Patients are referred for corneal transplants as a last option when they can no longer tolerate contact lenses or when lenses no longer provide adequate vision.
Keratoconus is one of the leading indications for corneal transplants and has been so for the last 3 decades. Despite the high success rate of corneal transplants (up to 20 years) there are reasons to defer it as long as possible. Patients with keratoconus are generally young and a longer-term graft survival of at least 30 or 40 years may be necessary. The surgery itself involves lengthy time off work and postsurgery, while potential complications include long-term steroid use, secondary cataracts, and glaucoma. After a corneal transplant, keratoconus may recur resulting in a need for subsequent interventions. Residual refractive errors and astigmatism can remain challenges after transplantation, and high refractive surgery and regraft rates in KC patients have been reported. Visual rehabilitation or recovery of visual acuity after transplant may be slow and/or unsatisfactory to patients.
Description of Technology/Therapy
Corneal cross-linking involves the use of riboflavin (vitamin B2) and ultraviolet-A (UVA) radiation. A UVA irradiation device known as the CXL® device (license number 77989) by ACCUTECH Medical Technologies Inc. has been licensed by Health Canada as a Class II device since September 19, 2008. An illumination device that emits homogeneous UVA, in combination with any generic form of riboflavin, is licensed by Health Canada for the indication to slow or stop the progression of corneal thinning caused by progressive keratectasia, iatrogenic keratectasia after laser-assisted in situ keratomileusis (LASIK) and pellucid marginal degeneration. The same device is named the UV-X® device by IROCMedical, with approvals in Argentina, the European Union and Australia.
UVA devices all use light emitting diodes to generate UVA at a wavelength of 360-380 microns but vary in the number of diodes (5 to 25), focusing systems, working distance, beam diameter, beam uniformity and extent to which the operator can vary the parameters. In Ontario, CXL is currently offered at over 15 private eye clinics by refractive surgeons and ophthalmologists.
The treatment is an outpatient procedure generally performed with topical anesthesia. The treatment consists of several well defined procedures. The epithelial cell layer is first removed, often using a blunt spatula in a 9.0 mm diameter under sterile conditions. This step is followed by the application of topical 0.1% riboflavin (vitamin B2) solution every 3 to 5 minutes for 25 minutes to ensure that the corneal stroma is fully penetrated. A solid-state UVA light source with a wavelength of 370 nm (maximum absorption of riboflavin) and an irradiance of 3 mW/cm2 is used to irradiate the central cornea. Following treatment, a soft bandage lens is applied and prescriptions are given for oral pain medications, preservative-free tears, anti-inflammatory drops (preferably not nonsteroidal anti-inflammatory drugs, or NSAIDs) and antibiotic eye drops. Patients are recalled 1 week following the procedure to evaluate re-epithelialization and they are followed-up subsequently.
Evidence-Based Analysis Methods
A literature search was conducted on photochemical corneal collagen cross-linking with riboflavin (vitamin B2) and ultraviolet-A for the management of corneal thinning disorders using a search strategy with appropriate keywords and subject headings for CXL for literature published up until April 17, 2011. The literature search for this Health Technology Assessment (HTA) review was performed using the Cochrane Library, the Emergency Care Research Institute (ECRI) and the Centre for Reviews and Dissemination. The websites of several other health technology agencies were also reviewed, including the Canadian Agency for Drugs and Technologies in Health (CADTH) and the United Kingdom’s National Institute for Clinical Excellence (NICE). The databases searched included OVID MEDLINE, MEDLINE IN-Process and other Non-Indexed Citations such as EMBASE.
As the evidence review included an intervention for a rare condition, case series and case reports, particularly for complications and adverse events, were reviewed. A total of 316 citations were identified and all abstracts were reviewed by a single reviewer for eligibility. For those studies meeting the eligibility criteria, full-text articles were obtained. Reference lists were also examined for any additional relevant studies not identified through the search.
Inclusion Criteria
English-language reports and human studies
patients with any corneal thinning disorder
reports with CXL procedures used alone or in conjunction with other interventions
original reports with defined study methodology
reports including standardized measurements on outcome events such as technical success, safety effectiveness, durability, vision quality of life or patient satisfaction
systematic reviews, meta-analyses, randomized controlled trials, observational studies, retrospective analyses, case series, or case reports for complications and adverse events
Exclusion Criteria
nonsystematic reviews, letters, comments and editorials
reports not involving outcome events such as safety, effectiveness, durability, vision quality or patient satisfaction following an intervention with corneal implants
reports not involving corneal thinning disorders and an intervention involving CXL
Summary of Evidence Findings
In the Medical Advisory Secretariat evidence review on corneal cross-linking, 65 reports (16 case reports) involving 1403 patients were identified on the use of CXL for managing corneal thinning disorders. The reports were summarized according to their primary clinical indication, whether or not secondary interventions were used in conjunction with CXL (referred to as CXL-Plus) and whether or not it was a safety-related report.
The safety review was based on information from the cohort studies evaluating effectiveness, clinical studies evaluating safety, treatment response or recovery, and published case reports of complications. Complications, such as infection and noninfectious keratitis (inflammatory response), reported in case reports, generally occurred in the first week and were successfully treated with topical antibiotics and steroids. Other complications, such as the cytotoxic effects on the targeted corneal stroma, occurred as side effects of the photo-oxidative process generated by riboflavin and ultraviolet-A and were usually reversible.
The reports on treatment effectiveness involved 15 pre-post longitudinal cohort follow-up studies ranging from follow-up of patients’ treated eye only, follow-up in both the treated and untreated fellow-eye; and follow-up in the treated eye only and a control group not receiving treatment. One study was a 3-arm randomized control study (RCT) involving 2 comparators: one comparator was a sham treatment in which one eye was treated with riboflavin only; and the other comparator was the untreated fellow-eye. The outcomes reported across the studies involved statistically significant and clinically relevant improvements in corneal topography and refraction after CXL. In addition, improvements in treated eyes were accompanied by worsening outcomes in the untreated fellow-eyes. Improvements in corneal topography reported at 6 months were maintained at 1- and 2-year follow-up. Visual acuity, although not always improved, was infrequently reported as vision loss. Additional procedures such as the use of intrastromal corneal ring segments, intraocular lenses and refractive surgical practices were reported to result in additional improvements in topography and visual acuity after CXL.
Considerations for Ontario Health System
The total costs of providing CXL therapy to keratoconus patients in Ontario was calculated based on estimated physician, clinic, and medication costs. The total cost per patient was approximately $1,036 for the treatment of one eye, and $1,751 for the treatment of both eyes. The prevalence of keratoconus was estimated at 4,047 patients in FY2011, with an anticipated annual incidence (new cases) of about 148 cases. After distributing the costs of CXL therapy for the FY2011 prevalent keratoconus population over the next 3 years, the estimated average annual cost was approximately $2.1 million, of which about $1.3 million would be physician costs specifically.
Conclusion
Corneal cross-linking effectively stabilizes the underlying disease, and in some cases reverses disease progression as measured by key corneal topographic measures. The affects of CXL on visual acuity are less predictable and the use of adjunct interventions with CXL, such as intrastromal corneal ring segments, refractive surgery, and intraocular lens implants are increasingly employed to both stabilize disease and restore visual acuity. Although the use of adjunct interventions have been shown to result in additional clinical benefit, the order, timing, and risks of performing adjunctive interventions have not been well established.
Although there is potential for serious adverse events with corneal UVA irradiation and photochemical reactions, there have been few reported complications. Those that have occurred tended to be related to side effects of the induced photochemical reactions and were generally reversible. However, to ensure that there are minimal complications with the use of CXL and irradiation, strict adherence to defined CXL procedural protocols is essential.
Keywords
Keratoconus, corneal cross-linking, corneal topography, corneal transplant, visual acuity, refractive error.
PMCID: PMC3377552  PMID: 23074417
7.  Airway Clearance Devices for Cystic Fibrosis 
Executive Summary
Objective
The purpose of this evidence-based analysis is to examine the safety and efficacy of airway clearance devices (ACDs) for cystic fibrosis and attempt to differentiate between devices, where possible, on grounds of clinical efficacy, quality of life, safety and/or patient preference.
Background
Cystic fibrosis (CF) is a common, inherited, life-limiting disease that affects multiple systems of the human body. Respiratory dysfunction is the primary complication and leading cause of death due to CF. CF causes abnormal mucus secretion in the airways, leading to airway obstruction and mucus plugging, which in turn can lead to bacterial infection and further mucous production. Over time, this almost cyclical process contributes to severe airway damage and loss of respiratory function. Removal of airway secretions, termed airway clearance, is thus an integral component of the management of CF.
A variety of methods are available for airway clearance, some requiring mechanical devices, others physical manipulation of the body (e.g. physiotherapy). Conventional chest physiotherapy (CCPT), through the assistance of a caregiver, is the current standard of care for achieving airway clearance, particularly in young patients up to the ages of six or seven. CF patients are, however, living much longer now than in decades past. The median age of survival in Canada has risen to 37.0 years for the period of 1998-2002 (5-year window), up from 22.8 years for the 5-year window ending in 1977. The prevalence has also risen accordingly, last recorded as 3,453 in Canada in 2002, up from 1,630 in 1977. With individuals living longer, there is a greater need for independent methods of airway clearance.
Airway Clearance Devices
There are at least three classes of airway clearance devices: positive expiratory pressure devices (PEP), airway oscillating devices (AOD; either handheld or stationary) and high frequency chest compression (HFCC)/mechanical percussion (MP) devices. Within these classes are numerous different brands of devices from various manufacturers, each with subtle iterations. At least 10 devices are licensed by Health Canada (ranging from Class 1 to Class 3 devices).
Evidence-Based Analysis of Effectiveness
Research Questions
Does long-term use of ACDs improve outcomes of interest in comparison to CCPT in patients with CF?
Does long-term use of one class of ACD improve outcomes of interest in comparison to another class of ACD in CF patients?
Literature Search
A comprehensive literature search was performed on March 7, 2009 using OVID MEDLINE, MEDLINE In-Process and Other Non-Indexed Citations, EMBASE, the Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Cochrane Library, and the International Agency for Health Technology Assessment (INAHTA) for studies published from January 1, 1950 to March 7, 2009.
Inclusion Criteria
All randomized controlled trials including those of parallel and crossover design,
Systematic reviews and/or meta-analyses. Randomized controlled trials (RCTs), systematic reviews and meta-analyses
Exclusion Criteria
Abstracts were generally excluded because their methods could not be examined; however, abstract data was included in several Cochrane meta-analyses presented in this paper;
Studies of less than seven days duration (including single treatment studies);
Studies that did not report primary outcomes;
Studies in which less than 10 patients completed the study.
Outcomes of Interest
Primary outcomes under review were percent-predicted forced expiratory volume (FEV-1), forced vital capacity (FVC), and forced expiratory flow between 25%-75% (FEF25-75). Secondary outcomes included number of hospitalizations, adherence, patient preference, quality of life and adverse events. All outcomes were decided a priori.
Summary of Findings
Literature searching and back-searching identified 13 RCTs meeting the inclusion criteria, along with three Cochrane systematic reviews. The Cochrane reviews were identified in preliminary searching and used as the basis for formulating this review. Results were subgrouped by comparison and according to the available literature. For example, results from Cochrane meta-analyses included abstract data and therefore, additional meta-analyses were also performed on trials reported as full publications only (MAS generally excludes abstracted data when full publications are available as the methodological quality of trials reported in abstract cannot be properly assessed).
Executive Summary Table 1 summarizes the results across all comparisons and subgroupings for primary outcomes of pulmonary function. Only two comparisons yielded evidence of moderate or high quality according to GRADE criteria–the comparisons of CCPT vs. PEP and handheld AOD vs. PEP–but only the comparison of CCPT vs. PEP noted a significant difference between treatment groups. In comparison to CCPT, there was a significant difference in favour of PEP for % predicted FEV-1 and FVC according to one long-term, parallel RCT. This trial was accepted as the best available evidence for the comparison. The body of evidence for the remaining comparisons was low to very low, according to GRADE criteria, being downgraded most often because of poor methodological quality and low generalizability. Specifically, trials were likely not adequately powered (low sample sizes), did not conduct intention-to-treat analyses, were conducted primarily in children and young adolescents, and outdated (conducted more than 10 years ago).
Secondary outcomes were poorly or inconsistently reported, and were generally not of value to decision-making. Of note, there were a significantly higher number of hospitalizations among participants undergoing AOD therapy in comparison to PEP therapy.
Summarization of results for primary outcomes by comparison and subgroupings
Bolding indicates significant difference
Positive summary statistics favour the former intervention
Abbreviations: AOD, airway oscillating device; CCPT, conventional chest physiotherapy; CI, confidence interval; HFCC, high frequency chest compression; MP, mechanical percussion; N/A: not applicable; PEP, positive expiratory pressure
Economic Analysis
Devices ranged in cost from around $60 for PEP and handheld AODs to upwards of $18,000 for a HFCC vest device. Although the majority of device costs are paid out-of-pocket by the patients themselves, their parents, or covered by third-party medical insurance, Ontario did provide funding assistance through the Assistive Devices Program (ADP) for postural drainage boards and MP devices. These technologies, however, are either obsolete or their clinical efficacy is not supported by evidence. ADP provided roughly $16,000 in funding for the 2008/09 fiscal year. Using device costs and prevalent and incident cases of CF in Ontario, budget impact projections were generated for Ontario. Prevalence of CF in Ontario for patients from ages 6 to 71 was cited as 1,047 cases in 2002 while incidence was estimated at 46 new cases of CF diagnosed per year in 2002. Budget impact projections indicated that PEP and handheld AODs were highly economically feasible costing around $90,000 for the entire prevalent population and less than $3,000 per year to cover new incident cases. HFCC vest devices were by far the most expensive, costing in excess of $19 million to cover the prevalent population alone.
Conclusions
There is currently a lack of sufficiently powered, long-term, parallel randomized controlled trials investigating the use of ACDs in comparison to other airway clearance techniques. While much of the current evidence suggests no significant difference between various ACDs and alternative therapies/technologies, at least according to outcomes of pulmonary function, there is a strong possibility that past trials were not sufficiently powered to identify a difference. Unfortunately, it is unlikely that there will be any future trials comparing ACDs to CCPT as withholding therapy using an ACD may be seen as unethical at present.
Conclusions of clinical effectiveness are as follows:
Moderate quality evidence suggests that PEP is at least as effective as or more effective than CCPT, according to primary outcomes of pulmonary function.
Moderate quality evidence suggests that there is no significant difference between PEP and handheld AODs, according to primary outcomes of pulmonary function; however, secondary outcomes may favour PEP.
Low quality evidence suggests that there is no significant difference between AODs or HFCC/MP and CCPT, according to both primary and secondary outcomes.
Very low quality evidence suggests that there is no significant difference between handheld AOD and CCPT, according to primary outcomes of pulmonary function.
Budget impact projections show PEP and handheld AODs to be highly economically feasible.
PMCID: PMC3377547  PMID: 23074531
8.  The Effectiveness of Community Action in Reducing Risky Alcohol Consumption and Harm: A Cluster Randomised Controlled Trial 
PLoS Medicine  2014;11(3):e1001617.
In a cluster randomized controlled trial, Anthony Shakeshaft and colleagues measure the effectiveness of a multi-component community-based intervention for reducing alcohol-related harm.
Background
The World Health Organization, governments, and communities agree that community action is likely to reduce risky alcohol consumption and harm. Despite this agreement, there is little rigorous evidence that community action is effective: of the six randomised trials of community action published to date, all were US-based and focused on young people (rather than the whole community), and their outcomes were limited to self-report or alcohol purchase attempts. The objective of this study was to conduct the first non-US randomised controlled trial (RCT) of community action to quantify the effectiveness of this approach in reducing risky alcohol consumption and harms measured using both self-report and routinely collected data.
Methods and Findings
We conducted a cluster RCT comprising 20 communities in Australia that had populations of 5,000–20,000, were at least 100 km from an urban centre (population ≥ 100,000), and were not involved in another community alcohol project. Communities were pair-matched, and one member of each pair was randomly allocated to the experimental group. Thirteen interventions were implemented in the experimental communities from 2005 to 2009: community engagement; general practitioner training in alcohol screening and brief intervention (SBI); feedback to key stakeholders; media campaign; workplace policies/practices training; school-based intervention; general practitioner feedback on their prescribing of alcohol medications; community pharmacy-based SBI; web-based SBI; Aboriginal Community Controlled Health Services support for SBI; Good Sports program for sports clubs; identifying and targeting high-risk weekends; and hospital emergency department–based SBI. Primary outcomes based on routinely collected data were alcohol-related crime, traffic crashes, and hospital inpatient admissions. Routinely collected data for the entire study period (2001–2009) were obtained in 2010. Secondary outcomes based on pre- and post-intervention surveys (n = 2,977 and 2,255, respectively) were the following: long-term risky drinking, short-term high-risk drinking, short-term risky drinking, weekly consumption, hazardous/harmful alcohol use, and experience of alcohol harm. At the 5% level of statistical significance, there was insufficient evidence to conclude that the interventions were effective in the experimental, relative to control, communities for alcohol-related crime, traffic crashes, and hospital inpatient admissions, and for rates of risky alcohol consumption and hazardous/harmful alcohol use. Although respondents in the experimental communities reported statistically significantly lower average weekly consumption (1.90 fewer standard drinks per week, 95% CI = −3.37 to −0.43, p = 0.01) and less alcohol-related verbal abuse (odds ratio = 0.58, 95% CI = 0.35 to 0.96, p = 0.04) post-intervention, the low survey response rates (40% and 24% for the pre- and post-intervention surveys, respectively) require conservative interpretation. The main limitations of this study are as follows: (1) that the study may have been under-powered to detect differences in routinely collected data outcomes as statistically significant, and (2) the low survey response rates.
Conclusions
This RCT provides little evidence that community action significantly reduces risky alcohol consumption and alcohol-related harms, other than potential reductions in self-reported average weekly consumption and experience of alcohol-related verbal abuse. Complementary legislative action may be required to more effectively reduce alcohol harms.
Trial registration
Australian New Zealand Clinical Trials Registry ACTRN12607000123448
Please see later in the article for the Editors' Summary
Editors' Summary
Background
People have consumed alcoholic beverages throughout history, but alcohol use is now an increasing global public health problem. According to the World Health Organization's 2010 Global Burden of Disease Study, alcohol use is the fifth leading risk factor (after high blood pressure and smoking) for disease and is responsible for 3.9% of the global disease burden. Alcohol use contributes to heart disease, liver disease, depression, some cancers, and many other health conditions. Alcohol also affects the well-being and health of people around those who drink, through alcohol-related crimes and road traffic crashes. The impact of alcohol use on disease and injury depends on the amount of alcohol consumed and the pattern of drinking. Most guidelines define long-term risky drinking as more than four drinks per day on average for men or more than two drinks per day for women (a “drink” is, roughly speaking, a can of beer or a small glass of wine), and short-term risky drinking (also called binge drinking) as seven or more drinks on a single occasion for men or five or more drinks on a single occasion for women. However, recent changes to the Australian guidelines acknowledge that a lower level of alcohol consumption is considered risky (with lifetime risky drinking defined as more than two drinks a day and binge drinking defined as more than four drinks on one occasion).
Why Was This Study Done?
In 2010, the World Health Assembly endorsed a global strategy to reduce the harmful use of alcohol. This strategy emphasizes the importance of community action–a process in which a community defines its own needs and determines the actions that are required to meet these needs. Although community action is highly acceptable to community members, few studies have looked at the effectiveness of community action in reducing risky alcohol consumption and alcohol-related harm. Here, the researchers undertake a cluster randomized controlled trial (the Alcohol Action in Rural Communities [AARC] project) to quantify the effectiveness of community action in reducing risky alcohol consumption and harms in rural communities in Australia. A cluster randomized trial compares outcomes in clusters of people (here, communities) who receive alternative interventions assigned through the play of chance.
What Did the Researchers Do and Find?
The researchers pair-matched 20 rural Australian communities according to the proportion of their population that was Aboriginal (rates of alcohol-related harm are disproportionately higher among Aboriginal individuals than among non-Aboriginal individuals in Australia; they are also higher among young people and males, but the proportions of these two groups across communities was comparable). They randomly assigned one member of each pair to the experimental group and implemented 13 interventions in these communities by negotiating with key individuals in each community to define and implement each intervention. Examples of interventions included general practitioner training in screening for alcohol use disorders and in implementing a brief intervention, and a school-based interactive session designed to reduce alcohol harm among young people. The researchers quantified the effectiveness of the interventions using routinely collected data on alcohol-related crime and road traffic crashes, and on hospital inpatient admissions for alcohol dependence or abuse (which were expected to increase in the experimental group if the intervention was effective because of more people seeking or being referred for treatment). They also examined drinking habits and experiences of alcohol-related harm, such as verbal abuse, among community members using pre- and post-intervention surveys. After implementation of the interventions, the rates of alcohol-related crime, road traffic crashes, and hospital admissions, and of risky and hazardous/harmful alcohol consumption (measured using a validated tool called the Alcohol Use Disorders Identification Test) were not statistically significantly different in the experimental and control communities (a difference in outcomes that is not statistically significantly different can occur by chance). However, the reported average weekly consumption of alcohol was 20% lower in the experimental communities after the intervention than in the control communities (equivalent to 1.9 fewer standard drinks per week per respondent) and there was less alcohol-related verbal abuse post-intervention in the experimental communities than in the control communities.
What Do These Findings Mean?
These findings provide little evidence that community action reduced risky alcohol consumption and alcohol-related harms in rural Australian communities. Although there was some evidence of significant reductions in self-reported weekly alcohol consumption and in experiences of alcohol-related verbal abuse, these findings must be interpreted cautiously because they are based on surveys with very low response rates. A larger or differently designed study might provide statistically significant evidence for the effectiveness of community action in reducing risky alcohol consumption. However, given their findings, the researchers suggest that legislative approaches that are beyond the control of individual communities, such as alcohol taxation and restrictions on alcohol availability, may be required to effectively reduce alcohol harms. In other words, community action alone may not be the most effective way to reduce alcohol-related harm.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001617.
The World Health Organization provides detailed information about alcohol; its fact sheet on alcohol includes information about the global strategy to reduce the harmful use of alcohol; the Global Information System on Alcohol and Health provides further information about alcohol, including information on control policies around the world
The US National Institute on Alcohol Abuse and Alcoholism has information about alcohol and its effects on health
The US Centers for Disease Control and Prevention has a website on alcohol and public health that includes information on the health risks of excessive drinking
The UK National Health Service Choices website provides detailed information about drinking and alcohol, including information on the risks of drinking too much, tools for calculating alcohol consumption, and personal stories about alcohol use problems
MedlinePlus provides links to many other resources on alcohol
More information about the Alcohol Action in Rural Communities project is available
doi:10.1371/journal.pmed.1001617
PMCID: PMC3949675  PMID: 24618831
9.  Multiple Intravenous Infusions Phase 1b 
Background
Minimal research has been conducted into the potential patient safety issues related to administering multiple intravenous (IV) infusions to a single patient. Previous research has highlighted that there are a number of related safety risks. In Phase 1a of this study, an analysis of 2 national incident-reporting databases (Institute for Safe Medical Practices Canada and United States Food and Drug Administration MAUDE) found that a high percentage of incidents associated with the administration of multiple IV infusions resulted in patient harm.
Objectives
The primary objectives of Phase 1b of this study were to identify safety issues with the potential to cause patient harm stemming from the administration of multiple IV infusions; and to identify how nurses are being educated on key principles required to safely administer multiple IV infusions.
Data Sources and Review Methods
A field study was conducted at 12 hospital clinical units (sites) across Ontario, and telephone interviews were conducted with program coordinators or instructors from both the Ontario baccalaureate nursing degree programs and the Ontario postgraduate Critical Care Nursing Certificate programs. Data were analyzed using Rasmussen’s 1997 Risk Management Framework and a Health Care Failure Modes and Effects Analysis.
Results
Twenty-two primary patient safety issues were identified with the potential to directly cause patient harm. Seventeen of these (critical issues) were categorized into 6 themes. A cause-consequence tree was established to outline all possible contributing factors for each critical issue. Clinical recommendations were identified for immediate distribution to, and implementation by, Ontario hospitals. Future investigation efforts were planned for Phase 2 of the study.
Limitations
This exploratory field study identifies the potential for errors, but does not describe the direct observation of such errors, except in a few cases where errors were observed. Not all issues are known in advance, and the frequency of errors is too low to be observed in the time allotted and with the limited sample of observations.
Conclusions
The administration of multiple IV infusions to a single patient is a complex task with many potential associated patient safety risks. Improvements to infusion and infusion-related technology, education standards, clinical best practice guidelines, hospital policies, and unit work practices are required to reduce the risk potential. This report makes several recommendations to Ontario hospitals so that they can develop an awareness of the issues highlighted in this report and minimize some of the risks. Further investigation of mitigating strategies is required and will be undertaken in Phase 2 of this research.
Plain Language Summary
Patients, particularly in critical care environments, often require multiple intravenous (IV) medications via large volumetric or syringe infusion pumps. The infusion of multiple IV medications is not without risk; unintended errors during these complex procedures have resulted in patient harm. However, the range of associated risks and the factors contributing to these risks are not well understood.
Health Quality Ontario’s Ontario Health Technology Advisory Committee commissioned the Health Technology Safety Research Team at the University Health Network to conduct a multi-phase study to identify and mitigate the risks associated with multiple IV infusions. Some of the questions addressed by the team were as follows: What is needed to reduce the risk of errors for individuals who are receiving a lot of medications? What strategies work best?
The initial report, Multiple Intravenous Infusions Phase 1a: Situation Scan Summary Report, summarizes the interim findings based on a literature review, an incident database review, and a technology scan.
The Health Technology Safety Research Team worked in close collaboration with the Institute for Safe Medication Practices Canada on an exploratory study to understand the risks associated with multiple IV infusions and the degree to which nurses are educated to help mitigate them. The current report, Multiple Intravenous Infusions Phase 1b: Practice and Training Scan, presents the findings of a field study of 12 hospital clinical units across Ontario, as well as 13 interviews with educators from baccalaureate-level nursing degree programs and postgraduate Critical Care Nursing Certificate programs. It makes 9 recommendations that emphasize best practices for the administration of multiple IV infusions and pertain to secondary infusions, line identification, line set-up and removal, and administering IV bolus medications.
The Health Technology Safety Research Team has also produced an associated report for hospitals entitled Mitigating the Risks Associated With Multiple IV Infusions: Recommendations Based on a Field Study of Twelve Ontario Hospitals, which highlights the 9 interim recommendations and provides a brief rationale for each one.
PMCID: PMC3377572  PMID: 23074426
10.  Impact of Extended-Duration Shifts on Medical Errors, Adverse Events, and Attentional Failures 
PLoS Medicine  2006;3(12):e487.
Background
A recent randomized controlled trial in critical-care units revealed that the elimination of extended-duration work shifts (≥24 h) reduces the rates of significant medical errors and polysomnographically recorded attentional failures. This raised the concern that the extended-duration shifts commonly worked by interns may contribute to the risk of medical errors being made, and perhaps to the risk of adverse events more generally. Our current study assessed whether extended-duration shifts worked by interns are associated with significant medical errors, adverse events, and attentional failures in a diverse population of interns across the United States.
Methods and Findings
We conducted a Web-based survey, across the United States, in which 2,737 residents in their first postgraduate year (interns) completed 17,003 monthly reports. The association between the number of extended-duration shifts worked in the month and the reporting of significant medical errors, preventable adverse events, and attentional failures was assessed using a case-crossover analysis in which each intern acted as his/her own control. Compared to months in which no extended-duration shifts were worked, during months in which between one and four extended-duration shifts and five or more extended-duration shifts were worked, the odds ratios of reporting at least one fatigue-related significant medical error were 3.5 (95% confidence interval [CI], 3.3–3.7) and 7.5 (95% CI, 7.2–7.8), respectively. The respective odds ratios for fatigue-related preventable adverse events, 8.7 (95% CI, 3.4–22) and 7.0 (95% CI, 4.3–11), were also increased. Interns working five or more extended-duration shifts per month reported more attentional failures during lectures, rounds, and clinical activities, including surgery and reported 300% more fatigue-related preventable adverse events resulting in a fatality.
Conclusions
In our survey, extended-duration work shifts were associated with an increased risk of significant medical errors, adverse events, and attentional failures in interns across the United States. These results have important public policy implications for postgraduate medical education.
During months in which medical interns worked extended shifts, the chances of their reporting at least one fatigue-related significant medical error increased more than 3-fold compared to months with no extended shifts.
Editors' Summary
Background.
In the United States, medical students who are doing their internship (first year of postgraduate clinical training) regularly work in the clinic for longer than 24 h at a time. It is already known that doctors or students who work for long shifts make more medical errors and are less able to pay attention to what they are doing. Many thousands of adverse medical events per year including, in the extreme, deaths of patients, are thought to result from medical errors, but it is not clear whether doctors or students working long shifts—as opposed to, for example, an increase in total number of hours worked—are the cause of many or any of these errors.
Why Was This Study Done?
This research group wanted to find out whether long shifts worked by interns had an effect on reported medical errors, and hence patient safety, and specifically whether any harm that happened to patients might otherwise have been preventable.
What Did the Researchers Do and Find?
The researchers contacted all US medical school graduates beginning their internships from one particular year-group by email, and asked each person whether they wanted to take part in a confidential survey. Individuals who agreed to participate were directed to a secure website to enter basic information about themselves and then to complete a form each month. On that form the interns gave information about their working hours, hours of sleep, and number of extended-duration shifts worked, and completed questions about medical errors in the past month. Then, for each intern in the study, researchers compared month by month the number of medical errors and the number of extended-duration shifts that had been worked. A total of 2,737 interns took part in the survey.
  Compared to months in which no extended-duration shifts were worked, in those months in which between one and four, and more than five extended-duration shifts were worked, the doctors were, respectively, three and seven times more likely to report at least one fatigue-related significant medical error. Similarly, fatigue-related adverse events increased by around seven and eight times, respectively, compared with months in which no extended-duration shifts were worked. Fatigue-related preventable adverse events associated with the death of the patient increased by ∼300% in interns working more than five extended-duration shifts per month; they were also more likely to fall asleep during lectures, rounds, and clinical activities, including surgery.
What Do These Findings Mean?
Guidelines for graduate medical education in the United States still allow up to nine marathon shifts (30 h at a stretch) per month, even though the total number of hours worked is capped. This study shows that the long shifts worked by interns are bad for patient safety, as they are more likely to cause harm that would not otherwise happen.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/doi:10.1371/journal.pmed.0030487.
The US Food and Drug Administration has resources on its website about medication errors
The National Sleep Foundation aims to improve public health and safety by achieving understanding of sleep and sleep disorders
Wikipedia (an internet encyclopedia anyone can edit) has a page about residency training in the United States
doi:10.1371/journal.pmed.0030487
PMCID: PMC1705824  PMID: 17194188
11.  Effect of Patient- and Medication-Related Factors on Inpatient Medication Reconciliation Errors 
ABSTRACT
Background
Little research has examined the incidence, clinical relevance, and predictors of medication reconciliation errors at hospital admission and discharge.
Objective
To identify patient- and medication-related factors that contribute to pre-admission medication list (PAML) errors and admission order errors, and to test whether such errors persist in the discharge medication list.
Design, Participants
We conducted a cross-sectional analysis of 423 adults with acute coronary syndromes or acute decompensated heart failure admitted to two academic hospitals who received pharmacist-assisted medication reconciliation during the Pharmacist Intervention for Low Literacy in Cardiovascular Disease (PILL–CVD) Study.
Main Measures
Pharmacists assessed the number of total and clinically relevant errors in the PAML and admission and discharge medication orders. We used negative binomial regression and report incidence rate ratios (IRR) of predictors of reconciliation errors.
Key Results
On admission, 174 of 413 patients (42%) had ≥1 PAML error, and 73 (18%) had ≥1 clinically relevant PAML error. At discharge, 158 of 405 patients (39%) had ≥1 discharge medication error, and 126 (31%) had ≥1 clinically relevant discharge medication error. Clinically relevant PAML errors were associated with older age (IRR = 1.46; 95% CI, 1.00– 2.12) and number of pre-admission medications (IRR = 1.17; 95% CI, 1.10–1.25), and were less likely when a recent medication list was present in the electronic medical record (EMR) (IRR = 0.54; 95% CI, 0.30–0.96). Clinically relevant admission order errors were also associated with older age and number of pre-admission medications. Clinically relevant discharge medication errors were more likely for every PAML error (IRR = 1.31; 95% CI, 1.19–1.45) and number of medications changed prior to discharge (IRR = 1.06; 95% CI, 1.01–1.11).
Conclusions
Medication reconciliation errors are common at hospital admission and discharge. Errors in preadmission medication histories are associated with older age and number of medications and lead to more discharge reconciliation errors. A recent medication list in the EMR is protective against medication reconciliation errors.
doi:10.1007/s11606-012-2003-y
PMCID: PMC3403136  PMID: 22350761
medication reconciliation; hospital; medication errors; admission; discharge
12.  Endovascular Laser Therapy for Varicose Veins 
Executive Summary
Objective
The objective of the MAS evidence review was to conduct a systematic review of the available evidence on the safety, effectiveness, durability and cost–effectiveness of endovascular laser therapy (ELT) for the treatment of primary symptomatic varicose veins (VV).
Background
The Ontario Health Technology Advisory Committee (OHTAC) met on November 27, 2009 to review the safety, effectiveness, durability and cost-effectiveness of ELT for the treatment of primary VV based on an evidence-based review by the Medical Advisory Secretariat (MAS).
Clinical Condition
VV are tortuous, twisted, or elongated veins. This can be due to existing (inherited) valve dysfunction or decreased vein elasticity (primary venous reflux) or valve damage from prior thrombotic events (secondary venous reflux). The end result is pooling of blood in the veins, increased venous pressure and subsequent vein enlargement. As a result of high venous pressure, branch vessels balloon out leading to varicosities (varicose veins).
Symptoms typically affect the lower extremities and include (but are not limited to): aching, swelling, throbbing, night cramps, restless legs, leg fatigue, itching and burning. Left untreated, venous reflux tends to be progressive, often leading to chronic venous insufficiency (CVI).
A number of complications are associated with untreated venous reflux: including superficial thrombophlebitis as well as variceal rupture and haemorrhage. CVI often results in chronic skin changes referred to as stasis dermatitis. Stasis dermatitis is comprised of a spectrum of cutaneous abnormalities including edema, hyperpigmentation, eczema, lipodermatosclerosis and stasis ulceration. Ulceration represents the disease end point for severe CVI.
CVI is associated with a reduced quality of life particularly in relation to pain, physical function and mobility. In severe cases, VV with ulcers, QOL has been rated to be as bad or worse as other chronic diseases such as back pain and arthritis.
Lower limb VV is a common disease affecting adults and estimated to be the seventh most common reason for physician referral in the US. There is a strong familial predisposition to VV with the risk in offspring being 90% if both parents affected, 20% when neither is affected, and 45% (25% boys, 62% girls) if one parent is affected. Globally, the prevalence of VV ranges from 5% to 15% among men and 3% to 29% among women varying by the age, gender and ethnicity of the study population, survey methods and disease definition and measurement. The annual incidence of VV estimated from the Framingham Study was reported to be 2.6% among women and 1.9% among men and did not vary within the age range (40-89 years) studied.
Approximately 1% of the adult population has a stasis ulcer of venous origin at any one time with 4% at risk. The majority of leg ulcer patients are elderly with simple superficial vein reflux. Stasis ulcers are often lengthy medical problems and can last for several years and, despite effective compression therapy and multilayer bandaging are associated with high recurrence rates. Recent trials involving surgical treatment of superficial vein reflux have resulted in healing and significantly reduced recurrence rates.
Endovascular Laser Therapy for VV
ELT is an image-guided, minimally invasive treatment alternative to surgical stripping of superficial venous reflux. It does not require an operating room or general anesthesia and has been performed in outpatient settings by a variety of medical specialties including surgeons (vascular or general), interventional radiologists and phlebologists. Rather than surgically removing the vein, ELT works by destroying, cauterizing or ablating the refluxing vein segment using heat energy delivered via laser fibre.
Prior to ELT, colour-flow Doppler ultrasonography is used to confirm and map all areas of venous reflux to devise a safe and effective treatment plan. The ELT procedure involves the introduction of a guide wire into the target vein under ultrasound guidance followed by the insertion of an introducer sheath through which an optical fibre carrying the laser energy is advanced. A tumescent anesthetic solution is injected into the soft tissue surrounding the target vein along its entire length. This serves to anaesthetize the vein so that the patient feels no discomfort during the procedure. It also serves to insulate the heat from damaging adjacent structures, including nerves and skin. Once satisfactory positioning has been confirmed with ultrasound, the laser is activated. Both the laser fibre and the sheath are simultaneously, slowly and continuously pulled back along the length of the target vessel. At the end of the procedure, homeostasis is then achieved by applying pressure to the entry point.
Adequate and proper compression stockings and bandages are applied after the procedure to reduce the risk of venous thromboembolism, and to reduce postoperative bruising and tenderness. Patients are encouraged to walk immediately after the procedure and most patients return to work or usual activity within a few days. Follow-up protocols vary, with most patients returning 1-3 weeks later for an initial follow-up visit. At this point, the initial clinical result is assessed and occlusion of the treated vessels is confirmed with ultrasound. Patients often have a second follow-up visit 1-3 months following ELT at which time clinical evaluation and ultrasound are repeated. If required, sclerotherapy may be performed during the ELT procedure or at any follow-up visits.
Regulatory Status
Endovascular laser for the treatment of VV was approved by Health Canada as a class 3 device in 2002. The treatment has been an insured service in Saskatchewan since 2007 and is the only province to insure ELT. Although the treatment is not an insured service in Ontario, it has been provided by various medical specialties since 2002 in over 20 private clinics.
Methods
Literature Search
The MAS evidence-based review was performed as an update to the 2007 health technology review performed by the Australian Medical Services Committee (MSAC) to support public financing decisions. The literature search was performed on August 18, 2009 using standard bibliographic databases for studies published from January 1, 2007 to August 15, 2009. Search alerts were generated and reviewed for additional relevant literature up until October 1, 2009.
Inclusion Criteria
English language full-reports and human studies
Original reports with defined study methodology
Reports including standardized measurements on outcome events such as technical success, safety, effectiveness, durability, quality of life or patient satisfaction
Reports involving ELT for VV (great or small saphenous veins)
Randomized controlled trials (RCTs), systematic reviews and meta-analyses
Cohort and controlled clinical studies involving > 1 month ultrasound imaging follow-up
Exclusion Criteria
Non systematic reviews, letters, comments and editorials
Reports not involving outcome events such as safety, effectiveness, durability, or patient satisfaction following an intervention with ELT
Reports not involving interventions with ELT for VV
Pilot studies or studies with small samples ( < 50 subjects)
Summary of Findings
The MAS evidence search identified 14 systematic reviews, 29 cohort studies on safety and effectiveness, four cost studies and 12 randomized controlled trials involving ELT, six of these comparing endovascular laser with surgical ligation and saphenous vein stripping.
Since 2007, 22 cohort studies involving 10,883 patients undergoing ELT of the great saphenous vein (GSV) have been published. Imaging defined treatment effectiveness of mean vein closure rates were reported to be greater than 90% (range 93%- 99%) at short term follow-up. Longer than one year follow-up was reported in five studies with life table analysis performed in four but the follow up was still limited at three and four years. The overall pooled major adverse event rate, including DVT, PE, skin burns or nerve damage events extracted from these studies, was 0.63% (69/10,883).
The overall level of evidence of randomized trials comparing ELT with surgical ligation and vein stripping (n= 6) was graded as moderate to high. Recovery after treatment was significantly quicker after ELT (return to work median number of days, 4 vs. 17; p= .005). Major adverse events occurring after surgery were higher [(1.8% (n=4) vs. 0.4% (n = 1) 1 but not significantly. Treatment effectiveness as measured by imaging vein absence or closure, symptom relief or quality of life similar in the two treatment groups and both treatments resulted in statistically significantly improvements in these outcomes. Recurrence was low after both treatments at follow up but neovascularization (growth of new vessels, a key predictor of long term recurrence was significantly more common (18% vs. 1%; p = .001) after surgery. Although patient satisfaction was reported to be high (>80%) with both treatments, patient preferences evaluated through recruitment process, physician reports and consumer groups were strongly in favour of ELT. For patients minimal complications, quick recovery and dependability of outpatient scheduling were key considerations.
As clinical effectiveness of the two treatments was similar, a cost-analysis was performed to compare differences in resources and costs between the two procedures. A budget impact analysis for introducing ELT as an insured service was also performed. The average case cost (based on Ontario hospital costs and medical resources) for surgical vein stripping was estimated to be $1,799. Because of the uncertainties with resources associated with ELT, in addition to the device related costs, hospital costs were varied and assumed to be the same as or less than (40%) those for surgery resulting in an average ELT case cost of $2,025 or $1,602.
Based on the historical pattern of surgical vein stripping for varices a 5-year projection was made for annual volumes and costs. In Ontario in 2007/2008, 3481 surgical vein stripping procedures were performed, 28% for repeat procedures. Annual volumes of ELT currently being performed in the province in over 20 private clinics were estimated to be approximately 840. If ELT were publicly reimbursed, it was assumed that it would capture 35% of the vein stripping market in the first year and increase to 55% in subsequent years. Based on these assumptions if ELT were not publicly reimbursed, the province would be paying approximately $5.9 million and if ELT were reimbursed the province would pay $8.2 million if the hospital costs for ELT were the same as surgery and $7.1 million if the hospital costs were less (40%) than surgery.
The conclusions on the comparative outcomes between laser ablation and surgical ligation and saphenous vein stripping are summarized in the table below (ES Table 1).
Outcome comparisons of ELT vs. surgery for VV
The outcomes of the evidence-based review on these treatments based on three different perspectives are summarized below:
Patient Outcomes – ELT vs. Surgery
ELT has a quicker recovery attributable to the decreased pain, lower minor complications, use of local anesthesia with immediate ambulation.
ELT is as effective as surgery in the short term as assessed by imaging anatomic outcomes, symptomatic relief and HRQOL outcomes.
Recurrence is similar but neovascularization, a key predictor of long term recurrence, is significantly higher with surgery.
Patient satisfaction is equally high after both treatments but patient preference is much more strongly for ELT. Surgeons performing ELT are satisfied with treatment outcomes and regularly offer ELT as a treatment alternative to surgery.
Clinical or Technical Advantages – ELT Over Surgery
An endovascular approach can more easily and more precisely treat multilevel disease and difficult to treat areas
ELT is an effective and a less invasive treatment for the elderly with VV and those with venous leg ulcers.
System Outcomes – ELT Replacing Surgery
ELT may offer system advantages in that the treatment can be offered by several medical specialties in outpatient settings and because it does not require an operating theatre or general anesthesia.
The treatment may result in ↓ pre-surgical investigations, decanting of patients from OR, ↓ demand on anesthetists time, ↓ hospital stay, ↓decrease wait time for VV treatment and provide more reliable outpatient scheduling.
Depending on the reimbursement mechanism for the treatment, however, it may also result in closure of outpatient clinics with an increasingly centralization of procedures in selected hospitals with large capital budgets resulting in larger and longer waiting lists.
Procedure costs may be similar for the two treatments but the budget impact may be greater with insurance of ELT because of the transfer of the cases from the private market to the public payer system.
PMCID: PMC3377531  PMID: 23074409
13.  Relationship between Vehicle Emissions Laws and Incidence of Suicide by Motor Vehicle Exhaust Gas in Australia, 2001–06: An Ecological Analysis 
PLoS Medicine  2010;7(1):e1000210.
In an ecological study, David Studdert and colleagues show that areas of Australia with fewer vehicles pre-dating stringent carbon monoxide emission laws have lower rates of suicide due to asphyxiation by motor vehicle exhaust gas.
Background
Globally, suicide accounts for 5.2% of deaths among persons aged 15 to 44 years and its incidence is rising. In Australia, suicide rates peaked in 1997 and have been declining since. A substantial part of that decline stems from a plunge in suicides by one particular method: asphyxiation by motor vehicle exhaust gas (MVEG). Although MVEG remains the second most common method of suicide in Australia, its incidence decreased by nearly 70% in the decade to 2006. The extent to which this phenomenon has been driven by national laws in 1986 and 1999 that lowered permissible levels of carbon monoxide (CO) emissions is unknown. The objective of this ecological study was to test the relationship by investigating whether areas of Australia with fewer noxious vehicles per capita experienced lower rates of MVEG suicide.
Methods and Findings
We merged data on MVEG suicides in Australia (2001–06) with data on the number and age of vehicles in the national fleet, as well as socio-demographic data from the national census. Poisson regression was used to analyse the relationship between the incidence of suicide within two levels of geographical area—postcodes and statistical subdivisions (SSDs)—and the population density of pre-1986 and pre-1999 passenger vehicles in those areas. (There was a mean population of 8,302 persons per postcode in the study dataset and 87,413 persons per SSD.) The annual incidence of MVEG suicides nationwide decreased by 57% (from 2.6 per 100,000 in 2001 to 1.1 in 2006) during the study period; the population density of pre-1986 and pre-1999 vehicles decreased by 55% (from 14.2 per 100 persons in 2001 to 6.4 in 2006) and 26% (from 44.5 per 100 persons in 2001 to 32.9 in 2006), respectively. Area-level regression analysis showed that the suicide rates were significantly and positively correlated with the presence of older vehicles. A percentage point decrease in the population density of pre-1986 vehicles was associated with a 6% decrease (rate ratio [RR] = 1.06; 95% confidence interval [CI] 1.05–1.08) in the incidence of MVEG suicide within postcode areas; a percentage point decrease in the population density of pre-1999 vehicles was associated with a 3% decrease (RR = 1.03; 95% CI 1.02–1.04) in the incidence of MVEG suicide.
Conclusions
Areas of Australia with fewer vehicles predating stringent CO emission laws experience lower rates of MVEG suicide. Although those emission laws were introduced primarily for environmental reasons, countries that lack them may miss the benefits of a serendipitous suicide prevention strategy.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Suicide (self-inflicted death) is a major, preventable public-health problem. About 1 million people die each year from suicide and about 20 times as many people attempt suicide. Globally, suicide rates have increased by nearly a half over the past 45 years and suicide is now among the three leading causes of death in people aged 15–44 years. Within this age group, 1 in 20 deaths is a suicide. Most people who commit suicide have a mental illness, usually depression or substance abuse, but suicide can also be triggered by a stressful event such as losing a partner. Often warning signs are present—a person who talks about killing themselves must always be taken seriously. Adequate prevention and treatment of mental illness and interventions that teach young people coping skills and improve their self-esteem have shown promise in reducing suicide rates, as have strategies (for example, restrictions on the sale of pain killers) that reduce access to common methods of suicide.
Why Was This Study Done?
In Australia, the suicide rate has been declining since 1997 when a record 2,722 suicides occurred. Fewer suicides by asphyxiation (oxygen deprivation) by motor vehicle gas exhaust (MVEG) account for much of this decline. MVEG contains carbon monoxide, a toxic gas that blocks oxygen transport around the body. Although MVEG suicide is still the second most common means of suicide in Australia, its incidence has dropped by two-thirds since 1997 but why? One possibility is that national laws passed in 1986 and 1999 that lowered the permissible level of carbon monoxide in vehicle exhaust for environmental reasons have driven the decline in MVEG suicides. Evidence from other countries suggests that this might be the case but no-one has directly investigated the relationship between MVEG suicide and the use of vehicles with reduced carbon monoxide emissions. In this ecological study (a study in which the effect of an intervention is studied on groups of people rather than on individuals), the researchers ask whether the number of pre-1986 and pre-1999 vehicles within particular geographic areas in Australia is correlated with the rates of MVEG suicide in those areas between 2001 and 2006.
What Did the Researchers Do and Find?
The researchers obtained data on MVEG suicides from the Australian National Coroners Information System and data on the number and age of vehicles on the road from the Australian Bureau of Statistics. MVEG suicides dropped from 498 in 2001 to 231 in 2006, they report, and 28% of passenger vehicles registered in Australia were made before 1986 in 2001 but only 12% in 2006; the percentage of registered vehicles made before 1999 fell from 89% to 60% over the same period. The researchers then used a statistical technique called Poisson regression to analyze the relationship within postcode areas between the incidence of MVEG suicide and the presence of pre-1986 and pre-1999 vehicles. This analysis showed that in areas where older vehicles were more numerous there were more MVEG suicides (a positive correlation). Specifically, the researchers calculate that if the proportion of pre-1986 vehicles on the road in Australia had stayed at 2001 levels throughout their study period, 621 extra MVEG suicides would have occurred in the country over that time.
What Do These Findings Mean?
These findings show that in areas of Australia that had fewer vehicles on the road predating stringent vehicle emission laws, there were lower rates of MVEG suicide between 2001 and 2006. Unfortunately, this study cannot provide any information on the actual age of vehicles used in MVEG suicides or on the relationship between vehicle age and attempted MVEG suicides. It also cannot reveal whether those areas that had the sharpest decreases in the density of older vehicles had the sharpest decreases in suicide rates because very few suicides occurred in most postcodes during the study. Most importantly, the design of this study means that the researchers cannot discount the possibility that the changes in Australia's emission laws have steered people towards other methods of taking their own lives. Nevertheless, the findings of this study suggest that the introduction of stringent vehicle emission laws for environmental reasons might, serendipitously, be a worthwhile long-term suicide prevention strategy in countries where MVEG suicide is common.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1000210.
Another PLoS Medicine research article, by Shu-Sen Chang and colleagues, investigates the evolution of the epidemic of charcoal-burning suicide in Taiwan
The US National Institute of Mental Health provides information on suicide and suicide prevention
The UK National Health Service Choices Web site has detailed information about suicide and its prevention
The World Health Organization provides information on the global burden of suicide and on suicide prevention (in several languages)
MedlinePlus provides links to further resources about suicide (in English and Spanish)
Suicide Prevention Australia is a nonprofit, nongovernmental organization working as a public-health advocate in suicide prevention
The Australian Institute of Health and Welfare has recently published a review of suicide statistics in Australia
The National Coroners Information System is a database contains information on every death reported to an Australian coroner since July 2000 (January 2001 for Queensland)
doi:10.1371/journal.pmed.1000210
PMCID: PMC2796388  PMID: 20052278
14.  Erectile Dysfunction Severity as a Risk Marker for Cardiovascular Disease Hospitalisation and All-Cause Mortality: A Prospective Cohort Study 
PLoS Medicine  2013;10(1):e1001372.
In a prospective Australian population-based study linking questionnaire data from 2006–2009 with hospitalisation and death data to June 2010 for 95,038 men aged ≥45 years, Banks and colleagues found that more severe erectile dysfunction was associated with higher risk of cardiovascular disease.
Background
Erectile dysfunction is an emerging risk marker for future cardiovascular disease (CVD) events; however, evidence on dose response and specific CVD outcomes is limited. This study investigates the relationship between severity of erectile dysfunction and specific CVD outcomes.
Methods and Findings
We conducted a prospective population-based Australian study (the 45 and Up Study) linking questionnaire data from 2006–2009 with hospitalisation and death data to 30 June and 31 Dec 2010 respectively for 95,038 men aged ≥45 y. Cox proportional hazards models were used to examine the relationship of reported severity of erectile dysfunction to all-cause mortality and first CVD-related hospitalisation since baseline in men with and without previous CVD, adjusting for age, smoking, alcohol consumption, marital status, income, education, physical activity, body mass index, diabetes, and hypertension and/or hypercholesterolaemia treatment. There were 7,855 incident admissions for CVD and 2,304 deaths during follow-up (mean time from recruitment, 2.2 y for CVD admission and 2.8 y for mortality). Risks of CVD and death increased steadily with severity of erectile dysfunction. Among men without previous CVD, those with severe versus no erectile dysfunction had significantly increased risks of ischaemic heart disease (adjusted relative risk [RR] = 1.60, 95% CI 1.31–1.95), heart failure (8.00, 2.64–24.2), peripheral vascular disease (1.92, 1.12–3.29), “other” CVD (1.26, 1.05–1.51), all CVD combined (1.35, 1.19–1.53), and all-cause mortality (1.93, 1.52–2.44). For men with previous CVD, corresponding RRs (95% CI) were 1.70 (1.46–1.98), 4.40 (2.64–7.33), 2.46 (1.63–3.70), 1.40 (1.21–1.63), 1.64 (1.48–1.81), and 2.37 (1.87–3.01), respectively. Among men without previous CVD, RRs of more specific CVDs increased significantly with severe versus no erectile dysfunction, including acute myocardial infarction (1.66, 1.22–2.26), atrioventricular and left bundle branch block (6.62, 1.86–23.56), and (peripheral) atherosclerosis (2.47, 1.18–5.15), with no significant difference in risk for conditions such as primary hypertension (0.61, 0.16–2.35) and intracerebral haemorrhage (0.78, 0.20–2.97).
Conclusions
These findings give support for CVD risk assessment in men with erectile dysfunction who have not already undergone assessment. The utility of erectile dysfunction as a clinical risk prediction tool requires specific testing.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Erectile dysfunction is the medical term used when a man is unable to achieve or sustain an erection of his penis suitable for sexual intercourse. Although a sensitive topic that can cause much embarrassment and distress, erectile dysfunction is very common, with an estimated 40% of men over the age of 40 years experiencing frequent or occasional difficulties. The most common causes of erectile dysfunction are medications, chronic illnesses such as diabetes, and drinking too much alcohol. Stress and mental health problems can also cause or worsen erectile dysfunction. There is also increasing evidence that erectile dysfunction may actually be a symptom of cardiovascular disease—a leading cause of death worldwide—as erectile dysfunction could indicate a problem with blood vessels or poor blood flow commonly associated with cardiovascular disease.
Why Was This Study Done?
Although previous studies have suggested that erectile dysfunction can serve as a marker for cardiovascular disease in men not previously diagnosed with the condition, few studies to date have investigated whether erectile dysfunction could also indicate worsening disease in men already diagnosed with cardiovascular disease. In addition, previous studies have typically been small and have not graded the severity of erectile dysfunction or investigated the specific types of cardiovascular disease associated with erectile dysfunction. In this large study conducted in Australia, the researchers investigated the relationship of the severity of erectile dysfunction with a range of cardiovascular disease outcomes among men with and without a previous diagnosis of cardiovascular disease.
What Did the Researchers Do and Find?
The researchers used information from the established 45 and Up Study, a large cohort study that includes 123,775 men aged 45 and over, selected at random from the general population of New South Wales, a large region of Australia. A total of 95,038 men were included in this analysis. The male participants completed a postal questionnaire that included a question on erectile functioning, which allowed the researchers to define erectile dysfunction as none, mild, moderate, or severe. Using information captured in the New South Wales Admitted Patient Data Collection—a complete record of all public and private hospital admissions, including the reasons for admission and the clinical diagnosis—and the government death register, the researchers were able to determine health outcomes of all study participants. They then used a statistical model to estimate hospital admissions for cardiovascular disease events for different levels of erectile dysfunction.
The researchers found that the rates of severe erectile dysfunction among study participants were 2.2% for men aged 45–54 years, 6.8% for men aged 55–64 years, 20.2% for men aged 65–74 years, 50.0% for men aged 75–84 years, and 75.4% for men aged 85 years and over. During the study period, the researchers recorded 7,855 hospital admissions related to cardiovascular disease and 2,304 deaths. The researchers found that among men without previous cardiovascular disease, those with severe erectile dysfunction were more likely to develop ischemic heart disease (risk 1.60), heart failure (risk 8.00), peripheral vascular disease (risk 1.92), and other causes of cardiovascular disease (risk 1.26) than men without erectile dysfunction. The risks of heart attacks and heart conduction problems were also increased (1.66 and 6.62, respectively). Furthermore, the combined risk of all cardiovascular disease outcomes was 1.35, and the overall risk of death was also higher (risk 1.93) in these men. The researchers found that these increased risks were similar in men with erectile dysfunction who had previously been diagnosed with cardiovascular disease.
What Do These Findings Mean?
These findings suggest that compared to men without erectile dysfunction, there is an increasing risk of ischemic heart disease, peripheral vascular disease, and death from all causes in those with increasing degrees of severity of erectile dysfunction. The authors emphasize that erectile dysfunction is a risk marker for cardiovascular disease, not a risk factor that causes cardiovascular disease. These findings add to previous studies and highlight the need to consider erectile dysfunction in relation to the risk of different types of cardiovascular disease, including heart failure and heart conduction disorders. However, the study's reliance on the answer to a single self-assessed question on erectile functioning limits the findings. Nevertheless, these findings provide useful information for clinicians: men with erectile dysfunction are at higher risk of cardiovascular disease, and the worse the erectile dysfunction, the higher the risk of cardiovascular disease. Men with erectile dysfunction, even at mild or moderate levels, should be screened and treated for cardiovascular disease accordingly.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001372.
Wikipedia defines erectile dysfunction (note that Wikipedia is a free online encyclopedia that anyone can edit)
MedlinePlus also has some useful patient information on erectile dysfunction
The Mayo Clinic has patient-friendly information on the causes of, and treatments for, erectile dysfunction, and also includes information on the link with cardiovascular disease
The National Heart Foundation of Australia provides information for health professionals, patients, and the general public about how to prevent and manage cardiovascular disease, including assessment and management of cardiovascular disease risk
doi:10.1371/journal.pmed.1001372
PMCID: PMC3558249  PMID: 23382654
15.  Year in Review: Medication Mishaps in the Elderly 
Objective
This paper reviews articles from the past year that examined medication mishaps (i.e., medication errors and adverse drug events [ADEs]) in the elderly.
Methods
The MEDLINE and EMBASE databases were searched for English-language articles published in 2010 using a combination of search terms including: medication errors, medication adherence, medication compliance, suboptimal prescribing, monitoring, adverse drug events, adverse drug withdrawal events, therapeutic failures, and aged. A manual search of the reference lists of the identified articles and the authors’ article files, book chapters and recent reviews was conducted to identify additional publications. Five studies of note were selected for annotation and critique. From this literature search, this paper also provides a selected bibliography of manuscripts published in 2010 (excluding those previously published in the American Journal of Geriatric Pharmacotherapy or by one of the authors) that address various types of medication errors and ADEs in the elderly.
RESULTS
Three studies addressed types of medication errors. One study examined underuse (due to prescribing) as a type of medication error. This was a before-and-after study from the Netherlands reported that those who received comprehensive geriatric assessments had a reduction in the rate of under-treatment of chronic conditions over a third (from 32.9% to 22.3%, p < 0.05). A second study focused on reducing medication errors due to the prescribing of potentially inappropriate medications. This quasi-experimental study found that a computerized provider order entry clinical decision support system decreased the number of potentially inappropriate medications ordered for patient’s ≥ 65 years of age who were hospitalized (11.56 before to 9.94 orders per day after, p < 0.001). The third medication error study was a cross-sectional phone survey of managed-care elders. This study found that more blacks than whites had low antihypertensive medication adherence as per a self-reported measure (18.4% vs. 12.3% respectively; p < 0.001). Moreover, blacks compared to whites used more complementary and alternative medicine (CAM) for the treatment of hypertension (30.5% vs. 24.7%, respectively; p = 0.005). In multivariable analyses stratified by race, among blacks, those that used CAM were more likely than those that did not to have low antihypertensive medication adherence (prevalence rate ratio 1.56, 95% confidence interval 1.14–2.15, p= 0.006).
The remaining two articles each addressed some form of medication adverse events. A case-control study of Medicare Advantage patients revealed for the first time that skeletal muscle relaxant use was significantly associated with an increased fracture risk (adjusted odds ratio 1.40, 95% confidence interval 1.15–1.72; p < 0.001). This increased risk was even more pronounced with the concomitant use of benzodiazepines. Finally, a randomized controlled trial across 16 centers in France used a one-week educational intervention about high-risk medications and ADEs directed at rehabilitation health care teams. They found that the rate of ADEs in the intervention group was lower than that in the usual care group (22% vs. 36%, respectively, p = 0.004).
CONCLUSION
Information from these studies may be used to advance health professionals’ understanding of medication errors and ADEs and may help guide research and clinical practices in years to come.
doi:10.1016/j.amjopharm.2011.01.003
PMCID: PMC3457784  PMID: 21459304
medication errors; suboptimal prescribing; medication adherence; drug monitoring; adverse drug events; aged
16.  Does the implementation of an electronic prescribing system create unintended medication errors? A study of the sociotechnical context through the analysis of reported medication incidents 
Background
Even though electronic prescribing systems are widely advocated as one of the most effective means of improving patient safety, they may also introduce new risks that are not immediately obvious. Through the study of specific incidents related to the processes involved in the administration of medication, we sought to find out if the prescribing system had unintended consequences in creating new errors. The focus of this study was a large acute hospital in the Midlands in the United Kingdom, which implemented a Prescribing, Information and Communication System (PICS).
Methods
This exploratory study was based on a survey of routinely collected medication incidents over five months. Data were independently reviewed by two of the investigators with a clinical pharmacology and nursing background respectively, and grouped into broad types: sociotechnical incidents (related to human interactions with the system) and non-sociotechnical incidents. Sociotechnical incidents were distinguished from the others because they occurred at the point where the system and the professional intersected and would not have occurred in the absence of the system. The day of the week and time of day that an incident occurred were tested using univariable and multivariable analyses. We acknowledge the limitations of conducting analyses of data extracted from incident reports as it is widely recognised that most medication errors are not reported and may contain inaccurate data. Interpretation of results must therefore be tentative.
Results
Out of a total of 485 incidents, a modest 15% (n = 73) were distinguished as sociotechnical issues and thus may be unique to hospitals that have such systems in place. These incidents were further analysed and subdivided into categories in order to identify aspects of the context which gave rise to adverse situations and possible risks to patient safety. The analysis of sociotechnical incidents by time of day and day of week indicated a trend for increased proportions of these types of incidents occurring on Sundays.
Conclusion
Introducing an electronic prescribing system has the potential to give rise to new types of risks to patient safety. Being aware of these types of errors is important to the clinical and technical implementers of such systems in order to, where possible, design out unintended problems, highlight training requirements, and revise clinical practice protocols.
doi:10.1186/1472-6947-11-29
PMCID: PMC3116457  PMID: 21569397
17.  Routine Eye Examinations for Persons 20-64 Years of Age 
Executive Summary
Objective
The objective of this analysis was to determine the strength of association between age, gender, ethnicity, family history of disease and refractive error and the risk of developing glaucoma or ARM?
Clinical Need
A routine eye exam serves a primary, secondary, and tertiary care role. In a primary care role, it allows contact with a doctor who can provide advice about eye care, which may reduce the incidence of eye disease and injury. In a secondary care role, it can via a case finding approach, diagnose persons with degenerative eye diseases such as glaucoma and or AMD, and lead to earlier treatment to slow the progression of the disease. Finally in a tertiary care role, it provides ongoing monitoring and treatment to those with diseases associated with vision loss.
Glaucoma is a progressive degenerative disease of the optic nerve, which causes gradual loss of peripheral (side) vision, and in advanced disease states loss of central vision. Blindness may results if glaucoma is not diagnosed and managed. The prevalence of primary open angle glaucoma (POAG) ranges from 1.1% to 3.0% in Western populations, and from 4.2% to 8.8% in populations of African descent. It is estimated up to 50% of people with glaucoma are aware that they have the disease. In Canada, glaucoma disease is the second leading cause of blindness in people aged 50 years and older. Tonometry, inspection of the optic disc and perimetry are used concurrently by physicians and optometrists to make the diagnosis of glaucoma. In general, the evidence shows that treating people with increased IOP only, increased IOP and clinical signs of early glaucoma or with normal-tension glaucoma can reduce the progression of disease.
Age-related maculopathy (ARM) is a degenerative disease of the macula, which is a part of the retina. Damage to the macula causes loss of central vision affecting the ability to read, recognize faces and to move about freely. ARM can be divided into an early- stage (early ARM) and a late-stage (AMD). AMD is the leading cause of blindness in developed countries. The prevalence of AMD increases with increasing age. It is estimated that 1% of people 55 years of age, 5% aged 75 to 84 years and 15% 80 years of age and older have AMD. ARM can be diagnosed during fundoscopy (ophthalmoscopy) which is a visual inspection of the retina by a physician or optometrist, or from a photograph of the retina. There is no cure or prevention for ARM. Likewise, there is currently no treatment to restore vision lost due to AMD. However, there are treatments to delay the progression of the disease and further loss of vision.
The Technology
A periodic oculo-visual assessment is defined “as an examination of the eye and vision system rendered primarily to determine if a patient has a simple refractive error (visual acuity assessment) including myopia, hypermetropia, presbyopia, anisometropia or astigmatism.” This service includes a history of the presenting complaint, past medical history, visual acuity examination, ocular mobility examination, slit lamp examination of the anterior segment, ophthalmoscopy, and tonometry (measurement of IOP) and is completed by either a physician or an optometrist.
Review Strategy
The Medical Advisory Secretariat conducted a computerized search of the literature in the following databases: OVID MEDLINE, MEDLINE, In-Process & Other Non-Indexed Citations, EMBASE, INAHTA and the Cochrane Library. The search was limited to English-language articles with human subjects, published from January 2000 to March 2006. In addition, a search was conducted for published guidelines, health technology assessments, and policy decisions. Bibliographies of references of relevant papers were searched for additional references that may have been missed in the computerized database search. Studies including participants 20 years and older, population-based prospective cohort studies, population-based cross-sectional studies when prospective cohort studies were unavailable or insufficient and studies determining and reporting the strength of association or risk- specific prevalence or incidence rates of either age, gender, ethnicity, refractive error or family history of disease and the risk of developing glaucoma or AMD were included in the review. The Grading of Recommendations Assessment, Development and Evaluation (GRADE) system was used to summarize the overall quality of the body of evidence.
Summary of Findings
A total of 498 citations for the period January 2000 through February 2006 were retrieved and an additional 313 were identified when the search was expanded to include articles published between 1990 and 1999. An additional 6 articles were obtained from bibliographies of relevant articles. Of these, 36 articles were retrieved for further evaluation. Upon review, 1 meta-analysis and 15 population-based epidemiological studies were accepted for this review
Primary Open Angle Glaucoma
Age
Six cross-sectional studies and 1 prospective cohort study contributed data on the association between age and PAOG. From the data it can be concluded that the prevalence and 4-year incidence of POAG increases with increasing age. The odds of having POAG are statistically significantly greater for people 50 years of age and older relative to those 40 to 49 years of age. There is an estimated 7% per year incremental odds of having POAG in persons 40 years of age and older, and 10% per year in persons 49 years of age and older. POAG is undiagnosed in up to 50% of the population. The quality of the evidence is moderate.
Gender
Five cross-sectional studies evaluated the association between gender and POAG. Consistency in estimates is lacking among studies and because of this the association between gender and prevalent POAG is inconclusive. The quality of the evidence is very low.
Ethnicity
Only 1 cross-sectional study compared the prevalence rates of POAG between black and white participants. These data suggest that prevalent glaucoma is statistically significantly greater in a black population 50 years of age and older compared with a white population of similar age. There is an overall 4-fold increase in prevalent POAG in a black population compared with a white population. This increase may be due to a confounding variable not accounted for in the analysis. The quality of the evidence is low.
Refractive Error
Four cross-sectional studies assessed the association of myopia and POAG. These data suggest an association between myopia defined as a spherical equivalent of -1.00D or worse and prevalent POAG. However, there is inconsistency in results regarding the statistical significance of the association between myopia when defined as a spherical equivalent of -0.5D. The quality of the evidence is very low.
Family History of POAG
Three cross-sectional studies investigated the association between family history of glaucoma and prevalent POAG. These data suggest a 2.5 to 3.0 fold increase in the odds having POAG in persons with a family history (any first-degree relative) of POAG. The quality of the evidence is moderate.
Age-Related Maculopathy
Age
Four cohort studies evaluated the association between age and early ARM and AMD. After 55 years of age, the incidence of both early ARM and AMD increases with increasing age. Progression to AMD occurs in up to 12% of persons with early ARM. The quality of the evidence is low
Gender
Four cohort studies evaluated the association between gender and early ARM and AMD. Gender differences in incident early ARM and incident AMD are not supported from these data. The quality of the evidence is lows.
Ethnicity
One meta-analysis and 2 cross-sectional studies reported the ethnic-specific prevalence rates of ARM. The data suggests that the prevalence of early ARM is higher in a white population compared with a black population. The data suggest that the ethnic-specific differences in the prevalence of AMD remain inconclusive.
Refractive Error
Two cohort studies investigated the association between refractive error and the development of incident early ARM and AMD. The quality of the evidence is very low.
Family History
Two cross-sectional studies evaluated the association of family history and early ARM and AMD. Data from one study supports an association between a positive family history of AMD and having AMD. The results of the study indicate an almost 4-fold increase in the odds of any AMD in a person with a family history of AMD. The quality of the evidence, as based on the GRADE criteria is moderate.
Economic Analysis
The prevalence of glaucoma is estimated at 1 to 3% for a Caucasian population and 4.2 to 8.8% for a black population. The incidence of glaucoma is estimated at 0.5 to 2.5% per year in the literature. The percentage of people who go blind per year as a result of glaucoma is approximately 0.55%.
The total population of Ontarians aged 50 to 64 years is estimated at 2.6 million based on the April 2006 Ontario Ministry of Finance population estimates. The range of utilization for a major eye examination in 2006/07 for this age group is estimated at 567,690 to 669,125, were coverage for major eye exams extended to this age group. This would represent a net increase in utilization of approximately 440,116 to 541,551.
The percentage of Ontario population categorized as black and/or those with a family history of glaucoma was approximately 20%. Therefore, the estimated range of utilization for a major eye examination in 2006/07 for this sub-population is estimated at 113,538 - 138,727 (20% of the estimated range of utilization in total population of 50-64 year olds in Ontario), were coverage for major eye exams extended to this sub-group. This would represent a net increase in utilization of approximately 88,023 to 108,310 within this sub-group.
Costs
The total cost of a major eye examination by a physician is $42.15, as per the 2006 Schedule of Benefits for Physician Services.(1) The total difference between the treatments of early-stage versus late-stage glaucoma was estimated at $167. The total cost per recipient was estimated at $891/person.
Current Ontario Policy
As of November 1, 2004 persons between 20 years and 64 years of age are eligible for an insured eye examination once every year if they have any of the following medical conditions: diabetes mellitus type 1 or 2, glaucoma, cataract(s), retinal disease, amblyopia, visual field defects, corneal disease, or strabismus. Persons between 20 to 64 years of age who do not have diabetes mellitus, glaucoma, cataract(s), retinal disease, amblyopia, visual field defects, corneal disease, or strabismus may be eligible for an annual eye examination if they have a valid “request for major eye examination” form completed by a physician (other than that who completed the eye exam) or a nurse practitioner working in a collaborative practice. Persons 20-64 years of age who are in receipt of social assistance and who do not have one of the 8 medical conditions listed above are eligible to receive an eye exam once every 2 years as a non-OHIP government funded service. Persons 19 years of age or younger and 65 years of age or older may receive an insured eye exam once every year.
Considerations for Policy Development
As of July 17, 2006 there were 1,402 practicing optometrists in Ontario. As of December 31, 2005 there were 404 practicing ophthalmologists in Ontario. It is unknown how many third party payers now cover routine eye exams for person between the ages of 20 and 64 years of age in Ontario.
PMCID: PMC3379534  PMID: 23074485
18.  Large scale organisational intervention to improve patient safety in four UK hospitals: mixed method evaluation 
Objectives To conduct an independent evaluation of the first phase of the Health Foundation’s Safer Patients Initiative (SPI), and to identify the net additional effect of SPI and any differences in changes in participating and non-participating NHS hospitals.
Design Mixed method evaluation involving five substudies, before and after design.
Setting NHS hospitals in the United Kingdom.
Participants Four hospitals (one in each country in the UK) participating in the first phase of the SPI (SPI1); 18 control hospitals.
Intervention The SPI1 was a compound (multi-component) organisational intervention delivered over 18 months that focused on improving the reliability of specific frontline care processes in designated clinical specialties and promoting organisational and cultural change.
Results Senior staff members were knowledgeable and enthusiastic about SPI1. There was a small (0.08 points on a 5 point scale) but significant (P<0.01) effect in favour of the SPI1 hospitals in one of 11 dimensions of the staff questionnaire (organisational climate). Qualitative evidence showed only modest penetration of SPI1 at medical ward level. Although SPI1 was designed to engage staff from the bottom up, it did not usually feel like this to those working on the wards, and questions about legitimacy of some aspects of SPI1 were raised. Of the five components to identify patients at risk of deterioration—monitoring of vital signs (14 items); routine tests (three items); evidence based standards specific to certain diseases (three items); prescribing errors (multiple items from the British National Formulary); and medical history taking (11 items)—there was little net difference between control and SPI1 hospitals, except in relation to quality of monitoring of acute medical patients, which improved on average over time across all hospitals. Recording of respiratory rate increased to a greater degree in SPI1 than in control hospitals; in the second six hours after admission recording increased from 40% (93) to 69% (165) in control hospitals and from 37% (141) to 78% (296) in SPI1 hospitals (odds ratio for “difference in difference” 2.1, 99% confidence interval 1.0 to 4.3; P=0.008). Use of a formal scoring system for patients with pneumonia also increased over time (from 2% (102) to 23% (111) in control hospitals and from 2% (170) to 9% (189) in SPI1 hospitals), which favoured controls and was not significant (0.3, 0.02 to 3.4; P=0.173). There were no improvements in the proportion of prescription errors and no effects that could be attributed to SPI1 in non-targeted generic areas (such as enhanced safety culture). On some measures, the lack of effect could be because compliance was already high at baseline (such as use of steroids in over 85% of cases where indicated), but even when there was more room for improvement (such as in quality of medical history taking), there was no significant additional net effect of SPI1. There were no changes over time or between control and SPI1 hospitals in errors or rates of adverse events in patients in medical wards. Mortality increased from 11% (27) to 16% (39) among controls and decreased from 17% (63) to 13% (49) among SPI1 hospitals, but the risk adjusted difference was not significant (0.5, 0.2 to 1.4; P=0.085). Poor care was a contributing factor in four of the 178 deaths identified by review of case notes. The survey of patients showed no significant differences apart from an increase in perception of cleanliness in favour of SPI1 hospitals.
Conclusions The introduction of SPI1 was associated with improvements in one of the types of clinical process studied (monitoring of vital signs) and one measure of staff perceptions of organisational climate. There was no additional effect of SPI1 on other targeted issues nor on other measures of generic organisational strengthening.
doi:10.1136/bmj.d195
PMCID: PMC3033440  PMID: 21292719
19.  Biventricular Pacing (Cardiac Resynchronization Therapy) 
Executive Summary
Issue
In 2002, (before the establishment of the Ontario Health Technology Advisory Committee), the Medical Advisory Secretariat conducted a health technology policy assessment on biventricular (BiV) pacing, also called cardiac resynchronization therapy (CRT). The goal of treatment with BiV pacing is to improve cardiac output for people in heart failure (HF) with conduction defect on ECG (wide QRS interval) by synchronizing ventricular contraction. The Medical Advisory Secretariat concluded that there was evidence of short (6 months) and longer-term (12 months) effectiveness in terms of cardiac function and quality of life (QoL). More recently, a hospital submitted an application to the Ontario Health Technology Advisory Committee to review CRT, and the Medical Advisory Secretariat subsequently updated its health technology assessment.
Background
Chronic HF results from any structural or functional cardiac disorder that impairs the ability of the heart to act as a pump. It is estimated that 1% to 5% of the general population (all ages) in Europe have chronic HF. (1;2) About one-half of the patients with HF are women, and about 40% of men and 60% of women with this condition are aged older than 75 years.
The incidence (i.e., the number of new cases in a specified period) of chronic HF is age dependent: from 1 to 5 per 1,000 people each year in the total population, to as high as 30 to 40 per 1,000 people each year in those aged 75 years and older. Hence, in an aging society, the prevalence (i.e., the number of people with a given disease or condition at any time) of HF is increasing, despite a reduction in cardiovascular mortality.
A recent study revealed 28,702 patients were hospitalized for first-time HF in Ontario between April 1994 and March 1997. (3) Women comprised 51% of the cohort. Eighty-five percent were aged 65 years or older, and 58% were aged 75 years or older.
Patients with chronic HF experience shortness of breath, a limited capacity for exercise, high rates of hospitalization and rehospitalization, and die prematurely. (2;4) The New York Heart Association (NYHA) has provided a commonly used functional classification for the severity of HF (2;5):
Class I: No limitation of physical activity. No symptoms with ordinary exertion.
Class II: Slight limitations of physical activity. Ordinary activity causes symptoms.
Class III: Marked limitation of physical activity. Less than ordinary activity causes symptoms. Asymptomatic at rest.
Class IV: Inability to carry out any physical activity without discomfort. Symptoms at rest.
The National Heart, Lung, and Blood Institute estimates that 35% of patients with HF are in functional NYHA class I; 35% are in class II; 25%, class III; and 5%, class IV. (5) Surveys (2) suggest that from 5% to 15% of patients with HF have persistent severe symptoms, and that the remainder of patients with HF is evenly divided between those with mild and moderately severe symptoms.
Overall, patients with chronic, stable HF have an annual mortality rate of about 10%. (2) One-third of patients with new-onset HF will die within 6 months of diagnosis. These patients do not survive to enter the pool of those with “chronic” HF. About 60% of patients with incident HF will die within 3 years, and there is limited evidence that the overall prognosis has improved in the last 15 years.
To date, the diagnosis and management of chronic HF has concentrated on patients with the clinical syndrome of HF accompanied by severe left ventricular systolic dysfunction. Major changes in treatment have resulted from a better understanding of the pathophysiology of HF and the results of large clinical trials. Treatment for chronic HF includes lifestyle management, drugs, cardiac surgery, or implantable pacemakers and defibrillators. Despite pharmacologic advances, which include diuretics, angiotensin-converting enzyme inhibitors, beta-blockers, spironolactone, and digoxin, many patients remain symptomatic on maximally tolerated doses.
The Technology
Owing to the limitations of drug therapy, cardiac transplantation and device therapies have been used to try to improve QoL and survival of patients with chronic HF. Ventricular pacing is an emerging treatment option for patients with severe HF that does not respond well to medical therapy. Traditionally, indications for pacing include bradyarrhythmia, sick sinus syndrome, atrioventricular block, and other indications, including combined sick sinus syndrome with atrioventricular block and neurocardiogenic syncope. Recently, BiV pacing as a new, adjuvant therapy for patients with chronic HF and mechanical dyssynchrony has been investigated. Ventricular dysfunction is a sign of HF; and, if associated with severe intraventricular conduction delay, it can cause dyssynchronous ventricular contractions resulting in decreased ventricular filling. The therapeutic intent is to activate both ventricles simultaneously, thereby improving the mechanical efficiency of the ventricles.
About 30% of patients with chronic HF have intraventricular conduction defects. (6) These conduction abnormalities progress over time and lead to discoordinated contraction of an already hemodynamically compromised ventricle. Intraventricular conduction delay has been associated with clinical instability and an increased risk of death in patients with HF. (7) Hence, BiV pacing, which involves pacing left and right ventricles simultaneously, may provide a more coordinated pattern of ventricular contraction and thereby potentially reduce QRS duration, and intraventricular and interventricular asynchrony. People with advanced chronic HF, a wide QRS complex (i.e., the portion of the electrocardiogram comprising the Q, R, and S waves, together representing ventricular depolarization), low left ventricular ejection fraction and contraction dyssynchrony in a viable myocardium and normal sinus rhythm, are the target patients group for BiV pacing. One-half of all deaths in HF patients are sudden, and the mode of death is arrhythmic in most cases. Internal cardioverter defibrillators (ICDs) combined with BiV pacemakers are therefore being increasingly considered for patients with HF who are at high risk of sudden death.
Current Implantation Technique for Cardiac Resynchronization
Conventional dual-chamber pacemakers have only 2 leads: 1 placed in the right atrium and the other in the right ventricle. The technique used for BiV pacemaker implantation also uses right atrial and ventricular pacing leads, in addition to a left ventricle lead advanced through the coronary sinus into a vein that runs along the ventricular free wall. This permits simultaneous pacing of both ventricles to allow resynchronization of the left ventricle septum and free wall.
Mode of Operation
Permanent pacing systems consist of an implantable pulse generator that contains a battery and electronic circuitry, together with 1 (single-chamber pacemaker) or 2 (dual-chamber pacemaker) leads. Leads conduct intrinsic atrial or ventricular signals to the sensing circuitry and deliver the pulse generator charge to the myocardium (muscle of the heart).
Complications of Biventricular Pacemaker Implantation
The complications that may arise when a BiV pacemaker is implanted are similar to those that occur with standard pacemaker implantation, including pneumothorax, perforation of the great vessels or the myocardium, air embolus, infection, bleeding, and arrhythmias. Moreover, left ventricular pacing through the coronary sinus can be associated with rupture of the sinus as another complication.
Conclusion of 2003 Review of Biventricular Pacemakers by the Medical Advisory Secretariat
The randomized controlled trials (RCTs) the Medical Advisory Secretariat retrieved analyzed chronic HF patients that were assessed for up to 6 months. Other studies have been prospective, but nonrandomized, not double-blinded, uncontrolled and/or have had a limited or uncalculated sample size. Short-term studies have focused on acute hemodynamic analyses. The authors of the RCTs reported improved cardiac function and QoL up to 6 months after BiV pacemaker implantation; therefore, there is level 1 evidence that patients in ventricular dyssynchrony who remain symptomatic after medication might benefit from this technology. Based on evidence made available to the Medical Advisory Secretariat by a manufacturer, (8) it appears that these 6-month improvements are maintained at 12-month follow-up.
To date, however, there is insufficient evidence to support the routine use of combined ICD/BiV devices in patients with chronic HF with prolonged QRS intervals.
Summary of Updated Findings Since the 2003 Review
Since the Medical Advisory Secretariat’s review in 2003 of biventricular pacemakers, 2 large RCTs have been published: COMPANION (9) and CARE-HF. (10) The characteristics of each trial are shown in Table 1. The COMPANION trial had a number of major methodological limitations compared with the CARE-HF trial.
Characteristics of the COMPANION and CARE-HF Trials*
COMPANION; (9) CARE-HF. (10)
BiV indicates biventricular; ICD, implantable cardioverter defibrillator; EF, ejection fraction; QRS, the interval representing the Q, R and S waves on an electrocardiogram; FDA, United States Food and Drug Administration.
Overall, CARE-HF showed that BiV pacing significantly improves mortality, QoL, and NYHA class in patients with severe HF and a wide QRS interval (Tables 2 and 3).
CARE-HF Results: Primary and Secondary Endpoints*
BiV indicates biventricular; NNT, number needed to treat.
Cleland JGF, Daubert J, Erdmann E, Freemantle N, Gras D, Kappenberger L et al. The effect of cardiac resynchronization on morbidity and mortality in heart failure (CARE-HF). New England Journal of Medicine 2005; 352:1539-1549; Copyright 2003 Massachusettes Medical Society. All rights reserved. (10)
CARE H-F Results: NYHA Class and Quality of Life Scores*
Minnesota Living with Heart Failure scores range from 0 to 105; higher scores reflect poorer QoL.
European Quality of Life–5 Dimensions scores range from -0.594 to 1.000; 1.000 indicates fully healthy; 0, dead
Cleland JGF, Daubert J, Erdmann E, Freemantle N, Gras D, Kappenberger L et al. The effect of cardiac resynchronization on morbidity and mortality in heart failure (CARE-HF). New England Journal of Medicine 2005; 352:1539-1549; Copyright 2005 Massachusettes Medical Society. All rights reserved.(10)
GRADE Quality of Evidence
The quality of these 3 trials was examined according to the GRADE Working Group criteria, (12) (Table 4).
Quality refers to criteria such as the adequacy of allocation concealment, blinding, and follow-up.
Consistency refers to the similarity of estimates of effect across studies. If there is an important unexplained inconsistency in the results, confidence in the estimate of effect for that outcome decreases. Differences in the direction of effect, the size of the differences in effect, and the significance of the differences guide the decision about whether important inconsistency exists.
Directness refers to the extent to which the people interventions and outcome measures are similar to those of interest. For example, there may be uncertainty about the directness of the evidence if the people of interest are older, sicker, or have more comorbid conditions than do the people in the studies.
As stated by the GRADE Working Group, (12) the following definitions were used in grading the quality of the evidence:
High: Further research is very unlikely to change our confidence on the estimate of effect.
Moderate: Further research is likely to have an important impact on our confidence in the estimate of effect and may change the estimate.
Low: Further research is very likely to have an important impact on our confidence in the estimate of effect and is likely to change the estimate.
Very low: Any estimate of effect is very uncertain.
Quality of Evidence: CARE-HF and COMPANION
Conclusions
Overall, there is evidence that BiV pacemakers are effective for improving mortality, QoL, and functional status in patients with NYHA class III/IV HF, an EF less than 0.35, a QRS interval greater than 120 ms, who are refractory to drug therapy.
As per the GRADE Working Group, recommendations considered the following 4 main factors:
The tradeoffs, taking into account the estimated size of the effect for the main outcome, the confidence limits around those estimates, and the relative value placed on the outcome
The quality of the evidence (Table 4)
Translation of the evidence into practice in a specific setting, taking into consideration important factors that could be expected to modify the size of the expected effects such as proximity to a hospital or availability of necessary expertise
Uncertainty about the baseline risk for the population of interest
The GRADE Working Group also recommends that incremental costs of health care alternatives should be considered explicitly alongside the expected health benefits and harms. Recommendations rely on judgments about the value of the incremental health benefits in relation to the incremental costs. The last column in Table 5 shows the overall trade-off between benefits and harms and incorporates any risk/uncertainty.
For BiV pacing, the overall GRADE and strength of the recommendation is moderate: the quality of the evidence is moderate/high (because of some uncertainty due to methodological limitations in the study design, e.g., no blinding), but there is also some risk/uncertainty in terms of the estimated prevalence and wide cost-effectiveness estimates (Table 5).
For the combination BiV pacing/ICD, the overall GRADE and strength of the recommendation is weak—the quality of the evidence is low (because of uncertainty due to methodological limitations in the study design), but there is also some risk/uncertainty in terms of the estimated prevalence, high cost, and high budget impact (Table 5). There are indirect, low-quality comparisons of the effectiveness of BiV pacemakers compared with the combination BiV/ICD devices.
A stronger recommendation can be made for BiV pacing only compared with the combination BiV/ICD device for patients with an EF less than or equal to 0.35, and a QRS interval over or equal to 120 ms, and NYHA III/IV symptoms, and refractory to optimal medical therapy (Table 5).
There is moderate/high-quality evidence that BiV pacemakers significantly improve mortality, QoL, and functional status.
There is low-quality evidence that combined BiV/ICD devices significantly improve mortality, QoL, and functional status.
To date, there are no direct comparisons of the effectiveness of BiV pacemakers compared with the combined BiV/ICD devices in terms of mortality, QoL, and functional status.
Overall GRADE and Strength of Recommendation
BiV refers to biventricular; ICD, implantable cardioverter defibrillator; NNT, number needed to treat.
PMCID: PMC3382419  PMID: 23074464
20.  Strategies for Increasing Recruitment to Randomised Controlled Trials: Systematic Review 
PLoS Medicine  2010;7(11):e1000368.
Patrina Caldwell and colleagues performed a systematic review of randomized studies that compared methods of recruiting individual study participants into trials, and found that strategies that focus on increasing potential participants' awareness of the specific health problem, and that engaged them, appeared to increase recruitment.
Background
Recruitment of participants into randomised controlled trials (RCTs) is critical for successful trial conduct. Although there have been two previous systematic reviews on related topics, the results (which identified specific interventions) were inconclusive and not generalizable. The aim of our study was to evaluate the relative effectiveness of recruitment strategies for participation in RCTs.
Methods and Findings
A systematic review, using the PRISMA guideline for reporting of systematic reviews, that compared methods of recruiting individual study participants into an actual or mock RCT were included. We searched MEDLINE, Embase, The Cochrane Library, and reference lists of relevant studies. From over 16,000 titles or abstracts reviewed, 396 papers were retrieved and 37 studies were included, in which 18,812 of at least 59,354 people approached agreed to participate in a clinical RCT. Recruitment strategies were broadly divided into four groups: novel trial designs (eight studies), recruiter differences (eight studies), incentives (two studies), and provision of trial information (19 studies). Strategies that increased people's awareness of the health problem being studied (e.g., an interactive computer program [relative risk (RR) 1.48, 95% confidence interval (CI) 1.00–2.18], attendance at an education session [RR 1.14, 95% CI 1.01–1.28], addition of a health questionnaire [RR 1.37, 95% CI 1.14–1.66]), or a video about the health condition (RR 1.75, 95% CI 1.11–2.74), and also monetary incentives (RR1.39, 95% CI 1.13–1.64 to RR 1.53, 95% CI 1.28–1.84) improved recruitment. Increasing patients' understanding of the trial process, recruiter differences, and various methods of randomisation and consent design did not show a difference in recruitment. Consent rates were also higher for nonblinded trial design, but differential loss to follow up between groups may jeopardise the study findings. The study's main limitation was the necessity of modifying the search strategy with subsequent search updates because of changes in MEDLINE definitions. The abstracts of previous versions of this systematic review were published in 2002 and 2007.
Conclusion
Recruitment strategies that focus on increasing potential participants' awareness of the health problem being studied, its potential impact on their health, and their engagement in the learning process appeared to increase recruitment to clinical studies. Further trials of recruitment strategies that target engaging participants to increase their awareness of the health problems being studied and the potential impact on their health may confirm this hypothesis.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Before any health care intervention—a treatment for a disease or a measure such as vaccination that is designed to prevent an illness—is adopted by the medical community, it undergoes exhaustive laboratory-based and clinical research. In the laboratory, scientists investigate the causes of diseases, identify potential new treatments or preventive methods, and test these interventions in animals. New interventions that look hopeful are then investigated in clinical trials—studies that test these interventions in people by following a strict trial protocol or action plan. Phase I trials test interventions in a few healthy volunteers or patients to evaluate their safety and to identify possible side effects. In phase II trials, a larger group of patients receives an intervention to evaluate its safety further and to get an initial idea of its effectiveness. In phase III trials, very large groups of patients (sometimes in excess of a thousand people) are randomly assigned to receive the new intervention or an established intervention or placebo (dummy intervention). These “randomized controlled trials” or “RCTs” provide the most reliable information about the effectiveness and safety of health care interventions.
Why Was This Study Done?
Patients who participate in clinical trials must fulfill the inclusion criteria laid down in the trial protocol and must be given information about the trial, its risks, and potential benefits before agreeing to participate (informed consent). Unfortunately, many RCTs struggle to enroll the number of patients specified in their trial protocol, which can reduce a trial's ability to measure the effect of a new intervention. Inadequate recruitment can also increase costs and, in the worst cases, prevent trial completion. Several strategies have been developed to improve recruitment but it is not clear which strategy works best. In this study, the researchers undertake a systematic review (a study that uses predefined criteria to identify all the research on a given topic) of “recruitment trials”—studies that have randomly divided potential RCT participants into groups, applied different strategies for recruitment to each group, and compared recruitment rates in the groups.
What Did the Researchers Do and Find?
The researchers identified 37 randomized trials of recruitment strategies into real and mock RCTs (where no actual trial occurred). In all, 18,812 people agreed to participate in an RCT in these recruitment trials out of at least 59,354 people approached. Some of these trials investigated novel strategies for recruitment, such as changes in how patients are randomized. Others looked at the effect of recruiter differences (for example, increased contact between the health care professionals doing the recruiting and the trial investigators), the effect of offering monetary incentives to participants, and the effect of giving more information about the trial to potential participants. Recruitment strategies that improved people's awareness of the health problem being studied—provision of an interactive computer program or a video about the health condition, attendance at an educational session, or inclusion of a health questionnaire in the recruitment process—improved recruitment rates, as did monetary incentives. Increasing patients' understanding about the trial process itself, recruiter differences, and alterations in consent design and randomization generally had no effect on recruitment rates although consent rates were higher when patients knew the treatment to which they had been randomly allocated before consenting. However, differential losses among the patients in different treatment groups in such nonblinded trials may jeopardize study findings.
What Do These Findings Mean?
These findings suggest that trial recruitment strategies that focus on increasing the awareness of potential participants of the health problem being studied and its possible effects on their health, and that engage potential participants in the trial process are likely to increase recruitment to RCTs. The accuracy of these findings depends on whether the researchers identified all the published research on recruitment strategies and on whether other research on recruitment strategies has been undertaken and not published that could alter these findings. Furthermore, because about half of the recruitment trials identified by the researchers were undertaken in the US, the successful strategies identified here might not be generalizable to other countries. Nevertheless, these recruitment strategies should now be investigated further to ensure that the future evaluation of new health care interventions is not hampered by poor recruitment into RCTs.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1000368.
The ClinicalTrials.gov Web site is a searchable register of federally and privately supported clinical trials in the US and around the world, providing information about all aspects of clinical trials
The US National Institutes of Health provides information about clinical trials
The UK National Health Service Choices Web site has information for patients about clinical trials and medical research
The UK Medical Research Council Clinical Trials Units also provides information for patients about clinical trials and links to information on clinical trials provided by other organizations
MedlinePlus has links to further resources on clinical trials (in English and Spanish)
The Australian Government's National Health and Medical Research Council has information about clinical trials
WHO International Clinical Trials Registry Platform aims to ensure that all trials are publicly accessible to those making health care decisions
The Star Child Health International Forum of Standards for Research is a resource center for pediatric clinical trial design, conduct, and reporting
doi:10.1371/journal.pmed.1000368
PMCID: PMC2976724  PMID: 21085696
21.  Characterizing the Complexity of Medication Safety using a Human Factors Approach: An Observational Study in Two Intensive Care Units 
BMJ quality & safety  2013;23(1):56-65.
Objective
To examine medication safety in two ICUs and to assess the complexity of medication errors and adverse drug events (ADEs) in ICUs across the stages of the medication-management process.
Methods
Four trained nurse data collectors gathered data on medication errors and ADEs between October 2006 and March 2007. Patient care documents (e.g., medication order sheets, notes) and incident reports were used to identify medication errors and ADEs in a 24-bed adult medical/surgical ICU and an 18-bed cardiac ICU in a tertiary care, community teaching hospital. In this cross-sectional study, a total of 630 consecutive ICU patient admissions were assessed to produce data on the number, rates and types of potential and preventable ADEs across stages of the medication-management process.
Results
An average of 2.9 preventable or potential ADEs occurred in each admission, i.e., 0.4 events per patient-day. Preventable or potential ADEs occurred in 2.6% of the medication orders. The rate of potential ADEs per 1,000 patient-days was 276, whereas the rate of preventable ADEs per 1,000 patient-days was 9.2. Most medication errors occur at the ordering (32%) and administration stages (39%). In 16–24% of potential and preventable ADEs, clusters of errors occurred either as sequence of errors (e.g., delay in medication dispensing leading to delay in medication administration) or grouped errors (e.g., route and frequency errors in the order for a medication). Many of the sequences led to administration errors that were caused by errors earlier in the medication-management process.
Conclusions
Understanding the complexity of the vulnerabilities of the medication-management process is important to devise solutions to improve patient safety. Electronic health record technology with computerized physician order entry may be one step necessary to improve medication safety in ICUs. Solutions that target multiple stages of the medication-management process are necessary to address sequential errors.
doi:10.1136/bmjqs-2013-001828
PMCID: PMC3938094  PMID: 24050986
medication safety; medication errors; adverse drug events; intensive care unit; human factors engineering
22.  Patient-Safety-Related Hospital Deaths in England: Thematic Analysis of Incidents Reported to a National Database, 2010–2012 
PLoS Medicine  2014;11(6):e1001667.
Sukhmeet Panesar and colleagues classified reports of patient-safety-related hospital deaths in England to identify patterns of cases where improvements might be possible.
Please see later in the article for the Editors' Summary
Background
Hospital mortality is increasingly being regarded as a key indicator of patient safety, yet methodologies for assessing mortality are frequently contested and seldom point directly to areas of risk and solutions. The aim of our study was to classify reports of deaths due to unsafe care into broad areas of systemic failure capable of being addressed by stronger policies, procedures, and practices. The deaths were reported to a patient safety incident reporting system after mandatory reporting of such incidents was introduced.
Methods and Findings
The UK National Health Service database was searched for incidents resulting in a reported death of an adult over the period of the study. The study population comprised 2,010 incidents involving patients aged 16 y and over in acute hospital settings. Each incident report was reviewed by two of the authors, and, by scrutinising the structured information together with the free text, a main reason for the harm was identified and recorded as one of 18 incident types. These incident types were then aggregated into six areas of apparent systemic failure: mismanagement of deterioration (35%), failure of prevention (26%), deficient checking and oversight (11%), dysfunctional patient flow (10%), equipment-related errors (6%), and other (12%). The most common incident types were failure to act on or recognise deterioration (23%), inpatient falls (10%), healthcare-associated infections (10%), unexpected per-operative death (6%), and poor or inadequate handover (5%). Analysis of these 2,010 fatal incidents reveals patterns of issues that point to actionable areas for improvement.
Conclusions
Our approach demonstrates the potential utility of patient safety incident reports in identifying areas of service failure and highlights opportunities for corrective action to save lives.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Being admitted to the hospital is worrying for patients and for their relatives. Will the patient recover or die in the hospital? Some seriously ill patients will inevitably die, but in an ideal world, no one should die in the hospital because of inadequate or unsafe care (an avoidable death). No one should die, for example, because healthcare professionals fail to act on signs that indicate a decline in a patient's clinical condition. Hospital mortality (death) is often regarded as a key indicator of patient safety in hospitals, and death rate indicators such as the “hospital standardized mortality ratio” (the ratio of the actual number of acute in-hospital deaths to the expected number of in-hospital deaths) are widely used to monitor and improve hospital safety standards. In England, for example, a 2012 report that included this measure as an indicator of hospital performance led to headlines of “worryingly high” hospital death rates and to a review of the quality of care in the hospitals with the highest death rates.
Why Was This Study Done?
Hospital standardized mortality ratios and other measures of in-patient mortality can be misleading because they can, for example, reflect the burden of disease near the hospital rather than the hospital's quality of care or safety levels. Moreover, comparative data on hospital mortality rates are of limited value in identifying areas of risk to patients or solutions to the problem of avoidable deaths. In this study, to identify areas of service failure amenable to improvement through strengthened clinical policies, procedures, and practices, the researchers undertake a thematic analysis of deaths in hospitals in England that were reported by healthcare staff to a mandatory patient-safety-related incident reporting system. Since 2004, staff in the UK National Health Service (the NHS comprises the publicly funded healthcare systems in England, Scotland, Wales, and Northern Ireland) have been encouraged to report any unintended or unexpected incident in which they believe a patient's safety was compromised. Since June 2010, it has been mandatory for staff in England and Wales to report deaths due to patient-safety-related incidents. A thematic analysis examines patterns (“themes”) within nonnumerical (qualitative) data.
What Did the Researchers Do and Find?
By searching the NHS database of patient-safety-related incidents, the researchers identified 2010 incidents that occurred between 1 June 2010 and 31 October 2012 that resulted in the death of adult patients in acute hospital settings. By scrutinizing the structured information in each incident report and the associated free text in which the reporter described what happened and why they think it happened, the researchers classified the reports into 18 incident categories. These categories fell into six broad areas of systemic failure—mismanagement of deterioration (35% of incidents), failure of prevention (26%), deficient checking and oversight (11%), dysfunctional patient flow (10%), equipment-related errors (6%), and other (12%, incidents where the problem underlying death was unclear). Management of deterioration, for example, included the incident categories “failure to act on or recognize deterioration” (23% of reported incidents), “failure to give ordered treatment/support in a timely manner,” and “failure to observe.” Failure of prevention included the incident categories “falls” (10% of reported incidents), “healthcare-associated infections” (also 10% of reported incidents), “pressure sores,” “suicides,” and “deep vein thrombosis/pulmonary embolism.”
What Do These Findings Mean?
Although the accuracy of these findings may be limited by data quality and by other aspects of the study design, they reveal patterns of patient-safety-related deaths in hospitals in England and highlight areas of healthcare that can be targeted for improvement. The finding that the mismanagement of deterioration of acutely ill patients is involved in a third of patient-safety-related deaths identifies an area of particular concern in the NHS and, potentially, in other healthcare systems. One way to reduce deaths associated with the mismanagement of deterioration, suggest the researchers, might be to introduce a standardized early warning score to ensure uniform identification of this population of patients. The researchers also suggest that more effort should be put into designing programs to prevent falls and other incidents and into ensuring that these programs are effectively implemented. More generally, the classification system developed here has the potential to help hospital boards and clinicians identify areas of patient care that require greater scrutiny and intervention and thereby save the lives of many hospital patients.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001667.
The NHS provides information about patient safety, including a definition of a patient safety incident and information on reporting patient safety incidents
The NHS Choices website includes several “Behind the Headlines” articles that discuss patient safety in hospitals, including an article that discusses the 2012 report of high hospital death rates in England, “Fit for the Future?” and another that discusses the Keogh review of the quality of care in the hospitals with highest death rates
The US Agency for Healthcare Research and Quality provides information on patient safety in the US
Wikipedia has pages on thematic analysis and on patient safety (note that Wikipedia is a free online encyclopedia that anyone can edit; available in several languages)
doi:10.1371/journal.pmed.1001667
PMCID: PMC4068985  PMID: 24959751
23.  Errors in medication history at hospital admission: prevalence and predicting factors 
Background
An accurate medication list at hospital admission is essential for the evaluation and further treatment of patients. The objective of this study was to describe the frequency, type and predictors of errors in medication history, and to evaluate the extent to which standard care corrects these errors.
Methods
A descriptive study was carried out in two medical wards in a Swedish hospital using Lund Integrated Medicines Management (LIMM)-based medication reconciliation. A clinical pharmacist identified each patient's most accurate pre-admission medication list by conducting a medication reconciliation process shortly after admission. This list was then compared with the patient's medication list in the hospital medical records. Addition or withdrawal of a drug or changes to the dose or dosage form in the hospital medication list were considered medication discrepancies. Medication discrepancies for which no clinical reason could be identified (unintentional changes) were considered medication history errors.
Results
The final study population comprised 670 of 818 eligible patients. At least one medication history error was identified by pharmacists conducting medication reconciliations for 313 of these patients (47%; 95% CI 43-51%). The most common medication error was an omitted drug, followed by a wrong dose. Multivariate logistic regression analysis showed that a higher number of drugs at admission (odds ratio [OR] per 1 drug increase = 1.10; 95% CI 1.06-1.14; p < 0.0001) and the patient living in their own home without any care services (OR = 1.58; 95% CI 1.02-2.45; p = 0.042) were predictors for medication history errors at admission. The results further indicated that standard care by non-pharmacist ward staff had partly corrected the errors in affected patients by four days after admission, but a considerable proportion of the errors made in the initial medication history at admission remained undetected by standard care (OR for medication errors detected by pharmacists' medication reconciliation carried out on days 4-11 compared to days 0-1 = 0.52; 95% CI 0.30-0.91; p=0.021).
Conclusions
Clinical pharmacists conducting LIMM-based medication reconciliations have a high potential for correcting errors in medication history for all patients. In an older Swedish population, those prescribed many drugs seem to benefit most from admission medication reconciliation.
doi:10.1186/1472-6904-12-9
PMCID: PMC3353244  PMID: 22471836
24.  Strongyloides stercoralis: Systematic Review of Barriers to Controlling Strongyloidiasis for Australian Indigenous Communities 
Background
Strongyloides stercoralis infects human hosts mainly through skin contact with contaminated soil. The result is strongyloidiasis, a parasitic disease, with a unique cycle of auto-infection causing a variety of symptoms and signs, with possible fatality from hyper-infection. Australian Indigenous community members, often living in rural and remote settings, are exposed to and infected with S. stercoralis. The aim of this review is to determine barriers to control of strongyloidiasis. The purpose is to contribute to the development of initiatives for prevention, early detection and effective treatment of strongyloidiasis.
Methodology/Principle Findings
Systematic search reviewing research published 2012 and earlier was conducted. Research articles discussing aspects of strongyloidiasis, context of infection and overall health in Indigenous Australians were reviewed. Based on the PRISMA statement, the systematic search of health databases, Academic Search Premier, Informit, Medline, PubMed, AMED, CINAHL, Health Source Nursing and Academic was conducted. Key search terms included strongyloidiasis, Indigenous, Australia, health, and community. 340 articles were retrieved with 16 original research articles published between 1969 and 2006 meeting criteria. Review found barriers to control defined across three key themes, (1) health status, (2) socioeconomic status, and (3) health care literacy and procedures.
Conclusions/Significance
This study identifies five points of intervention: (1) develop reporting protocols between health care system and communities; (2) test all Indigenous Australian patients, immunocompromised patients and those exposed to areas with S. stercoralis; (3) health professionals require detailed information on strongyloidiasis and potential for exposure to Indigenous Australian people; (4) to establish testing and treatment initiatives within communities; and (5) to measure and report prevalence rates specific to communities and to act with initiatives based on these results. By defining barriers to control of strongyloidiasis in Australian Indigenous people, improved outcomes of prevention, treatment of strongyloidiasis and increased health overall are attainable.
Author Summary
Strongyloides stercoralis, a nematode parasite, has a well-documented history of infecting human hosts in tropic and subtropic regions mainly through skin contact with inhabited soil. The result is strongyloidiasis, a human parasitic disease, with a unique cycle of auto-infection contributing to a variety of symptoms, of which, hyper-infection causing fatality may occur. In Australia, Indigenous community members often located in rural and remote settings, are exposed to and infected with strongyloides. Previous researchers report strongyloidiasis as a recurrent health issue for Indigenous Australians. This is a systematic review to determine the barriers to control for this pernicious pathogen. Barriers to control can be defined across three key themes: (1) health status, (2) socioeconomic status, and (3) health care literacy and procedure. By conceptualizing these barriers and addressing steps to control as outlined in this study, there is potential for improvement in prevention and treatment outcomes of strongyloidiasis and subsequently, overall health for Australian Indigenous people. This study contributes to furthering prevention and treatment of strongyloidiasis, increasing exposure to the issue of strongyloidiasis in Australian Indigenous people. It is the intent of this paper to express the need to have continued research and further health policy directed specifically to eradicate strongyloidiasis in Australian Indigenous communities.
doi:10.1371/journal.pntd.0003141
PMCID: PMC4177786  PMID: 25254655
25.  Influenza and Pneumococcal Vaccinations for Patients With Chronic Obstructive Pulmonary Disease (COPD) 
Executive Summary
In July 2010, the Medical Advisory Secretariat (MAS) began work on a Chronic Obstructive Pulmonary Disease (COPD) evidentiary framework, an evidence-based review of the literature surrounding treatment strategies for patients with COPD. This project emerged from a request by the Health System Strategy Division of the Ministry of Health and Long-Term Care that MAS provide them with an evidentiary platform on the effectiveness and cost-effectiveness of COPD interventions.
After an initial review of health technology assessments and systematic reviews of COPD literature, and consultation with experts, MAS identified the following topics for analysis: vaccinations (influenza and pneumococcal), smoking cessation, multidisciplinary care, pulmonary rehabilitation, long-term oxygen therapy, noninvasive positive pressure ventilation for acute and chronic respiratory failure, hospital-at-home for acute exacerbations of COPD, and telehealth (including telemonitoring and telephone support). Evidence-based analyses were prepared for each of these topics. For each technology, an economic analysis was also completed where appropriate. In addition, a review of the qualitative literature on patient, caregiver, and provider perspectives on living and dying with COPD was conducted, as were reviews of the qualitative literature on each of the technologies included in these analyses.
The Chronic Obstructive Pulmonary Disease Mega-Analysis series is made up of the following reports, which can be publicly accessed at the MAS website at: http://www.hqontario.ca/en/mas/mas_ohtas_mn.html.
Chronic Obstructive Pulmonary Disease (COPD) Evidentiary Framework
Influenza and Pneumococcal Vaccinations for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Smoking Cessation for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Community-Based Multidisciplinary Care for Patients With Stable Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Pulmonary Rehabilitation for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Long-term Oxygen Therapy for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Noninvasive Positive Pressure Ventilation for Acute Respiratory Failure Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Noninvasive Positive Pressure Ventilation for Chronic Respiratory Failure Patients With Stable Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Hospital-at-Home Programs for Patients with Acute Exacerbations of Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Home Telehealth for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Cost-Effectiveness of Interventions for Chronic Obstructive Pulmonary Disease Using an Ontario Policy Model
Experiences of Living and Dying With COPD: A Systematic Review and Synthesis of the Qualitative Empirical Literature
For more information on the qualitative review, please contact Mita Giacomini at: http://fhs.mcmaster.ca/ceb/faculty_member_giacomini.htm.
For more information on the economic analysis, please visit the PATH website: http://www.path-hta.ca/About-Us/Contact-Us.aspx.
The Toronto Health Economics and Technology Assessment (THETA) collaborative has produced an associated report on patient preference for mechanical ventilation. For more information, please visit the THETA website: http://theta.utoronto.ca/static/contact.
Objective
The objective of this analysis was to determine the effectiveness of the influenza vaccination and the pneumococcal vaccination in patients with chronic obstructive pulmonary disease (COPD) in reducing the incidence of influenza-related illness or pneumococcal pneumonia.
Clinical Need: Condition and Target Population
Influenza Disease
Influenza is a global threat. It is believed that the risk of a pandemic of influenza still exists. Three pandemics occurred in the 20th century which resulted in millions of deaths worldwide. The fourth pandemic of H1N1 influenza occurred in 2009 and affected countries in all continents.
Rates of serious illness due to influenza viruses are high among older people and patients with chronic conditions such as COPD. The influenza viruses spread from person to person through sneezing and coughing. Infected persons can transfer the virus even a day before their symptoms start. The incubation period is 1 to 4 days with a mean of 2 days. Symptoms of influenza infection include fever, shivering, dry cough, headache, runny or stuffy nose, muscle ache, and sore throat. Other symptoms such as nausea, vomiting, and diarrhea can occur.
Complications of influenza infection include viral pneumonia, secondary bacterial pneumonia, and other secondary bacterial infections such as bronchitis, sinusitis, and otitis media. In viral pneumonia, patients develop acute fever and dyspnea, and may further show signs and symptoms of hypoxia. The organisms involved in bacterial pneumonia are commonly identified as Staphylococcus aureus and Hemophilus influenza. The incidence of secondary bacterial pneumonia is most common in the elderly and those with underlying conditions such as congestive heart disease and chronic bronchitis.
Healthy people usually recover within one week but in very young or very old people and those with underlying medical conditions such as COPD, heart disease, diabetes, and cancer, influenza is associated with higher risks and may lead to hospitalization and in some cases death. The cause of hospitalization or death in many cases is viral pneumonia or secondary bacterial pneumonia. Influenza infection can lead to the exacerbation of COPD or an underlying heart disease.
Streptococcal Pneumonia
Streptococcus pneumoniae, also known as pneumococcus, is an encapsulated Gram-positive bacterium that often colonizes in the nasopharynx of healthy children and adults. Pneumococcus can be transmitted from person to person during close contact. The bacteria can cause illnesses such as otitis media and sinusitis, and may become more aggressive and affect other areas of the body such as the lungs, brain, joints, and blood stream. More severe infections caused by pneumococcus are pneumonia, bacterial sepsis, meningitis, peritonitis, arthritis, osteomyelitis, and in rare cases, endocarditis and pericarditis.
People with impaired immune systems are susceptible to pneumococcal infection. Young children, elderly people, patients with underlying medical conditions including chronic lung or heart disease, human immunodeficiency virus (HIV) infection, sickle cell disease, and people who have undergone a splenectomy are at a higher risk for acquiring pneumococcal pneumonia.
Technology
Influenza and Pneumococcal Vaccines
Trivalent Influenza Vaccines in Canada
In Canada, 5 trivalent influenza vaccines are currently authorized for use by injection. Four of these are formulated for intramuscular use and the fifth product (Intanza®) is formulated for intradermal use.
The 4 vaccines for intramuscular use are:
Fluviral (GlaxoSmithKline), split virus, inactivated vaccine, for use in adults and children ≥ 6 months;
Vaxigrip (Sanofi Pasteur), split virus inactivated vaccine, for use in adults and children ≥ 6 months;
Agriflu (Novartis), surface antigen inactivated vaccine, for use in adults and children ≥ 6 months; and
Influvac (Abbott), surface antigen inactivated vaccine, for use in persons ≥ 18 years of age.
FluMist is a live attenuated virus in the form of an intranasal spray for persons aged 2 to 59 years. Immunization with current available influenza vaccines is not recommended for infants less than 6 months of age.
Pneumococcal Vaccine
Pneumococcal polysaccharide vaccines were developed more than 50 years ago and have progressed from 2-valent vaccines to the current 23-valent vaccines to prevent diseases caused by 23 of the most common serotypes of S pneumoniae. Canada-wide estimates suggest that approximately 90% of cases of pneumococcal bacteremia and meningitis are caused by these 23 serotypes. Health Canada has issued licenses for 2 types of 23-valent vaccines to be injected intramuscularly or subcutaneously:
Pneumovax 23® (Merck & Co Inc. Whitehouse Station, NJ, USA), and
Pneumo 23® (Sanofi Pasteur SA, Lion, France) for persons 2 years of age and older.
Other types of pneumococcal vaccines licensed in Canada are for pediatric use. Pneumococcal polysaccharide vaccine is injected only once. A second dose is applied only in some conditions.
Research Questions
What is the effectiveness of the influenza vaccination and the pneumococcal vaccination compared with no vaccination in COPD patients?
What is the safety of these 2 vaccines in COPD patients?
What is the budget impact and cost-effectiveness of these 2 vaccines in COPD patients?
Research Methods
Literature search
Search Strategy
A literature search was performed on July 5, 2010 using OVID MEDLINE, MEDLINE In-Process and Other Non-Indexed Citations, EMBASE, the Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Cochrane Library, and the International Agency for Health Technology Assessment (INAHTA) for studies published from January 1, 2000 to July 5, 2010. The search was updated monthly through the AutoAlert function of the search up to January 31, 2011. Abstracts were reviewed by a single reviewer and, for those studies meeting the eligibility criteria, full-text articles were obtained. Articles with an unknown eligibility were reviewed with a second clinical epidemiologist and then a group of epidemiologists until consensus was established. Data extraction was carried out by the author.
Inclusion Criteria
studies comparing clinical efficacy of the influenza vaccine or the pneumococcal vaccine with no vaccine or placebo;
randomized controlled trials published between January 1, 2000 and January 31, 2011;
studies including patients with COPD only;
studies investigating the efficacy of types of vaccines approved by Health Canada;
English language studies.
Exclusion Criteria
non-randomized controlled trials;
studies investigating vaccines for other diseases;
studies comparing different variations of vaccines;
studies in which patients received 2 or more types of vaccines;
studies comparing different routes of administering vaccines;
studies not reporting clinical efficacy of the vaccine or reporting immune response only;
studies investigating the efficacy of vaccines not approved by Health Canada.
Outcomes of Interest
Primary Outcomes
Influenza vaccination: Episodes of acute respiratory illness due to the influenza virus.
Pneumococcal vaccination: Time to the first episode of community-acquired pneumonia either due to pneumococcus or of unknown etiology.
Secondary Outcomes
rate of hospitalization and mechanical ventilation
mortality rate
adverse events
Quality of Evidence
The quality of each included study was assessed taking into consideration allocation concealment, randomization, blinding, power/sample size, withdrawals/dropouts, and intention-to-treat analyses. The quality of the body of evidence was assessed as high, moderate, low, or very low according to the GRADE Working Group criteria. The following definitions of quality were used in grading the quality of the evidence:
Summary of Efficacy of the Influenza Vaccination in Immunocompetent Patients With COPD
Clinical Effectiveness
The influenza vaccination was associated with significantly fewer episodes of influenza-related acute respiratory illness (ARI). The incidence density of influenza-related ARI was:
All patients: vaccine group: (total of 4 cases) = 6.8 episodes per 100 person-years; placebo group: (total of 17 cases) = 28.1 episodes per 100 person-years, (relative risk [RR], 0.2; 95% confidence interval [CI], 0.06−0.70; P = 0.005).
Patients with severe airflow obstruction (forced expiratory volume in 1 second [FEV1] < 50% predicted): vaccine group: (total of 1 case) = 4.6 episodes per 100 person-years; placebo group: (total of 7 cases) = 31.2 episodes per 100 person-years, (RR, 0.1; 95% CI, 0.003−1.1; P = 0.04).
Patients with moderate airflow obstruction (FEV1 50%−69% predicted): vaccine group: (total of 2 cases) = 13.2 episodes per 100 person-years; placebo group: (total of 4 cases) = 23.8 episodes per 100 person-years, (RR, 0.5; 95% CI, 0.05−3.8; P = 0.5).
Patients with mild airflow obstruction (FEV1 ≥ 70% predicted): vaccine group: (total of 1 case) = 4.5 episodes per 100 person-years; placebo group: (total of 6 cases) = 28.2 episodes per 100 person-years, (RR, 0.2; 95% CI, 0.003−1.3; P = 0.06).
The Kaplan-Meier survival analysis showed a significant difference between the vaccinated group and the placebo group regarding the probability of not acquiring influenza-related ARI (log-rank test P value = 0.003). Overall, the vaccine effectiveness was 76%. For categories of mild, moderate, or severe COPD the vaccine effectiveness was 84%, 45%, and 85% respectively.
With respect to hospitalization, fewer patients in the vaccine group compared with the placebo group were hospitalized due to influenza-related ARIs, although these differences were not statistically significant. The incidence density of influenza-related ARIs that required hospitalization was 3.4 episodes per 100 person-years in the vaccine group and 8.3 episodes per 100 person-years in the placebo group (RR, 0.4; 95% CI, 0.04−2.5; P = 0.3; log-rank test P value = 0.2). Also, no statistically significant differences between the 2 groups were observed for the 3 categories of severity of COPD.
Fewer patients in the vaccine group compared with the placebo group required mechanical ventilation due to influenza-related ARIs. However, these differences were not statistically significant. The incidence density of influenza-related ARIs that required mechanical ventilation was 0 episodes per 100 person-years in the vaccine group and 5 episodes per 100 person-years in the placebo group (RR, 0.0; 95% CI, 0−2.5; P = 0.1; log-rank test P value = 0.4). In addition, no statistically significant differences between the 2 groups were observed for the 3 categories of severity of COPD. The effectiveness of the influenza vaccine in preventing influenza-related ARIs and influenza-related hospitalization was not related to age, sex, severity of COPD, smoking status, or comorbid diseases.
safety
Overall, significantly more patients in the vaccine group than the placebo group experienced local adverse reactions (vaccine: 17 [27%], placebo: 4 [6%]; P = 0.002). Significantly more patients in the vaccine group than the placebo group experienced swelling (vaccine 4, placebo 0; P = 0.04) and itching (vaccine 4, placebo 0; P = 0.04). Systemic reactions included headache, myalgia, fever, and skin rash and there were no significant differences between the 2 groups for these reactions (vaccine: 47 [76%], placebo: 51 [81%], P = 0.5).
With respect to lung function, dyspneic symptoms, and exercise capacity, there were no significant differences between the 2 groups at 1 week and at 4 weeks in: FEV1, maximum inspiratory pressure at residual volume, oxygen saturation level of arterial blood, visual analogue scale for dyspneic symptoms, and the 6 Minute Walking Test for exercise capacity.
There was no significant difference between the 2 groups with regard to the probability of not acquiring total ARIs (influenza-related and/or non-influenza-related); (log-rank test P value = 0.6).
Summary of Efficacy of the Pneumococcal Vaccination in Immunocompetent Patients With COPD
Clinical Effectiveness
The Kaplan-Meier survival analysis showed no significant differences between the group receiving the penumoccocal vaccination and the control group for time to the first episode of community-acquired pneumonia due to pneumococcus or of unknown etiology (log-rank test 1.15; P = 0.28). Overall, vaccine efficacy was 24% (95% CI, −24 to 54; P = 0.33).
With respect to the incidence of pneumococcal pneumonia, the Kaplan-Meier survival analysis showed a significant difference between the 2 groups (vaccine: 0/298; control: 5/298; log-rank test 5.03; P = 0.03).
Hospital admission rates and median length of hospital stays were lower in the vaccine group, but the difference was not statistically significant. The mortality rate was not different between the 2 groups.
Subgroup Analysis
The Kaplan-Meier survival analysis showed significant differences between the vaccine and control groups for pneumonia due to pneumococcus and pneumonia of unknown etiology, and when data were analyzed according to subgroups of patients (age < 65 years, and severe airflow obstruction FEV1 < 40% predicted). The accumulated percentage of patients without pneumonia (due to pneumococcus and of unknown etiology) across time was significantly lower in the vaccine group than in the control group in patients younger than 65 years of age (log-rank test 6.68; P = 0.0097) and patients with a FEV1 less than 40% predicted (log-rank test 3.85; P = 0.0498).
Vaccine effectiveness was 76% (95% CI, 20−93; P = 0.01) for patients who were less than 65 years of age and −14% (95% CI, −107 to 38; P = 0.8) for those who were 65 years of age or older. Vaccine effectiveness for patients with a FEV1 less than 40% predicted and FEV1 greater than or equal to 40% predicted was 48% (95% CI, −7 to 80; P = 0.08) and −11% (95% CI, −132 to 47; P = 0.95), respectively. For patients who were less than 65 years of age (FEV1 < 40% predicted), vaccine effectiveness was 91% (95% CI, 35−99; P = 0.002).
Cox modelling showed that the effectiveness of the vaccine was dependent on the age of the patient. The vaccine was not effective in patients 65 years of age or older (hazard ratio, 1.53; 95% CI, 0.61−a2.17; P = 0.66) but it reduced the risk of acquiring pneumonia by 80% in patients less than 65 years of age (hazard ratio, 0.19; 95% CI, 0.06−0.66; P = 0.01).
safety
No patients reported any local or systemic adverse reactions to the vaccine.
PMCID: PMC3384373  PMID: 23074431

Results 1-25 (1201124)