PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-7 (7)
 

Clipboard (0)
None
Journals
Authors
more »
Year of Publication
Document Types
1.  A Delphi survey to determine how educational interventions for evidence-based practice should be reported: Stage 2 of the development of a reporting guideline 
BMC Medical Education  2014;14:159.
Background
Undertaking a Delphi exercise is recommended during the second stage in the development process for a reporting guideline. To continue the development for the Guideline for Reporting Evidence-based practice Educational interventions and Teaching (GREET) a Delphi survey was undertaken to determine the consensus opinion of researchers, journal editors and educators in evidence-based practice (EBP) regarding the information items that should be reported when describing an educational intervention for EBP.
Methods
A four round online Delphi survey was conducted from October 2012 to March 2013. The Delphi panel comprised international researchers, educators and journal editors in EBP. Commencing with an open-ended question, participants were invited to volunteer information considered important when reporting educational interventions for EBP. Over three subsequent rounds participants were invited to rate the importance of each of the Delphi items using an 11 point Likert rating scale (low 0 to 4, moderate 5 to 6, high 7 to 8 and very high >8). Consensus agreement was set a priori as at least 80 per cent participant agreement. Consensus agreement was initially calculated within the four categories of importance (low to very high), prior to these four categories being merged into two (<7 and ≥7). Descriptive statistics for each item were computed including the mean Likert scores, standard deviation (SD), range and median participant scores. Mean absolute deviation from the median (MAD-M) was also calculated as a measure of participant disagreement.
Results
Thirty-six experts agreed to participate and 27 (79%) participants completed all four rounds. A total of 76 information items were generated across the four survey rounds. Thirty-nine items (51%) were specific to describing the intervention (as opposed to other elements of study design) and consensus agreement was achieved for two of these items (5%). When the four rating categories were merged into two (<7 and ≥7), 18 intervention items achieved consensus agreement.
Conclusion
This Delphi survey has identified 39 items for describing an educational intervention for EBP. These Delphi intervention items will provide the groundwork for the subsequent consensus discussion to determine the final inclusion of items in the GREET, the first reporting guideline for educational interventions in EBP.
doi:10.1186/1472-6920-14-159
PMCID: PMC4128547  PMID: 25081371
Evidence-based practice; Reporting guideline; Delphi survey
2.  A systematic review of how studies describe educational interventions for evidence-based practice: stage 1 of the development of a reporting guideline 
BMC Medical Education  2014;14:152.
Background
The aim of this systematic review was to identify which information is included when reporting educational interventions used to facilitate foundational skills and knowledge of evidence-based practice (EBP) training for health professionals. This systematic review comprised the first stage in the three stage development process for a reporting guideline for educational interventions for EBP.
Methods
The review question was ‘What information has been reported when describing educational interventions targeting foundational evidence-based practice knowledge and skills?’
MEDLINE, Academic Search Premier, ERIC, CINAHL, Scopus, Embase, Informit health, Cochrane Library and Web of Science databases were searched from inception until October - December 2011. Randomised and non-randomised controlled trials reporting original data on educational interventions specific to developing foundational knowledge and skills of evidence-based practice were included.
Studies were not appraised for methodological bias, however, reporting frequency and item commonality were compared between a random selection of studies included in the systematic review and a random selection of studies excluded as they were not controlled trials. Twenty-five data items were extracted by two independent reviewers (consistency > 90%).
Results
Sixty-one studies met the inclusion criteria (n = 29 randomised, n = 32 non-randomised). The most consistently reported items were the learner’s stage of training, professional discipline and the evaluation methods used (100%). The least consistently reported items were the instructor(s) previous teaching experience (n = 8, 13%), and student effort outside face to face contact (n = 1, 2%).
Conclusion
This systematic review demonstrates inconsistencies in describing educational interventions for EBP in randomised and non-randomised trials. To enable educational interventions to be replicable and comparable, improvements in the reporting for educational interventions for EBP are required. In the absence of a specific reporting guideline, there are a range of items which are reported with variable frequency. Identifying the important items for describing educational interventions for facilitating foundational knowledge and skills in EBP remains to be determined. The findings of this systematic review will be used to inform the next stage in the development of a reporting guideline for educational interventions for EBP.
doi:10.1186/1472-6920-14-152
PMCID: PMC4113129  PMID: 25060160
Systematic review; Evidence-based practice; Educational intervention; Reporting guidelines
3.  Development and validation of the ACE tool: assessing medical trainees’ competency in evidence based medicine 
BMC Medical Education  2014;14:114.
Background
While a variety of instruments have been developed to assess knowledge and skills in evidence based medicine (EBM), few assess all aspects of EBM - including knowledge, skills attitudes and behaviour - or have been psychometrically evaluated. The aim of this study was to develop and validate an instrument that evaluates medical trainees’ competency in EBM across knowledge, skills and attitude.
Methods
The ‘Assessing Competency in EBM’ (ACE) tool was developed by the authors, with content and face validity assessed by expert opinion. A cross-sectional sample of 342 medical trainees representing ‘novice’, ‘intermediate’ and ‘advanced’ EBM trainees were recruited to complete the ACE tool. Construct validity, item difficulty, internal reliability and item discrimination were analysed.
Results
We recruited 98 EBM-novice, 108 EBM-intermediate and 136 EBM-advanced participants. A statistically significant difference in the total ACE score was observed and corresponded to the level of training: on a 0-15-point test, the mean ACE scores were 8.6 for EBM-novice; 9.5 for EBM-intermediate; and 10.4 for EBM-advanced (p < 0.0001). Individual item discrimination was excellent (Item Discrimination Index ranging from 0.37 to 0.84), with internal reliability consistent across all but three items (Item Total Correlations were all positive ranging from 0.14 to 0.20).
Conclusion
The 15-item ACE tool is a reliable and valid instrument to assess medical trainees’ competency in EBM. The ACE tool provides a novel assessment that measures user performance across the four main steps of EBM. To provide a complete suite of instruments to assess EBM competency across various patient scenarios, future refinement of the ACE instrument should include further scenarios across harm, diagnosis and prognosis.
doi:10.1186/1472-6920-14-114
PMCID: PMC4062508  PMID: 24909434
Evidence based medicine; Assessment; Medical students
4.  Implementation of a blended learning approach to teaching evidence based practice: a protocol for a mixed methods study 
BMC Medical Education  2013;13:170.
Background
Evidence based practice (EBP) requires that health professionals are competent in integrating the best evidence in their decision making. Being 'evidence-based’ requires skills and knowledge in epidemiology, biostatistics and information literacy. EBP is commonly taught in medical and health sciences degrees, yet there is little evidence to guide educators as to the best teaching modality to increase learner competency in EBP.
Methods/design
This study is mixed methods in design. A randomised controlled trial will examine the effectiveness of blended learning versus didactic approach of teaching EBP to medical students. The primary outcome of the RCT is EBP competency as assessed by the Berlin tool. Focus groups will be conducted to explore student perceptions and attitudes towards implementing a blended learning approach in teaching EBP. A concurrent triangulation design will be implemented, permitting quantitative data to inform the effectiveness of the intervention and qualitative data to contextualise the results.
Discussion
This study will provide novel evidence on the effectiveness of blended learning in teaching EBP to a cohort of undergraduate and graduate-entry medical students.
doi:10.1186/1472-6920-13-170
PMCID: PMC3878342  PMID: 24351113
5.  Protocol for development of the guideline for reporting evidence based practice educational interventions and teaching (GREET) statement 
Background
There are an increasing number of studies reporting the efficacy of educational strategies to facilitate the development of knowledge and skills underpinning evidence based practice (EBP). To date there is no standardised guideline for describing the teaching, evaluation, context or content of EBP educational strategies. The heterogeneity in the reporting of EBP educational interventions makes comparisons between studies difficult. The aim of this program of research is to develop the Guideline for Reporting EBP Educational interventions and Teaching (GREET) statement and an accompanying explanation and elaboration (E&E) paper.
Methods/design
Three stages are planned for the development process. Stage one will comprise a systematic review to identify features commonly reported in descriptions of EBP educational interventions. In stage two, corresponding authors of articles included in the systematic review and the editors of the journals in which these studies were published will be invited to participate in a Delphi process to reach consensus on items to be considered when reporting EBP educational interventions. The final stage of the project will include the development and pilot testing of the GREET statement and E&E paper.
Outcome
The final outcome will be the creation of a Guideline for Reporting EBP Educational interventions and Teaching (GREET) statement and E&E paper.
Discussion
The reporting of health research including EBP educational research interventions, have been criticised for a lack of transparency and completeness. The development of the GREET statement will enable the standardised reporting of EBP educational research. This will provide a guide for researchers, reviewers and publishers for reporting EBP educational interventions.
doi:10.1186/1472-6920-13-9
PMCID: PMC3599902  PMID: 23347417
Evidence based practice; Education; Reporting guideline
6.  Do you think it's a disease? a survey of medical students 
BMC Medical Education  2012;12:19.
Background
The management of medical conditions is influenced by whether clinicians regard them as "disease" or "not a disease". The aim of the survey was to determine how medical students classify a range of conditions they might encounter in their professional lives and whether a different name for a condition would influence their decision in the categorisation of the condition as a 'disease' or 'not a disease'.
Methods
We surveyed 3 concurrent years of medical students to classify 36 candidate conditions into "disease" and "non-disease". The conditions were given a 'medical' label and a (lay) label and positioned where possible in alternate columns of the survey.
Results
The response rate was 96% (183 of 190 students attending a lecture): 80% of students concurred on 16 conditions as "disease" (eg diabetes, tuberculosis), and 4 as "non-disease" (eg baldness, menopause, fractured skull and heat stroke). The remaining 16 conditions (with 21-79% agreement) were more contentious (especially obesity, infertility, hay fever, alcoholism, and restless leg syndrome). Three pairs of conditions had both a more, and a less, medical label: the more medical labels (myalgic encephalomyelitis, hypertension, and erectile dysfunction) were more frequently classified as 'disease' than the less medical (chronic fatigue syndrome, high blood pressure, and impotence), respectively, significantly different for the first two pairs.
Conclusions
Some conditions excluded from the classification of "disease" were unexpected (eg fractured skull and heat stroke). Students were mostly concordant on what conditions should be classified as "disease". They were more likely to classify synonyms as 'disease' if the label was medical. The findings indicate there is still a problem 30 years on in the concept of 'what is a disease'. Our findings suggest that we should be addressing such concepts to medical students.
doi:10.1186/1472-6920-12-19
PMCID: PMC3383512  PMID: 22471875
7.  Sicily statement on evidence-based practice 
Background
A variety of definitions of evidence-based practice (EBP) exist. However, definitions are in themselves insufficient to explain the underlying processes of EBP and to differentiate between an evidence-based process and evidence-based outcome. There is a need for a clear statement of what Evidence-Based Practice (EBP) means, a description of the skills required to practise in an evidence-based manner and a curriculum that outlines the minimum requirements for training health professionals in EBP. This consensus statement is based on current literature and incorporating the experience of delegates attending the 2003 Conference of Evidence-Based Health Care Teachers and Developers ("Signposting the future of EBHC").
Discussion
Evidence-Based Practice has evolved in both scope and definition. Evidence-Based Practice (EBP) requires that decisions about health care are based on the best available, current, valid and relevant evidence. These decisions should be made by those receiving care, informed by the tacit and explicit knowledge of those providing care, within the context of available resources.
Health care professionals must be able to gain, assess, apply and integrate new knowledge and have the ability to adapt to changing circumstances throughout their professional life. Curricula to deliver these aptitudes need to be grounded in the five-step model of EBP, and informed by ongoing research. Core assessment tools for each of the steps should continue to be developed, validated, and made freely available.
Summary
All health care professionals need to understand the principles of EBP, recognise EBP in action, implement evidence-based policies, and have a critical attitude to their own practice and to evidence. Without these skills, professionals and organisations will find it difficult to provide 'best practice'.
doi:10.1186/1472-6920-5-1
PMCID: PMC544887  PMID: 15634359

Results 1-7 (7)