PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-5 (5)
 

Clipboard (0)
None
Journals
Authors
more »
Year of Publication
Document Types
1.  Development and validation of the ACE tool: assessing medical trainees’ competency in evidence based medicine 
BMC Medical Education  2014;14:114.
Background
While a variety of instruments have been developed to assess knowledge and skills in evidence based medicine (EBM), few assess all aspects of EBM - including knowledge, skills attitudes and behaviour - or have been psychometrically evaluated. The aim of this study was to develop and validate an instrument that evaluates medical trainees’ competency in EBM across knowledge, skills and attitude.
Methods
The ‘Assessing Competency in EBM’ (ACE) tool was developed by the authors, with content and face validity assessed by expert opinion. A cross-sectional sample of 342 medical trainees representing ‘novice’, ‘intermediate’ and ‘advanced’ EBM trainees were recruited to complete the ACE tool. Construct validity, item difficulty, internal reliability and item discrimination were analysed.
Results
We recruited 98 EBM-novice, 108 EBM-intermediate and 136 EBM-advanced participants. A statistically significant difference in the total ACE score was observed and corresponded to the level of training: on a 0-15-point test, the mean ACE scores were 8.6 for EBM-novice; 9.5 for EBM-intermediate; and 10.4 for EBM-advanced (p < 0.0001). Individual item discrimination was excellent (Item Discrimination Index ranging from 0.37 to 0.84), with internal reliability consistent across all but three items (Item Total Correlations were all positive ranging from 0.14 to 0.20).
Conclusion
The 15-item ACE tool is a reliable and valid instrument to assess medical trainees’ competency in EBM. The ACE tool provides a novel assessment that measures user performance across the four main steps of EBM. To provide a complete suite of instruments to assess EBM competency across various patient scenarios, future refinement of the ACE instrument should include further scenarios across harm, diagnosis and prognosis.
doi:10.1186/1472-6920-14-114
PMCID: PMC4062508  PMID: 24909434
Evidence based medicine; Assessment; Medical students
2.  Implementation of a blended learning approach to teaching evidence based practice: a protocol for a mixed methods study 
BMC Medical Education  2013;13:170.
Background
Evidence based practice (EBP) requires that health professionals are competent in integrating the best evidence in their decision making. Being 'evidence-based’ requires skills and knowledge in epidemiology, biostatistics and information literacy. EBP is commonly taught in medical and health sciences degrees, yet there is little evidence to guide educators as to the best teaching modality to increase learner competency in EBP.
Methods/design
This study is mixed methods in design. A randomised controlled trial will examine the effectiveness of blended learning versus didactic approach of teaching EBP to medical students. The primary outcome of the RCT is EBP competency as assessed by the Berlin tool. Focus groups will be conducted to explore student perceptions and attitudes towards implementing a blended learning approach in teaching EBP. A concurrent triangulation design will be implemented, permitting quantitative data to inform the effectiveness of the intervention and qualitative data to contextualise the results.
Discussion
This study will provide novel evidence on the effectiveness of blended learning in teaching EBP to a cohort of undergraduate and graduate-entry medical students.
doi:10.1186/1472-6920-13-170
PMCID: PMC3878342  PMID: 24351113
3.  Protocol for development of the guideline for reporting evidence based practice educational interventions and teaching (GREET) statement 
Background
There are an increasing number of studies reporting the efficacy of educational strategies to facilitate the development of knowledge and skills underpinning evidence based practice (EBP). To date there is no standardised guideline for describing the teaching, evaluation, context or content of EBP educational strategies. The heterogeneity in the reporting of EBP educational interventions makes comparisons between studies difficult. The aim of this program of research is to develop the Guideline for Reporting EBP Educational interventions and Teaching (GREET) statement and an accompanying explanation and elaboration (E&E) paper.
Methods/design
Three stages are planned for the development process. Stage one will comprise a systematic review to identify features commonly reported in descriptions of EBP educational interventions. In stage two, corresponding authors of articles included in the systematic review and the editors of the journals in which these studies were published will be invited to participate in a Delphi process to reach consensus on items to be considered when reporting EBP educational interventions. The final stage of the project will include the development and pilot testing of the GREET statement and E&E paper.
Outcome
The final outcome will be the creation of a Guideline for Reporting EBP Educational interventions and Teaching (GREET) statement and E&E paper.
Discussion
The reporting of health research including EBP educational research interventions, have been criticised for a lack of transparency and completeness. The development of the GREET statement will enable the standardised reporting of EBP educational research. This will provide a guide for researchers, reviewers and publishers for reporting EBP educational interventions.
doi:10.1186/1472-6920-13-9
PMCID: PMC3599902  PMID: 23347417
Evidence based practice; Education; Reporting guideline
4.  Do you think it's a disease? a survey of medical students 
BMC Medical Education  2012;12:19.
Background
The management of medical conditions is influenced by whether clinicians regard them as "disease" or "not a disease". The aim of the survey was to determine how medical students classify a range of conditions they might encounter in their professional lives and whether a different name for a condition would influence their decision in the categorisation of the condition as a 'disease' or 'not a disease'.
Methods
We surveyed 3 concurrent years of medical students to classify 36 candidate conditions into "disease" and "non-disease". The conditions were given a 'medical' label and a (lay) label and positioned where possible in alternate columns of the survey.
Results
The response rate was 96% (183 of 190 students attending a lecture): 80% of students concurred on 16 conditions as "disease" (eg diabetes, tuberculosis), and 4 as "non-disease" (eg baldness, menopause, fractured skull and heat stroke). The remaining 16 conditions (with 21-79% agreement) were more contentious (especially obesity, infertility, hay fever, alcoholism, and restless leg syndrome). Three pairs of conditions had both a more, and a less, medical label: the more medical labels (myalgic encephalomyelitis, hypertension, and erectile dysfunction) were more frequently classified as 'disease' than the less medical (chronic fatigue syndrome, high blood pressure, and impotence), respectively, significantly different for the first two pairs.
Conclusions
Some conditions excluded from the classification of "disease" were unexpected (eg fractured skull and heat stroke). Students were mostly concordant on what conditions should be classified as "disease". They were more likely to classify synonyms as 'disease' if the label was medical. The findings indicate there is still a problem 30 years on in the concept of 'what is a disease'. Our findings suggest that we should be addressing such concepts to medical students.
doi:10.1186/1472-6920-12-19
PMCID: PMC3383512  PMID: 22471875
5.  Sicily statement on evidence-based practice 
Background
A variety of definitions of evidence-based practice (EBP) exist. However, definitions are in themselves insufficient to explain the underlying processes of EBP and to differentiate between an evidence-based process and evidence-based outcome. There is a need for a clear statement of what Evidence-Based Practice (EBP) means, a description of the skills required to practise in an evidence-based manner and a curriculum that outlines the minimum requirements for training health professionals in EBP. This consensus statement is based on current literature and incorporating the experience of delegates attending the 2003 Conference of Evidence-Based Health Care Teachers and Developers ("Signposting the future of EBHC").
Discussion
Evidence-Based Practice has evolved in both scope and definition. Evidence-Based Practice (EBP) requires that decisions about health care are based on the best available, current, valid and relevant evidence. These decisions should be made by those receiving care, informed by the tacit and explicit knowledge of those providing care, within the context of available resources.
Health care professionals must be able to gain, assess, apply and integrate new knowledge and have the ability to adapt to changing circumstances throughout their professional life. Curricula to deliver these aptitudes need to be grounded in the five-step model of EBP, and informed by ongoing research. Core assessment tools for each of the steps should continue to be developed, validated, and made freely available.
Summary
All health care professionals need to understand the principles of EBP, recognise EBP in action, implement evidence-based policies, and have a critical attitude to their own practice and to evidence. Without these skills, professionals and organisations will find it difficult to provide 'best practice'.
doi:10.1186/1472-6920-5-1
PMCID: PMC544887  PMID: 15634359

Results 1-5 (5)