PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-6 (6)
 

Clipboard (0)
None
Journals
Authors
more »
Year of Publication
Document Types
1.  Students benefit from developing their own emergency medicine OSCE stations: a comparative study using the matched-pair method 
BMC Medical Education  2013;13:138.
Background
Students can improve the learning process by developing their own multiple choice questions. If a similar effect occurred when creating OSCE (objective structured clinical examination) stations by themselves it could be beneficial to involve them in the development of OSCE stations. This study investigates the effect of students developing emergency medicine OSCE stations on their test performance.
Method
In the 2011/12 winter semester, an emergency medicine OSCE was held for the first time at the Faculty of Medicine at the University of Leipzig. When preparing for the OSCE, 13 students (the intervention group) developed and tested emergency medicine examination stations as a learning experience. Their subsequent OSCE performance was compared to that of 13 other students (the control group), who were parallelized in terms of age, gender, semester and level of previous knowledge using the matched-pair method. In addition, both groups were compared to 20 students who tested the OSCE prior to regular emergency medicine training (test OSCE group).
Results
There were no differences between the three groups regarding age (24.3 ± 2.6; 24.2 ± 3.4 and 24 ± 2.3 years) or previous knowledge (29.3 ± 3.4; 29.3 ± 3.2 and 28.9 ± 4.7 points in the multiple choice [MC] exam in emergency medicine). Merely the gender distribution differed (8 female and 5 male students in the intervention and control group vs. 3 males and 17 females in the test OSCE group).
In the exam OSCE, participants in the intervention group scored 233.4 ± 6.3 points (mean ± SD) compared to 223.8 ± 9.2 points (p < 0.01) in the control group. Cohen’s effect size was d = 1.24. The students of the test OSCE group scored 223.2 ± 13.4 points.
Conclusions
Students who actively develop OSCE stations when preparing for an emergency medicine OSCE achieve better exam results.
doi:10.1186/1472-6920-13-138
PMCID: PMC3852440  PMID: 24098996
OSCE; Emergency medicine; Undergraduate education; Assessment of training
2.  Learning the facts in medical school is not enough: which factors predict successful application of procedural knowledge in a laboratory setting? 
BMC Medical Education  2013;13:28.
Background
Medical knowledge encompasses both conceptual (facts or “what” information) and procedural knowledge (“how” and “why” information). Conceptual knowledge is known to be an essential prerequisite for clinical problem solving. Primarily, medical students learn from textbooks and often struggle with the process of applying their conceptual knowledge to clinical problems. Recent studies address the question of how to foster the acquisition of procedural knowledge and its application in medical education. However, little is known about the factors which predict performance in procedural knowledge tasks. Which additional factors of the learner predict performance in procedural knowledge?
Methods
Domain specific conceptual knowledge (facts) in clinical nephrology was provided to 80 medical students (3rd to 5th year) using electronic flashcards in a laboratory setting. Learner characteristics were obtained by questionnaires. Procedural knowledge in clinical nephrology was assessed by key feature problems (KFP) and problem solving tasks (PST) reflecting strategic and conditional knowledge, respectively.
Results
Results in procedural knowledge tests (KFP and PST) correlated significantly with each other. In univariate analysis, performance in procedural knowledge (sum of KFP+PST) was significantly correlated with the results in (1) the conceptual knowledge test (CKT), (2) the intended future career as hospital based doctor, (3) the duration of clinical clerkships, and (4) the results in the written German National Medical Examination Part I on preclinical subjects (NME-I). After multiple regression analysis only clinical clerkship experience and NME-I performance remained independent influencing factors.
Conclusions
Performance in procedural knowledge tests seems independent from the degree of domain specific conceptual knowledge above a certain level. Procedural knowledge may be fostered by clinical experience. More attention should be paid to the interplay of individual clinical clerkship experiences and structured teaching of procedural knowledge and its assessment in medical education curricula.
doi:10.1186/1472-6920-13-28
PMCID: PMC3598785  PMID: 23433202
Conceptual knowledge; Procedural knowledge; Strategic knowledge; Conditional knowledge; Key feature problems; Problem solving task; Clinical experience; Prior cognitive performance
3.  Developing and analysing a curriculum map in Occupational- and Environmental Medicine 
BMC Medical Education  2010;10:60.
Background
During the last 5 years a fundamental curriculum reform was realized at the medical school of the Ludwig-Maximilians-University. In contrast to those efforts, the learning objectives were not defined consistently for the curriculum and important questions concerning the curriculum could not be answered. This also applied to Occupational and Environmental Medicine where teachers of both courses were faced with additional problems such as the low number of students attending the lectures.
The aims of the study were to develop and analyse a curriculum map for Occupational and Environmental Medicine based on learning objectives using a web-based database.
Furthermore we aimed to evaluate student perception about the curricular structure.
Methods
Using a web-based learning objectives database, a curriculum map for Occupational and Environmental Medicine was developed and analysed. Additionally online evaluations of students for each course were conducted.
Results
The results show a discrepancy between the taught and the assessed curriculum. For both curricula, we identified that several learning objectives were not covered in the curriculum. There were overlaps with other content domains and redundancies within both curricula. 53% of the students in Occupational Medicine and 43% in Environmental Medicine stated that there is a lack of information regarding the learning objectives of the curriculum.
Conclusions
The results of the curriculum mapping and the poor evaluation results for the courses suggest a need for re-structuring both curricula.
doi:10.1186/1472-6920-10-60
PMCID: PMC2944147  PMID: 20840737
4.  Job requirements compared to medical school education: differences between graduates from problem-based learning and conventional curricula 
Background
Problem-based Learning (PBL) has been suggested as a key educational method of knowledge acquisition to improve medical education. We sought to evaluate the differences in medical school education between graduates from PBL-based and conventional curricula and to what extent these curricula fit job requirements.
Methods
Graduates from all German medical schools who graduated between 1996 and 2002 were eligible for this study. Graduates self-assessed nine competencies as required at their day-to-day work and as taught in medical school on a 6-point Likert scale. Results were compared between graduates from a PBL-based curriculum (University Witten/Herdecke) and conventional curricula.
Results
Three schools were excluded because of low response rates. Baseline demographics between graduates of the PBL-based curriculum (n = 101, 49% female) and the conventional curricula (n = 4720, 49% female) were similar. No major differences were observed regarding job requirements with priorities for "Independent learning/working" and "Practical medical skills". All competencies were rated to be better taught in PBL-based curriculum compared to the conventional curricula (all p < 0.001), except for "Medical knowledge" and "Research competence". Comparing competencies required at work and taught in medical school, PBL was associated with benefits in "Interdisciplinary thinking" (Δ + 0.88), "Independent learning/working" (Δ + 0.57), "Psycho-social competence" (Δ + 0.56), "Teamwork" (Δ + 0.39) and "Problem-solving skills" (Δ + 0.36), whereas "Research competence" (Δ - 1.23) and "Business competence" (Δ - 1.44) in the PBL-based curriculum needed improvement.
Conclusion
Among medical graduates in Germany, PBL demonstrated benefits with regard to competencies which were highly required in the job of physicians. Research and business competence deserve closer attention in future curricular development.
doi:10.1186/1472-6920-10-1
PMCID: PMC2824799  PMID: 20074350
5.  Answer changing in multiple choice assessment change that answer when in doubt – and spread the word! 
Background
Several studies during the last decades have shown that answer changing in multiple choice examinations is generally beneficial for examinees. In spite of this the common misbelief still prevails that answer changing in multiple choice examinations results in an increased number of wrong answers rather than an improved score. One suggested consequence of newer studies is that examinees should be informed about this misbelief in the hope that this prejudice might be eradicated. This study aims to confirm data from previous studies about the benefits of answer changing as well as pursue the question of whether students informed about the said advantageous effects of answer changing would indeed follow this advice and change significantly more answers. Furthermore a look is cast on how the overall examination performance and mean point increase of these students is affected.
Methods
The answer sheets to the end of term exams of 79 3rd year medical students at the University of Munich were analysed to confirm the benefits of answer changing. Students taking the test were randomized into two groups. Prior to taking the test 41 students were informed about the benefits of changing answers after careful reconsideration while 38 students did not receive such information. Both groups were instructed to mark all answer changes made during the test.
Results
Answer changes were predominantly from wrong to right in full accordance with existing literature resources. It was shown that students who had been informed about the benefits of answer changing when in doubt changed answers significantly more often than students who had not been informed. Though students instructed on the benefits of changing answers scored higher in their exams than those not instructed, the difference in point increase was not significant.
Conclusion
Students should be informed about the benefits of changing initial answers to multiple choice questions once when in reasonable doubt about these answers. Furthermore, reconsidering answers should be encouraged as students will heed the advice and change more answers than students not so instructed.
doi:10.1186/1472-6920-7-28
PMCID: PMC2020461  PMID: 17718902
6.  Comparison between Long-Menu and Open-Ended Questions in computerized medical assessments. A randomized controlled trial 
Background
Long-menu questions (LMQs) are viewed as an alternative method for answering open-ended questions (OEQs) in computerized assessment. So far this question type and its influence on examination scores have not been studied sufficiently. However, the increasing use of computerized assessments will also lead to an increasing use of this question type.
Using a summative online key feature (KF) examination we evaluated whether LMQs can be compared with OEQs in regard to the level of difficulty, performance and response times. We also evaluated the content for its suitability for LMQs.
Methods
We randomized 146 fourth year medical students into two groups. For the purpose of this study we created 7 peer-reviewed KF-cases with a total of 25 questions. All questions had the same content in both groups, but nine questions had a different answer type. Group A answered 9 questions with an LM type, group B with an OE type. In addition to the LM answer, group A could give an OE answer if the appropriate answer was not included in the list.
Results
The average number of correct answers for LMQs and OEQs showed no significant difference (p = 0.93). Among all 630 LM answers only one correct term (0.32%) was not included in the list of answers. The response time for LMQs did not significantly differ from that of OEQs (p = 0.65).
Conclusion
LMQs and OEQs do not differ significantly. Compared to standard multiple-choice questions (MCQs), the response time for LMQs and OEQs is longer. This is probably due to the fact that they require active problem solving skills and more practice. LMQs correspond more suitable to Short answer questions (SAQ) then to OEQ and should only be used when the answers can be clearly phrased, using only a few, precise synonyms.
LMQs can decrease cueing effects and significantly simplify the scoring in computerized assessment.
doi:10.1186/1472-6920-6-50
PMCID: PMC1618389  PMID: 17032439

Results 1-6 (6)