Search tips
Search criteria

Results 1-3 (3)

Clipboard (0)

Select a Filter Below

more »
Year of Publication
Document Types
1.  Good exams made easy: The item management system for multiple examination formats 
BMC Medical Education  2012;12:63.
The development, implementation and evaluation of assessments require considerable resources and often cannot be carried out by a single faculty/institution. Therefore some medical faculties have founded cooperation projects which mainly focus on the exchange of multiple choice questions (MCQs).
Since these cooperation projects do not entirely support all relevant processes in terms of preparation, implementation and evaluation of assessment, in 2006 the Medical Assessment Alliance (MAA) was founded for mutual support. In addition to MCQs the MAA started to develop innovative assessment formats and facilitate content through a coordinated exchange of experiences. To support cooperation within this network, the web-based Item Management System (IMS) was developed which supports all processes of the assessment workflow as an all-in-one working platform.
At present, the Alliance has 28 partner faculties in Europe. More than 2.800 users in 750 working groups are collaborating. Currently 90.000 questions have been stored in the IMS. Since 2007, nearly 4.600 examinations have been successfully conducted.
This article describes in detail the unique features of the IMS and contrasts it with the item management systems of other associations.
PMCID: PMC3441576  PMID: 22857655
Assessment alliance; Quality control
2.  Does Medical Students' Preference of Test Format (Computer-based vs. Paper-based) have an Influence on Performance? 
BMC Medical Education  2011;11:89.
Computer-based examinations (CBE) ensure higher efficiency with respect to producibility and assessment compared to paper-based examinations (PBE). However, students often have objections against CBE and are afraid of getting poorer results in a CBE.
The aims of this study were (1) to assess the readiness and the objections of students to a CBE vs. PBE (2) to examine the acceptance and satisfaction with the CBE on a voluntary basis, and (3) to compare the results of the examinations, which were conducted in different formats.
Fifth year medical students were introduced to an examination-player and were free to choose their format for the test. The reason behind the choice of the format as well as the satisfaction with the choice was evaluated after the test with a questionnaire. Additionally, the expected and achieved examination results were measured.
Out of 98 students, 36 voluntarily chose a CBE (37%), 62 students chose a PBE (63%). Both groups did not differ concerning sex, computer-experience, their achieved examination results of the test, and their satisfaction with the chosen format. Reasons for the students' objections against CBE include the possibility for outlines or written notices, a better overview, additional noise from the keyboard or missing habits normally present in a paper based exam. The students with the CBE tended to judge their examination to be more clear and understandable. Moreover, they saw their results to be independent of the format.
Voluntary computer-based examinations lead to equal test scores compared to a paper-based format.
PMCID: PMC3213144  PMID: 22026970
computer-based examination; paper-based examination; usability
3.  Improvement of the Educational Process by Computer-based Visualization of Procedures: Randomized Controlled Trial 
Before any invasive procedure, physicians have a legal obligation to inform patients. Traditionally, this involves a discussion with a physician, supplemented by written leaflet information directed at the specific procedure.
Comparison of the use and effectiveness of computer-based visualization opposed to standardized conversation for providing patients with information of forthcoming procedures (coronary catheters or endoscopy procedures).
Prospective, randomized trial with 56 participants allocated in two different groups: Visualization Group (standardized information supported by a tool for displaying two-dimensional pictures to explain medical facts as well as informative leaflet) or Control Group (standardized information and informative leaflet only). Detailed information was given about the indication, the probable complications and the details of the forthcoming procedures (coronary catheters or endoscopy procedures). All participants had to reach a Karnofsky Score of 70 points and be able to understand German or English. Main outcome measures were patient's satisfaction with physician-patient conversation, patient's acquired knowledge and duration of the intervention as described above.
Patients of the Visualization Group were more satisfied with the conversation and had higher knowledge scores after the conversation. A Mann-Whitney-U-Test between the two groups showed that these differences in satisfaction (P<0.001) and knowledge (P=<0.006) were statistically significant. Length of time needed for the conversation was slightly higher in the Visualization Group, but this difference was not statistically significant (25 versus 23 min; P= 0.441). No differences could be found due to differing age or educational level in the results of the Visualization and the Control Group.
Using computerized visualization increased the satisfaction and knowledge of the patients. The presentation of the visualized information in the Visualization Group did not demand significantly more time than the standard conversation in the Control Group.
PMCID: PMC1550596  PMID: 15249265
Computer-based visualization; evaluation of visualization; patient empowerment; technology assessment

Results 1-3 (3)