PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-2 (2)
 

Clipboard (0)
None

Select a Filter Below

Journals
Authors
Year of Publication
Document Types
1.  Good exams made easy: The item management system for multiple examination formats 
BMC Medical Education  2012;12:63.
Background
The development, implementation and evaluation of assessments require considerable resources and often cannot be carried out by a single faculty/institution. Therefore some medical faculties have founded cooperation projects which mainly focus on the exchange of multiple choice questions (MCQs).
Methods
Since these cooperation projects do not entirely support all relevant processes in terms of preparation, implementation and evaluation of assessment, in 2006 the Medical Assessment Alliance (MAA) was founded for mutual support. In addition to MCQs the MAA started to develop innovative assessment formats and facilitate content through a coordinated exchange of experiences. To support cooperation within this network, the web-based Item Management System (IMS) was developed which supports all processes of the assessment workflow as an all-in-one working platform.
Results
At present, the Alliance has 28 partner faculties in Europe. More than 2.800 users in 750 working groups are collaborating. Currently 90.000 questions have been stored in the IMS. Since 2007, nearly 4.600 examinations have been successfully conducted.
Conclusion
This article describes in detail the unique features of the IMS and contrasts it with the item management systems of other associations.
doi:10.1186/1472-6920-12-63
PMCID: PMC3441576  PMID: 22857655
Assessment alliance; Quality control
2.  Does Medical Students' Preference of Test Format (Computer-based vs. Paper-based) have an Influence on Performance? 
BMC Medical Education  2011;11:89.
Background
Computer-based examinations (CBE) ensure higher efficiency with respect to producibility and assessment compared to paper-based examinations (PBE). However, students often have objections against CBE and are afraid of getting poorer results in a CBE.
The aims of this study were (1) to assess the readiness and the objections of students to a CBE vs. PBE (2) to examine the acceptance and satisfaction with the CBE on a voluntary basis, and (3) to compare the results of the examinations, which were conducted in different formats.
Methods
Fifth year medical students were introduced to an examination-player and were free to choose their format for the test. The reason behind the choice of the format as well as the satisfaction with the choice was evaluated after the test with a questionnaire. Additionally, the expected and achieved examination results were measured.
Results
Out of 98 students, 36 voluntarily chose a CBE (37%), 62 students chose a PBE (63%). Both groups did not differ concerning sex, computer-experience, their achieved examination results of the test, and their satisfaction with the chosen format. Reasons for the students' objections against CBE include the possibility for outlines or written notices, a better overview, additional noise from the keyboard or missing habits normally present in a paper based exam. The students with the CBE tended to judge their examination to be more clear and understandable. Moreover, they saw their results to be independent of the format.
Conclusions
Voluntary computer-based examinations lead to equal test scores compared to a paper-based format.
doi:10.1186/1472-6920-11-89
PMCID: PMC3213144  PMID: 22026970
computer-based examination; paper-based examination; usability

Results 1-2 (2)