PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-6 (6)
 

Clipboard (0)
None
Journals
Authors
more »
Year of Publication
Document Types
1.  Cumulative assessment: strategic choices to influence students’ study effort 
BMC Medical Education  2013;13:172.
Background
It has been asserted that assessment can and should be used to drive students’ learning. In the current study, we present a cumulative assessment program in which test planning, repeated testing and compensation are combined in order to influence study effort. The program is aimed at helping initially low-scoring students improve their performance during a module, without impairing initially high-scoring students’ performance. We used performance as a proxy for study effort and investigated whether the program worked as intended.
Methods
We analysed students’ test scores in two second-year (n = 494 and n = 436) and two third-year modules (n = 383 and n = 345) in which cumulative assessment was applied. We used t-tests to compare the change in test scores of initially low-scoring students with that of initially high-scoring students between the first and second subtest and again between the combined first and second subtest and the third subtest. During the interpretation of the outcomes we took regression to the mean and test difficulty into account.
Results
Between the first and the second subtest in all four modules, the scores of initially low-scoring students increased more than the scores of initially high-scoring students decreased. Between subtests two and three, we found a similar effect in one module, no significant effect in two modules and the opposite effect in another module.
Conclusion
The results between the first two subtests suggest that cumulative assessment may positively influence students’ study effort. The inconsistent outcomes between subtests two and three may be caused by differences in perceived imminence, impact and workload between the third subtest and the first two. Cumulative assessment may serve as an example of how several evidence-based assessment principles can be integrated into a program for the benefit of student learning.
doi:10.1186/1472-6920-13-172
PMCID: PMC3880587  PMID: 24370117
Summative assessment; Learning effects of assessment; Medical education; Higher education; Knowledge development; Knowledge retention; Test enhanced learning; Cumulative assessment; Repeated testing
2.  Which characteristics of written feedback are perceived as stimulating students’ reflective competence: an exploratory study 
BMC Medical Education  2013;13:94.
Background
Teacher feedback on student reflective writing is recommended to improve learners’ reflective competence. To be able to improve teacher feedback on reflective writing, it is essential to gain insight into which characteristics of written feedback stimulate students’ reflection processes. Therefore, we investigated (1) which characteristics can be distinguished in written feedback comments on reflective writing and (2) which of these characteristics are perceived to stimulate students’ reflection processes.
Methods
We investigated written feedback comments from forty-three teachers on their students’ reflective essays. In Study 1, twenty-three medical educators grouped the comments into distinct categories. We used Multiple Correspondence Analysis to determine dimensions in the set of comments. In Study 2, another group of twenty-one medical educators individually judged whether the comments stimulated reflection by rating them on a five-point scale. We used t-tests to investigate whether comments classified as stimulating and not stimulating reflection differed in their scores on the dimensions.
Results
Our results showed that characteristics of written feedback comments can be described in three dimensions: format of the feedback (phrased as statement versus question), focus of the feedback (related to the levels of students’ reflections) and tone of the feedback (positive versus negative). Furthermore, comments phrased as a question and in a positive tone were judged as stimulating reflection more than comments at the opposite side of those dimensions (t = (14.5) = 6.48; p = < .001 and t = (15) = −1.80; p < .10 respectively). The effect sizes were large for format of the feedback comment (r = .86) and medium for tone of the feedback comment (r = .42).
Conclusions
This study suggests that written feedback comments on students’ reflective essays should be formulated as a question, positive in tone and tailored to the individual student’s reflective level in order to stimulate students to reflect on a slightly higher level. Further research is needed to examine whether incorporating these characteristics into teacher training helps to improve the quality of written feedback comments on reflective writing.
doi:10.1186/1472-6920-13-94
PMCID: PMC3750500  PMID: 23829790
Undergraduate medical education; Written feedback; Reflective writing; Professional development
3.  The effect of implementing undergraduate competency-based medical education on students’ knowledge acquisition, clinical performance and perceived preparedness for practice: a comparative study 
BMC Medical Education  2013;13:76.
Background
Little is known about the gains and losses associated with the implementation of undergraduate competency-based medical education. Therefore, we compared knowledge acquisition, clinical performance and perceived preparedness for practice of students from a competency-based active learning (CBAL) curriculum and a prior active learning (AL) curriculum.
Methods
We included two cohorts of both the AL curriculum (n = 453) and the CBAL curriculum (n = 372). Knowledge acquisition was determined by benchmarking each cohort on 24 interuniversity progress tests against parallel cohorts of two other medical schools. Differences in knowledge acquisition were determined comparing the number of times CBAL and AL cohorts scored significantly higher or lower on progress tests. Clinical performance was operationalized as students’ mean clerkship grade. Perceived preparedness for practice was assessed using a survey.
Results
The CBAL cohorts demonstrated relatively lower knowledge acquisition than the AL cohorts during the first study years, but not at the end of their studies. We found no significant differences in clinical performance. Concerning perceived preparedness for practice we found no significant differences except that students from the CBAL curriculum felt better prepared for ‘putting a patient problem in a broad context of political, sociological, cultural and economic factors’ than students from the AL curriculum.
Conclusions
Our data do not support the assumption that competency-based education results in graduates who are better prepared for medical practice. More research is needed before we can draw generalizable conclusions on the potential of undergraduate competency-based medical education.
doi:10.1186/1472-6920-13-76
PMCID: PMC3668236  PMID: 23711403
Medical education; Competency-based education; Undergraduate medical education; Competence; Curriculum development; Curriculum comparison; Active learning; Clinical performance; Self-efficacy; Progress test
4.  Does reflection have an effect upon case-solving abilities of undergraduate medical students? 
BMC Medical Education  2012;12:75.
Background
Reflection on professional experience is increasingly accepted as a critical attribute for health care practice; however, evidence that it has a positive impact on performance remains scarce. This study investigated whether, after allowing for the effects of knowledge and consultation skills, reflection had an independent effect on students’ ability to solve problem cases.
Methods
Data was collected from 362 undergraduate medical students at Ghent University solving video cases and reflected on the experience of doing so. For knowledge and consultation skills results on a progress test and a course teaching consultation skills were used respectively. Stepwise multiple linear regression analysis was used to test the relationship between the quality of case-solving (dependent variable) and reflection skills, knowledge, and consultation skills (dependent variables).
Results
Only students with data on all variables available (n = 270) were included for analysis. The model was significant (Anova F(3,269) = 11.00, p < 0.001, adjusted R square 0.10) with all variables significantly contributing.
Conclusion
Medical students’ reflection had a small but significant effect on case-solving, which supports reflection as an attribute for performance. These findings suggest that it would be worthwhile testing the effect of reflection skills training on clinical competence.
doi:10.1186/1472-6920-12-75
PMCID: PMC3492041  PMID: 22889271
5.  Using video-cases to assess student reflection: Development and validation of an instrument 
BMC Medical Education  2012;12:22.
Background
Reflection is a meta-cognitive process, characterized by: 1. Awareness of self and the situation; 2. Critical analysis and understanding of both self and the situation; 3. Development of new perspectives to inform future actions. Assessors can only access reflections indirectly through learners’ verbal and/or written expressions. Being privy to the situation that triggered reflection could place reflective materials into context. Video-cases make that possible and, coupled with a scoring rubric, offer a reliable way of assessing reflection.
Methods
Fourth and fifth year undergraduate medical students were shown two interactive video-cases and asked to reflect on this experience, guided by six standard questions. The quality of students’ reflections were scored using a specially developed Student Assessment of Reflection Scoring rubric (StARS®). Reflection scores were analyzed concerning interrater reliability and ability to discriminate between students. Further, the intra-rater reliability and case specificity were estimated by means of a generalizability study with rating and case scenario as facets.
Results
Reflection scores of 270 students ranged widely and interrater reliability was acceptable (Krippendorff’s alpha = 0.88). The generalizability study suggested 3 or 4 cases were needed to obtain reliable ratings from 4th year students and ≥ 6 cases from 5th year students.
Conclusion
Use of StARS® to assess student reflections triggered by standardized video-cases had acceptable discriminative ability and reliability. We offer this practical method for assessing reflection summatively, and providing formative feedback in training situations.
doi:10.1186/1472-6920-12-22
PMCID: PMC3426495  PMID: 22520632
6.  Factors confounding the assessment of reflection: a critical review 
BMC Medical Education  2011;11:104.
Background
Reflection on experience is an increasingly critical part of professional development and lifelong learning. There is, however, continuing uncertainty about how best to put principle into practice, particularly as regards assessment. This article explores those uncertainties in order to find practical ways of assessing reflection.
Discussion
We critically review four problems: 1. Inconsistent definitions of reflection; 2. Lack of standards to determine (in)adequate reflection; 3. Factors that complicate assessment; 4. Internal and external contextual factors affecting the assessment of reflection.
Summary
To address the problem of inconsistency, we identified processes that were common to a number of widely quoted theories and synthesised a model, which yielded six indicators that could be used in assessment instruments. We arrived at the conclusion that, until further progress has been made in defining standards, assessment must depend on developing and communicating local consensus between stakeholders (students, practitioners, teachers, supervisors, curriculum developers) about what is expected in exercises and formal tests. Major factors that complicate assessment are the subjective nature of reflection's content and the dependency on descriptions by persons being assessed about their reflection process, without any objective means of verification. To counter these validity threats, we suggest that assessment should focus on generic process skills rather than the subjective content of reflection and where possible to consider objective information about the triggering situation to verify described reflections. Finally, internal and external contextual factors such as motivation, instruction, character of assessment (formative or summative) and the ability of individual learning environments to stimulate reflection should be considered.
doi:10.1186/1472-6920-11-104
PMCID: PMC3268719  PMID: 22204704

Results 1-6 (6)