Medical students’ ability to reflect was a significant, albeit weak, predictor of the quality of their case solving after allowing for the effects of knowledge and consultation skills. That is in line with findings of Sobral [21
] demonstrating a significant but weak correlation (r
0.003) between undergraduate medical students’ scores on a reflection-in-learning scale and academic achievement. He explained this relationship by the underlying metacognitive skills of reflection, which also affect academic achievement through learning. A similar explanation can also be applied to the present study. Reflection includes the ability to relive an experience, analyze it critically, and come up with conclusions after careful exploration of alternatives [13
]. Using such skills might have helped students with high reflection scores to understand the case content more profoundly and to give more carefully considered answers, which resulted in higher case solving scores.
Our results demonstrate that case solving both triggers and is affected by reflection. This relationship, however, is not as circular as it might appear. At its heart lies a distinction between the content and process of reflection. Whereas the content of reflection is context specific and influenced by its triggering experience and learners’ unique frame of reference, the process of reflection has a more generic character [34
]. In the present study, case solving as a triggering experience is related to the content of reflection. The effect of reflection on case-solving that we found, however, refers to the process of reflection, which is driven by more generic reflective skills.
Focus on those generic skills makes it possible to assess reflections while recognizing the uniqueness of both a learner’s frame of reference and the context in which their reflection was initiated [4
]. It also provides a counter-argument to the argument that our results can be accounted for by having measured reflection skills and the quality of case-solving in the same context whilst knowledge and consultation skills were assessed in a different context. The focus on process skills made the influence of context less important.
Although the predictive effect of reflection, knowledge and consultation skills on the quality of case solving was statistically significant, the model only explained 10
% of the total variance. From previous studies we would have expect the levels of knowledge and consultation skills to account for more variance than was demonstrated here [22
]. First, this inconsistency with earlier studies may be explained by the different methods used to assess case solving. As opposed to answering questions in video-cases, other studies used objective structured clinical examinations (OSCE) derived formats as clinical performance examinations (CPX) and Integrated Structured Clinical Exams (ISCE). These methods required practical knowledge and executive skills and are called performance assessment in vitro whereas video-based approach in the present study exampled a clinical context based test where students had to demonstrate theoretical knowledge by means of writing skills [36
]. Second, the specific indicators of knowledge and consultation skills may have contributed to the modest explained total variance of our model. The Dutch inter-university progress test is designed to test a greater breadth of knowledge than was needed to solve the questions in the video-cases [24
]. The scores students received in the course ‘clinical, technical and communicative skills’, used as variable for consultation skills, also included competence in radiology and pharmacology next to consultation and communication skills. Whilst these broader aspects of competence were not included in previous studies, they were clearly relevant to the diagnostic and treatment planning aspects of the video-cases.
The modest total of variance explained by our regression model suggests the set of three predictors in the model was incomplete. Factors such as case difficulty, the time of testing, and test environment were similar for all students; personal factors, however, could make cases more or less difficult for individual students and contribute to variance in the scores. Desmedt [37
] identified motivation, beliefs, and self-efficacy as relevant factors, alongside gender, personality, intelligence and learning style. Future research could address limitations of the current study by developing a more comprehensive model to describe case-solving. It could also test the generalizability of our findings to a workplace context and from case scores to clinical practice.