At the start of a new graduate-entry medical school in Australia, an electronic Clinical Log was implemented, as part of a learning portfolio, to capture students’ learning experiences and foster learning in a range of clinical settings throughout the course. It also aimed to develop a habit of critical reflection while learning was taking place
, and provide feedback to students and the institution on learning progress. As such, the Clinical Log had high face validity with utility for formative assessment but sound psychometrics were needed for high stakes summative purposes [20
]. To address reliability issues such as high variability of scoring between examiners, the Log assessment was carefully introduced with clear articulation of criteria to both students and assessors. Experienced trained assessors, who understood the purpose of the assessment and expected student performance, were used. This study, the first in a series reporting on outcomes and challenges associated with the Clinical Log, showed that most students had a low initial level of engagement with the Log until motivated by inclusion of a Clinical Log station in the end-of-Phase OSCEs. It demonstrated the well-known fact that ‘assessment drives learning’. However the initiative encouraged student engagement with the school’s aim to foster student reflection for professional reasons.
The improvement in cohort performance on the Log station with subsequent OSCEs was attributed to the following: assessment criteria were communicated to students and assessors prior to each exam; a formative assessment experience was offered prior to the summative testing; and post-exam feedback and remediation were offered to all students, especially borderline and failing students.
While it seemed that the Log OSCE assessment was the major motivator for recording of clinical experiences, it was pleasing to observe greater use by Phase 3 students throughout the year-long community-based integrated placement. This may have been due to greater exposure to ‘undifferentiated patients’ and a growing appreciation of the value of self-reflection.
For this paper only the quantity of student reflection in the Log has been analysed. It seemed that for most students inclusion of the reflective criteria in the OSCE station was necessary for student engagement in this desired professional activity. The quality of the reflections in the Log needs further evaluation. The fact that many students were only motivated to record and reflect clinical encounters when the OSCE assessment approached suggested that most reflection occurred on
rather than in
, professional action [14
]. Reflection on
action can occur when the student enters and reflects on each patient-interaction in the Log, or shares Log experiences with preceptors, or peers and tutors. Delayed Log entry may have limited impact on reflection on
action but it is likely to significantly impede reflection in
action. The longitudinal placement is valued as it offers students the benefits of long-term patient follow-up (continuity of care experiences), and ongoing reflection on diagnosis and management decisions as the presentation unfolds (reflection in
action). While the Log allowed recording of continuity of care for individual patients, use of this facility and reflection in
action requires Log entry closely related to the patient-interaction, rather than delayed entry as assessment approaches. The planned development of a mobile Clinical Log application may facilitate student engagement with more reflection in
, as well as on
The School continues to monitor student and clinician feedback on the Clinical Log, addressing issues that have discouraged use. More guidance is being offered on expectations and the educational use of the Log, showing it as an activity for, rather than being in competition with, learning. With the growing use of e-portfolios and logs in undergraduate and postgraduate education and recognition of the value of self-reflection for professional competency [15
], the clinicians who supervise and/or mentor students have been offered more training on providing constructive feedback on learners’ personal and professional development, and reflection on this. This was deemed important for those who embraced completion of a dossier of evidence but were less comfortable with the reflective component in the Log and assessment. Potentially the investment in faculty professional development will have benefits for learners in the vertical continuum of medical education, and faculty themselves (as most teachers contribute to both undergraduate and postgraduate medical education, and may use logs/portfolios themselves in continuing medical education).
Further work is underway to review the quality, as well as the quantity of the reflections and correlate these with learner academic success. This should help strengthen the evidence base for use of electronic reflective logs as part of learning portfolios in undergraduate medical education and build learner confidence in the value of reflection for developing professional artistry. It will be interesting to further investigate at what stage students appreciate the value of self-reflection as part of being a competent physician.
Recent work on different aspects of a portfolio approach to competency-based assessment has reported the value of giving students the responsibility of ‘interpreting, selecting and combining formative assessments received during the year, to document their performance in a learning portfolio for summative decisions’ [28
]. The authors advise that this has helped students to internalise the self-regulation process, potentially more valuable for professional development than an extrinsic driver of portfolio use, such as assessment.