PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-4 (4)
 

Clipboard (0)
None
Journals
Authors
Year of Publication
Document Types
1.  The national portfolio for postgraduate family medicine training in South Africa: a descriptive study of acceptability, educational impact, and usefulness for assessment 
BMC Medical Education  2013;13:101.
Background
Since 2007 a portfolio of learning has become a requirement for assessment of postgraduate family medicine training by the Colleges of Medicine of South Africa. A uniform portfolio of learning has been developed and content validity established among the eight postgraduate programmes. The aim of this study was to investigate the portfolio’s acceptability, educational impact, and perceived usefulness for assessment of competence.
Methods
Two structured questionnaires of 35 closed and open-ended questions were delivered to 53 family physician supervisors and 48 registrars who had used the portfolio. Categorical and nominal/ordinal data were analysed using simple descriptive statistics. The open-ended questions were analysed with ATLAS.ti software.
Results
Half of registrars did not find the portfolio clear, practical or feasible. Workshops on portfolio use, learning, and supervision were supported, and brief dedicated time daily for reflection and writing. Most supervisors felt the portfolio reflected an accurate picture of learning, but just over half of registrars agreed. While the portfolio helped with reflection on learning, participants were less convinced about how it helped them plan further learning. Supervisors graded most rotations, suggesting understanding the summative aspect, while only 61% of registrars reflected on rotations, suggesting the formative aspects are not yet optimally utilised. Poor feedback, the need for protected academic time, and pressure of service delivery impacting negatively on learning.
Conclusion
This first introduction of a national portfolio for postgraduate training in family medicine in South Africa faces challenges similar to those in other countries. Acceptability of the portfolio relates to a clear purpose and guide, flexible format with tools available in the workplace, and appreciating the changing educational environment from university-based to national assessments. The role of the supervisor in direct observations of the registrar and dedicated educational meetings, giving feedback and support, cannot be overemphasized.
doi:10.1186/1472-6920-13-101
PMCID: PMC3733709  PMID: 23885806
2.  Does reflection have an effect upon case-solving abilities of undergraduate medical students? 
BMC Medical Education  2012;12:75.
Background
Reflection on professional experience is increasingly accepted as a critical attribute for health care practice; however, evidence that it has a positive impact on performance remains scarce. This study investigated whether, after allowing for the effects of knowledge and consultation skills, reflection had an independent effect on students’ ability to solve problem cases.
Methods
Data was collected from 362 undergraduate medical students at Ghent University solving video cases and reflected on the experience of doing so. For knowledge and consultation skills results on a progress test and a course teaching consultation skills were used respectively. Stepwise multiple linear regression analysis was used to test the relationship between the quality of case-solving (dependent variable) and reflection skills, knowledge, and consultation skills (dependent variables).
Results
Only students with data on all variables available (n = 270) were included for analysis. The model was significant (Anova F(3,269) = 11.00, p < 0.001, adjusted R square 0.10) with all variables significantly contributing.
Conclusion
Medical students’ reflection had a small but significant effect on case-solving, which supports reflection as an attribute for performance. These findings suggest that it would be worthwhile testing the effect of reflection skills training on clinical competence.
doi:10.1186/1472-6920-12-75
PMCID: PMC3492041  PMID: 22889271
3.  Using video-cases to assess student reflection: Development and validation of an instrument 
BMC Medical Education  2012;12:22.
Background
Reflection is a meta-cognitive process, characterized by: 1. Awareness of self and the situation; 2. Critical analysis and understanding of both self and the situation; 3. Development of new perspectives to inform future actions. Assessors can only access reflections indirectly through learners’ verbal and/or written expressions. Being privy to the situation that triggered reflection could place reflective materials into context. Video-cases make that possible and, coupled with a scoring rubric, offer a reliable way of assessing reflection.
Methods
Fourth and fifth year undergraduate medical students were shown two interactive video-cases and asked to reflect on this experience, guided by six standard questions. The quality of students’ reflections were scored using a specially developed Student Assessment of Reflection Scoring rubric (StARS®). Reflection scores were analyzed concerning interrater reliability and ability to discriminate between students. Further, the intra-rater reliability and case specificity were estimated by means of a generalizability study with rating and case scenario as facets.
Results
Reflection scores of 270 students ranged widely and interrater reliability was acceptable (Krippendorff’s alpha = 0.88). The generalizability study suggested 3 or 4 cases were needed to obtain reliable ratings from 4th year students and ≥ 6 cases from 5th year students.
Conclusion
Use of StARS® to assess student reflections triggered by standardized video-cases had acceptable discriminative ability and reliability. We offer this practical method for assessing reflection summatively, and providing formative feedback in training situations.
doi:10.1186/1472-6920-12-22
PMCID: PMC3426495  PMID: 22520632
4.  Factors confounding the assessment of reflection: a critical review 
BMC Medical Education  2011;11:104.
Background
Reflection on experience is an increasingly critical part of professional development and lifelong learning. There is, however, continuing uncertainty about how best to put principle into practice, particularly as regards assessment. This article explores those uncertainties in order to find practical ways of assessing reflection.
Discussion
We critically review four problems: 1. Inconsistent definitions of reflection; 2. Lack of standards to determine (in)adequate reflection; 3. Factors that complicate assessment; 4. Internal and external contextual factors affecting the assessment of reflection.
Summary
To address the problem of inconsistency, we identified processes that were common to a number of widely quoted theories and synthesised a model, which yielded six indicators that could be used in assessment instruments. We arrived at the conclusion that, until further progress has been made in defining standards, assessment must depend on developing and communicating local consensus between stakeholders (students, practitioners, teachers, supervisors, curriculum developers) about what is expected in exercises and formal tests. Major factors that complicate assessment are the subjective nature of reflection's content and the dependency on descriptions by persons being assessed about their reflection process, without any objective means of verification. To counter these validity threats, we suggest that assessment should focus on generic process skills rather than the subjective content of reflection and where possible to consider objective information about the triggering situation to verify described reflections. Finally, internal and external contextual factors such as motivation, instruction, character of assessment (formative or summative) and the ability of individual learning environments to stimulate reflection should be considered.
doi:10.1186/1472-6920-11-104
PMCID: PMC3268719  PMID: 22204704

Results 1-4 (4)