PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-5 (5)
 

Clipboard (0)
None

Select a Filter Below

Journals
Authors
Year of Publication
Document Types
1.  Relationships between academic performance of medical students and their workplace performance as junior doctors 
BMC Medical Education  2014;14:157.
Background
Little recent published evidence explores the relationship between academic performance in medical school and performance as a junior doctor. Although many forms of assessment are used to demonstrate a medical student’s knowledge or competence, these measures may not reliably predict performance in clinical practice following graduation.
Methods
This descriptive cohort study explores the relationship between academic performance of medical students and workplace performance as junior doctors, including the influence of age, gender, ethnicity, clinical attachment, assessment type and summary score measures (grade point average) on performance in the workplace as measured by the Junior Doctor Assessment Tool.
Results
There were two hundred participants. There were significant correlations between performance as a Junior Doctor (combined overall score) and the grade point average (r = 0.229, P = 0.002), the score from the Year 6 Emergency Medicine attachment (r = 0.361, P < 0.001) and the Written Examination in Year 6 (r = 0.178, P = 0.014). There was no significant effect of any individual method of assessment in medical school, gender or ethnicity on the overall combined score of performance of the junior doctor.
Conclusion
Performance on integrated assessments from medical school is correlated to performance as a practicing physician as measured by the Junior Doctor Assessment Tool. These findings support the value of combining undergraduate assessment scores to assess competence and predict future performance.
doi:10.1186/1472-6920-14-157
PMCID: PMC4132279  PMID: 25073426
Workplace based assessment; Junior doctors; Undergraduate medicine
2.  Assessment of Junior Doctor performance: a validation study 
BMC Medical Education  2013;13:129.
Background
In recent years, Australia has developed a National Junior Doctor Curriculum Framework that sets out the expected standards and describes areas of performance for junior doctors and through this has allowed a national approach to junior doctor assessment to develop. Given the significance of the judgments made, in terms of patient safety, development of junior doctors, and preventing progression of junior doctors moving to the next stage of training, it is essential to develop and validate assessment tools as rigorously as possible. This paper reports on a validation study of the Junior Doctor Assessment Tool as used for PGY1 doctors to evaluate the psychometric properties of the instrument and to explore the effect of length of experience as a PGY1 on assessment scores.
Methods
This validation study of the Australian developed Junior Doctor Assessment Tool as it was used in three public and other associated hospitals in Western Australia for PGY1 across a two year period addressed two core aims, namely: (1) to evaluate the psychometric properties of the instrument; (2) to explore the effect of length of experience as a PGY1 on assessment scores.
Results
The highest mean scores were for professional behaviours, teamwork and interpersonal skills and the lowest were for procedures. Most junior doctors were assessed three or more times and scores were not different in the first rotation compared to subsequent rotations. While statistically significant, there appeared to be little practical influence on scores obtained by the number of times they were assessed. Principal component analysis identified two principal components of junior doctor performance are being assessed rather than the commonly reported three. A Cronbach Alpha of .883 was calculated for the 10 item scale.
Conclusions
Now that the components of the tool have been analysed it will be more meaningful and potentially more influential to consider these factors on the potential educational impact of this assessment process for monitoring junior doctor development and progression.
doi:10.1186/1472-6920-13-129
PMCID: PMC4015703  PMID: 24053267
Educational assessment; Undergraduate medical education; Internship
3.  Does self reflection and insight correlate with academic performance in medical students? 
BMC Medical Education  2013;13:113.
Background
Medical students in academic difficulty are often described as lacking insight. The Self Reflection and Insight Scale (SRIS) is a tool for measuring insight which has been validated in medical students. We investigated whether self reflection and insight scores correlate with academic performance in Year 4 medical students from a six year undergraduate medical degree, and whether self reflection and insight changes after one year of clinical training.
Methods
Self reflection and insight scores were measured in 162 students at the start of Year 4 at the University of Western Australia. Performance in end of year written and clinical exams was monitored and correlated with SRIS. Seventy of the students were surveyed again at the start of Year 5 to see if scores changed or were stable after one year of full time clinical training.
Results
We found no correlation between self reflection or insight and academic performance in written and clinical exams. There was a significant increase in recognition of the need for self reflection in Year 5 compared with Year 4.
Conclusions
While no correlation was found between this measure of self reflection and insight with academic performance, there was an increase in students’ recognition of the need for reflection after one year of clinical studies. This study is a valuable first step towards a potentially exciting research domain and warrants further longitudinal evaluation with larger cohorts of students using additional measures of achievement.
doi:10.1186/1472-6920-13-113
PMCID: PMC3765283  PMID: 23971859
Self reflection; Insight; Medical students
4.  Potential influence of selection criteria on the demographic composition of students in an Australian medical school 
BMC Medical Education  2011;11:97.
Background
Prior to 1999 students entering our MBBS course were selected on academic performance alone. We have now evaluated the impact on the demographics of subsequent cohorts of our standard entry students (those entering directly from high school) of the addition to the selection process of an aptitude test (UMAT), a highly structured interview and a rural incentive program.
Methods
Students entering from 1985 to 1998, selected on academic performance alone (N = 1402), were compared to those from 1999 to 2011, selected on the basis of a combination of academic performance, interview score, and UMAT score together with the progressive introduction of a rural special entry pathway (N = 1437).
Results
Males decreased from 57% to 45% of the cohort, students of NE or SE Asian origin decreased from 30% to 13%, students born in Oceania increased from 52% to 69%, students of rural origin from 5% to 21% and those from independent high schools from 56% to 66%. The proportion of students from high schools with relative socio-educational disadvantage remained unchanged at approximately 10%. The changes reflect in part increasing numbers of female and independent high school applicants and the increasing rural quota. However, they were also associated with higher interview scores in females vs males and lower interview scores in those of NE and SE Asian origin compared to those born in Oceania or the UK. Total UMAT scores were unrelated to gender or region of origin.
Conclusions
The revised selection processes had no impact on student representation from schools with relative socio-educational disadvantage. However, the introduction of special entry quotas for students of rural origin and a structured interview, but not an aptitude test, were associated with a change in gender balance and ethnicity of students in an Australian undergraduate MBBS course.
doi:10.1186/1472-6920-11-97
PMCID: PMC3233506  PMID: 22111521
5.  Designing and implementing a skills program Using a clinically integrated, multi-professional approach: Using evaluation to drive curriculum change 
The essential procedural skills that newly graduated doctors require are rarely defined, do not take into account pre-vocational employer expectations, and differ between Universities. This paper describes how one Faculty used local evaluation data to drive curriculum change and implement a clinically integrated, multi-professional skills program. A curriculum restructure included a review of all undergraduate procedural skills training by academic staff and clinical departments, resulting in a curriculum skills map. Undergraduate training was then linked with postgraduate expectations using the Delphi process to identify the skills requiring structured standardised training. The skills program was designed and implemented without a dedicated simulation center. This paper shows the benefits of an alternate model in which clinical integration of training and multi-professional collaboration encouraged broad ownership of a program and, in turn, impacted the clinical experience obtained.
doi:10.3885/meo.2009.F0000221
PMCID: PMC2779614  PMID: 20165528
multi-professional; skills training; undergraduate medicine

Results 1-5 (5)