|Home | About | Journals | Submit | Contact Us | Français|
The General Medical Council (GMC) is holding consultations in order to decide on the proposed changes to the undergraduate medical assessment. In the last round of consultation only eight medical students formally responded nationally.
To determine the views of a larger proportion of final year medical students across the country on the proposed changes to the undergraduate medical assessment.
An online national survey of 10 medical schools, from which 401 responses from final year medical students were collected.
The results indicate the medical students' views on the GMC's proposed changes to standardise the assessment system. The majority of the students were in favour of having a say in any changes to their future assessment. They agreed with the principle that there should be a consistency between assessments at different medical schools and currently their results did not represent preparedness to practice.
The General Medical Council (GMC) undertook an initial facilitative consultation from 1 July to 1 October 2005 on the strategic options for undergraduate medical education.1 The purpose of this consultation was to improve the quality of undergraduate medical education and to ensure patient safety. Among the issues under consideration were the proposed changes to “undergraduate medical assessment” in addition to “fitness to practice”. The final report on the outcomes of the strategic options consultation has now been published.2 A further period of informal consultation on the changes to the undergraduate medical assessment system closed on 25 October 2006. Within its broader remit this looked into policy implications of “shared questions” in the examinations set by medical schools and/or “a national examination”, success in which could be a condition for graduation and/or provisional registration.3
There are examples from the USA where the United States Medical Licensing Exam is being used as a national assessment for both American and foreign medical graduates. This system helps in grading and ranking medical students and is used by all residency programmes as a basis for short listing candidates.
Before the compilation of the final report on “strategic options for undergraduate medical education”, the GMC had consulted many stakeholders. These included medical educators, doctors, medical schools, Royal Colleges, British Medical Association, Conference of Postgraduate Medical Deans, Department of Health, students/pre‐registration house officers, patient groups, GMC lay members/associates, postgraduate deaneries, equality and diversity groups, members of the public, members of parliament, and healthcare professionals.2 In addition the GMC held seminars and student debates on these issues. Only eight medical students formally responded to this consultation (personal communication with GMC Education Committee). These proposed changes will have a profound effect on the training and assessment of medical students. It may also change the way medical students are selected for the foundation posts. In addition, the GMC has recognised that more evidence is required before any large scale changes are implemented.2 This made it crucial to know the views of a larger number of final year medical students on these issues. This nationwide survey aimed to find out the opinion of final year medical students on the proposed changes to the assessment system.
The survey questionnaire was based on the GMC document for informal consultation.4 The themes to be surveyed were put together and sent to the GMC Education Committee for validation/approval and then the students were surveyed. The survey statements are summarised in appendix 1. Questions were asked both on the proposed national assessment and bank of shared questions for the final year medical examination. An online survey form was developed using Weblearn (Oxford University Virtual Learning Environment). Students had an opportunity to read the GMC document on the GMC website, if they needed more information before answering the questionnaire. A deadline date of 25 October 2006 was set, in line with the deadline for the informal feedback on strategic proposals to assessment by the GMC.
Twenty‐seven medical schools nationwide were identified and contacted by telephone. Permission was sought so that an email containing the link to the online survey could be forwarded to all of their final year students. Ten medical schools agreed to take part in the survey (table 11).
Nineteen questions had five options each as a response, ranging from “strongly agree” to “strongly disagree”. Five questions had “choose one option” out of two to three proposed options. The responses were collected automatically by Weblearn and then imported into SPSS version 14.02 for Windows for analysis. The numbers of final year medical students in each of the responding medical schools were added to determine a final response rate.
Ten medical schools participated out of 27 nationwide. In terms of schools it was a response rate of 37%. A total of 401 online responses were collected. The total number of final year medical students from the participating medical schools was 2302. The calculated response rate was 17.5% for responding students. The respondents represented a fair number of students from different parts of the country, both from medical schools established >5 years ago (n=8) and <5 years ago (n=2).
In response to each question two patterns were recognised. The first pattern was where there was a strong opinion forming “for” or “against” any option represented by 75% of responses grouped either in “strongly agree and agree” or “strongly disagree and disagree” groups (table 22).). The second pattern was where there was a spread of responses distributed over the range of the options without any clear opinion forming. This was the case with statements 2, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 20, 21, 22 and 23 (see appendix 1 for details of these statements).
There were five questions where students were asked to choose between two or three options. Again if more than 75% had fallen in one or the other category we considered it as a strong opinion forming (table 33)) and if such a trend was not seen we considered it as a “no clear opinion” (table 44).). The raw data and results are tabulated in appendix 2.
On the choice of “shared questions” versus “a national examination” as a tool to achieve objectives of preparedness, consistency, confidence and accountability in assessment and a fair selection to foundation programmes, 45.4% were in favour of shared questions, 22.9% were in the favour of a national examination, while 27.2% thought that these could be achieved by both.
Most (90.8%) of the responding medical students wanted to have a say in any proposed changes to the undergraduate medical assessment; 92.5% of students wanted to have a consistent assessment system between medical schools; and 82.8% of the students agreed that the GMC should have a role in determining how medical students are selected in the foundation training programmes.
If national assessment was to be brought into place 84.3% of students were in favour of multiple examination centres regionally; 75.6% of the respondents wanted the national assessment to be a part of university examination rather than a separate entity. With regards to the timing of this examination, 72.3% were in favour of having this exam near the end of the undergraduate course rather that in the foundation years. There was nearly equal divide of 51.1% and 45.4% of responding students in favour of results of the national assessment to be used as pass/fail criteria and ranking and grading of students, respectively. There was no clear consensus formed on all of the issues relating to shared questions, and the rest of the questions on national assessment (statements and detailed results are tabulated in the appendices).
We were able to divide the responses into two major groups—one where there was a clear opinion formed, and the other where responses were evenly spread over all options. One of the things which clearly emerged was that a majority (90.8%) of students wanted to have a say in the changes bound to affect their future assessment. The GMC has already identified that there are concerns about the need for consistency in outcomes between medical schools and between students.2 Similarly, in our survey a large proportion (92.5%) of respondents were in favour of a consistency of assessment between medical schools. More than three quarters (82.8%) of the students agreed that the GMC should be involved in the selection process to foundation posts. Only 9.0% wanted the national assessment to be held on one site nationally, while 84.3% were in favour of holding the examination at multiple sites. More students were in favour of the national assessment as a part of university examination rather than a separate test. There was a nearly equal divide in the number of students who were for and against the national assessment to be used as a tool in grading and ranking of students.
The GMC has identified a concern about the timing, cost and logistics of a national examination. It was suggested that medical students are already struggling through a heavy exam load and a national exam may be better placed at the end of F1. On the contrary, 72.3% of our respondents were in favour of this exam being held towards the end of the undergraduate course as a prerequisite to graduation. On the issue of shared questions, no clear opinions were evident and the choices were spread over the range of options.
Some respondents mailed back with comments and suggestions. Their views are invaluable. There was a great deal of anxiety about the fact that medical students are not compulsorily consulted in the run up to new policy implementations. A system of selection for the foundation jobs has already been put into place. This system is based on quartile ranking of medical students. In 2007 this was done in retrospect. In this system, 40 out of 85 points were based on the students' application while the rest came from quartile ranking from individual medical schools. The method of allocation to quartiles was left to individual institutions.
Students particularly feel that a standardised national assessment is a good option as it will minimise the inconsistencies of examinations at different medical schools; 45.4% of respondents thought that this will lead to a fairer selection process for foundation jobs by allowing grading and ranking. One student suggested that a ranking system would drive teaching forward and reward those who made an effort to get into better medical schools. The view of one of the respondents was that a homogenisation of the medical curriculum nationwide could help to standardise medical education and in turn assessment.
The majority of students surveyed were in favour of having a say in any changes to their future assessment. They agreed with the principal that there should be a consistency between assessments at different medical schools and currently their results did not represent preparedness to practise. In order to decide between a “national examination” and a system of “shared questions”, there was no clear majority in favour of either of these issues. But if a system of national examination was to be introduced, then the students clearly indicated on some of the issues the direction they wanted the GMC to take. The majority of students wanted this examination to be held at several regional sites towards the end of the undergraduate course, as a part of the university examination rather than a separate test.
It will be fair to say that many issues remain unresolved, but students' opinion must not be ignored and the information that has been gathered here should contribute to the GMC's consultations before any policy changes are implemented.
We would like to acknowledge the help of all the medical schools and students who participated in the survey and Oxford University Computer Services for their help with Information Technology issues
1. Medical students should have a say in the changes planned for their future assessment.
2. The results you achieve locally now depend too much on the university assessing you rather than your fitness to practise or preparedness.
3. There should be an appropriate degree of consistency of assessment between one school and another.
4. The GMC should be concerned how students are selected in the foundation programmes.
4A. Objectives of preparedness, consistency, confidence and accountability in assessment and fair selection to foundation programmes can be better achieved by:
A: Shared questions
B: A national examination
5. National assessment should be used as a tool for fair selection in the foundation programme.
6. Success in national examination should be a condition for graduation.
7. Success in national examination should be a condition for provisional registration.
8. National assessment should be used for ranking of students.
9. Your skills and preparedness to practise can be assessed by any one national assessment.
10. National examination model should have a precise relationship with PLAB assuring equity in assessment of UK and international medical graduates.
11. National assessment could lead to teaching to and learning for the examination, specially when it could be used for ranking for foundation programmes.
12. If there is national assessment it should take the place of local assessment to prevent duplication.
13. National assessment should be restricted to the use of written questions.
14. If held during the foundation programme it would impact on decisions on acceptance to programmes of GP and other specialist training.
15. If held during foundation, this exam would impact adversely on acquisition of clinical skills, as doctors will focus too much on the examination.
16. Would you like this assessment to be conducted at:
A: One site
B: Several sites locally
17. This test should be taken as:
A: A separate test
B: Part of university exam
18. Results of this test should:
A: Pass/fail only
B: Allow grading and ranking
19. The timing should be:
A: Near the end of undergraduate course
B: Foundation year (condition for full registration)
20. Keeping in view the variability in medical school curricula it will be fair to have one bank of shared questions.
21. Shared questions nationally can be used to achieve the same goals as a national examination.
22. Shared questions and assessment tools can give wholly comparable results between medical schools.
23. Shared questions could be used for grading and ranking.
Competing interests: None.