PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of postmedjPostgraduate Medical JournalVisit this articleSubmit a manuscriptReceive email alertsContact usBMJ
 
Postgrad Med J. 2006 April; 82(966): 285–288.
PMCID: PMC2579636

Practical experience of using directly observed procedures, mini clinical evaluation examinations, and peer observation in pre‐registration house officer (FY1) trainees

Abstract

This paper describes an eight month experience with three of the four main assessment tools that will be used to validate the successful completion of the FY1 placement. The practical issues around the implementation of these new tools is of great concern to all involved in the management of postgraduate training and it is hoped that this paper will contribute some of the practical elements so far unavailable from Department of Health sources.

Keywords: assessment, medical education, trainee doctors

In 2003 the four UK Departments of Health issued a joint statement setting out a requirement to reform the UK postgraduate medical training programme to meet the health needs of contemporary society.1 Further work has led to the policy document MMC The Next Steps2 that outlines reform of the first two postgraduate years into the foundation programme. The first year of the foundation programme (FY1) will replace the current pre‐registration house officer (PRHO) year and will be defined by the principles of “structure, assessment and quality assurance”2 underpinned by a strong educational infrastructure. On completion of the FY1, trainees should be able to demonstrate the competencies outlined by the General Medical Council (GMC)3 that are a mixture of generic attributes and procedural skills. To monitor the success of the foundation programme trainees “robust assessment methods will be required to ensure standards have been met and that satisfactory progress can be demonstrated”.2

The specifics of style, structure, and educational content of foundation programmes have yet to be defined and “will emerge from pilots”.2 The MMC programme for FY1 started 1 August 2005 and as yet there seems to be no published work on the official Department of Health (DoH) pilots. In this paper we describe our eight month experience with three of the four main assessment tools that will be used to validate the successful completion of the FY1 placement. This account of piloting the MMC assessment methodology is not intended to be a prospective research study but rather a trial to help prepare our trust for the mandatory implementation of MMC in 2005. The practical issues around the implementation of these new tools is of great concern to all involved in the management of postgraduate training and it is hoped that this paper will contribute some of the practical elements so far unavailable from DoH sources.

Methods and location

This pilot took place at Whipps Cross Hospital University Trust (WXHUT), an acute site with 745 beds situated in north ease London. The hospital hosts 27 PRHOs mainly provided by Queen Mary Westfield College, St Bartholomew's, and The London Medical College. Within WXHUT there is a well developed tradition of teaching and training of both undergraduate and postgraduate medical trainees and a teaching faculty that includes consultants with sessional commitments (two per week) and two full time clinical skills facilitators (CSFs) with nursing backgrounds. The CSFs lead the assessment programme. We have followed a policy of competency assessment followed by educational support for our new trainees for some years. We deem this pilot to be an extension of existing medical education practice within our trust. PRHOs have historically been provided for with a teaching programme that includes many of the generic competencies subsequently outlined in the MMC curriculum. The early phases of this programme include sessions on responsibility for lifelong learning, clinical governance, consent, and communication with patients with special needs. This classroom programme is supported by a practical programme of half day clinical skills teaching provided by CSFs and medical leads. We have adapted this educational framework for the MMC programme as required by the curriculum. In summary the trainees within this pilot programme were probably much better supported educationally than many will be in MMC programmes run currently within the UK. The strategy for implementation of the assessment programme was that this would be coordinated by the senior CSF and that three different approaches to locating assessors would be explored. In the first, a proportion of the assessments would be carried out by either the facilitator or one of a medical faculty. This was to provide first hand experience of the assessment tools to the faculty team, provide a core group that could be trained to a standard in assessment methods, and to reduce the assessment burden on the clinical staff who would normally supervise FY1 trainees. In the second, this task was allocated to the consultant educational supervisor and in the third, the responsibility for finding an assessor rested with the trainee themselves. Training in assessment procedures consisted of a mixture of self taught research from existing sources combined with sessions from educational experts with some experience or knowledge in the field of assessment*. All 27 of the PRHOs who were asked to participate in the project agreed. This was done by letter sent out before the start of the their posts. In addition they were all provided with a verbal explanation of the pilot assessment tools and written examples of each. It was explained to them that this pilot was to form the basis of the future MMC induction programme. Three assessment tools were selected for use on the basis of the assessment discussions that were taking place at the London Deanery in early 2004. These were the directly observed procedures (DOPs), mini clinical evaluation examinations (mini‐CEXs), and multisource feedback. It is probable that the final recommendations for FY1 assessment will include four assessment tools and this pilot did not use the last of these, case based discussion. The principal aim of the study was to assess the feasibility and practicalities of large scale teaching and assessments using the three tools. None of the assessment data were shared beyond the trainees and their assessors and no data were available to referees for future employment citations.

Directly observed procedures

DOPs are not designed to test the person but rather provide the opportunity for that person to ensure that a particular skill is performed correctly according to agreed guidelines using an agreed checklist. The procedures selected are from those outlined as core competencies from the GMC document Tomorrow's Doctors.4 The checklists were taken from those used at this centre for the past four years to assess competencies of incoming PRHOs. They were originally derived from a consensus list from experts in the field5 and have been refined with the experience of frequent use and feedback from both assessors and those assessed. On completion of their hospital induction programme all PRHOs were requested to contact the CSF to arrange assessment DOPS for venepuncture, intravenous cannulation, male catheterisation, and arterial blood gas sampling. The CSF carried a bleep and PRHOs arranged a mutually convenient time and place using this contact arrangement. After completion of the DOPs the PRHOs were asked to complete an evaluation form expressing their views on the DOPs process.

Mini clinical evaluation examinations

The mini‐CEX was originally developed in the USA by the American Board of Internal Medicine. It is designed to assess clinical skills, attitudes, and behaviours essential to providing high quality clinical care. The setting is that of a 15–20 minute snapshot of doctor/patient interaction. Data from the USA have shown them to be reliable and valid.6 The Royal College of Physicians have adopted this as part of its work evaluating tools for performance assessment.7 For this tool, the responsibility for assessment was allocated to the consultant educational supervisors who were allowed in turn to delegate to a specialist registrar. The instructions for assessing mini‐CEX were sent to all consultants by email and written letter. It was made clear that such an assessment should take no more than 20 minutes to complete. A training session was organised and all consultants invited.

Multisource feedback

Multisource feedback, sometimes referred to as 360 degree appraisal, is already being used extensively in industry8 and has already been used for medical practice.9 The department produced its own multisource feedback forms using the principles piloted elsewhere10,11 but extolling the principles of simplicity and relevance to non‐medical healthcare workers and patients. Each PRHO was given two feedback forms and asked to choose a healthcare staff member of any professional background to complete one, and a patient to complete the other. Feedback form sources were anonymous. The forms were sent back directly to the CSF for collation and were then fed back to the PRHO by the CSF.

Results

DOPs

Altogether 54 DOPs were completed by 27 PRHOs. Each individual DOP took less than 15 minutes. Fifty were observed by the CSF and the remainder by a doctor. Five PRHOs completed one DOP, 14 completed two, and seven completed three. Eight per cent of DOPs were for cannulation and venepuncture, 13% for arterial blood gas sampling, and 7% for male bladder catheterisation. A number of practical issues developed in arranging DOPs sessions. Firstly, some procedures are not frequently required so opportunities to observe the skill are difficult to find. When an opportunity might arise it may not be convenient for the assessor to make themselves available at short notice and sometimes such procedures were outside of normal working hours, when assessors may not be present. For other procedures, for example, venepuncture or cannulation, there were more opportunities available for observation at a planned time. It was appreciated at an early stage that the accident and emergency department was an area that permitted easy access to patients needing procedures as part of their medical management. Routine theatre lists provided a further opportunity where procedural opportunities could be anticipated in advance. When arrangements were made by PRHOs to meet with an assessor they did not always keep their appointment. In other instances PRHOs failed to make initial contact with an assessor to arrange a DOP. As participation in this programme was not mandatory, as it will be in the foundation programme proper, this placed emphasis on the assessor to arrange a meeting resulting in an additional time burden.

In the first working month, eight DOPs were completed but only two PRHOs achieved a level judged on the scoring sheet deemed to be procedurally competent for a PRHO. This was not based on a scoring system but by the use of weighting for certain points of the checklist, failure of which to achieve led to an unsatisfactory sign off. Examples of these include hand washing and sharps disposal. In the third month 13 DOPs were completed and all reached the score sheet competency level equivalent to that expected on completion of the PRHO year. During the first four to six weeks of the job the CSF often had to provide assistance to complete the procedure correctly but this improved with time. Many of the PRHOs that did not achieve competency were noted to be hurrying and initially some viewed the DOP as a test. This latter situation also improved with time and a distinct change in attitude was noted after a few weeks. PRHOs also took the opportunity provided by the 1:1 situation of a DOP to relate other concerns or issues, for example recent sharps injuries were reported by three of the PRHOs. This prompted reminders of safe procedure to be circulated to all PRHOs. Several PRHOs made positive comments about the constructive feedback they received from the CSF and the benefit this had had on their own perception of competency. In five separate instances the CSF assessed a patient selected for a DOPs to be sufficiently ill to require more senior intervention and in some cases this highlighted deficiencies in basic care. Finally it was noted by assessors that a minority of the PRHOs experienced low self esteem and poor confidence within their early weeks in post and that the 1:1 relationship with an assessor provided an opportunity for support.

A feedback survey to gain the PRHOs views on the DOPs process was performed at the completion of the pilot programme. Twenty five of the 27 PRHOs (93%) responded and their views are summarised in table 11.

Table thumbnail
Table 1 Perceptions of 25 of 26 PRHOs of the DOPS process as an educational tool

The box shows the main learning points relating to DOPs.

Mini‐CEX

It was thought appropriate to ask individual consultants to supervise mini CEXs themselves as this is the system envisaged in the foundation programme schemes. Consultants were allowed to delegate this role to their registrar. There was immediate resistance to the role of assessor by some consultants. Some felt they would not participate unless this role was made mandatory. Others were not prepared to take on extra work without financial reimbursement. While it had been made clear that mini‐CEXs should take about 20 minutes including five minutes for discussion, there was feedback from assessors that this was not feasible using the marking form we adopted.

Only 10 (37%) of the PRHOs managed to go through this process. The reasons for this are mixed and were given as lack of time from both the assessor and the PRHO, inability to match up mutual free time, and difficulty in finding a suitable patient. Comments received from those that successfully completed this part of the assessments included difficulties in using the current format and running overtime. Careful patient selection was also noted to be important. It was agreed that the form should be simplified and that more work needed to be done to enlist the help of more consultants in the future.

Multisource feedback

Twenty five of 27 (93%) feedback forms completed by work colleagues were returned. The PRHOs chose a variety of people to comment including SHOs, registrars, nursing staff, and pharmacists. It was expected that those completing the forms would have a reasonable awareness of the person concerned but there was no specific training or guidance as to how to complete the form. Every form returned was complimentary and in some cases comments such as “excellent doctor”, “good to work with” were made. No negative comments were received. Non‐medical staff did volunteer that they felt unable to comment upon areas of clinical practice as they had not sufficiently observed this for individual PRHOs or felt they did not have sufficient knowledge of the level required of a doctor of this grade. In addition 21 of 26 (81%) feedback forms were received from patients and again every single form was complimentary. Forms that were not returned tended to be from patients who were discharged within a short time period of receiving the forms. No patient approached refused to participate.

Main learning points from carrying out DOPs

  • The DOP does not take long to complete if done correctly
  • The DOP is useful for ongoing educational and training purposes
  • Some DOPs are, for reasons of frequency of occurrence in normal practice, much easier to carry out than others
  • There is a lack of awareness of DOPs
  • Additional support is required for a minority of PRHOs

Summary

Despite the difficulties, our experience was that all three tools used in this pilot can be used successfully. The DOPs have the advantage of a short time commitment for the procedural observation itself and a ready supply of practice opportunities for the commonly performed procedures. It may be that DOPs will not be possible over a wide range of skills unless they are programmed events. Nevertheless the total time involved in advance booking of mutually convenient times and coordinating this with a procedural opportunity is much greater. Some thought must be given to the coordination of assessor and trainee time and there are various possible solutions to this. Setting aside a regular observation period, for example, Thursday afternoons, when a trained observer is available for DOPs is one solution but if the onus remains with the trainee to arrange these events then the assessor is likely to either spend wasted time awaiting the call or risk multiple simultaneous calls on their time. It seems that some form of locally organised timetabling of DOPs may be a more efficient option and the opportunities that out of hours working provides must also not be neglected with provision of assessors beyond office hours. All of these challenges are yet to be addressed.

Selecting patients can also be difficult and requires some research and preparation. Those that are too ill may be unsuitable while the well may not require procedural interventions. Our experience suggests that A&E is a fertile ground for potential procedures that are clinically indicated. In contrast, this area is also under pressure for target waits and there is some potential for conflict between the teaching and the administrative imperatives. Finally the patients themselves must be considered in all of this. No patient when approached refused to be part of a DOPs but this could occur. In some instances the attention paid to the clinical condition of the patient raised issues about their general medical care that is of concern. Perhaps “random” checks of this kind may also have some clinical benefit to patients. The issues of training the assessors and validating the assessment tools are not the focus of this paper but require urgent attention. Nationally agreed assessment protocols will address the validity and reliability issues while training of assessors in another less easily solved issue.

The mini‐CEX content is more easily covered by the general patient population but increasing specialisation within medicine may restrict some variety in subject choice. The learning points from our experience have been that to adhere to the narrow time frame allocated the assessment criteria must be limited and focused. More will need to be done to engage the non‐enthusiasts who will have equal responsibility to perform assessments. The issue of additional payment or sessional time paid to consultants should be addressed as a matter of urgency through the job planning and the appraisal process. The multisource feedback was well received by both appraisers and appraisees alike. Non‐medical healthcare workers did raise issues about their ability to confidently assess clinical performance of medical colleagues in some areas but were happy to contribute in areas where they felt confident, such as team working and communication skills. In this pilot all appraisers gave positive comments. This suggests, however, that an opportunity is being lost to provide constructive criticism.

This may be that appraisers find it difficult to give critical feedback in which case it may be that appraisers too will require further training and then the resources required suddenly become overwhelming. The lack of negative comments may reflect the doctors' prerogative to select their own assessors. Perhaps a more random or comprehensive sampling technique would reveal a more useful response? If there was one negative comment, it came from more senior medical staff who felt that the appraisal system was rather one sided if doctors were not given the opportunity to also appraise their allied healthcare worker and nursing colleagues too. This pilot also looked at patient involvement in feedback and found that this too is possible, although completion rates were lower than for healthcare professionals.

One generic issue to all these sessions has been the value of even this brief 1:1 time between trainee and a more senior figure. This opportunity was frequently used to express concerns and self doubts that could be addressed permitting support of the trainee that may not have been given if such circumstances did not exist. There seems little doubt also from our anecdotal experience that PRHOs appreciated the continuity of support received from the CSF nurse who organised and participated in much of the assessment.

When considering the three approaches to involving assessors, this pilot suggested that some trainees are slow to arrange sessions, consultants need convincing that they must participate and that this should be recognised by their employers, while the most willing to involve themselves in assessment were the healthcare professionals who performed the peer assessment. From our own experience it is our intention to further develop a core faculty of assessors who will take on a proportion of the assessment burden both teaching colleagues and adding a degree of trained reliability to the assessment system.

Conclusion

This observational study has shown that all three assessment tools may be used successfully in terms of implementation within a hospital environment. There are a number of practical issues that may hinder or improve the implementation of such an assessment programme and advanced awareness of these should be to the advantage of those organising future programmes.

Contributors

AM is a consultant in anaesthetics and intensive care and lead clinician for intensive care, resuscitation services, and clinical skills. He has specific responsibility for medical input into the WXUHT pilot FY1 programme. JH is a registered nurse and manages the clinical skills department. She has been in this post for five years and is now leading the assessment process for the WXHUT FY1 programme. CMR is professor of medical education at St Bartholomew's, The Royal London and Queen Mary Westfield and director of medical Education at WXHUT.

Abbreviations

PHRO - pre‐registration house officer

CSF - clinical skills facilitator

DOP - directly observed procedure

mini‐CEX - mini clinical evaluation examination

Footnotes

*Katharine Boursicot, senior lecturer in medical education, Clinical, Communication and Learning Skills Unit, St Bartholomew's and The London School of Medicine, Queen Mary, University of London.

Funding: none.

References

1. Department of health and others Modernising medical careers: the response of the four UK health ministers to the consultation on “Unfinished Business‐ proposals for reform of the senior house officer grade”. London: HMSO, 2003.
2. Department of Health and others Modernising medical careers: the next steps. London: HMSO, 2004.
3. GMC The new doctor recommendations on general clinical training. London: GMC, 2005.
4. GMC Tomorrow's doctors. London: GMC, 2003.
5. Mallett J, Dougherty L. In: Manual of clinical procedures. 5th ed. Oxford: Blackwell Science, 2000.
6. Norcini J, Blank L, Duffy D. et al The mini‐CEX: a method for assessing clinical skills. Ann Intern Med 2003. 138476–481.481. [PubMed]
7. The Royal College of Physicians Trainees Committee meeting. Jul 2004. http://www.rcplondon.ac.uk/
8. Bracken D. The handbook of multisource feedback. New Jersey: Pfeiffer, 2001.
9. Violata C. Multisource feedback: a method of assessing surgical practice. BMJ 2003. 326546–548.548. [PMC free article] [PubMed]
10. Benjamin D R. A mini CEX pilot study and methods of assessment for senior house officers in medicine in the United Kingdom. London: Education Department, The Royal College of Physicians, 2004.
11. F2 curriculum sub‐committee Core competencies for second foundation year, final version. London: Academy of Medical Royal Colleges, 2004.

Articles from Postgraduate Medical Journal are provided here courtesy of BMJ Group