PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of brjgenpracRCGP homepageJ R Coll Gen Pract at PubMed CentralBJGP at RCGPBJGP at RCGP
 
Br J Gen Pract. Jul 1, 2010; 60(576): e295–e304.
PMCID: PMC2894404
Primary Medical Care Provider Accreditation (PMCPA): pilot evaluation
Stephen M Campbell, PhD, Senior Research Fellow, Umesh Chauhan, PhD, Clinical Research Associate, and Helen Lester, MD, Professor of Primary Care
National Primary Care Research and Development Centre, University of Manchester, Manchester
Address for correspondence Dr Stephen M Campbell, National Primary Care Research and Development Centre, University of Manchester, Williamson Building, Oxford Road, Manchester, M13 9PL. E-mail: stephen.campbell/at/manchester.ac.uk
Received July 29, 2009; Revised September 4, 2009; Accepted January 27, 2010.
Background
While practice-level or team accreditation is not new to primary care in the UK and there are organisational indicators in the Quality and Outcomes Framework (QOF) organisational domain, there is no universal system of accreditation of the quality of organisational aspects of care in the UK.
Aim
To describe the development, content and piloting of version 1 of the Primary Medical Care Provider Accreditation (PMCPA) scheme, which includes 112 separate criteria across six domains: health inequalities and health promotion; provider management; premises, records, equipment, and medicines management; provider teams; learning organisation; and patient experience/involvement, and to present the results from the pilot service evaluation focusing on the achievement of the 30 core criteria and feedback from practice staff.
Design of study
Observational service evaluation using evidence uploaded onto an extranet system in support of 30 core summative pilot PMCPA accreditation criteria.
Setting
Thirty-six nationally representative practices across England, between June and December 2008.
Method
Study population: interviews with GPs, practice managers, nurses and other relevant staff from the participating practices were conducted, audiotaped, transcribed, and analysed using a thematic approach. For each practice, the number of core criteria that had received either a‘good’or‘satisfactory’rating from a RCGP-trained assessment team, was counted and expressed as a percentage.
Results
Thirty-two practices completed the scheme, with nine practices passing 100% of core criteria (range: 27–100%). There were no statistical differences in achievement between practices of different sizes and in different localities. Practice feedback highlighted seven key issues: (1) overall view of PMCPA; (2) the role of accreditation; (3) different motivations for taking part; (4) practice managers dominated the workload associated with implementing the scheme; (5) facilitators for implementation; (6) patient benefit — relevance of PMCPA to quality improvement; (7) recommendations for improving the scheme.
Conclusion
Version 1 of PMCPA has been piloted as a primary care accreditation scheme and shown to be relevant to different types of practice. The scheme is undergoing revision in accordance with the findings from the pilot and ongoing consultation.
Keywords: accreditation, primary care, quality of health care
There are over 8500 general practices in England, all providing primary medical care services. However, there is no universal system of accreditation of the quality of organisational aspects of care and no contractual levers to promote organisational quality beyond the voluntary indicators within the Quality and Outcomes Framework (QOF) organisational domain, which had a mean practice achievement rate in 2007/2008 of 94.5%.1
Patients have the right to expect and receive high-quality care. Structural features of primary care organisation do not guarantee the provision of safe and quality care; however, they can influence the processes of clinical care.24 For example, if the necessary equipment or skills are not available, this can affect patient safety; if there are no available appointments, this will influence patient access to care.
Accreditation is a model of external assessment of healthcare providers that is being increasingly used worldwide.57 Practice-level or team accreditation is not a new concept to primary care in the UK, with previous initiatives such as Quality Team Development, Quality Practice Award, and practice team accreditation.8,9 These previous schemes have all been voluntary programmes using standards set above the expected minimum. Recent UK government policy documents such as the Next Stage Review have focused on organisational issues of primary care and expressed support for provider accreditation schemes that focus on measures to ensure continuous improvement in the quality of primary and community care.10
In 2008, an accreditation scheme called Primary Medical Care Provider Accreditation (PMCPA), focusing on organisational issues of primary care, was developed and piloted by the authors, in conjunction with the Royal College of General Practitioners (RCGP). The aims of the pilot were to evaluate the experiences of a representative sample of general practices across England in implementing PMCPA, and to analyse the uptake and achievement of the core criteria in each of the domains. This paper focuses on the content and concept of PMCPA, how PMCPA was developed and piloted, and the uptake and achievement of the core criteria within the scheme. It does not describe how it might be implemented since this is the subject of ongoing discussion.
Development of the accreditation scheme
PMCPA focuses on organisational rather than clinical issues, although criteria that are organisational but support clinical issues were included. It was developed using the methodology and criteria of the RCGP Quality Team Development scheme,8 as a starting point, supplemented by criteria from other international accreditation schemes in primary care. These included ideas from the European Practice Assessment scheme,1113 and New Zealand Practice Accreditation Scheme.14 Criteria were also derived from recent primary care policy documents including the RCGP Road Map,15 and Standards for Better Health.16 This enabled criteria to reflect currently accepted good practice and also key issues for the future development of quality primary care practice. Six separate quality domains were identified based on this evidence base (Table 1).
Table 1
Table 1
PMCPA domains and number of criteria per domain.
Each criterion was allocated to an appropriate domain and reviewed by an accreditation development group, and divided into those that were deemed summative (yes/no) and those that were formative/developmental in nature. Summative assessment represents a judgment of‘pass or fail’, which must be underpinned by a clear idea of what constitutes acceptable performance when judged by external assessors. In comparison, formative or educational assessment is an internal reflective process to promote improvement and attainment of a goal, which is judged as evidence of improvement rather than a simple pass or fail. Each criterion was underpinned by guidance, provided to participants, about the evidence necessary to meet the criterion.
Each domain was created to include a balance of core summative and developmental formative criteria. Indicators in the organisational component of the QOF were excluded, as were criteria that covered legal and contractual requirements in version 1 of PMCPA. The overall framework of PMCPA including the six domains and the number of core (n = 30) and developmental (n = 82) criteria is shown in Table 1. The full set of core criteria and associated evidence is shown in Appendix 1. This paper describes the results from the service evaluation, focused on the 30 core criteria for which data were collected in all participating practices.
Prior to participation in PMCPA, practices were asked to sign a declaration that they complied with a list of statutory, contractual, and legal requirements. Practices participating in the pilot scheme were asked to self-assess and provide written documentation against all 30 core criteria and three or four randomly selected developmental criteria. Practices self-assessed and uploaded written documentation in support of each criterion, using a web-based extranet system created for the pilot. This also enabled documents to be time stamped in terms of uploading. The extranet suite also had a facility for staff to communicate directly with staff in other participating practices in their primary care trust (PCT).
After the pilot period, practices were visited by one of the three authors and asked to describe their experience and views of PMCPA as part of the service evaluation. Eighty-three practice staff took part in these feedback sessions (31 GPs, 33 practice managers, and 13 practice nurses, as well as six other staff such as information technology/audit personnel).
Sample and method
The study aimed to recruit a total of 40 practices — 10 each in four purposively chosen PCTs, selected to provide a mix of rural/urban and deprived/affluent areas. In each PCT area, a stratified random sample was selected to be representative of all practices in the PCT in terms of deprivation, practice list size, and 2007 QOF score.
If a practice in a particular sampling cell (that is, small practice in a deprived area) declined to participate, a practice with a similar profile from the same sampling cell was invited to take part. A criteria and information pack was distributed to each participating practice in May 2008. The pack detailed information on the scheme, the evidence required, and guidance to meet the core and developmental criteria, as well as user instructions for the extranet site. Practices were allocated 15 weeks, from 15 June to 30 September 2008, to prepare the self-assessment report and submit written evidence via the web-based extranet site.
An RCGP-trained assessment team consisting of a clinician, a practice manager, and a lay assessor, was appointed by the respective PCT for each practice. The assessor team reviewed the relevant evidence prior to the visits, which took place between October and December 2008. RCGP assessor training included calibration to help ensure validity, reliability, repeatability, and equity of assessment. At each practice, all 30 core criteria were rated as‘good’or‘satisfactory’, representing a pass of a summative criterion, or‘borderline’or‘unsatisfactory’, representing a fail of the summative criterion.
Analyses
For each practice, the number of core criteria (out of all 30) that had received either a‘good’or‘satisfactory’rating were counted and expressed as a percentage of the number of criteria that were attempted/applied to each practice.
Practices were classed as being either‘single-handed’or‘group’practices. Single-handed was defined as those practices with no more than 1.5 full-time equivalent GPs. In addition, to examine the effect of list size, practices were divided into three roughly equal groups based on list size: less than 4000 patients (11 practices); 4000 to 6999 (13 practices); and 7000 or more (12 practices). Index of Multiple Deprivation (IMD) scores were used to examine the impact of location on practice means.
Since the distribution of practice total core criteria scores was highly skewed and the sample size fairly small, the data were analysed using non-parametric statistical methods. Median scores were used as the measure of central tendency, rather than means. Relationships between total criteria scores and the categories of practice type and size were investigated using quantile regression, to allow examination of relationships with and without control for IMD scores as a covariate. Regression coefficients were tested for statistical significance using a non-parametric method based on 1000 bootstrap replications. All analyses were undertaken in Stata (version 10.1).
Key concepts were identified from these feedback session notes for each practice and summarised for each practice by using an open coding method. Codes that had common elements were merged to form categories. These categories are reported in this paper.
A total of 80 practices were invited to take part, and a sample of 36 practices were recruited (45%): 10 in Haringey, London (from 21 invited: 48%); eight in Nottinghamshire (18: 44%); nine in Oldham (19: 47%); and nine in Warwickshire (22: 41%). The characteristics of this sample of 36 practices compared to all practices in England are shown in Table 2.
Table 2
Table 2
Overall characteristics and representativeness of the PMCPA sample (n = 36 practices) compared to all England.
Four practices withdrew during the piloting process and the remaining 32 practices fully completed the pilot (eight in Haringey; eight in Nottinghamshire; nine in Oldham; and seven in Warwickshire). Thirty of these 32 practices received an assessment visit by their PCT-appointed, three-person team. Two practices were unable to complete assessment visits due to difficulties timetabling a visit with assessors. There were therefore 30 practices with an assessment rating of each criterion.
Thirty-four practices uploaded 1817 documents on to the extranet site (mean 53.4, range 1–165); two practices did so before withdrawing from the pilot. Over 80% of documents were uploaded by practice staff in the last month, with most practices uploading documents during the last week, and some in the last 24 hours of the pilot; although it is known anecdotally that this did not necessarily reflect when the majority of the work took place.
Overall achievement
At most practices, all 30 criteria were deemed to be applicable by assessors and assigned a rating. Where all 30 criteria applied and were rated, eight practices scored 30/30 using a rating of‘good’or‘satisfactory’as a pass. Three of these practices were small practices (list size <4000 patients), two medium-sized practices (list size: 4000–7999 patients), and three were larger practices (list size: ≥8000 patients). Of the practices passing 30/30 core criteria, two were in areas with a low deprivation index, five in areas of medium deprivation index, and one in an area of high deprivation.
In some practices, certain criteria were not given a rating by assessors; for example, core criterion 9 related to minor surgery and four practices stated they did not provide a minor surgery service. As such, the denominator for ascribing an overall score to a practice was not always 30. One additional practice scored 28/28. This meant that nine practices (30%) passed 100% of the core criteria for which they were given a rating by their assessors. On this basis, practices passed between 27% and 100% for the core criteria for which they were given an assessment rating. Only two practices had scores of ≤65%, while half of all practices scored ≥90% within the 15-week timeframe.
The frequency with which each of the 30 core criteria were rated good/satisfactory or borderline/unsatisfactory/not applicable across the evaluation sample of practices is shown in Appendix 1
Only one core criterion relating to confidentiality (CC19) was passed as ‘good’ or ‘satisfactory’ by all participating practices.
Achievement by practice characteristics
Across the 30 core criteria, single-handed practices had a lower median pass (good or satisfactory) score (73.3%) compared to group practices (93.3%). However, there were no statistical differences in achievement between different-sized practices (P = 0.13). Single-handed practices were more likely to be located in areas with a higher IMD score (Mann–Whitney U test, P<0.01), but adjusting for IMD did not affect the (lack of) relationship with total core criteria scores (P = 0.17). Median total criteria scores in relation to practice size were 73.3 for single-handed practices, 96.6 for medium-sized practices, and 86.7 for larger practices. Although the mean score was greater for medium-sized practices, under quantile regression the differences between the group means were not significant (P = 0.11), and remained non-significant after adjustment for IMD scores (P = 0.11). Figure 1 shows the individual practice total core criteria pass scores by practice deprivation for single-handed and group practices.
Figure 1
Figure 1
Individual practice total core criteria pass scores by practice IMD scores for single-handed practices and group practices.
Practice feedback
Practice feedback highlighted seven key issues: (1) overall view of PMCPA; (2) the role of accreditation; (3) different motivations for taking part; (4) practice managers dominated the workload associated with implementing the scheme; (5) facilitators for implementation; (6) patient benefit — relevance of PMCPA to quality improvement; and (7) recommendations for improving the scheme.
Overall view of PMCPA
All practices felt that PMCPA was relevant to and aligned with family practice priorities, reflected quality in primary care, and was a worthwhile use of practice time. PMCPA was seen as promoting improvements in organisational standards. However, the main risk identified was that externally imposed standards could be seen as a‘tick-box’exercise, with organisations seeking to meet the target without necessarily reflecting on how the issues contained within the standard affect their own setting. PMCPA was not seen as just a tick-box exercise by most practices but as an opportunity to change and benefit the practice.
Many managers and doctors, however, felt that the scheme was too heavily populated by criteria where the evidence for external assessment focused on demonstrating the existence of a protocol or procedure. These health professionals stated that PMCPA should focus more on the evidence of change of benefit to the practice.
The role of accreditation
There was general consensus about the value of accreditation, but different interpretations of its role. Some practices fed back that the role of accreditation is to show adherence to an acceptable standard in terms of compliance or conformance with an accepted set of standards. Others explicitly emphasised focusing on formative practice-specific quality improvement as a reflective exercise rather than a box-ticking exercise. The RCGP was, however, seen as the arbiter of professional standards, and their leadership of the scheme was seen as key by almost all practices.
Motivations for taking part
Practices had different reasons for taking part in the accreditation pilot. This was most often described as a team-development exercise focusing on practice quality improvement, but also to fix a perceived problem such as current standards or deficiencies in team working, or to demonstrate how good they perceived their practice to be.
Workload
Practice managers carried out 90–95% of the actual workload. GPs tended to make the initial decision to be involved and then largely had a hands-off approach and were used as a‘checking in point’by most managers. The workload was higher than expected in most practices, although almost all practices emphasised that this reflected the 15-week duration of the pilot, the timing (over the summer months), and the fact that the pilot was implemented at a time when many practices were busy with other concurrent demands such as the information management and technology direct enhanced service.
Facilitators for implementation
While most of the workload was undertaken by managers within practices, both doctor and team engagement at some level were critical to success, and those practices that did not complete PMCPA were usually those where only the manager was truly engaged in the scheme. A combination of team work, designated roles and responsibilities within the team, protected time to work on PMCPA, engagement and ownership by all team members, leadership (from a manager and/or doctor), and doctor buy-in were crucial factors in implementing PMCPA as a learning organisation.
Patient benefit — relevance of PMCPA to quality improvement
Many participants explained how using PMCPA as a learning exercise within the practice, with a focus on internal reflective quality-improvement strategies, as well as focusing on the targets for external assessment, led to direct patient benefits relevant to their practice population. In many practices, staff explained how having to write practice protocols in order to meet the core criteria formalised tacit knowledge. Other examples included facilitating changes in the structures and processes of care that improved patient safety, making services more responsive to the needs of patients, and a greater community focus. A minority of managers and doctors contextualised the role of schemes like PMCPA from the perspective of‘looking through the eyes of patients’.
Examples of how practices used PMCPA to make their practice more responsive to the needs of their patients included thinking about how to work better with local interpreting services, creating a histology safety net to improve the recall system after minor surgery, training healthcare assistants to understand abnormal results picked up in new patient checks, creating fail-safe systems for non-collection of prescriptions, and a variety of initiatives to improve information giving to and consultation with patients.
Recommendations for improving the scheme
Practice feedback on how to improve PMCPA emphasised 2–3 years as a realistic timeframe for completing the scheme and a focus on using evidence of implementation and learning rather than the simple presence of a written protocol. The experiences of piloting PMCPA and listening to feedback in practices of different sizes clearly showed the need to make PMCPA, and the evidence that is presented to meet standards, practice specific and flexible. For example, evidence used to pass a criterion might not be the same in every practice. Many participants also recommended that PMCPA needs to have a greater emphasis on patient benefit and patient responsiveness.
Summary of main findings
PMCPA has been developed as a criteria- and evidence-based accreditation scheme based on the RCGP Quality Team Development model and other international accreditation schemes. It is a scheme that seeks to be applicable equally to all providers working in primary care. Within a representative sample of practices, 30% of participating general practices passed 100% of the core criteria and half of all practices scored 90% or above, despite the short 15-week pilot period. There were no statistical differences in achievement levels between practices of different sizes. All participating practices fed back that PMCPA was relevant to and aligned with primary care priorities.
Strengths and limitations of the study
The strengths of this pilot include the representative nature of the sample of practices. Despite the short time period for piloting, 32 of the 36 recruited practices completed the scheme. However, the sample size of participating practices was relatively small, hence only fairly substantial differences between different kinds of practices would have been detectable. This paper has also only reported the summative components of PMCPA and not the formative aspects of continuous quality improvement inherent within the developmental criteria, because data were not collected in all practices for these criteria.
Comparison with existing literature
While accreditation has acquired different meanings in different healthcare settings,17 it nevertheless represents official recognition, acceptance, or approval of demonstrable compliance against set standards.18 There are five ways in which assessment against set standards might be used.6 These are quality control (mandatory, externally set, minimum predetermined acceptable standards), mandatory regulation (legal or safety standards), continuous quality improvement (to show ongoing excellence above a minimum standard), information giving (to enable comparison between providers by patients and policy makers), and marketing (to showcase a standard of service available). A complex picture emerges from reviews of healthcare accreditation schemes worldwide but two key features are commonplace — promoting change, and professional development.5 The piloted version of PMCPA incorporates both quality control with summative assessment and quality improvement with practice-specific formative assessment. Scrivens recommended over a decade ago that accreditation schemes need to move away from just external peer review and incorporate continuous quality improvement.19 It is increasingly recognised worldwide that a quality-improvement strategy at a primary care practice level needs a balance between quality assurance/regulation (summative) and educational (formative internal improvement) approaches.8,14,20
PMCPA echoes previous quality initiatives in the UK where primary healthcare teams generated quality improvement when they had the time and resources to learn, but work and plan together with clear objectives and under the leadership of a manager or doctor.21
Implications for clinical practice and future research
There are concerns about the effectiveness and appropriateness of practice accreditation, despite mounting international recognition of its importance.6,17 In the UK context, where a number of different quality-improvement initiatives are currently in development, the white paper Our Health, Our Care, Our Sayadvocated the need to coalesce current administrative requirements on primary care into one scheme, thereby reducing overall burden.22 It will also be important that accreditation is not just a paper exercise but leads to practice-specific and relevant quality improvement and patient benefit. There is a need for a mix of professional, clinical, managerial, and financial approaches to quality improvement.23 In the UK there is an increasing plurality of primary care providers from‘traditional’general practices to general or personal medical services (PMS), PMS plus, out-of-hours providers, and a range of private provider models. A primary care provider accreditation scheme will also need to be responsive to these different modes of delivery.
Quality team development in primary care was found to have positive benefits for practices but participants tended to be a self-selecting innovative minority of practices.8 The challenge now is how best to roll out PMCPA so that it is engaged with by the majority of practices. If PMCPA is to be implemented in a meaningful way, it will need to align and integrate with other ongoing and planned quality-improvement strategies in the UK. In particular, to avoid duplication of effort and data gathering, PMCPA will need to support the registration requirements of the Care Quality Commission, revalidation of doctors, and revisions of the QOF.
PMCPA is currently being revised in light of the findings from the pilot and to ensure the scheme is integrated with and aligned to other national initiatives. However, the findings from this pilot suggest that PMCPA was a positive experience for the participating practices and that practice size and locality do not affect engagement or achievement.
Acknowledgments
The authors would like to thank Dr Janet Hall (Accreditation Development Group), Dr Meeta Kathoria, Dr Evangelos Kontopantelis, Dr David Reeves, Professor Nigel Sparrow, Rehema Shabaya, Dr Bill Taylor (Accreditation Development Group), Caroline Turnbull, and the practice staff in all participating practices, as well as all members of the PMCPA Stakeholder Committee.
Appendix 1. PMCPA core criteria and participating practices pass rates (n = 30)
Criterion numberCore criterionEvidence required for submissionPassed (good or satisfactory) (%)Did not achieve (borderline/unsatisfactory or not attempted)
Health inequalities and health promotion

CC1New patients are offered a consultation to ascertain details of their past medical and family histories, social factors including occupation and lifestyle, medications and measurements of risk factors (for example, smoking, alcohol intake, blood pressure, height, weight and body mass index). Such consultations, suitably adapted, should be offered to newly registered children (to support delivery of the Child Health Promotion Programme).Provider designed check. Evidence of implementation of checks.28 (93.3)2 (6.7)

CC2The provider has a system in place to collect information on the risk factors particular to their provider population.Written policy regarding the recording of factors that put patients' health at risk, including stating where this information is recorded in case records.29 (96.7)1 (3.3)

Provider management

CC3The provider operates a system to ensure that a named healthcare professional can be contacted promptly in the case of emergency.Written evidence that the provider operates a system to ensure that a named, appropriately qualified, healthcare professional can be contacted promptly in the case of emergency. This should include a rota identifying the healthcare professional who is available to deal with an emergency. The professional could be a doctor or nurse, or in certain circumstances the ambulance service. A flow chart or decision-making tool should make the system clear to all team members and should give information on what to do if any mobile communication is not responding.28 (93.3)2 (6.7)

CC4The provider has a written policy for informing patients or, where appropriate, families and carers, of the results of investigations, and the policy is explained to them.Written policy including how the patient can obtain the results of investigations carried out by the provider; for example, patient information leaflet; website (hyperlink may be submitted as evidence); notice on wall. This information is up to date, revised when changes occur, and reviewed every 12 months. This policy may state that the provider contacts the patient or vice versa. The time and method of contact should be stated. If the patient is contacting the provider, then it should be clear who is likely to give out the result. It needs to be clear how patients are made aware of the contents of the policy when an investigation is carried out. These instructions about the policy may be verbal or written.23 (76.7)7 (23.3)

CC5The provider operates a policy regarding the management of patient care following discharge from hospital, which includes reviewing any amendments to medication.Written policy covering all patients including children and young people: evidence of implementation and follow-up.25 (83.3)5 (16.7)

CC6Non-collection of prescriptions held by the provider are monitored and followed-up by the provider.Written policy; checked by a clinician. Evidence of implementation and follow-up.25 (83.3)5 (16.7)

CC7An arrangement exists for private discussion between patients and non-clinical team members.Written description of arrangement; inspection of premises.29 (96.7)1 (3.3)

CC8The confidentiality of patient data is respected by the whole team.(1) Written provider policy on patient confidentiality; (2) written description of storage of patient medical records including evidence that medical records are not stored or left visible in areas where members of the public have unrestricted access; inspection of premises; (3) level of access to computerised access — password protection for different team members; (4) written policy for shredding any patient details/letters once they have been scanned/dealt with; (5) asking staff about how they maintain confidentiality.29 (96.7)1 (3.3)

Premises, equipment, records, and medicines management

CC9The provider keeps a record or log of their minor operations which will have the following information recorded: (1) date; (2) patient name; (3) procedure performed; (4) whether a specimen was sent for histology; (5) patient consent; (6) complications; (7) patient informed of result.Minor surgery template or written policy; informed consent. Evidence of implementation and follow-up.24 (80)6 (20)
CC10The provider will appraise the premises, listing the strengths and weaknesses of the current arrangements and changes they would like to make to improve the working environment including safety and patient care.An annual written provider premises improvement plan listing the strengths and weaknesses of the current arrangements and changes they would like to make to improve the working environment and patient care, including access for specific groups (that is, disabled access), with set objectives and methods. Evidence of implementation and follow-up.18 (60)12 (40)
CC11The provider will appraise all medical equipment to ensure that it is up to date, listing the strengths and weaknesses of the current arrangements and changes they would like to make.There is an up-to-date inventory list detailing which items of basic medical equipment must always be based on site for that provider (as justified by the provider), and all equipment is up to date and present or revised/renewed when changes occur and reviewed every 12 months. Evidence of implementation and follow-up.25(83.3)5 (16.7)
CC12The premises are clean, temperature regulated, well lit and well maintained.Patient user group. Evidence of implementation and follow-up of any quality deficits.19 (63.3)11 (36.7)
CC13All drugs in the emergency bag(s) and stock room are within expiry date.Written policy on supplying and checking all emergency bags and stock room drugs including CDs. Evidence of implementation and follow-up.27 (90)3 (10)

Provider teams

CC14All first-contact team members have been trained to recognise and respond appropriately to urgent medical matters.Written evidence that all first-contact team members have received training in last 12 months or at induction. Review of training materials.20 (66.7)10 (33.3)
CC15A first-contact team member trained to recognise and respond appropriately for basic life support is always available.Written monthly team member rota showing that there is always at least one person on the premises competent in basic life support, who has attended training/updating in basic life support skills in the preceding 18 months.28 (93.3)2 (6.70
CC16The provider will ensure that all team members employed by the provider are competent and have appropriate qualifications and training, and that all health professionals are currently registered with the relevant regulatory body on the appropriate part(s) of its register(s).Evidence of qualifications and references. Instructions on how registration is checked.24 (80)6 (20)

CC17All members of the team are suitably trained and only carry out treatments that are within their competence.Written description of the system(s) available to healthcare team members for demonstrating and maintaining professional competence, for example, appraisal, clinical supervision, preceptorship, reflection on provider. Team members asked about their experience.26 (86.7)4 (13.3)

CC18All team members have training at induction and are refreshed 3 yearly on the principles of the Data Protection Act.Copy of written team member induction policy; written evidence of team member training dates either as part of appraisal or log of team member training dates. Team members asked about their experience.20 (66.7)10 (33.3)
CC19All provider team members maintain patient confidentiality at all times and have signed a confidentiality agreement.(1) Written provider policy on patient confidentiality; (2) confidentiality clause in all contracts of employment; (3) all team members have signed a confidentiality agreement/clause; (4) evidence that patient confidentiality forms part of the induction training; (5) question team members about confidentiality.30 (100)
CC20There is a written policy for ensuring team members offering new patient checks are trained in recognising actionable findings and taking relevant action.Evidence of written policy, set objectives, and implementation and follow-up.22 (73.3)8 (26.7)

Learning organisation

CC21The team identifies possible health and safety risks to team members, takes steps to minimise them, and has policies in place for responding when/if adverse events occur.Written policy of latest team member risk assessment and actions taken in response. Evidence of implementation and follow-up.20 (66.7)10 (33.3)
CC22The team identifies possible health and safety risks to patients and takes steps to minimise them, and has policies in place for responding when adverse events occur.Written policy of latest patient risk assessment and actions taken in response. Evidence of implementation and follow-up.20 (66.7)10 (33.3)
CC23The provider operates a policy to identify and learn from all patient safety incidents and significant events and to share learning points with all team members and also outside agencies who were stakeholders in the event.Three examples of events (significant event, yellow card report, drug alert) for which there is evidence of shared learning (for example, pharmacist). Evidence of implementation such as a case discussion with learning points and follow-up.28 (93.3)2 (6.70
CC24The provider produces an annual development plan that contains clear objectives and timescales and takes account of their local delivery plan.2–3 written objectives for quality improvement set by the provider or which there is evidence (which includes evidence taken from aof range of patients using the provider) of a pre-existing quality deficit.19 (63.3)11 (36.7)
CC25The provider has a written quality-improvement strategy for clinical governance which enables quality assurance of its services and promotes quality improvement and enhanced patient safety.At least quarterly meetings about quality improvement; written agenda and minutes; set objectives; named people responsible for meeting objectives. Evidence that objectives are met.23 (76.7)7 (23.3)
CC26The team works with other agencies, groups and the community to help improve local public health, prevent disease, and promote the health of their patients.Description of two examples annually where the provider has worked with others to improve local public health, prevent disease, or promote the health of their patients.22 (73.2)8 (26.7)

Patient experience and involvement

CC27Patients are informed of the arrangements for care outside normal contractual opening hours.(1) Written policy displayed in the provider premises and covered in detail in the patient information leaflet or website (hyperlink may be submitted as evidence); information provided in a format that is accessible to all patients; (2) up-to-date, revised when changes occur and reviewed every 12 months. Evidence of implementation and follow-up.28 (93.3)2 (6.70)
CC28Interpersonal continuity of patient care is made a priority when booking appointments, or patients are asked who is their usual doctor and offered an appointment with that doctor whenever possible.Written policy aligned to written evidence (that is, team member training log or annual appraisal) of team member training. Evidence of implementation.24 (80)6 (20)
CC29Relevant information is provided to patients about provider opening hours and appointment availability standards.(1) Patient information leaflet; website; notice on wall; (2) up-to-date. Evidence of implementation and follow-up.25 (83.3)5 (16.7)

CC30Relevant information is provided to patients about arrangements for contacting team members directly.Patient information leaflet; website (hyperlink may be submitted as evidence); notice on wall; information includes opening hours; telephone access opening hours; rota; flow chart; decision-making tool. Evidence of implementation and follow-up.28 (93.3)2 (6.7)
Notes
Funding body
This work was done at the National Primary Care Research and Development Centre, which receives funding from the Department of Health. Also Supported by the Royal College of General Practitioners. The views expressed are those of the authors and not necessarily those of the Department of Health or the RCGP.
Ethics committee
North West Research Ethics Committee.
Competing interests
The authors have stated that there are none.
Discuss this article
Contribute and read comments about this article on the Discussion Forum: http://www.rcgp.org.uk/bjgp-discuss
1. NHS Information Centre. QOF domain summary. Quality and Outcomes Framework (QOF) for April 2007–March 2008, England. http://www.ic.nhs.uk/webfiles/QOF/2007-08/NewFilesGS/National%20QOF%20tables%202007-08%20-%20domain%20summary.xls (accessed 5 Feb 2010)
2. Donabedian A. Explorations in quality assessment and monitoring. Volume 1: the definition of quality and approaches to its assessment. Ann Arbor, MI: Health Administration Press; 1980.
3. Campbell SM, Roland MO, Buetow S. Defining quality of care. Soc Sci Med. 2000;51(11):1611–1625. [PubMed]
4. Starfield B. Primary care. Concept, evaluation and policy. New York, NY: Oxford University Press; 1992.
5. Greenfield D, Braithwaite J. Health sector accreditation research: a systematic review. Int J Qual Health Care. 2008;20(3):172–183. [PubMed]
6. Buetow SA, Wellingham J. Accreditation of general practices: challenges and lessons. Qual Saf Health Care. 2004;12(2):129–135. [PMC free article] [PubMed]
7. Shaw C. The external assessment of health services. World Hosp Health Serv. 2004;40(1):24–27. [PubMed]
8. Macfarlane F, Greenalgh T, Schofield T, Desombre T. RCGP Quality Team Development programme: an illuminative evaluation. Qual Saf Health Care. 2004;13(5):356–362. [PMC free article] [PubMed]
9. Walshe K, Walsh N. Accreditation in primary care. In: Walshe K, Walsh N, Schofield T, et al., editors. Accreditation in primary care: towards clinical governance. Oxford: Radcliffe Medical Press; 2000. pp. 1–15.
10. Department of Health. High quality care for all. NHS Next Stage Review final report. London: Department of Health; 2008. http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/PublicationsPolicyAndGuidance/DH_085825 (accessed 5 Feb 2010) [PubMed]
11. Engels Y, Campbell S, Dautzenberg M, et al. Developing a framework of, and quality indicators for, general practice management in Europe. Fam Pract. 2005;22(2):215–222. [PubMed]
12. Topas Europe. Easy to use and scientifically developed quality management for general practice. http://www.topaseurope.eu/files/EPA-Information-Paper-English-vs11_0.pdf (accessed 6 Feb 2010)
13. Engels Y, Dautzenberg M, Campbell SM, et al. Testing a European set of indicators for the evaluation of the management of primary care practices. Fam Pract. 2006;23(1):137–147. [PubMed]
14. The Royal New Zealand College of General Practitioners. Aiming For Excellence – 3rd edition. An assessment tool for New Zealand general practice. New Zealand, 2009. http://www.rnzcgp.org.nz/assets/Uploads/qualityprac/Aiming-for-Excellence-2009-including-record-review.pdf (accessed 25 May 2010)
15. Royal College of General Practitioners. The future direction of general practice: a roadmap. London: Royal College of General Practitioners; 2008.
16. Department of Health. Standards for better health. London: HMSO; 2004. http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/PublicationsPolicyAndGuidance/DH_4086665 (accessed 6 Feb 2010)
17. Shaw CD. External assessment of health care. BMJ. 2001;322(7290):851–854. [PMC free article] [PubMed]
18. Shaw CD. Evaluating accreditation. Int J Qual Health Care. 2003;15(6):455–456. [PubMed]
19. Scrivens E. Putting continuous quality improvement in to accreditation: improving approaches to quality assessment. Qual Saf Health Care. 1997;6(4):212–218. [PMC free article] [PubMed]
20. Booth B, Hays R, Douglas K. National standards for general practice. Aust Fam Physician. 1998;27(12):1107–1109. [PubMed]
21. Campbell SM, Steiner A, Webb D, et al. Personal Medical Services improve quality of care? A multi-method evaluation. J Health Serv Res Policy. 2005;10(1):31–39. [PubMed]
22. Department of Health. Our health, our care, our say. London: HMSO; 2006. http://www.dh.gov.uk/en/Healthcare/Ourhealthourcareoursay/DH_065882 (accessed 6 Feb 2010)
23. Lester H, Roland M. Future of quality measurement. BMJ. 2007;335(7630):1130–1131. [PMC free article] [PubMed]
Articles from The British Journal of General Practice are provided here courtesy of
Royal College of General Practitioners