PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of archdischArchives of Disease in ChildhoodVisit this articleSubmit a manuscriptReceive email alertsContact usBMJ
 
Arch Dis Child. 2007 July; 92(7): 633–636.
PMCID: PMC2083791

Quality improvement of paediatric care in the Netherlands

Abstract

The development of the quality improvement programme of the Paediatric Association of the Netherlands is described within the setting of the national programme of the Dutch government. The programme is based on four pillars: site visits by peers (visitatie), continuous medical and professional education, development of clinical (evidence based) guidelines and patient safety with complication registration. The site visits by peers play a central role in assessing the quality improvement activities in hospital based paediatric care. The self assessment approach and the confidential character of the visits are well received by the surveyed specialists. Recent inclusion of quality criteria in the legally required 5 yearly medical specialist recertification process has boosted the care for quality, which could serve as example for other countries.

Quality improvement has always been one of the core responsibilities of medical doctors and one of the primary tasks of their professional bodies. In the process of forming and formalising the medical profession, as we know it today, many developments were explicitly initiated to assure and improve the quality and safety of patient care.1,2,3 Interest in professional quality improvement is still increasing and international exchange of knowledge about the various systems of medical quality improvement is desirable. Although quality assurance remains to a large extent in the formal domain of national authorities, the development of international policy or legal frameworks may be expected.4 The mutual dependency of the medical community and the government remains: the government is (constitutionally) responsible for the provision of quality care to the public and at the same time it is dependent on the medical profession to ensure delivery of optimal care.

Over the past two decades the Dutch government has financially supported many of the quality initiatives undertaken by the professional associations of the 29 acknowledged medical specialties, coordinated by the Organisation of Medical Specialists. This paper describes how the Dutch medical community has developed an extensive programme of quality improvement of medical specialist care in hospitals. In particular, we will focus on the approach of the Paediatric Association of the Netherlands which was the first to start a professional quality management bureau (in 2000) and which has one of the oldest quality improvement programmes.

General background

In the Netherlands, with 16 million inhabitants, secondary medical care is provided in approximately 100 general hospitals and eight academic medical centres. All hospitals and academic centres have a paediatric department where paediatricians work together in a group practice. Training in paediatrics takes 5 years and recertification is legally required once every 5 years. The salaries of nearly all paediatricians are paid by the (subsidised) hospitals and 99% of Dutch paediatricians are members of the Paediatric Association. Most other medical specialists work in private group practices based in hospitals.

It is noted that in the Netherlands there are no primary care paediatricians. All primary care, including paediatric primary care, is provided by the general practitioner, the gate keeper for referrals to secondary care. Preventive care (child health care) is not provided by paediatricians but by child welfare clinic doctors and school doctors who undergo 2 year in‐service training after medical qualification. The doctors fall under the responsibility of the municipal public health services and are not allowed to deliver curative care or to prescribe medicines.

The four quality pillars

The quality improvement programme of the Paediatric Association is based on four pillars:

  1. site visits by peers (visitatie),
  2. continuous medical and professional education,
  3. clinical (evidence based) guidelines and the development of performance indicators,
  4. complication registration and patient safety.

Peer review site visits (visitatie)

Since 1994 all 29 specialist societies have run a visitatie programme to improve the quality of patient care. Each hospital based specialist group is visited once every 5 years, with 400 groups being visited annually. Survey teams consist of three practising specialists from out‐of‐region hospitals, who have been trained in visitatie procedures, quality improvement approaches and interviewing skills.

Prior to the visit, a questionnaire with approximately 120 questions about practice organisation and care processes has to be filled out. During the visit, which lasts one full day, interviews are held with the specialists concerned, nursing staff, psychologists, registrars if present, the hospital board, the board of the medical staff, other staff members and referring general practitioners. The wards and outpatient facilities are surveyed.

During the past 10 years, each hospital practice in the Netherlands has been visited at least twice by fellow specialists. Dutch specialists expressed their doubts with respect to the added value of a third visitatie with exclusive focus on ”the practice circumstances”.5 There is still little evidence to underpin the assumed relationship between “the circumstances” and outcomes of patient care. In 2004 a new visitatie model was launched by a group of paediatricians, gynaecologists and surgeons. The model shifted from a general assessment of circumstantial aspects of care delivery towards a more systematic and in depth evaluation of (daily) medical practice. The visitatie model now covers four professional quality domains: evaluation of care, patient perspective, professional development and specialist group functioning. Table 11 distinguishes the various professional quality aspects linked to each domain and offers an overview of the (self‐) assessment tools. The new model stresses self‐evaluation and in preparation for the site visit specialist groups have to evaluate each of the four professional quality domains for which tools are available on the websites of the professional associations.

Table thumbnail
Table 1 Visitatie quality domains, aspects and assessment tools

Medical audit is the preferred instrument to evaluate patient care. From each (evidence based) guideline of the Paediatric Association, a set of around 10 review criteria has been formulated by an expert panel. A systematic analysis of a sample of patient records gives insight into the performance of the specialist group under review. At present this analysis is carried out by the visiting peers, but in future the specialists group can evaluate their colleagues' patient records. For paediatricians the guideline on Down syndrome was evaluated in 2005/2006 and, to give an example, one of the review criteria (the need for vaccination against hepatitis B) was not implemented in one third of the patients.

The quick scan is the instrument developed for a quick and systematic mapping of the strengths and weaknesses of the specialist team's functioning. Five fields are investigated: policies and procedures, group culture and structure, decision making, reputation and results. Each field contains around 15 questions such as: we know each other's ambitions, qualities and weaknesses; our group functioning is effective and efficient; our decision making process is the result of dialogue; nobody in our group is a lazy performer; I am glad to be member of this specialist group; our reputation within and outside the hospital is good. Each member records his/her judgement and the assumed importance of this question on a five point scale. By multiplying the actual situation score and the importance score, a priority list is composed and visualised in a bar diagram for discussion with the visiting committee.

Patient experiences are measured by a one page questionnaire issued in advance to 30 patients per specialist. The answers are entered in a computer system belonging to the Paediatric Association and reported back to the group practice for discussion during the site visit.

General practitioner satisfaction is measured by a one page questionnaire sent in advance to all referring general practitioners. Their representative, who is invited to meet the site visit committee and to our experience never fails to do so, reports the results.

Medical records are audited by a fellow paediatrician approximately 2 months before the site visit. Around 30 medical records are screened according to a standard checklist created by an expert panel of the association.

The results of this (mostly self‐) evaluation process are presented to and discussed with the visiting peers during the site visit. Clearly, the educational approach to quality is dominant in this visitatie model. The specialty associations refuse to play the role of “visitatie police”. They are reluctant to pass judgement on the quality of care and rather assess the quality management efforts a specialist group displays. A 5‐point rating scale has been developed for this purpose (table 22).). Each site visit results in practice‐specific recommendations for improvement. Implementation of recommendations is left to the surveyed specialist groups; approximately 50% of all recommendations are actually followed up by the groups.6

Table thumbnail
Table 2 Ratings of quality management efforts as displayed by specialist groups

As all results are kept confidential and the visiting committee consists of fellow specialists, who, to our pleasant surprise, were invariable keen to evaluate another practice, an open atmosphere is created.

In addition to the group focused visitaties, evaluation of the individual performance of specialists is now also being introduced in Dutch hospitals.7

In 2005/2006, a total of 35 paediatric practices were visited according to the new visitatie model. The self‐assessment approach and the peer character of the visitatie model allow for remarkable openness. Many unexpected quality aspects do come up and give rise to improvement programmes. This approach is well accepted even when there is a poor relationship between colleagues. Since January 2006 onwards, participation in the visitatie programme has been mandatory for all medical specialists, who need recertification every 5 years by the College for licensure and registration. The College refrains from including the results of the site visit in the recertification process; only participation in the programme is required.

Continuous medical and professional education

All specialist associations in the Netherlands have credited continuous medical education activities for their members since 1994. Until 2005 continuous medical education was considered a moral and ethical obligation of each individual medical practitioner. From January 2006 onwards, a minimum of 200 h of postgraduate education has become a legal obligation, necessary for the 5 yearly recertification as a medical specialist by the College for licensure and registration. Each specialist association is allowed to go above this minimum and for paediatricians a total of 300 credits (300 h) is required over a period of 5 years; 10% of this time can be spent on non‐medical issues such as statistics, management, ethics, etc. The accreditation council of the Paediatric Association has formulated a set of quality criteria for the process of review and approval of educational activities including the subject, the method of presentation, the time involved, the location and the pre‐ and post tests.

Since June 2006 all medical postgraduate education is offered, assessed and registered according to a standard format, supervised by the Royal Dutch Medical Association. Each doctor has his personal page on a central computer system, linked to the website of the 29 different professional specialist organisations. Credit applications are digitally submitted and assessed by medical specialists from each scientific organisation. As paediatricians need to update their knowledge on a wide variety of subjects, a portfolio system can be built in, indicating which subjects have not been followed during the previous 5 years. The accreditation council of the Paediatric Association aims to improve the educational quality of postgraduate education and intends to honour courses of high educational quality with more than one point per hour.

Guidelines, implementation and the development of performance indicators

Consensus meetings and protocols have been subject of quality improvement activities since 1980. The evidence based approach, however, received a major boost after the Dutch government decided to financially support the development of evidence based guidelines. Over the past 5 years, nearly all specialist organisations have begun the development of evidence based guidelines and approximately 30 of over 100 guidelines initiated have been finalised. The Dutch Institute for Healthcare Improvement (CBO), a member of the Guidelines International Network, supports the development of guidelines.8,9,10

The Paediatric Association of the Netherlands has published more than 100 protocols and guidelines, but only five evidence based guidelines: neonatal resuscitation, fluid administration in critically ill children, mental retardation, apparent life threatening events and pain management. The development of these evidence based guidelines turned out to be a painstaking effort, lasting nearly 3 years at a cost of approximately €100 000 per guideline. Therefore, it was decided recently to stop the development of Dutch paediatric evidence based guidelines and only adapt existing international evidence based guidelines. A committee of expert paediatricians collects missing evidence through an in‐depth literature search and adapts the guideline to a national standard within 9 months. At present urinary tract infection, head trauma, constipation, fever and sedation in minor investigations are the subject of study.

Improving the implementation of guidelines in (daily) practice is the next step in the quality improvement cycle.11 The development of quality indicators is necessary to measure the degree and quality of implementation. The eagerness of the public and press to get the results of this quality improvement exercise frightens off many medical professionals. A system of internal (confidential) and external (publicly available) quality indicators is being developed,12 but the difference is artificial and most likely untenable.

Complication registration and patient safety

In 2001 the registration of medical complications became the subject of discussion within the paediatric field in The Netherlands. The surgical disciplines have a longer history of standardised complication registration, which is at present applied in over 30% of Dutch hospitals.13 Early in 2003, a committee was set up to design a register for paediatric complications. Within a year a list of 90 complications, divided over 16 organs or systems, with 34 different technical interventions, nine routes of medicine administration and four grades of severity of outcome was composed and pilot tested in four hospitals. Complication registration appeared feasible, especially if it occurred as part of the daily ward round. While paediatric departments in general hospitals registered a complication rate of less than 2% of admitted patients, in intensive care units of academic centres the rate was 20–25%.14 A second series of pilots has started in 12 general hospitals and eight academic subspecialty departments in preparation for national complication registration in paediatrics in 2007.

In the same period a patient safety campaign was launched and at present a patient safety project with blame‐free reporting of (near miss) accidents is running in the 10 neonatal intensive care units in the Netherlands under the name Neosafe.15

It is clear that complication registration and blame‐free reporting of (near miss) accidents are indissolubly connected and will lead to an integrated patient safety management system for all paediatric practices.

Discussion

Considerable progress in the quality improvement of specialist medical care in the Netherlands has been realised by the specialist societies with the support of the Dutch Government, the Order of Medical Specialists and the Dutch Institute for Healthcare Improvement (CBO). With the introduction of quality criteria for the 5 yearly process of recertification as a medical specialist, quality improvement has received an enormous boost. In addition to the pre‐existing criterion of a minimum of 16 h clinical specialist work per week, medical specialists are now legally required to participate in the visitatie programme of their specialty association and to attend at least 200 h of postgraduate education. This is of course no guarantee of better performance or better patient health, but care for quality gives at least some indication of quality of care. There is urgent need for evaluation, but there is little research into quality programmes which meet vigorous scientific criteria and perhaps conclusive evidence of effectiveness may never be possible.16 Process evaluation seems at present to be the best option.17 Examples from leading countries, such as the USA, Canada, the UK, and the Nordic countries, give hope for improvement in quality of medical care, but most initiatives are scattered and not guided by a strong national body aiming at an integrated quality approach.18,19,20,21 The visitatie programme can be considered the core of professional quality improvement practice: it brings together all the efforts and results of professional development, the implementation of (evidence based) guidelines, the structure, culture and functioning of the specialist group, and the use of complication registration with an interest in patient safety. It is unique in the world, developed and standardised in the Netherlands and digitally available. With the establishment of a professional office for quality management in 2000, the Paediatric Association of the Netherlands was able to design a total quality management system for all practising paediatricians. It facilitates the establishment of a quality improving culture and contributes to maintaining public trust in the medical profession.

Footnotes

We have not in the past 5 years accepted fees, reimbursements or funds from an organisation that may in any way gain or lose financially from the results of our study or the conclusions of any review, editorial, or letter.

Competing interests: None.

References

1. Freidson E. Professionalism. The third logic. Oxford, UK: Blackwell, 2001
2. Medical Professionalism Project Medical professionalism in the new millennium: a physicians' charter. Lancet 2002. 359520–522.522 [PubMed]
3. Klazinga N S. Quality management of medical specialist care in the Netherlands, an explorative study on its nature and development. Dissertation. Overveen: Belvedere, 1996
4. Legemaate J. Integrating health law and health policy: an European perspective. Health Policy 2002. 60101–110.110 [PubMed]
5. Lombarts M J M H, Klazinga N S. Supporting Dutch medical specialists with the implementation of visitatie recommendations: a descriptive evaluation of a two‐year project. Int J Qual Health Care 2003. 15(2)119–129.129 [PubMed]
6. Lombarts M J M H, Klazinga N S, Redekop K. Measuring the perceived impact of facilitation on implementing recommendations from external assessment: lessons from the Dutch visitatie program for medical specialists. J Eval Clin Pract 2005. 11(6)587–597.597 [PubMed]
7. Lombarts M J M H, Overeem K, Kremer H P H. The better doctor. Peers and patients can best evaluate the performance of medical specialists (in Dutch). Medisch Contact 2006. 61656–659.659
8. Dutch Institute for Healthcare Improvement ( C B O ) Evidence‐based guidelines development. Manual for guideline committees (in Dutch). Utrecht: CBO, 2006
9. CBO Audit and visitation. http://www.cbo.nl/english/folder20030512164118/default_view (accessed 8 March 2007)
10. Guidelines International Network http://www.g‐i‐n.net (accessed 8 March 2007)
11. The Netherlands Health Care Inspectorate The result counts. Performance‐indicators as independent measure of the quality of Dutch hospitals (in Dutch). The Hague: IGZ, 2004
12. Wollersheim H, Burgers J, Grol R. Clinical guidelines to improve patient care. Neth J Med 2005. 63(6)188–192.192 [PubMed]
13. Marang‐van de Mheen P J, Stadlander M C, Kievit J. Adverse outcomes in surgical patients: implementation of a nationwide reporting system. Qual Saf Health Care 2006. 15320–324.324 [PMC free article] [PubMed]
14. Schulpen T W J. Complication or error. Complication registration at paediatric departments is possible (in Dutch). Medisch Contact 2005. 441751–1753.1753
15. Molendijk A, Borst K, van Dolder R. To err is human. Blamefree reporting increases transparency (in Dutch). Medisch Contact 2004. 581658–1661.1661
16. Ovretveit J, Gustafson D. Evaluation of quality improvement programmes. Qual Saf Health Care 2002. 11270–275.275 [PMC free article] [PubMed]
17. Hulscher M E J L, Laurant M G H, Grol R P T M. Process evaluation on quality improvement interventions. Qual Saf Health Care 2003. 1240–46.46 [PMC free article] [PubMed]
18. Beal A C, Co J P T, Dougherty D. et al Quality measures for children's health care. Pediatrics 2004. 113199–209.209 [PubMed]
19. Cheng S M, Thompson L J. Cancer Care Ontario and integrated cancer programs: portrait of a performance management system and lessons learned. J Health Organ Manag 2006. 20(4)335–343.343 [PubMed]
20. Ovretveit J. The Norwegian approach to integrated quality development. J Manag Med 2001. 15125–141.141 [PubMed]
21. Shaller D. Implementing and using quality measures for children's health care: the perspectives on the state of the practice. Pediatrics 2004. 113217–227.227 [PubMed]

Articles from Archives of Disease in Childhood are provided here courtesy of BMJ Publishing Group