|Home | About | Journals | Submit | Contact Us | Français|
Across the National Health Service clinical teams from a variety of professions work together in delivering patient care, with traditional professional roles becoming increasingly challenged by new working practices. Shared learning between health care professionals in the NHS has been advocated as a means to improve the ability of individuals to work together in optimising patient outcomes.1 The Department of Health has been keen to incorporate such interprofessional education within its policies. Most notably, the NHS Plan in 2001 highlights the need for cooperation and partnership with a core curriculum of pivotal communication skills and NHS principles/organisation.2 In order to support such a vision, the Department of Health prompted universities to create common learning pathways to deliver the key components in its workforce strategy: teamwork, collaboration, skill mix, flexible working, flexible opportunities to change career trajectory, and new types of worker.3 At the same time that the Department of Health has promoted such shared learning, the medical profession has become more sceptical of its agenda, concerned that interprofessional education seeks to equip cheaper generic health care workers and “de‐professionalise” medicine.
The importance of precision in definition is particularly relevant to any discussion of multi‐professional education. Various accounts from different authors use terminology in slightly different ways and therefore it is important for the reader to understand the common use of key terminology. Multiprofessional education (MPE) describes activities where individuals from two or more professions learn together side by side, but not in a collaborative manner. By contrast, interprofessional education (IPE) describes activities where health care personnel from two or more professions learn together, learn from each other and/or learn about each other's roles in order to facilitate collaboration. For ease of description this account uses the terms shared learning and multidisciplinary learning interchangeably with IPE.
Barr describes the development of IPE as a result of the evolution of social work, nursing, professions allied to medicine and complementary/alternative therapies. Each group sought to strengthen its relationship with other professions alongside its continued drive to define further its own independent professional identity. At the same time the medical profession was ostensibly focused upon defining further specialist branches. However, the unique multidisciplinary context of general practice stimulated this discipline to forge closer professional development links with other professions.
Such processes within the community and primary care context culminated in the discrete conception of IPE as a means to promote collaboration, combat professional prejudices, improve professional skills and subsequently improve patient outcomes. The model seemed particularly relevant to the community settings where traditional hierarchical professional boundaries seemed less pervasive. The movement continued to evolve in the early 1970s as publicised court cases around child protection issues highlighted deficiencies in communication between professionals. IPE pilots became established both within various primary care settings (between general practitioners, health visitors, social workers and therapists) and through academic courses both at undergraduate and postgraduate levels. By the early 1990s the first masters programmes were beginning to feature collaborative practice at their core. At the same time (mid to late '90s) economic rationalisation of higher educational programmes placed increasing demands upon educational institutions to devise more cost effective generic courses (often in modular and distance learning form) that could be delivered to a variety of professions more cost effectively. The Department of Health was keen to see the potential benefits of IPE as it addressed workforce problems such as the European Working Time directive.
It would be deceptively straightforward to adopt a positivist, empiricist philosophy to interpreting the research related to the outcomes of IPE and deem that the evidence indicates no justification for such activity. Indeed, the much cited Cochrane systematic review by Zwarenstein et al, published in 2000, reviewed 89 papers and found none meeting stringent methodological standards for inclusion.4 As a consequence the review found no conclusive evidence of the effectiveness of IPE in relation to professional practice and/or health care outcomes. Given the fact that in medical education overall there is a relative neglect of such traditionally robust research design, it would tempting to accept the findings of this systematic review and conclude that this is where the IPE story ends.5 However, further enquiry about the evolution of research activity following the Cochrane review reveals a slightly less pessimistic picture.
Over the last 5 years several of the researchers (IPE Joint Evaluation Team—JET) who conducted the original Cochrane review have engaged in a further ongoing review of educationalist research on IPE, commissioned by CAIPE (UK Centre for the Advancement of Interprofessional Education).6 At the end of 2005 the review selected 107 out of 353 papers as methodologically robust (though the authors admit these are less stringent methodological standards than the original Cochrane review). The vast majority of the papers were from the US or the UK and 80% involved interventions for postgraduate learners with a reasonably even distribution between primary and secondary care. The JET reports evaluations across the Kirkpatrick range7:
Most of the evaluation reports are not surprisingly focused on learner reaction and the review found that there was a very significant positive reporting bias. Because the report was compiled on behalf of an organisation focused on delivering IPE and that the positive reporting bias was so pronounced, these findings are not presented here. However, despite these reservations about the strength of new evidence in support of IPE, the review is important because it demonstrates that the IPE movement is continuing to evolve and develop and has not been deterred by the rebuff of empiricism through the Cochrane study. Interestingly, the authors themselves do not promote a quantitative report of their findings but instead highlight the potential of IPE and the importance of continued research. Indeed, in a second paper the JET group issues guidance on constructing methodologically robust evaluation of IPE interventions so that the evidence base can expand further.8
The research relating to IPE provides a fascinating illustration of the core themes underpinning wider debate about best evidence medical education (BEME). Harden's impassioned declaration of the need for good quality evidence upon which to evolve educational policy was the subject of a thought‐provoking critique by Norman.9,10 One of the central tenets of Norman's article was that there was clear evidence that adult educational teaching processes were effective but that empiricist, positivist methodology was not infrequently too blunt a tool to capture the essence of successful educational outcomes. Norman highlighted the need to supplement more conventionally powerful experimental designs with robustly gathered qualitative material. Norman's critique thus sought to challenge the assumption that “quantitative is best”.
It is clearly justifiable to observe that there is a paucity of powerful experimental design support for IPE. However, it is equally true that even allowing for the potential conflict of interest inherent in the JET review, evidence exists from less traditionally powerful research designs in support of IPE across a range of Kirkpatrick evaluative levels. As Norman's critique highlights, the lack of robust empirical support for IPE is not unique, given the problems of establishing the effectiveness of other educational interventions, and yet somehow the literature regarding IPE seems to have been reported with more certainty and pessimism. Here it would appear that the political context in which IPE is being developed is extremely pertinent.
If one assumes the somewhat defensive and reductivist perspective that IPE is being used solely as a means to generate cheaper generic health care workers to replace existing traditional professionals, then one can begin to understand why the research base can be interpreted with such veracity. IPE in such a context is a threat and the reassuring transparency of the objective positivist paradigm provides a powerfully persuasive rejection of the movement. However, if one instead appreciates the potential political dangers of developing IPE while at the same time seeking to embrace the inherent advantages of more integrated IPE (as community care for chronic conditions in particular becomes ever more complex and team reliant), then the need to evaluate adequately IPE's effectiveness using a range of complementary research methodologies becomes more powerful. In this context it would not be justifiable to claim from the literature that there was no support for the benefits of IPE in terms of challenging professional attitudes/behaviours and organisational practice. Admittedly there is as yet little methodologically robust evidence for benefit in terms of patient outcome, but as outlined above this is not unique to IPE within the context of BEME overall.
It is interesting that there is good evidence to support the effectiveness of multidisciplinary care approaches in terms of patient outcomes (for example, care of the patient with acute stroke).11 However, when researchers apply a positivist philosophy to define the key elements of educational interventions to facilitate better team working, then it has proven far more difficult to demonstrate efficacy. One explanation is that IPE is not effective at improving team working and therefore patient care; the other explanation is that the area is so potentially complex methodologically that at present the relatively blunt tool of an empirical reductivist study is unable to adequately study the topic. That is not to present a circular argument that IPE cannot be studied by traditional methods, rather it requires more intensive research investigation. Such research requires a sequential rigorous methodological approach. Firstly, research needs to define the key elements in multidisciplinary teams that improve patient outcomes in different contexts. Next, further studies need to define specific educational interventions to promote such elements along with valid, reliable means to evaluate such teaching–learning activities. Finally, the effectiveness on patient outcomes of such educational interventions could be examined using more traditionally powerful experimental—or at the very least prospective—designs.
Shared learning is a real‐life necessity for a health care system reliant upon effective teamwork. In future, more methodologically robust research must seek to define the critical components of effective IPE interventions. Most early research activity was conceived by enthusiasts seeking to support their fervour with relatively weak descriptive studies; only recently have better, stringent studies provided tentative support of efficacy. One could be forgiven for thinking such a lack of evidence was unique to IPE, but this is clearly not the case given the paucity of empirical evidence for many fundamental aspects of medical educational practice.12 By challenging the validity of IPE one necessarily challenges the basic tenets of adult educational theory from which IPE as well as most modern medical education has essentially been derived. These implications have been neglected in a political context in which professions perceive themselves under threat and the pressures for rapid change intensify defensive boundaries between disciplines. Evidence of the effectiveness of shared learning has therefore gathered at a pace and the research imperative should now move beyond questioning its efficacy and towards understanding when and how IPE is at its most powerful. Let multidisciplinary learning be used when it is most appropriate rather than as a cheap educational panacea.
BEME - best evidence medical education
CAIPE - Centre for the Advancement of Interprofessional Education
IPE - interprofessional education
JET - Joint Evaluation Team
MPE - multiprofessional education
NHS - National Health Service
Competing interests: The author is not aware of any competing interest