PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptNIH Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
Milbank Q. Author manuscript; available in PMC Jun 1, 2010.
Published in final edited form as:
PMCID: PMC2698591
NIHMSID: NIHMS109127
Toward a Transdisciplinary Model of Evidence-Based Practice
Jason M. Satterfield, Bonnie Spring, Ross C. Brownson, Edward J. Mullen, Robin P. Newhouse, Barbara B. Walker, and Evelyn P. Whitlock
University of California, San Francisco; Northwestern University; Washington University; Columbia University; University of Maryland; Indiana University; Kaiser Permanente; Oregon Evidence-Based Practice Center
Address correspondence to: Jason M. Satterfield, Division of General Internal Medicine, University of California, 400 Parnassus Ave, A-405, San Francisco, CA 94143-0320 (email: jsatter/at/medicine.ucsf.edu)
Context
This article describes the historical context and current developments in evidence-based practice (EBP) for medicine, nursing, psychology, social work, and public health, as well as the evolution of the seminal “three circles” model of evidence-based medicine, highlighting changes in EBP content, processes, and philosophies across disciplines.
Methods
The core issues and challenges in EBP are identified by comparing and contrasting EBP models across various health disciplines. Then a unified, transdisciplinary EBP model is presented, drawing on the strengths and compensating for the weaknesses of each discipline.
Findings
Common challenges across disciplines include (1) how “evidence” should be defined and comparatively weighted; (2) how and when the patient’s and/or other contextual factors should enter the clinical decision-making process; (3) the definition and role of the “expert”; and (4) what other variables should be considered when selecting an evidence-based practice, such as age, social class, community resources, and local expertise.
Conclusions
A unified, transdisciplinary EBP model would address historical shortcomings by redefining the contents of each model circle, clarifying the practitioner’s expertise and competencies, emphasizing shared decision making, and adding both environmental and organizational contexts. Implications for academia, practice, and policy also are discussed.
Keywords: Evidence-based practice, clinical decision making, transdisciplinary practice
In 1996, Haynes and colleagues introduced a conceptual model depicting how research could be integrated into the clinical practice of medicine (Haynes et al. 1996). This vanguard “three circles” model has been adapted by the major health disciplines, all of which endorse the policy of evidence-based practice (EBP). Indeed, the Institute of Medicine (IOM) has named EBP as a core competence for health professionals (Greiner and Knebel 2003). The IOM’s guidance aims to speed up the glacial rate at which medical discoveries are translated into practice and to increase the delivery of recommended health care (IOM 2001). The endorsement of EBP accords also with the National Institute of Health (NIH)’s Roadmap initiative to break down disciplinary silos and accelerate the transfer between research and practice (Zerhouni 2005). Just as the research teams of the future will be interdisciplinary, the practice teams of the future will be interprofessional (Grumbach and Bodenheimer 2004; Zerhouni 2005).
The challenges associated with translational science and interprofessional practice are substantial and call for more unified practice models, a common language, and unifying goals (Stokols 2006; Stokols et al. 2008). Because the training of practitioners in the health professions differs, their vocabulary, conceptual frameworks, and research methods often differ as well, thereby impeding cross-disciplinary translation. A recent review of interprofessional health education (IPHE) highlights these challenges and confirms that although a handful of studies support IPHE, this area is still in its infancy (Reeves et al. 2008). Although interdisciplinary groups like the United States Preventive Services Task Force (USPSTF), the Cochrane Collaboration, and the Campbell Collaboration have offered successful and influential systematic reviews and practice guidelines, these organizations serve as “producers” of evidence-based materials and rarely serve as EBP “consumers,” that is, frontline practitioners or policymakers engaged in clinical decision making who might benefit from translational and transdisciplinary dissemination and training. Readers are directed elsewhere for more information about these important collaboratives or the EBP guidelines (e.g., www.cochrane.org; http://www.ahrq.gov/clinic/uspstfix.htm; Guyatt and Rennie 2007).
The coauthors of this article hail from medicine, nursing, psychology, social work, and public health and also have formed the Council on Evidence-Based Behavioral Practice supported by NIH’s Office of Behavioral and Social Sciences Research (OBSSR). In this article we examine the history and evolution of evidence-based practice policies in our respective professions in order to produce a shared EBP conceptual model and process that uses the unique strengths from each profession and addresses the common criticisms of evidence-based practices; for example, the evidence is too narrowly defined; the role and value of practitioners and their expertise are unclear; resources and/or contextual factors are ignored; and not enough attention is paid to the client’s preferences. In each discipline-specific section we present a brief history of EBP, a conceptual model, and a discussion of the EBP process, including what constitutes legitimate “data” or “evidence.” We conclude by offering a harmonized, transdisciplinary model of evidence-based practice that specifies a common language and an enriched process for clinical and/or policy decision making surpassing that of a single disciplinary approach. We hope that this enhanced, hybrid model will support a collaborative dissemination and implementation of evidence-based health practices at the individual, community, and population levels.
In 1992, evidence-based medicine (EBM) was introduced as a “new paradigm” for the practice of clinical medicine (Evidence-Based Medicine Working Group 1992). EBM was intended to develop and promote an explicit and rational process for clinical decision making that deemphasized intuition and unsystematic clinical expertise while emphasizing the importance of incorporating the best research findings into clinical care. The emergence of this new model was made possible by decades-long advances in research and epidemiologic methodologies, medical informatics, and innovations in medical training programs. After important critical exchanges within the medical community, EBM was more explicitly defined as “the conscientious and judicious use of current best evidence from clinical care research in the management of individual patients,” as shown in the three-circle model for evidence-based clinical decisions in figure 1 (Haynes et al. 1996; Sackett et al. 1996, p. 71).
FIGURE 1
FIGURE 1
Three-Circle Model of Evidence-Based Clinical Decisions
These three circles illustrate the distinct but overlapping sources of data that might be used when making clinical decisions. Moreover, the authors explicitly stated that under certain circumstances, clinical expertise and/or the patient’s preferences may override research evidence. Note that the three circles are of equal size or “weight,” with clinical expertise occupying the top, central position. The authors also were careful to state (and restate) that EBM is not “cook book medicine,” a means of cutting costs by limiting care, or a subversive means for clinical researchers to overemphasize the value of randomized-controlled trials (Haynes et al. 1996; Sackett et al. 1996). EBM intentionally deemphasizes the role of expert authority and instead promotes a transparent, rational decision-making process that can be taught, refined, and applied by all clinicians.
Although conceptually appealing, the original model lacked explicit guidance in how the circles or sources of data were to be integrated when making decisions, particularly when the research evidence was at odds with either clinical experience or the patient’s preferences. Furthermore, the scope, relative value, and appropriate applications of “clinical expertise” remained unclear. An updated model then attempted to address these concerns by changing the clinical expertise circle to “clinical state and circumstances” and moving clinical expertise to the intersection points of the new three circles, as shown in figure 2. Clinical expertise now is the ability to elicit, appropriately appraise, and consequently integrate these potentially disparate sources of data (Haynes, Devereaux, and Guyatt 2002). The central placement of clinical expertise also highlights the value of clinical experience in guiding the EBM decision-making process and offers a noteworthy concession regarding the importance of the individual practitioner.
FIGURE 2
FIGURE 2
An Updated Three-Circle Model of Evidence-Based Clinical Decisions
Although all the circles are again represented equally in figure 2, the seminal EBM texts and training programs were focused primarily on medical informatics, clinical epidemiology, biostatistics, and critical appraisal skills. The current understanding of the patient’s preferences was regarded as “primitive,” and exactly how clinical expertise would guide the integration of the three circles still was not clear (Strauss et al. 2005). The authors again reiterated that although the name EBM includes “evidence,” it is not intended to mean that evidence is the most important source of information (in contrast to patient’s preferences or clinical circumstances) but, rather, a necessary but not sufficient aspect of clinical decision making. More recent explications of this model have further defined “evidence” and suggested an evidence hierarchy to help EBM users appraise and integrate multiple types of evidence (Guyatt and Rennie 2007; Strauss et al. 2005).
As EBM has evolved, the recommended processes for applying it to clinical decision making have grown more explicit (Strauss et al. 2005). Based on an early article about “rules of evidence,” the notion of a stepwise “evidence cycle” was created to guide practitioners in the EBM process (Bhandari and Giannoudis 2006; Sackett 1986). The five steps of this cycle were recently renamed to exploit a variant of the popular five A’s mnemonic: Assess, Ask, Acquire, Appraise, Apply. Clinicians assess the patient and clinical situation, ask relevant clinical or treatment questions, acquire evidence or other data, appraise the collected data, and apply the indicated treatment.
Professional standards for nursing include those practices based on the best available evidence (ANA 2004a). Because most nursing is practiced in organizations, nursing administrators are responsible for developing an infrastructure to promote EBN (ANA 2004b). As nursing developed in the 1970s, an approach called research utilization focused on translating research findings into practice (Titler 1997). Although research utilization contained a process for critically appraising research, it did not incorporate the patient’s preferences or clinical judgment. Since then, nursing has been heavily influenced by the progress of EBM (Melnyk et al. 2000).
The quest for Magnet accreditation has been a key driver for EBN, especially in acute care. Magnet accreditation is awarded to organizations known to provide good (and evidence-based) nursing care and favorable work environments (ANCC 2007). Magnet standards are based on ANA professional standards and research pertaining to what is known about the best environments for nurses to provide care and for patients to receive care. As organizations evaluate their readiness, collect data, and prepare their application for Magnet status, they often realize that they need to develop or build additional human and material resources.
To prepare nurses for professional practice, educational standards at the baccalaureate, master’s and doctoral level all include competencies in EBP (AACN 2008). These competencies are similar to those identified in EBM but are specified by an academic degree. For example, competencies at the baccalaureate level include the integration of best evidence, clinical judgment, interprofessional perspectives, and the patient’s preferences (AACN 1998). At the master’s level, competencies add the use of new knowledge to analyze intervention outcomes, initiate change, and improve practices (AACN 1996). Finally, the doctoral level adds the use of analytic methods to critically appraise existing evidence in order to determine and implement the best practices (AACN 2006).
There is no single EBN model to guide practices. Similar to EBM, all nursing models contain the patient’s preferences, the provider’s expertise, and the critical appraisal of research evidence. The nurse’s clinical judgment and the patient’s preferences are incorporated into EBN as recommendations are constructed. Compared with EBM, EBN usually relies more on evidence from nonrandomized designs. Without randomized controlled trials (RCTs), the sources for nursing evidence are quality improvement (QI) data, financial analysis, and/or patient satisfaction data.
Nearly all EBN process models (Newhouse et al. 2007; Stetler 2001; Titler et al. 2001) follow a process similar to EBM in which a practice-relevant question is posed, evidence is acquired and appraised, and it is applied to practice and evaluated. The EBN models do differ in the specific steps, level of prescriptive detail, and tools available to support the process.
EBN pushes beyond EBM in the areas of qualitative research and the integration of the patient’s experiences into practice decisions. Partly because of the dearth of RCT evidence, EBN flattens the evidence hierarchy, giving greater weight to qualitative data, patient satisfaction, QI data, and cost-effectiveness. By highlighting contextual and patient-generated responses, EBN strongly underscores assessing and incorporating the patient’s preferences into the clinical decision-making process.
In 1995, the American Psychological Association (APA) commissioned the Task Force on Promotion and Dissemination of Psychological Procedures. Its objective was to establish rigorous criteria, including the replication and use of a treatment manual(s), to identify “empirically supported treatments” (ESTs), and to select treatments that met these criteria. The task force identified eighteen treatments as “empirically supported” (e.g., cognitive-behavioral therapy for panic disorder) and seven as “probably efficacious” (e.g., exposure therapy for social phobia) (Chambless et al. 1996). A later report listed sixteen ESTs that were then widely disseminated to training programs across the country (Chambless et al. 1998).
These ESTs generated both enthusiasm and controversy (Spring et al. 2005). To some, a treatment manual gave the appearance of “cookbook” therapy. Others found problematic the general lack of evidence that specific psychotherapies work better for specific disorders (i.e., the classic “Dodo bird verdict” described by Luborsky, Singer, and Luborsky 1975). The implication, some argued, was that most psychological treatments work best using nonspecific therapeutic elements, such as empathy, catharsis, or the patient’s relationship with the therapist (Wampold 2001).
The APA task force released its report on ESTs in the same year that the McMaster group published its first papers on EBM (Chambless et al. 1996; Haynes et al. 1996). EBM named three domains to be considered n decision making, one of which was research evidence. In contrast, the APA task force focused exclusively on research evidence, singling out those treatments that had the best empirical support. The APA task force proposed standards of evidence that could be used to select the psychological treatments to be included in psychology training programs. In so doing, psychology was anticipating the general policy of critical appraisal that EBM later used in selecting the best practices for its treatment guidelines.
The APA’s need to align psychology with the other health care professions led it to form an evidence-based task force in 2005. This task force was charged with defining evidence-based psychology practices (EBPPs) by combining the diverse views of scientists and practitioners. The APA’s definition of EBPPs resembled both the evidence-based practice definition adopted earlier by the IOM (2001) and the original EBM three-circle model (APA Presidential Task Force on Evidence-Based Practice 2006; Haynes et al. 1996). The task force noted that multiple levels of evidence and research designs could contribute to evidence-based practice and that some research designs were better than others for answering certain questions. It did not, however, endorse a particular pyramid of evidence like the one used in EBM.
The APA task force revised the three-circle model by more precisely defining clinical expertise and patients’ preferences, an area discussed in EBN but not well represented in EBM. In addition to defining psychologists’ clinical expertise as containing eight competencies (e.g., assessment, diagnostic judgment, systematic case formulation, and treatment planning), the task force described how those competencies could be acquired and the role of expertise in the clinical decision-making process. Furthermore, it recognized the limitations of expertise and the inevitable cognitive biases influencing clinicians’ judgment. Patients’ preferences were expanded to include patients’ characteristics, values, and context. The task force also viewed variables such as identity and sociocultural factors (e.g., age, gender, ethnicity, social class, religion, income), functional status (e.g., ability to work), readiness to change, level of social support, and developmental history as germane to the clinical decision-making process. This clear articulation of variables to be considered in the patient’s “circle” represents a substantial step forward from earlier EBM models.
EBM was first introduced into social work in the 1990s, although earlier models for integrating research and practice did exist (e.g., the empirical practice movement and scientific practitioner model) (Gambrill 1999; Gibbs 2003; Reid 1994). Evidence-based social work practice (EBSWP) is, however, qualitatively different from these earlier efforts and, like EBM, has been seen as a paradigm shift (Gambrill 2003).
The adoption of EBSWP was facilitated by a marked increase in practice research as well as by mechanisms for evidence dissemination. Since 1999, for example, the Campbell Collaboration has promoted the development and dissemination of high-quality systematic reviews in social welfare, criminal justice, and education. Specialized EBP centers in Europe and North America provide important infrastructure supporting EBSWP, as has the growth of partnerships between practitioners and researchers. EBSWP is now required for the accreditation of social work training programs, and the use of research evidence for professional practice is prescribed by the national code of ethics for social work (Institute for the Advancement of Social Work Research 2007).
Haynes, Devereaux, and Guyatt’s three-circle conceptualization (see figure 2) is the most frequently cited model in social work (Haynes, Devereaux, and Guyatt 2002), although Regehr, Stern, and Shlonsky’s more recent conceptualization represents an emerging alternative view of EBSWP with a larger context (Regehr, Stern, and Shlonsky 2007). In this model, professional expertise replaces clinical expertise, reflecting social workers’ roles in management and policy in addition to clinical practice. The later model places Haynes, Devereaux, and Guyatt’s (2002) three circles at the center of a contextual frame, which is in turn framed at the outer boundary by broad contextual factors (figure 3). Regehr, Stern, and Shlonsky’s model includes intraorganizational, extraorganizational, and practitioner-level factors that need to be taken into account on the journey from evidence to practice. This more developed, nuanced appreciation for political, economics, organizational, and other contextual factors represents an important perspective absent from other models of evidence-based practice.
FIGURE 3
FIGURE 3
Elements of Evidence-Based Policy and Practice
The process of EBSWP is similar to the EBM process, with its five steps preceded by the step of becoming motivated to use EBSWP (Gibbs 2003). The practitioner’s expertise is given a central place because of the complex skills needed to integrate the domains illustrated in figure 3. Also in keeping with social work’s emphasis on the importance of individualization, the EBSWP process stresses assessment early in the process and continuing throughout. For example, the EBSWP approach to child protective services begins with an actuarial assessment of population risk so as to target resources to those clients at the highest risk. This is followed by a contextual assessment of an individual client’s strengths, needs, and preferences. Throughout this process, evidence is sought and the quality of assessment tools and the effectiveness of service options are appraised (Mullen et al. 2005).
Like other disciplines, social work has debated whether evidence is acceptable and what its relevant weight or value should be (Mullen and Streiner 2004). An inclusive view of evidence is advocated to serve the diverse needs of social workers who engage in clinical-, community-, and population-focused practice. Social workers must look for findings from a variety of research designs to address their practice questions (Rubin 2007), with the emphasis on using evidence from practice-based research that examines practical problems found in social work practice (Roberts and Yeager 2006). Evidence from qualitative research is valued for its insights into clients’ experiences and context as well as the thick description provided. Evidence from quantitative research is valued for its objectivity and precision in addressing questions about the efficacy and cost-effectiveness of alternative intervention options.
Formal discourse on the nature and scope of evidence-based public health (EBPH) originated about a decade ago. In 1997, Jenicek defined EBPH as the “use of epidemiological insight while studying and applying research, clinical, and public health experience and findings in clinical practice, health programs, and health policies” (Jenicek 1997, p. 190). Subsequent definitions have both expanded and deepened through EBPH practice questions and the identification of high-quality evidence (Brownson, Gurney, and Land 1999; Glasziou and Longbottom 1999).
In 2004, Kohatsu extended the definition of EBPH to communities’ input and preferences in decision making (Kohatsu, Robinson, and Torner 2004). In a model modified from Muir Gray, the three circles of EBM have become scientific evidence, population needs and values, and resources. Population needs and values refer to what EBM calls “patients’ preferences,” but they also encompass what EBM called “clinical state and circumstances.” The resources circle is entirely new and reflects necessary thinking when addressing the needs of a population. All three circles point to sources of data to be used when making public health decisions.
As the tenets of EBPH have been illuminated, several new components have emerged (Brownson, Fielding, and Maylahn 2009):
  • Making decisions based on the best available scientific evidence (both quantitative and qualitative).
  • Using data and information systems systematically.
  • Applying program-planning frameworks (often based in behavioral science theory).
  • Engaging the community in assessment and decision making.
  • Making sound evaluations.
  • Disseminating what is learned to key stakeholders and decision makers.
The most commonly applied framework in EBPH is probably that shown in figure 4, which uses a seven-stage process (Brownson et al. 2003,Brownson et al. 2007). Note that the quality and volume of evidence differ from those for EBM. Although fewer RCTs are available, public health surveillance, interventions, and policies are more likely to rely on cross-sectional studies, quasi-experimental designs, and time-series analyses. Studies sometimes lack a comparison group, which detracts from the quality of the evidence, and the formal training of public health workers is highly variable. Unlike medicine, public health draws practitioners from many disciplines and thus does not have a single (or even small number of) academic credential(s) that “certifies” a public health practitioner. Moreover, probably fewer than half have any formal training in a public health discipline like epidemiology or health education (Turnock 2001).
FIGURE 4
FIGURE 4
The Most Commonly Applied Framework in EBPH
EBPH has made three contributions to the EBP models. First, much like nursing and social work, EBPH expands the types of data to be considered as evidence. Because RCTs often are not available for complex, frontline work, they usually do not inform public health decisions. Second, EBPH addresses the issue of resource allocation in overburdened systems. Third, EBPH has constructed a more detailed, iterative stepwise process that guides both the decision making and the initial questions (figure 4).
To gauge the added value of a new EBP model, it is helpful to recall the primary criticisms of EBM: the evidence is too narrowly defined; the role and value of practitioners and their expertise are unclear; resources and/or contextual factors are ignored; and not enough attention is paid to clients’ preferences. These criticisms become particularly relevant to the behavioral and social science aspects of health, whose evidence base is much less extensive than in medicine and in which causality is nearly always determined by several factors. Therefore, it is important to define evidence broadly. For example, evidence may involve quantitative data (e.g., numerical results of program or policy evaluations) and qualitative data (e.g., nonnumerical observations collected by focus groups). As noted in regard to the discipline-specific evidence-based practice models, evidence may be narrowly defined and placed in a hierarchy (i.e., pyramid of evidence), or it may draw more broadly from sources like quality improvement or patient satisfaction data and consequently weight those categories more equally (e.g., EBN). The perceived value of evidence may vary by stakeholder type. Ultimately, the most useful evidence in a particular situation depends on the type of question asked about a specific practice or policy.
It is particularly helpful to see how each discipline has used the original EBM models to address specific shortcomings. Nursing, public health, and social work expanded the scope of what is considered evidence. They confirmed that many different practice questions are important and that the best study design depends on the question asked. Psychology specified criteria for “empirically supported treatments” and has, perhaps, been most successful in introducing these treatments to training programs. Both psychology and social work have emphasized the importance of clients’ characteristics as potential moderators of outcome. Social work also has made important changes in EBM that draw attention to institutional and environmental contexts. Public health addresses the mostly ignored issue of how resource availability influences decision making. Finally, both nursing and psychology have recognized the importance of patients’ characteristics and preferences to final decisions regarding clinical care. Although each of these discipline-specific models has particular strengths, none takes into account the vagaries of practice across the health professions.
Our revised EBP model (figure 5) has a transdisciplinary perspective. It incorporates each discipline’s most important advances and attempts to address remaining deficiencies. The model is grounded in an ecological framework and emphasizes shared decision making. We used an ecological framework because intervening solely with individuals often is insufficient to maximize long-term gains for the population as a whole. Both the impact on the population and health maintenance can be enhanced by intervening also at the interpersonal, organizational, community, and public policy levels.
FIGURE 5
FIGURE 5
Our Revised EBP Model
The model’s new external frame contains environment and organizational factors to create a cultural context that moderates the acceptability of an intervention, its feasibility, and the balance between fidelity and adaptation that is needed for effective implementation. Environment and organization are important to evidence-based decisions in all disciplines, although some disciplines, such as nursing, social work, and public health, may be more likely to choose or modify evidence-based interventions based on context. Nursing’s practices are organizational, and the feasibility of its practice recommendations is modified by governing policies, purchasing agreements, and affiliations. Because it is a social science, social work naturally incorporates attributes of the client’s environment into the plan of care. With the goal of preventing disease in populations, public health interventions must be implemented through organizations and communities. Albeit to a lesser extent, context is incorporated into decision making even when the treatment focuses on the individual patient, as in medicine and psychology. The diagnosis and treatment of a patient’s disease may require a stronger emphasis on the patient’s and provider’s characteristics, with a diminished role for context.
Consistent with major EBM models, “best available scientific evidence” remains one of the three circles. Evidence is defined as research findings derived from the systematic collection of data through observation and experimentation and the formulation of questions and testing of hypotheses. In accord with the 1996 EBM model but differing from the 2002 version, clinical state and circumstances are no longer a circle (Haynes, Devereaux, and Guyatt 2002; Haynes et al. 1996). Because we regard state and circumstances as attributes of the patient, community, or population, we include them in the circle containing all that entity’s values, preferences, and characteristics.
As in the 1996 template, the practitioner’s expertise occupies a prominent place. We see expertise as one of many resources needed to implement health services and as one of four categories: competence at performing the EBP process, assessment, communication/collaboration, and engagement/intervention. EBP process skills are proficiency in formulating answerable practical questions, acquiring and appraising relevant evidence, applying that evidence through shared decision making that considers the client’s characteristics and resources, analyzing outcomes, and adjusting as appropriate. Assessment skills are competence in the appraisal of care recipients and expertise in implementing and evaluating the outcome of a needed health procedure. Communication and collaboration skills entail the ability to convey information clearly and to listen, observe, and adjust to arrive at an understanding and an agreement on a course of action. Engagement and intervention skills refer, at a minimum, to proficiency at motivating interest, constructive involvement, and positive change from stakeholders.
We have reconceptualized clinical expertise in a particular intervention or technique as a resource to be evaluated as part of the decision-making process. The expert’s role still differs from that of an educated consumer of EBP recommendations to the actual producer of primary research evidence to the synthesizer of evidence for EBP guidelines.
At the center of our model is decision making; the cognitive action that turns evidence into contextualized evidence-based practices. We had four reasons for moving decision making to the center of our model and practitioner’s expertise to a lower circle. First, we found that decision making was not a particular individual’s inherent professional or intuitive skill but, rather, a systematic decisional process combining evidence with the client, resources, and context. Second, we felt that the central emphasis on the practitioner’s expertise was inconsistent with the lack of empirical support for the proposition that the practitioner’s performance improved with experience (Choudhry, Fletcher, and Soumerai 2005). Third, we placed decision making in the center of the figure to demonstrate the great difficulties and practical challenges in reconciling the many variables needed to make evidence-based decisions about clinical care, public health, or public policy. The evidence often is at odds with a patient’s or a population’s preferences. Similarly, resources (including expertise) may not be available to deliver what both the evidence and the patient’s/population’s preferences demand. By highlighting the nuances of data collection and decision making in the various disciplines (e.g., elevating patients’ preferences in nursing, more heavily weighting quantitative research evidence in medicine), and providing a transdisciplinary model that represents equally all the various inputs, a practitioner using the new EBP model can more collaboratively discuss the conflicts at hand. Moreover, the emergence of these conflicts may help policymakers direct resources to providers’ training, patients’ education, and communities’ development.
Finally, we are committed to a model of collaborative health care practice in which health decisions are not solely the practitioner’s but are shared among the practitioner(s), clients, and other affected stakeholders. Even though current models of shared decision making offer guidance when decisions are made by a dyad (i.e., practitioner and patient), relatively little is known about interprofessional decision making in a team-based or transdisciplinary practice (Légaré et al. 2008; Whitney 2003). Légaré and colleagues observed that true interprofessional decision making would require sharing the goal of health care decisions based on patients’ values, a sense of trust among professionals, and leadership and organizational structures that facilitate shared decision making in clinical care (Légaré et al. 2008).
This new EBP model has important implications. First, it provides a useful framework for guiding health services research with an interdisciplinary and real-world perspective. EBM and, more generally, EBP are in need of greater empirical validation as systemic approaches to the delivery of clinical care. On a more modest level, research is needed on the process and impact of shared clinical decision making and the relative contributions of each “sphere” of data. By using a common language and melding disciplinary philosophies, EBP is intended to support research endeavors across traditional disciplinary silos.
Second, the EBP model may guide evidence-based policy considerations focused on the population’s health but also may influence EBP at the individual level. Policy-level approaches are often more permanent than many health programs focused on individual behavior change (Brownson, Haire-Joshu, and Luke 2006). EBP seeks to increase the effectiveness and efficiency of policy and so may entail both “big P” policies (formal laws, rules, regulations enacted by elected officials) and “small p” policies (organizational guidelines, social norms guiding behavior). Large-scale policies to support EBP can extend to laws governing behavior (e.g., seat belt laws, regulations of smoking in public places) and regulations focusing on coverage of health care services. These policies often involve a variety of professionals in many disciplines. At an organizational level, EBP might be licensure requirements or continuing education to ensure that the training in evidence-based approaches is adequate.
Finally, the EBP model has important implications for both academia and practice. On an academic level, support for EBP may guide the curricula of clinical or public health training programs and the preferred approaches to clinical or population-based decision making. Specific EBP competencies can be identified, taught, and assessed. EBP supports team-based, interdisciplinary care and training. In frontline clinics and health departments, EBP can either determine standards of care and/or provide an explicit and transparent process that guides each practitioner and patient or community in making informed, evidence-based clinical decisions. By taking into account contextual factors, patients’ preferences, evidence, and expertise, EBP is intended to provide realistic, high-quality, acceptable and effective care as broadly as possible. As the evidence-base deepens and expands across disciplines, the health professions may be pressured to implement those evidence-based practices shown to be most effective. We also anticipate that resource and environmental constraints will have a significant influence on how widely these evidence-based practices can be disseminated and implemented. Our proposed EBP model highlights the role that health professionals need to play in making collaborative decisions that take into account not only the evidence-based but also the available resources and the environmental context. While no simple solution is evident, transdisciplinary training in contextually sensitive EBP will help providers and policymakers make these complex and important decisions.
Acknowledgments
The project was supported by an NIH/OBSSR contract to Bonnie Spring at Northwestern University (N01-LM-6-3512, Resources for Training in Evidence-Based Behavioral Practice). The authors gratefully acknowledge the administrative support of the EBBP project coordinator, Kristin Hitchcock.
  • American Association of Colleges of Nursing (AACN) The Essentials of Master’s Education for Advanced Practice Nursing. 1996. [accessed June 3, 2008]. Available at http://www.aacn.nche.edu/Education/pdf/MasEssentials96.pdf.
  • American Association of Colleges of Nursing (AACN) The Essentials of Baccalaureate Education for Professional Nursing Practice. 1998. [accessed June 3, 2008]. Available at http://www.aacn.nche.edu/Education/pdf/BaccEssentials98.pdf.
  • American Association of Colleges of Nursing (AACN) The Essentials of Doctoral Education for Advanced Nursing Practice. 2006. [accessed June 3, 2008]. Available at http://www.aacn.nche.edu/DNP/pdf/Essentials.pdf.
  • American Association of Colleges of Nursing. AACN “Essentials” Series. 2008. [accessed March 2, 2008]. Available at http://www.aacn.nche.edu/education/essentials.htm.
  • American Nurses Association (ANA) Nursing: Scope and Standards of Practice. Washington, D.C: 2004a.
  • American Nurses Association(ANA) Scope and Standards for Nurse Administrators. 2nd ed. Washington, D.C.: Nursebooks; 2004b.
  • American Nurses Credentialing Center (ANCC) Forces of Magnetism. 2007. [accessed March 27, 2009]. Available at http://198.65.134.123/Magnet/ProgramOverview/ForcesofMagnetism.aspx.
  • APA Presidential Task Force on Evidence-Based Practice. Evidence-Based Practice in Psychology. American Psychologist. 2006. [accessed March 27, 2009]. pp. 271–285. Available at http://www.sonoma.edu/users/s/smithh/methods/evidence.pdf. [PubMed]
  • Bhandari M, Giannoudis PV. Evidence-Based Medicine: What It Is and What It Is Not. Injury. 2006;37(4):302–306. [PubMed]
  • Brownson RC, Baker EA, Leet TL, Gillespie KN. Evidence-Based Public Health. New York: Oxford University Press; 2003.
  • Brownson RC, Diem G, Grabauskas V, Legetic B, Potemkina R, Shatchkute A, Baker EA, et al. Training Practitioners in Evidence-Based Chronic Disease Prevention for Global Health. Promotion and Education. 2007;14(3):159–163. [PubMed]
  • Brownson RC, Fielding JE, Maylahn C. Evidence-Based Public Health: A Fundamental Concept for Public Health Practice. Annual Review of Public Health. 2009 in press.
  • Brownson RC, Gurney JG, Land G. Evidence-Based Decision Making in Public Health. Journal of Public Health Management and Practice. 1999;5:86–97. [PubMed]
  • Brownson RC, Haire-Joshu D, Luke DA. Shaping the Context of Health: A Review of Environmental and Policy Approaches in the Prevention of Chronic Diseases. Annual Review of Public Health. 2006;27:341–370. [PubMed]
  • Chambless DL, Baker MJ, Baucom DH, Beutler LE, Calhoun KS, Crits-Christoph P, Daiuto A, et al. Update on Empirically Validated Therapies,II. The Clinical Psychologist. 1998;51(1):3–16. Available at http://home.comcast.net/~dave.combs/valther.pdf(accessed March 27, 2009).
  • Chambless DL, Sanderson WC, Shoham V, Bennett Johnson S, Pope KS, Crits-Christoph P, Baker M, et al. An Update on Empirically Validated Therapies. The Clinical Psychologist. 1996;49(2):5–18.
  • Choudhry NK, Fletcher RH, Soumerai SB. Systematic Review: The Relationship between Clinical Experience and Quality of Care. Annals of Internal Medicine. 2005;142:260–273. [PubMed]
  • Evidence-Based Medicine Working Group. Evidence-Based Medicine: A New Approach to Teaching the Practice of Medicine. Journal of the American Medical Association. 1992;268:2420–2425. [PubMed]
  • Gambrill ED. Evidence-Based Practice: An Alternative to Authority-Based Practice. Families in Society. 1999;80:341–350.
  • Gambrill ED. Evidence-Based Practice: Sea Change or the Emperor’s New Clothes? Journal of Social Work Education. 2003;39(1):3–23.
  • Gibbs L. Evidence-Based Practice for the Helping Professions: A Practical Guide with Integrated Multimedia. Pacific Grove, Calif: Brooks/ Cole an Imprint of Wadsworth Publishers; 2003.
  • Glasziou P, Longbottom H. Evidence-Based Public Health Practice. Australian and New Zealand Journal of Public Health. 1999;23(4):436–440. [PubMed]
  • Greiner AC, Knebel E, editors. Health Professions Education: A Bridge to Quality. Washington, D.C: National Academies Press; 2003. Institute of Medicine Quality Chasm series.
  • Grumbach K, Bodenheimer T. Can Health Care Teams Improve Primary Care Practice? Journal of the American Medical Association. 2004;291:1246–1251. [PubMed]
  • Guyatt GH, Rennie D. Users’ Guides to the Medical Literature: A Manual for Evidence-Based Clinical Practice. Chicago: American Medical Association; 2007.
  • Haynes RB, Devereaux P, Guyatt GH. Clinical Expertise in the Era of Evidence-Based Medicine and Patient Choice. ACP Journal Club. 2002;136:A11–A14. [PubMed]
  • Haynes RB, Sackett DL, Gray JMA, Cook DF, Guyatt GH. Transferring Evidence from Research into Practice: The Role of Clinical Care Research Evidence in Clinical Decisions. ACP Journal Club. 1996:A14–A16. November/December. [PubMed]
  • Institute for the Advancement of Social Work Research. Partnerships to Integrate Evidence-Based Mental Health Practices into Social Work Education. Washington, D.C: 2007.
  • Institute of Medicine (IOM) Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, D.C: National Academy of Science Press; 2001.
  • Jenicek M. Epidemiology, Evidence-Based Medicine, and Evidence-Based Public Health. Journal of Epidemiology. 1997;7:187–197. [PubMed]
  • Kohatsu ND, Robinson JG, Torner JC. Evidence-Based Public Health: An Evolving Concept. American Journal of Preventive Medicine. 2004;27(5):417–421. [PubMed]
  • Légaré F, Stacey D, Graham ID, Elwyn G, Pluye P, Gagnon M-P, Frosch D, et al. Advancing Theories, Models, and Measurement for an Interprofessional Approach to Shared Decision Making in Primary Care: A Study Protocol. BMC Health Services Research. 2008;8:2. [PMC free article] [PubMed]
  • Luborsky L, Singer B, Luborsky L. Comparative Studies of Psychotherapies: Is It True That “Everyone Has Won and All Must Have Prizes”? Archives of General Psychiatry. 1975;32:995–1008. [PubMed]
  • Melnyk BM, Fineout-Overholt E, Stone P, Ackerman M. Evidence-Based Practice: The Past, the Present, and Recommendations for the Millennium. Pediatric Nursing. 2000;26(1):77–80. [PubMed]
  • Mullen EJ, Shlonsky A, Bledsoe SE, Bellamy JL. From Concept to Implementation: Challenges Facing Evidence-Based Social Work. Evidence and Policy: A Journal of Debate, Research, and Practice. 2005;1(1):61–84.
  • Mullen EJ, Streiner DL. The Evidence For and Against Evidence-Based Practice. Brief Treatment and Crisis Intervention. 2004;4(2):111–121.
  • Newhouse RP, Dearholt S, Poe S, Pugh LC, White K. Johns Hopkins Nursing Evidence-Based Practice Model and Guidelines. Indianapolis: Sigma Theta Tau International; 2007.
  • Reeves S, Zwarenstein M, Goldman J, Barr H, Freeth D, Hammick M, Koppel I. Interprofessional Education: Effects on Professional Practice and Health Care Outcomes. Cochrane Database of Systematic Reviews. 2008 issue 1, art. no. CD002213.
  • Regehr C, Stern S, Shlonsky A. Operationalizing Evidence-Based Practice: The Development of an Institute for Evidence-Based Social Work. Research on Social Work Practice. 2007;17(3):408–416.
  • Reid WJ. The Empirical Practice Movement. Social Service Review. 1994;68(2):165–184.
  • Roberts AR, Yeager KR, editors. Foundations of Evidence-Based Social Work Practice. New York: Oxford University Press; 2006.
  • Rubin A. Practitioner’s Guide to Using Research for Evidence-Based Practice. New York: Wiley; 2007.
  • Sackett DL. Rules of Evidence and Clinical Recommendations on the Use of Antithrombotic Agents. Chest. 1986;89(suppl.):2–3. [PubMed]
  • Sackett DL, Rosenberg WMC, Gray JAM, Haynes RB, Richardson WS. Evidence-Based Medicine: What It Is and What It Isn’t. British Medical Journal. 1996;312:71–72. [PMC free article] [PubMed]
  • Spring B, Pagoto S, Kaufman PG, Whitlock E, Glasgow RE, Smith TW, Trudeau KJ, Davidson KW. Invitation to a Dialogue between Researchers and Clinicians about Evidence-Based Behavioral Medicine. Annals of Behavioral Medicine. 2005;30(2):125–137. [PubMed]
  • Stetler CB. Updating the Stetler Model of Research Utilization to Facilitate Evidence-Based Practice. Nursing Outlook. 2001;49(6):272–279. [PubMed]
  • Stokols D. Toward a Science of Transdisciplinary Action Research. American Journal of Community Psychology. 2006;38(1–2):63–77. [PubMed]
  • Stokols D, Misra S, Moser RP, Hall KL, Taylor BK. The Ecology of Team Science: Understanding Contextual Influences on Transdisciplinary Collaboration. American Journal of Preventive Medicine. 2008;35:S96–S115. [PubMed]
  • Strauss SE, Richardson WS, Glasziou P, Haynes RB. Evidence-Based Medicine: How to Practice and Teach EBM. 3rd ed. New York: Elsevier; 2005.
  • Titler MG. Research Utilization: Necessity or Luxury? In: McCloskey J, Grace H, editors. Current Issues in Nursing. 5th ed. St. Louis: Mosby; 1997. pp. 104–117.
  • Titler MG, Kleiber C, Steelman VJ, Rakel BA, Budreau G, Everett LQ, Buckwalter KC, Tripp-Reimer T, Goode CJ. The Iowa Model of Evidence-Based Practice to Promote Quality Care. Critical Care Nursing Clinics of North America. 2001;13(4):497–509. [PubMed]
  • Turnock BJ. Public Health: What It Is and How It Works. 2nd ed. Gaithersburg, Md: Aspen Publishers; 2001.
  • Wampold B. The Great Psychotherapy Debate: Models, Methods, and Findings. Mahwah, N.J: Erlbaum; 2001.
  • Whitney SN. A New Model of Medical Decision: Exploring the Limits of Shared Decision Making. Medical Decision Making. 2003;23:275–280. [PubMed]
  • Zerhouni E. Translational and Clinical Science—Time for a New Vision. New England Journal of Medicine. 2005;353:1621–1623. [PubMed]