Search tips
Search criteria 


Logo of jamiaAlertsAuthor InstructionsSubmitAboutJAMIA - The Journal of the American Medical Informatics Association
J Am Med Inform Assoc. 2010 Mar-Apr; 17(2): 115–123.
PMCID: PMC3000781

Informatics, evidence-based care, and research; implications for national policy: a report of an American Medical Informatics Association health policy conference


There is an increased level of activity in the biomedical and health informatics world (e-prescribing, electronic health records, personal health records) that, in the near future, will yield a wealth of available data that we can exploit meaningfully to strengthen knowledge building and evidence creation, and ultimately improve clinical and preventive care. The American Medical Informatics Association (AMIA) 2008 Health Policy Conference was convened to focus and propel discussions about informatics-enabled evidence-based care, clinical research, and knowledge management. Conference participants explored the potential of informatics tools and technologies to improve the evidence base on which providers and patients can draw to diagnose and treat health problems. The paper presents a model of an evidence continuum that is dynamic, collaborative, and powered by health informatics technologies. The conference's findings are described, and recommendations on terminology harmonization, facilitation of the evidence continuum in a “wired” world, development and dissemination of clinical practice guidelines and other knowledge support strategies, and the role of diverse stakeholders in the generation and adoption of evidence are presented.

Keywords: Informatics, evidence-based care, evidence-based medicine, research


The US healthcare system continues to face multiple challenges related to unsustainable increases in cost, uneven quality of care, and persistent barriers to universal access. Additional pressures are mounting as a result of demographic and other trends: especially the ageing of the US population leading to a more complex and costly disease burden in the coming years; the potentially transformative impact of personalized medicine based on individual genomic information; and the movement toward greater involvement in decision making about health issues by patients and their families.

While biomedical research has yielded many new diagnostic and therapeutic options, it is not always clear which options offer “…the right treatment for the right patient at the right time”.1 Efforts to determine “what works” are hardly new in the study of medicine, but the systematic utilization of “evidence-based medicine” (EBM) began in the 1990s, led by a small group of researchers and educators. As defined by its adherents, EBM “…is the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients”.2 The American Medical Informatics Association (AMIA) convened a 2008 Health Policy Conference to focus discussions and advance understanding about the potential for informatics-enabled evidence-based care, clinical research, and knowledge management. Conference participants explored the applicability of informatics tools and technologies to improve the evidence base on which providers and patients can draw to diagnose and treat health problems. This paper, based on the conference findings, presents a model of an evidence continuum that is dynamic, collaborative, and enabled by health informatics technologies.

It has been 10 years since the Institute of Medicine (IOM) Committee on the Quality of Health Care in America released its report, To err is human: building a safer health system.3 Although this report and the subsequent IOM report, Crossing the quality chasm: a new health system for the 21st century,4 have generated significant discussion and research, our healthcare system in general, and the entities and enterprises within it have been slow to generate, transform, and use evidence. Improved efficiency and effectiveness of care relies on the best information being available and readily accessible by health professionals and patients to use in making decisions. An underlying series of complex processes is required for this to happen via basic, translational, and clinical research: collecting patient data and making it available to researchers and clinicians; organizing the information that is needed for clinical decision making; creating methods to effectively disseminate the information; and capturing the results of decisions so that this information is available for new analyses and future cycles of improvement.5

The Agency for Healthcare Research and Quality (AHRQ) Effective Health Care program conducts research to provide up-to-date, unbiased evidence on healthcare interventions. Evidence-based practice centers (EPCs) ( are a key part of the AHRQ program; launched in 1997, the EPCs develop evidence reports and technology assessments on topics relevant to clinical, social science/behavioral, economic, and other healthcare organization and delivery issues. The Cochrane Collaboration (, an international non-profit independent organization, is dedicated to producing and disseminating systematic reviews of healthcare interventions; founded in 1993, the Collaboration sets standards for reviews of medical treatments and offers systematic reviews of research by disorder. Research into the comparative effectiveness of different healthcare interventions has been authorized by the American Recovery and Reinvestment Act (ARRA) of 2009, with the creation of a Federal Coordinating Council charged with coordinating research and guiding investments in this research.6 One example of more recent efforts is an AHRQ funded project entitled “Structuring clinical recommendations for use in clinical decision support applications” to create structured logic statements from clinical recommendations, such as those provided by the US Preventive Services Task Force (USPSTF) and those underlying clinical measures related to “meaningful use” criteria. The goal is to accelerate, in a scalable fashion, the process whereby such clinical recommendations are converted into locally executable clinical decision-support rules.

Clearly, the push is on now to rapidly increase knowledge about “what works” in order to improve healthcare and contain costs. However, what counts as evidence to determine “what works” varies according to different experts. While the term “evidence” is often defined as findings established through randomized controlled trials and systematic reviews, the EBM paradigm encompasses other methods of establishing evidence as well.7–10 Examples of these methodologies include observational, cohort, and case–control studies; surveys; qualitative research; and expert opinion, among others.

Many caution against an overreliance on any one methodology or approach, because inherent shortcomings can prevent it from meeting the healthcare system's current and future needs for the timely generation of evidence. For example, limitations of randomized controlled trials have been described as slow pace, high cost, failure to address many questions of interest to practitioners, lack of inclusion of sizeable numbers of disadvantaged populations in studies, and difficulty in generalizing findings to the general population.9–14 Further, establishing new evidence about the efficacy and safety of clinical interventions through any means is no guarantee that the interventions will be used in actual clinical practice. Practitioners face challenges in staying abreast of new evidence and implementing evidence-based care. While National Institute of Health (NIH) programs often refer to the importance of dissemination of translational research, even extensive dissemination of such evidence is rarely sufficient to assure that it will be put into practice in a real world clinical setting. There is evidence demonstrating that even when a best practice is well known and documented in clinical practice guidelines, it is only used in patient care about 50% of the time.15 In the realm of public policy, policy-makers are besieged with information but results are not easily translated into policy decisions, and interpretation of study findings is sometimes inconclusive and even controversial.16

At the same time as these challenges are being increasingly recognized, there has been a slow but steady rise in adoption of new information and communications technologies (eg, e-prescribing, electronic health records, personal health records) by the healthcare community. Experts posit that health information technology (health IT) will be instrumental in helping to answer many of the pressing questions facing the healthcare system and will facilitate efforts to evaluate the effectiveness of healthcare interventions.1 17 Use of health IT will likely accelerate given the large amounts of money being made available for this purpose through ARRA. While this should lead to a substantial increase in available data that we may be able to use in order to advance evidence creation and improve knowledge building and clinical and preventive care, extracting value from such data repositories is a challenge for researchers as well as for healthcare practitioners.

AMIA's 2008 health policy conference: focus on an informatics-enabled evidence continuum

Recognizing the potential of informatics tools and techniques to contribute to a more robust evidence base, the 2008 AMIA Health Policy Conference took as its theme: “informatics-enabled evidence-based care, clinical research, and knowledge management”. AMIA convened a group of approximately 100 subject matter experts and stakeholders representing a variety of backgrounds, disciplines, and work environments, and posed the following questions to encourage them to think broadly about an approach to an evidence continuum that is dynamic, collaborative, and enabled by informatics:

  • What does the future evidence-based care and delivery system look like in a “connected” world?
  • How can health IT help the healthcare community make the needed strides toward determining “what works”?
  • What research is needed to help inform the development of policies guiding future efforts to determine “what works”, so that policies that promote patient care, protect patient data, and enhance patient participation can keep pace with technical solutions?

This paper, drawing on insights gained from the conference, outlines the approach to the evidence continuum conceptualized by participants, discusses ways in which it can be strengthened by informatics, and presents findings and recommendations for related research activities and policy actions.

Moving toward a dynamic and collaborative evidence continuum

In 2007, the Institute of Medicine (IOM) called for a new “rapid learning healthcare system” to accelerate the generation of new evidence. This holistic paradigm is characterized by continuous learning and improvement, and the evolution of new approaches to rapidly generate, apply, and evaluate evidence. A key feature of this paradigm is a “culture of shared responsibility” in which stakeholders (researchers, providers, patients) embrace the concept of a healthcare system that “learns”; share an understanding of the nature of evidence and the evolution of new methods to generate it; and work together toward the goal of shared decision making that is informed by the best possible evidence.9

With the current approach to the evidence continuum, the evidence flows in a top-down manner, with evidence generated by researchers, then translated by consensus panels into clinical practice guidelines, which are then disseminated chiefly via medical journals, mailings, and other professional vehicles, with the goal that these guidelines will be adopted by clinical practitioners.

Conference participants discussed the ongoing evolution of the EBM concept to meet current and future healthcare system needs and developed a model that illustrates this evolution. The model is characterized by shared responsibility for evidence generation and information exchange; it is dynamic and collaborative, with continuous feedback about effectiveness that emerges in the course of care.9 Conceptualized as a “…continuum of synthesized information”18 it embraces various research methods, acknowledging that evidence is generated from numerous data sources such as practice-based research; large, aggregated databases; modeling and simulation; emerging technologies (implantable devices, geographic information systems technology); and patient contributions.

In this model, research data generated from multiple sources are integrated and harmonized, stored in registries and other public repositories, and converted to computable, user friendly guidelines along with other useful approaches, which are disseminated through numerous resources: online repositories such as the National Guideline Clearinghouse (; email alerts; online courses; social networks and microblogging services such as Twitter; online services focused on medical research (Physicians First Watch, Medscape, DocGuide Weekly); and aggressive efforts such as the American Heart Association's “Get With The Guidelines” program.19 The guidelines and other “knowledge distillations” can then be applied by practitioners, patients, and citizens who use computerized decision support tools based on the evidence. Feedback from the point of care (including patient feedback) is actively sought to determine the comparative effectiveness of treatments, and to identify new research questions and priorities that are then integrated into future research agendas.

This evidence continuum requires participation by many types of stakeholders: academic and pharmaceutical industry researchers, community researchers, healthcare providers, patients, policymakers, consumers and their caregivers, and the health IT industry. Rather than the current linear information flow from bench to bedside, the flow is circular, returning information from the bedside to the bench.20 Technical, policy, and regulatory issues are embedded in this continuum, for example, intellectual property, data stewardship, and data access and use among others.

The model shown in figure 1 outlines the following processes that are characteristic of the evidence continuum conceptualized by conference participants: generating evidence, translating evidence for use in care delivery, disseminating and implementing evidence in the clinical environment, and adopting and assessing effectiveness of evidence-based interventions. The arrows in the illustration flow both ways—with evidence-based findings informing practice, and practice generating feedback by testing findings in real word settings, as well as identifying topics for the research agenda.

Figure 1
Dynamic and collaborative evidence continuum.

Generating evidence

Participants agreed that we must improve our ability to generate knowledge and evidence to support prevention, diagnosis, and treatment. A variety of evidence-generation approaches may need to be pursued to resolve the many unknowns and “semi-knowns” routinely faced by practitioners. As noted below, informatics research methods and techniques offer genuine promise in this regard, and further research will be needed to confirm these impressions.

Translating evidence

As we generate additional knowledge and evidence to support care delivery, we can anticipate that there will be significantly more information that will become accessible to the healthcare system. In its raw form, this information will be difficult to use effectively: data, knowledge, and evidence collected as part of care management or data mining will be more helpful if translated (aggregated and synthesized) for application in clinical care (eg, clinical care guidelines). A primary consideration in this regard is the need for guidelines to be in computer-understandable form to enable a new generation of systems for retrieving, analyzing, and presenting evidence.21

Disseminating and implementing evidence

Effective strategies, techniques, and incentives are needed to disseminate and implement this new knowledge throughout clinical environments. This need was noted at the “2nd Annual National Institutes of Health Conference on the science of dissemination and implementation: building research capacity to bridge the gap from science to service” (2009). Among the goals of this conference was the identification of methods and approaches to promote research and organizational capacity that will advance dissemination and implementation science.22 NIH conference participants recognized that there is a large gap between what we know can maximize the quality of healthcare and what is currently being delivered in practice and that we must not only understand how to create the best interventions, but how to ensure that they are effectively delivered by clinical and community practices.

Organizations of practicing physicians are increasingly promoting activities that foster the use of EBM by members and measurably improve practice performance. For example, a key component of the mission of the AMA-convened Physician Consortium for Performance Improvement (PCPI) is “…identifying and developing evidence-based clinical performance measures and measurement resources that enhance quality of patient care and foster accountability”.23 In 2002, the American Academy of Family Physicians (AAFP) established incentives to increase the use of research evidence in continuing medical education activities; in 2011, the Academy will eliminate the incentives, noting that it is now the norm for independent, certified continuing medical educations to be based on evidence.24

Adopting and assessing evidence

Adoption refers to the acceptance of an intervention and its integration into practice by providers and patients. Barriers to adoption of EBM include, among others, lack of awareness of evidence; limited applicability of recommendations for patient management; absence of good clinical studies for therapies; organizational culture, cognitive overload; and lack of compensation incentives.25 26 With respect to the last barrier, the Institute of Medicine, in its 2007 report, “Rewarding provider performance: aligning incentives in Medicare”, urged the creation of new payment incentives that support the revamping of health system structures and processes of care with the aim of promoting higher value. Although the report acknowledged that there are gaps in knowledge about the magnitude of incentives that are needed to achieve change while avoiding adverse consequences, it advocated beginning to align payment policies with activities that promote quality.27 In fact, the Centers for Medicare and Medicaid Services (CMS) has initiated pay for performance initiatives to encourage improved quality of care in Medicare services. For example, the Medicare Health Care Quality Demonstration includes among its goals “…reducing variations in utilization by appropriate use of evidence-based care and best practice guidelines…”.28

Recently, increased national attention is focusing on the concept of comparative effectiveness. According to AHRQ, comparative effectiveness is “research that is designed to inform healthcare decisions by providing evidence on the effectiveness, benefits, and harms of different treatment options”.29 The need for ongoing assessment of evidence-based interventions was acknowledged in the ARRA legislation which includes funding for research to evaluate and compare clinical outcomes, effectiveness, risks, and benefits of two or more medical treatments and services that address a particular medical condition.

Enabling the evidence continuum through informatics: examples from the current environment

In 2001, Bakken, in support of her statement that “…an informatics infrastructure is essential for evidence-based practice”, explained that this infrastructure was comprised of five building blocks: “standardized terminologies and structures, digital sources of evidence, standards that facilitate healthcare data exchange among heterogeneous systems, informatics processes that support the acquisition and application of evidence to a specific clinical situation, and informatics competencies”.18 Several examples of ways in which fundamental processes of the evidence continuum described above are being facilitated by health IT are outlined below.

  • Practice-based research. A key feature of the learning healthcare system, as envisioned by the IOM Roundtable on EBM, is the recognition that the point of care is the knowledge engine for the continuous learning process.30 Westfall et al argue that beyond the traditional research translation paradigm (basic science research >> human clinical research >> clinical practice), another translational step is needed to solve problems that primary care physicians routinely encounter: research in ambulatory clinical practices, or practice-based research.13 Experts enumerate the potential benefits of practice-based research such as providing access to large groups of patients to serve as subjects in research studies; facilitating the rapid testing of new interventions; promoting the surfacing of research questions and evidence gaps; and providing opportunities to collect longitudinal data documenting management of chronic health problems over the course of years.13 31 Since the late 1970s in the USA, primary care practice-based research networks (PBRNs) have helped to expand the primary care knowledge base and integrate research into practice as well as practice into research.32 Examples of the ways in which practice-based research is being enabled by health IT tools and techniques include recruitment of patients for practice-based research studies via electronic data collection33; translation of research into practice by re-engineering practices through strategic adoption of health IT34; and identification of risk in a patient population using physician-coded data from an electronic medical system.35
  • Data mining. In a 2009 report on computational technology and effective healthcare, the National Research Council lists “data mining capabilities” as one of the four domains of IT in healthcare.36 While electronic administrative data already can be mined to obtain information on resource use, economic endpoints, and some clinical outcome information, data obtained from the wider adoption of interoperable electronic medical record systems have the potential to facilitate real world clinical studies.8 Mining of routinely-collected primary care data that are aggregated into large databases can enable audits and improve quality, guide health service planning, and facilitate epidemiological study and research.37 Examples of the use of data mining to identify evidence include mining microarray data to identify a marker for breast cancer therapeutic response38; mining multimedia electronic health records for imaging data for research use39; and identifying adverse events.40 41

While many experts have noted the potential that mining of large health-related databases offers as a tool to generate evidence, the fact that these sources are both numerous and heterogeneous in structure and content presents challenges as well as opportunities. Data are stored in many locations on multiple systems (eg, regional and national healthcare data repositories, structured databases, legacy systems, databases, and text files behind web forms). Increasing amounts of data are multimedia and high-dimensional (eg, voice, imaging, and continuous biomedical signals). These data, ranging from tests results to patient-provided data, have varying degrees of reliability.36 Other challenges are related to database design shortcomings and biases42; difficulties in linking data that lack common patient identification42; medical record fragmentation and lack of tools to explore narrative information in records37 39; and data privacy issues.36 These “silos” must yield to data integration efforts.

  • Patient participation in research According to Davis et al, one of the attributes of patient-centered primary care practices is the utilization of clinical information systems to support quality improvement, practice-based learning, and overall high-quality care.43 Further, patients are increasingly interested in gaining direct access to their medical information through electronic medical record systems as they strive to become active partners in their care, improve communication with caregivers, and share in making treatment decisions.44 45 Health information and communications technology also offers opportunities for patients to contribute to research at different points in the evidence continuum including enhancing recruitment of clinical trial participants via an electronic health record-based alert system46; use of the web by disease-oriented patient communities to collect and share outcome-based patient data with researchers (; and patient documentation of drug-related problems on the web.47 48

AMIA conference findings and recommendations

Four major findings emerged from the 2008 AMIA Health Policy Conference relating to harmonization of terminology, facilitation of the evidence-based continuum, development and dissemination of clinical practice guidelines, and inclusion of multiple, diverse stakeholders (including patients) in the generation and adoption of evidence. These findings are described below along with related recommendations for further action.

Finding 1: there is a need to harmonize the various terms used to describe aspects of the evidence continuum to clarify their application by research and practice communities

AMIA conference participants noted that there are various approaches to defining and using the terms “evidence-based care/practice” and “translational research”. The term evidence-based practice (or practices; EBP(s)), generally refers to approaches to prevention or treatment that are validated by some form of documented scientific evidence. However, as noted in the introduction to this paper, what the term “evidence” means can vary. Experts have argued for renewed attention to the definition of EBM to reflect the evolution of its scope.49 50

With respect to translational research, there are traditionally two distinct domains: “early-stage” translational research, or the translation of basic science to clinical research, and “late-stage” translational research, or the translation of clinical evidence to health services research. The NIH roadmap describes translational research as bringing new knowledge from “bench to bedside” ( While this journey from the scientist's laboratory bench to a patient's bedside arguably covers the full spectrum of basic science, clinical practice, health services, and health policy research, emphasis has, in reality, been placed on the translation of basic science evidence into efficacious and safe clinical interventions.

In contrast, AHRQ describes translational research as the bringing of clinical research into practice settings. For example, the AHRQ TRIP-II (Translating Research Into Practice) program describes translational research as aimed at producing “sustainable improvements in clinical outcomes and patient outcomes” by “translating research findings into diverse applied settings”.51 AHRQ's definition suggests that the problem of delivering appropriate healthcare is much more than simple “dissemination” of scientific evidence; the production of clinical guidelines is not enough to assure adoption in clinical practice. Thus, translational research, in this view, is a process of developing, testing, and implementing strategies for health services beyond the artificial environment of a clinical trial.

There are also issues related to the use of terms involved in dissemination and implementation (D&I) research. Dissemination can be viewed as the targeted distribution of information and intervention materials to a specific public health or clinical practice audience. Implementation is the use of strategies to adopt and integrate evidence-based health interventions and change practice patterns within specific settings. A review of the literature on dissemination and knowledge utilization noted that there is little agreement among researchers on common definitions of key terms in this field, such as “dissemination”.52 Rabin et al highlight inconsistencies in the use and meaning of D&I terms and concepts and provide a glossary for D&I terminology in public health and clinical settings.53


  • Public and private sector organizations should collaborate to build consensus around working definitions of key terms related to EBM. As appropriate, the agencies of the US Department of Health and Human Services (DHHS) should refine and adjudicate existing definitions and terms related to evidence-based care and practice.

Finding 2: additional efforts are needed to facilitate and harmonize the generation, translation, dissemination, and adoption of evidence-based knowledge

In spite of various public and private sector efforts, research is not easily converted into information that can be readily adapted into clinical guidelines or related tools which in turn are not readily translatable or deployed in practice. Moreover, numerous individuals and organizations are working on similar translation initiatives but rarely sharing approaches or moving toward common standards on the form of translation. Some promising efforts have been undertaken to improve communication. The Clinical and Translational Science Award (CTSA) Consortium, a national consortium of medical research institutions funded by the NIH National Center for Research Resources (NCRR), is one example. Within the Consortium, the CTSA Informatics Committee promotes successful implementation of informatics support for CTSA functions through the sharing of knowledge, expertise, and resources (;com_ID=9). The ARRA legislation includes funds to enable AHRQ to build on its existing collaborative and transparent Effective Health Care program. This program is expected to allow for input from many perspectives into the development of research and implementation of the findings.

There is an increasing role for biomedical and health informatics in supporting all phases of the evidence continuum. Research is needed to study how new and combined sources of data (eg, implantable devices, genetics data); data obtained via remote monitoring and telehealth/telemedicine; and data generated or collected by consumers (eg, personal health records) can contribute to evidence-based practice. Public policies regarding data use and data stewardship need to be reviewed and modified to accommodate the new data sources. For example, it is still not always clear who has the authority to mine records and under what circumstances.

Payment and coverage policies are often not consistent with efforts to establish performance or quality measures based on evidence-based findings. Differences in incentives have engendered a culture clash between the majority of researchers and the practitioners to whom evidence is being pushed; this inhibits evidence adoption. Strategies are needed that will enhance dissemination, adoption, and evaluation of evidence-based care for specific care settings and care recipients including non-academic environments, rural and safety net providers, and special populations who are often under-represented in studies or who may require alternative solutions. In the realm of treatment effectiveness, real-world heterogeneity of patients must be taken into account. Patients differ from one another on a number of levels (genetically, biologically, demographically, geographically, etc) and these differences may impact the effectiveness of treatments for reasons that may be either biological or cultural, or both. In another important area related to effectiveness, attention needs to be focused on the growing body of evidence that demonstrates the pervasiveness of interventions that are excessive, inappropriate, and unscientific.54 For example, the 2009 National Quality Forum Spring Implementation Conference focused on waste and overuse in healthcare (

Questions that can help guide approaches to creating a research and policy agenda that facilitates the generation, translation, dissemination, and adoption of evidence-based knowledge include:

  • How can we better utilize the existing infrastructure that supports clinical care to generate new knowledge and evidence?
  • How can we tap into practice-based research networks and health information exchanges to transform clinical care practitioners into scientific investigators and quality improvement specialists? How do we create incentives that encourage this transformation? How do we integrate, harmonize, standardize, aggregate, and manage how data are collected and used?
  • How do we store this information in registries and other public repositories?
  • How do we mine the aggregated data to identify new knowledge that can impact research and patient care?
  • How do we create evidence and guidelines in ways that support dissemination?
  • How do we use social networks, professional societies, publishers, funders, and informaticians to speed the dissemination of newly-discovered knowledge?


  • Additional and/or redirected funding needs to be allocated to explore and evaluate the application of new informatics tools and evolving technologies (eg, social networking) to the processes that characterize the evidence continuum as it is envisioned in this paper. New and refined data systems and methods to disseminate information must be developed to accommodate feedback loops that inform real-time practice as well as systematically generate new evidence. Approaches and systems are needed to capture data and help researchers and practitioners understand patient heterogeneity to improve treatment selection and outcomes. For example, small area variation and local context can be critical in examining and comparing patient outcomes. Geographic information systems are needed to support health services research on these types of differences and to develop locally-sensitive strategies for healthcare delivery.
  • DHHS should fund additional research to study and assess the potential effects of payment and coverage policies on uptake of evidence-based findings by the practicing community.
  • Responsible data stewardship to ensure the confidentiality and security of person-specific health information should be considered as a critical enabler of the evidence continuum.
  • Mechanisms should be developed to encourage collaboration and sharing of results by researchers and practitioners working on the translation of evidence-based findings for application throughout the healthcare delivery system.

Finding 3: further research is needed to improve the creation, translation dissemination, implementation, and validation of evidence-based clinical practice guidelines and potentially other knowledge management and application strategies

Many feel that informatics-enabled clinical practice guidelines and/or standards of care have the potential to contribute significantly to the uptake of evidence-based knowledge. However, it is important to ensure that guidelines are developed for clinical practice areas in which they are most needed and translated effectively into everyday practice by being supportive of the way clinicians think and act. An increased awareness of the processes of guideline development and translation is likely to be valuable in achieving a more realistic set of codes.

Guidelines are frequently developed in isolation and there are many steps in the implementation process that reveal the gaps in the guidelines themselves. It is important to understand what is and what is not executable in guidelines. Assuming that they are evidence-based, effective guideline development ultimately requires specification of how data are obtained and represented. The translation process will be greatly facilitated by growth in standards for how evidence is expressed during patient encounters. In order for guidelines to be widely shared and implemented, they need to be encoded in a standardized form; authoring environments have been created to systematize the ways in which encodeable guidelines are developed.55 56 Feedback loops between guideline developers and translators should help to accelerate the process. The Quality Enhancement Research Initiative (QUERI) of the Veterans Health Administration (VHA) is an example of a collaboration across key elements of a large healthcare system that has enriched the development, implementation, and evaluation of clinical practice guidelines.57 Strategies for broad dissemination as well as local adaptation of guidelines are critical for successful implementation.19 58 Further, and of great importance, links between guideline implementation and outcomes need to be established. Variability of guideline implementation (eg, what was and was not adhered to, what else was done, etc) needs to be captured as well as the influence of implementations on outcomes.


  • Research is needed to determine the best methods of communicating and encouraging adherence to standard practices for different practice settings and clinical specialties, and different types of clinical providers. Innovative strategies need to be identified to alert and train practitioners about new guidelines/updates, including use of non-traditional diffusion tools such as social networking sites, blogs, etc (see for a list of 140 healthcare uses for the micro-sharing tool, Twitter, including tracking FDA guideline updates).
  • Research should be funded to improve the rigor of all aspects of guideline development, translation, and adoption. Examples of research include enhancing the usability of guidelines by identifying the cognitive processes that experts use as they develop guidelines as well as the diverse cognitive processes of guideline users; and validating guidelines through feedback from actual practice that addresses the degree to which interventions fit within real-world healthcare delivery systems. Guidelines will need to be continuously reviewed to assure that they genuinely enhance actual performance of care and improve health outcomes.
  • A broader array of stakeholders should be engaged in the process of developing practice guidelines. For example, in addition to clinicians, participation by computer programmers, behavioral scientists, usability experts, health informaticians, workflow experts, quality improvement specialists, and patients should be sought.

Finding 4: multiple, diverse stakeholders, including patients, caregivers, and their families, play important roles in the generation, dissemination, and adoption of evidence

As the demand for “what works” in healthcare accelerates, new evidence will be produced not only by the “push” of academic centers, but by the pull of clinical need, comparative effectiveness research, and public demand. Achieving success requires sustained, multidisciplinary efforts combining the technical skills of clinicians, informaticians, programmers, health services researchers, epidemiologists, biostatisticians, geneticists, policymakers, and others. For example, cross training in skill sets by practitioners of different disciplines is an important step in ensuring full participation in research innovations that are enabled by health IT. Health professionals must be trained in informatics so that they have a thorough understanding of the assets and limitation of health IT tools and systems that are becoming increasingly prevalent throughout the healthcare system, and are expected to play a major role in enabling the processes underlying the evidence continuum. Professionals working at all points in the evidence continuum will need to understand the ramifications of policy decisions at the local and national level (eg, data use, payment incentives).

We must also recognize changes in the ways that patients perceive their involvement in evidence generation, dissemination, and evaluation processes. The impact of the patient-centered movement on healthcare is broad and deep at numerous points in the evidence continuum. In 2007, the National Working Group on Evidence-Based Care issued a call to action advocating increased involvement of patients and consumers in setting research design priorities and clinical endpoints (eg, quality of life); reviewing evidence and identifying unanswered questions; translating findings for lay populations; and monitoring the impact of findings on patient health, outcomes, and access to care.59 In November 2009, DHHS convened a meeting on consumer health informatics, noting that consumer behavior is an essential contributor to quality improvement in healthcare. The purpose of the meeting was to create a blueprint for improving healthcare quality through enhanced behavioral support for healthcare consumers (

Rapidly growing social networking and collaborative websites are helping to enable this involvement. Examples of patient-driven activities in the research arena are plentiful: maintenance of a network of rare diseases biological resource centers (Eurodis—EuroBioBank,; promotion of recruitment into clinical trials and training of activists to participate in trial design and oversight (National Breast Cancer Coalition,; funding of researchers who commit to a collaborative rather than a competitive philosophy (The Life Raft Group,; and assistance in pinpointing the identity of a gene for a rare inherited disorder (PXE International,


  • Cross-fertilization should be encouraged by all stakeholder groups working at various stages in the evidence continuum through collaborative professional activities, conferences, etc.
  • Education/training activities need to be implemented that help students planning to work in evidence-based research fields gain skills in a broad range of relevant areas including basic computer programming, health informatics, health services research, population health, and comparative effectiveness.60 61
  • Researchers should be encouraged to accept and place a high value on the contributions of patients as partners in the generation, translation, adoption, and evaluation of evidence-based research. Patients (including members of immigrant communities), should be involved, as appropriate, in planning and implementing research efforts, particularly in community-based settings.62


Achieving desirable levels of healthcare quality that incorporate patient safety and centeredness as well as cost effectiveness requires consistent, systematic, and comprehensive application of available evidence-based knowledge. Thus, the conduct of translational health research has become a vital national enterprise. However, multiple barriers prevent the effective translation of basic science discoveries into clinical and community practice. There are many important questions that remain unanswered and the need for evidence to support clinical practice far exceeds the healthcare research enterprise's ability to produce it.

The current approach/framework to the generation and sharing of evidence does not adequately address this need, and AMIA meeting participants agreed that efforts are required to create a more dynamic and collaborative evidence continuum including new approaches to translate evidence to support clinical care and to study outcomes. They advanced an alternative way to depict the evidence continuum which we outlined in this paper. Additional refinement of this concept is invited.

One of the ways to accelerate the generation and adoption of evidence is to engage more practicing community-based clinicians as active participants in the research process. Expanding the pool of researchers in this way will help ground research in the “real world” where practitioners deal every day with unanswered questions presented by their patients related to disease prevention and treatment. However, this evidence-generation approach, as well as others discussed in this paper, raises new challenges in assessing the quality of new evidence,63 issues with regard to applying the evidence across broad and specialized populations, and other concerns that will require ongoing thoughtful consideration.

New and emerging technologies and advances in the electronic capture of clinical care information present opportunities for powering the evidence continuum. However, the healthcare profession faces numerous challenges in implementing health IT to drive improvement in health outcomes across a broad array of settings. Funding for research and evaluation of efforts to develop and deploy health IT tools is essential to assuring their successful use and integration into the health delivery systems.

In a recent commentary on the original 1992 article in JAMA that brought the term “EBM” to the attention of the overall medical community, Montori et al discussed a second fundamental principle of EBM outlined by the EBM Working Group in 2000 (the first principle being the hierarchy of evidence): “…whatever the evidence, value and preference judgments are implicit in every clinical decision. A key implication of this second principle is that clinical decisions, recommendations, and practice guidelines must not only attend to the best available evidence, but also to the values and preferences of the informed patient.”63

In keeping with this tenet as well as with the overarching principles regarding shared decision making and customization of care based on patient choice outlined by the IOM in Crossing the quality chasm,4 this paper has described an enhanced role for patients in the evidence continuum—as generators of data and as informed participants in decision making about the application of evidence-based findings to healthcare. Making this a reality will require extensive education of consumers and patients about the principles of evidence-based research and care and about the value of their personal health information for legitimate biomedical and health services research. It will require widespread acceptance of their participation by researchers and providers, as well as practical techniques to encourage their involvement to the greatest extent possible. And, it will require the needed technical strides and public policies to ensure the privacy and security of person-specific data. Communication and information technology each have roles to play in this transformation.

AMIA Board of Directors response and action

By convening the 2008 conference and disseminating this paper, AMIA has further delineated critical issues related to evidence-based practice and care and informatics. The AMIA Board of Directors reviewed the paper and endorsed its findings, conclusions, and recommendations. AMIA will continue to encourage other organizations to work collaboratively to further this important public discourse. AMIA will forward the paper and its recommendations to DHHS organizations and private sector organizations for their review and consideration.


AMIA would like to acknowledge the input of the many participants and presenters from the 2008 AMIA Health Policy Conference on which this article is based. Doug Fridsma and Suzanne Markel-Fox co-chaired the Meeting Steering Committee. Srini Kalluri, Kraig Kinchen, David Leventhal, Joyce Niland, Phil Payne, and Mark Weiner served as members of the Steering Committee. They were actively involved in and provided valuable input to all aspects of the planning processes.

The authors wish to thank Freda Temple for her careful review and editing of all versions of this manuscript. The authors would also like to thank the following individuals who reviewed earlier versions of this summary of the 2008 AMIA meeting: Doug Fridsma, Kraig Kinchen, and Suzanne Markel-Fox. Additionally, AMIA acknowledges and thanks the organizations that generously supported the 2008 meeting: Amplify Public Affairs, GSK, IMO, Lilly, PercipEnz, and Pfizer.


Competing interests: None.

Provenance and peer review: Not commissioned; externally peer reviewed.


1. Clancy CM. AHRQ's research efforts in comparative effectiveness. Statement before the U.S. House of representatives committee on ways and means, subcommittee on health. 2007
2. Sackett DL, Rosenberg WM, Gray JA, et al. Evidence based medicine: what it is and what it isn't. BMJ 1996;312:71–2 [PMC free article] [PubMed]
3. Kohn LT, Corrigan JM, Donaldson MS, editors. , eds. Committee on quality of health care in America. Institute of Medicine. To err is human: building a safer health system Washington, DC: National Academies Press, 2000 [PubMed]
4. Committee on Quality of Health Care in America, Institute of Medicine Crossing the quality chasm: a new health system for the 21st century Washington, DC: National Academies Press, 2001 [PubMed]
5. Detmer DE, Steen EB. Information and communications technology and the future health workforce: transformative opportunities and critical challenges. In: Holmes D, editor. , ed. From education to regulation: dynamic challenges for the health workforce Washington, DC: Association of Academic Health Centers, 2008:21–46
6. Comparative effectiveness research funding.
7. Knottnerus JA, Dinant GJ. Medicine based evidence, a prerequisite for evidence based medicine. BMJ 1997;315:1109–10 [PMC free article] [PubMed]
8. Tunis SR. A clinical research strategy to support shared decision making. Health Aff 2005;24:180–4 [PubMed]
9. Olsen L, Aisner D, McGinnis JM, editors. , eds. Roundtable on evidence-based medicine, institute of medicine: learning healthcare system: workshop summary Washington, DC: National Academies Press, 2007
10. Haynes RB. What kind of evidence is it that Evidence-Based Medicine advocates want health care providers and consumers to pay attention to? BMC Health Serv Res 2002;2:3 [PMC free article] [PubMed]
11. Rogers WA. Evidence based medicine and justice: a framework for looking at the impact of EBM upon vulnerable or disadvantaged groups. J Med Ethics 2004;30:141–5 [PMC free article] [PubMed]
12. Herman WH. Evidence-based diabetes care. Clin Diabetes 2002;20:22–3
13. Westfall JM, Mold J, Fagnan L. Practice-based research—“Blue Highways” on the NIH roadmap. JAMA 2007;297:403–6 [PubMed]
14. Tierney WM, Oppenheimer CC, Hudson BL, et al. A national survey of primary care practice-based research networks. Ann Fam Med 2007;5:242–50 [PubMed]
15. McGlynn EA, Asch SM, Adams J, et al. The quality of health care delivered to adults in the United States. N Engl J Med 2003;348:2635–45 [PubMed]
16. Robert Wood Johnson Foundation, The Synthesis Project. About synthesis.
17. Blumenthal D. Testimony before the committee on small business, subcommittee on regulations and healthcare. 2009
18. Bakken S. An informatics infrastructure is essential for evidence-based practice. J Am Med Inform Assoc 2001;8:199–201 [PMC free article] [PubMed]
19. Smaha LA. American Heart Association. The American Heart Association Get With The Guidelines program. Am Heart J 2004;148(5 Suppl):S46–8 [PubMed]
21. Sim I, Sanders GD, McDonald KM. Evidence-based practice for mere mortals: the role of informatics and health services research. J Gen Intern Med 2002;17:302–8 [PMC free article] [PubMed]
22. 2d Annual NIH Conference on the Science of Dissemination and Implementation. Building research capacity to bridge the gap from science to service. 2009
23. Physician Consortium for Performance Improvement 2008 Report.
24. American Academy of Family Physicians. Evidence-Based CME. 2009
25. Mendelson D, Carino TV. Evidence-based medicine in the United States—de rigueur or dream deferred? Health Aff (Millwood) 2005;24:133–6 [PubMed]
26. Shortell SM, Zazzali JL, Burns LR, et al. Implementing evidence-based medicine: the role of market pressures, compensation incentives, and culture in physician organizations. Med Care 2001;39(7 Suppl 1):I62–78 [PubMed]
27. Committee on Redesigning Health Insurance Performance Measures, Payment, and Performance Improvement Programs, Institute of Medicine Rewarding provider performance: aligning incentives in medicare (Pathways to Quality Health Care Series) Washington, DC: National Academies Press, 2007
28. Medicare “Pay For Performance (P4P)” Initiatives.
29. AHRQ Effective Health Care Program. What is comparative effectiveness research?.
30. Roundtable on Evidence-Based Medicine, Institute of Medicine Learning healthcare system concepts. V. 2008. Annual report Washington, DC: National Academies of Science, 2008
31. Van Weel C. Longitudinal research and data collection in primary care. Ann Fam Med 2005;3(Suppl 1):S46–51 [PubMed]
32. Lanier D. Primary care practice-based research comes of age in the United States. Ann Fam Med 2005;3(Suppl 1):S2–4 [PubMed]
33. Kho A, Zafar A, Tierney W. Information technology in PBRNs: the Indiana University Medical Group Research Network (IUMG ResNet) experience. J Am Board Fam Med 2007;20:196–203 [PubMed]
34. Nagykaldi Z, Mold JW. The role of health information technology in the translation of research into practice: an Oklahoma Physicians Resource/Research Network (OKPRN) study. J Am Board Fam Med 2007;20:188–95 [PubMed]
35. Stephens MB, Reamy BV. Primary Care Education and Research Learning Network A novel approach using an electronic medical record to identify children and adolescents at risk for dyslipidemia: a study from the Primary Care Education and Research Learning (PEARL) network. J Am Board Fam Med 2008;21:356–7 [PubMed]
36. Stead WW, Lin HS, editors. , eds. Committee on engaging the computer science research community in health care informatics; National Research Council. Computational technology for effective health care: immediate steps and strategic directions Washington, DC: National Academies Press, 2009 [PubMed]
37. de Lusignan S, van Weel C. The use of routinely collected computer data for research in primary care: opportunities and challenges. Fam Pract 2006. April;23:253–63 [PubMed]
38. Tovey S, Dunne B, Witton CJ, et al. Can molecular markers predict when to implement treatment with aromatase inhibitors in invasive breast cancer? Clin Cancer Res 2005;11:4835–42 [PubMed]
39. Lowe HJ. Multimedia electronic medical record systems. Acad Med 1999;74:146–52 [PubMed]
40. Wilson AM, Thabane L, Holbrook A. Application of data mining techniques in pharmacovigilance. Br J Clin Pharmacol 2004;57:127–34 [PMC free article] [PubMed]
41. Honigman B, Lee J, Rothschild J, et al. Using computerized data to identify adverse drug events in outpatients. J Am Med Inform Assoc 2001;8:254–66 [PMC free article] [PubMed]
42. Harrison JH, Jr, Aller RD. Regional and national health care data repositories. Clin Lab Med 2008;28:101–17, vii. [PubMed]
43. Davis K, Schoenbaum SC, Audet AM. A 2020 vision of patient-centered primary care. J Gen Intern Med 2005;20:953–7 [PMC free article] [PubMed]
44. Earnest MA, Ross SE, Wittevrongel L, et al. Use of a patient-accessible electronic medical record in a practice for congestive health failure: patient and physician experiences. J Am Med Inform Assoc 2004;11:410–7 [PMC free article] [PubMed]
45. Ross SE, Moore LA, Earnest MA, et al. Providing a web-based online medical record with electronic communication capabilities to patients with congestive heart failure: randomized trial. J Med Internet Res 2004;6:e12. [PMC free article] [PubMed]
46. Embi PJ, Jain A, Clark J, et al. Effect of a clinical trial alert system on physician participation in trial recruitment. Arch Intern Med 2005;165:2272–7 [PMC free article] [PubMed]
47. Beck M. Inexact copies: how generics differ from brand names. Wall St J 2008 April 22.
48. Graedon J. E-patients unite to document problems with generic drug.
49. Dawes M, Summerskill W, Glasziou P, et al. Sicily statement on evidence-based practice. BMC Med Educ 2005. January 5;1 [PMC free article] [PubMed]
50. Buetow S, Kenealy T. Evidence-based medicine: the need for a new definition. J Eval Clin Pract 2000;6:85–92 [PubMed]
51. Agency for Healthcare Research and Quality Translating research into practice (TRIP)-II.
52. National Center for the Dissemination of Disability Research (NCDDR)
53. Rabin BA, Brownson RC, Haire-Joshu D, et al. A glossary for dissemination and implementation research in health. J Public Health Manag Pract 2008;14:117–23 [PubMed]
54. Keyhani S, Siu AL. The underuse of overuse research. Health Serv Res 2008;43:1923–30 [PMC free article] [PubMed]
55. Georg G, Séroussi B, Bouaud J. Does GEM-encoding clinical practice guidelines improve the quality of knowledge bases? A study with the rule-based formalism. AMIA Annu Symp Proc 2003:254–8 [PMC free article] [PubMed]
56. Berg D, Ram P, Glasgow J, et al. SAGEDesktop: an environment for testing clinical practice guidelines. Conf Proc IEEE Eng Med Biol Soc 2004;5:3217–20 [PubMed]
57. Craig TJ, Petzel R. Management perspectives on research contributions to practice through collaboration in the U.S. Veterans Health Administration: QUERI Series. Implement Sci 2009;4:8. [PMC free article] [PubMed]
58. Eagle KA, Gallogly M, Mehta RH, et al. Taking the national guideline for care of acute myocardial infarction to the bedside: developing the guideline applied in practice (GAP) initiative in Southeast Michigan. Jt Comm J Qual Improv 2002;28:5–19 [PubMed]
59. National Working Group on Evidence-Based Health Care Rebalancing evidence-based healthcare: the central role of patients and consumers. July 23, 2007.
60. Logan JR, Price SL. Computer science education for medical informaticians. Int J Med Inform 2004;73:139–44 [PubMed]
61. Mandl KD, Lee TH. Integrating medical informatics and health services research: the need for dual training at the clinical health systems and policy levels. J Am Med Inform Assoc 2002;9:127–32 [PMC free article] [PubMed]
62. Lin JS, Finlay A, Tu A, et al. Understanding immigrant Chinese Americans' participation in cancer screening and clinical trials. J Community Health 2005;30:451–66 [PubMed]
63. Montori VM, Guyatt GH. Progress in evidence-based medicine. JAMA 2008;300:1814–6 [PubMed]

Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of American Medical Informatics Association