Search tips
Search criteria 


Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
Am J Prev Med. Author manuscript; available in PMC 2013 September 1.
Published in final edited form as:
PMCID: PMC3592983

Bridging Research and Practice

Models for Dissemination and Implementation Research



Theories and frameworks (hereafter called models) enhance dissemination and implementation (D&I) research by making the spread of evidence-based interventions more likely. This work organizes and synthesizes these models by: (1) developing an inventory of models used in D&I research; (2) synthesizing this information; and (3) providing guidance on how to select a model to inform study design and execution.

Evidence acquisition

This review began with commonly cited models and model developers and used snowball sampling to collect models developed in any year from journal articles, presentations, and books. All models were analyzed and categorized in 2011 based on three author-defined variables: construct flexibility, focus on dissemination and/or implementation activities (D/I), and the socio-ecological framework (SEF) level. Five-point scales were used to rate construct flexibility from broad to operational and D/I activities from dissemination-focused to implementation-focused. All SEF levels (system, community, organization, and individual) applicable to a model were also extracted. Models that addressed policy activities were noted.

Evidence synthesis

Sixty-one models were included in this review. Each of the five categories in the construct flexibility and D/I scales had/contained at least four models. Models were distributed across all levels of the SEF; the fewest models (n=8) addressed policy activities. To assist researchers in selecting and utilizing a model throughout the research process, the authors present and explain examples of how models have been used.


These findings may enable researchers to better identify and select models to inform their D&I work.


Vast resources are invested in the development of interventions to prevent and treat disease; however, only a fraction of research products is translated to practice and policy in order to affect population health.13 Dissemination and implementation (D&I) science seeks to understand how to systematically facilitate deployment and utilization of evidence-based approaches to improve the quality and effectiveness of health promotion, health services, and health care.4 Although this discussion is framed largely around health fields, much of the work in D&I stems from other industries and disciplines. As the field of D&I research grows, the number of existing theories and frameworks informing this research continues to expand.

Although theories and frameworks are often presented as synonymous, they are distinct concepts. Theories present a systematic way of understanding events or behaviors by providing inter-related concepts, definitions, and propositions that explain or predict events by specifying relationships among variables.5 Moreover, theories are abstract, broadly applicable and not content- or topic-specific.5 On the other hand, frameworks are strategic or action-planning models that provide a systematic way to develop, manage, and evaluate interventions.6 Despite their differences, theories and frameworks both enhance effectiveness of interventions by helping to focus interventions on the essential processes of behavioral change, which can be quite complex.5,710 For example, public health interventions that utilize health behavior theories, such as social cognitive theory and the theory of planned behavior, are more effective than interventions without a theoretic base.5,10

The importance of theories and frameworks in other areas of research (e.g., individual-level, behavioral intervention) suggests that success in D&I research will also benefit from the use of theories and frameworks. This is supported by research that demonstrates that the use of theories and frameworks in D&I research enhances interpretability of study findings and ensures that essential implementation strategies are included.1113 For simplicity, the current paper refers to theories and frameworks (both of which are important for D&I research) collectively as models.

The roots of D&I research cut across many disciplines, including agriculture, medicine, public health, organizational behavior, psychology, political science, and marketing. The field has grown and changed since its origins several decades ago.14 The mounting interest in transdisciplinary research and increasing ease of information-sharing encourages collaboration of these diverse, but inter-related specialties. Since 2004, at least ten peer-reviewed journals across a range of scientific disciplines have devoted special issues or sections to the topic of dissemination or implementation of evidence-based practices.15 Due to the interdisciplinary nature of D&I research, there is a need to collect, organize, and synthesize the many models used to integrate evidence-based interventions and healthcare information into practice.

This paper seeks to further D&I science by providing a narrative review of models used in D&I research. D&I science is notably different from the simple dissemination of research findings that occurs at the end of a study (e.g., a press release, an issue brief, a peer-reviewed publication). Instead, D&I science seeks to investigate and better understand the complex task of spreading ideas across multiple levels of the socio-ecological framework (SEF), which may include groups at the organizational and community levels.

Models were aggregated from published literature and scientific presentations. To facilitate selection of the most-appropriate model to inform D&I study design and execution by researchers, these models are organized based on: the flexibility of a model’s constructs; whether the model is more focused on dissemination and/or implementation; and the socio-ecological level to which a model is applicable (system, community, organization, or individual); as well as whether or not the model addresses policy creation or use. Additionally, case studies are included to illustrate how models can be used to inform D&I research.

Evidence Acquisition

Dissemination and implementation research is described using several terms, many of which are used interchangeably; for example: knowledge translation, knowledge exchange, and knowledge utilization.16 The diverse range of disciplines contributing models to D&I research leads to a tremendously wide range of sources. These factors prohibited establishing a scope for this review that would comply with traditional systematic review guidelines. Therefore, a narrative approach was determined to be most appropriate for this review. Narrative reviews are useful for summarizing studies and describing “what we know,” informed by reviewers’ experiences and existing theories.17,18 The authors’ aim was to capture and carefully review a large number of existing models within the D&I field. This was accomplished through an approach divided into several phases: initial sampling; snowball sampling from the initial sample; consulting with experts; identifying categories into which models could be placed; arranging the models based on the categories; and contacting a subset of model developers to ensure that the categories were valid.

Without consensus terminology in D&I research, the starting point for the narrative review was determined by two of the study authors, who generated a list of commonly used models and model developers. Snowball sampling was then used to identify new articles through existing reviews, reference lists, and presentations delivered by the authors and available online. The search was not exhaustive but did attempt to identify every model. To ensure comprehensiveness, U.S. NIH officials who advise researchers submitting grant proposals for D&I research were queried for additional models.

Models published in peer-reviewed and non–peer reviewed sources in this review are from many disciplines including innovation, organizational behavior, and research utilization. Several criteria were used to define the scope of models included in this review. The following parameters were informed by two of the study authors, who are experts in the field, and were developed to provide a succinct list of models to D&I researchers that would be of the highest value.

The first criterion was that the model be designed for use by researchers, in contrast to practitioners or clinicians. Although the distinction between researchers and practitioners is ambiguous, researchers have been described as “knowledge creators,” and practitioners have been described as those applying knowledge in service.19 The second criterion was that the model be applicable to local-level dissemination targeting communities and organizations. Thus, models that applied only to national-level plans were excluded; these were models for which the unit of dissemination or implementation would be at a national level (e.g., a country’s dissemination plan).

The authors also excluded models that applied to only individual behavior change with no application to community- or organization-level dissemination. Since this review focuses on models for D&I research, models designed to assist only in the dissemination that occurs at the end of a research study were also excluded. Lastly, the included publications were limited to those written in English. As narrative reviews are best conducted by a team,18 two of the authors reviewed publications as well as reports of D&I research. The authors convened regular meetings to discuss the categorization and inclusion/exclusion of models.

In the process of reviewing the models, several groupings emerged. Therefore, to assist researchers in selecting a model, three author-defined variables were used to categorize the models: construct flexibility, focus on dissemination and/or implementation activities (D/I), and SEF level (Table 1). First, models were categorized based on their construct flexibility on a 1–5 scale, where 1=broad and 5=operational. Models falling between these categories were scored as 2, 3, or 4.

Table 1
Definitions of categories used to sort models

Broad models are those that contain constructs that are more loosely outlined/defined, thereby allowing researchers greater flexibility to apply the model to a wide array of D&I activities and contexts. This also places more responsibility on the researcher to carefully think through how to operationalize, implement, and use the model. Operational models provide detailed, step-by-step actions for completion of D&I research processes. These are clearly defined for a particular context and activity. Models between these two extremes contain constructs that are more detailed than broad models but not as detailed as operational models. This made the models less flexible across all contexts, but more conducive to visualizing how the model may assist with study design.

To further facilitate selection, models were also categorized on a continuum from dissemination to implementation. Dissemination is the active approach of spreading evidence-based interventions to the target audience via determined channels using planned strategies. Implementation is the process of putting to use or integrating evidence-based interventions within a setting.20 Models informing D&I research fall along the spectrum from dissemination to implementation. Therefore, models were split into five categories: models that focused entirely on dissemination (D-only); dissemination more than implementation (D>I); both activities equally (D=I); implementation more than dissemination (I>D); and only on implementation (I only).

The last variable used to classify these models was the level of the SEF at which the model operates. The use of the SEF recognizes that D&I strategies may focus on changing behavior at a specific level (e.g., clinician, organization) or may cut across multiple levels. Therefore, it is important for future use of models to identify the level at which each model operates. Models were assigned as many SEF levels as were applicable, including individual, organization, community, and system. Models addressing policy, such as policy use and creation of policy, were also labeled as such.

Based on these three categories, models were classified by two independent reviewers. Initial agreement for categorization of models along the spectrum from dissemination to implementation was 84% (Kappa coefficient=0.79). Initial agreement for the construct flexibility scale was considerably lower: 43% (Kappa coefficient=0.25). These categorizations were discussed by the independent reviewers, and discrepancies were resolved via consensus. To ensure that models were accurately described and that definitions were clear to experts in the field, a sample of model developers were contacted and presented with the category definitions and assignment for the model they developed. Further, all model developers for whom contact information could be identified were contacted to assure that the models presented below have an accurate name and all appropriate citations.

After finalizing the list of models and their categorization (Table 2), additional information about the model was abstracted: the original field in which the model was developed, the number of times the original publication has been cited, and a subset of studies, if any, that used the model to inform their design. The field of origin was ascertained by determining the model developers’ stated intended use for the model. Google Scholar was used to determine the number of times the original publication had been cited. Articles identified by Google Scholar as citing the model were abstracted to identify studies in which researchers had used the model to inform the study design. The model’s field of origin, the number of articles that cite the model, and studies that use the model are included in Appendix A (available online at

Table 2
Categorization of D&I models for use in research studies

Five examples of model use, selected to represent a broad range of fields, are described in greater detail within this work. As models can be applied retrospectively to inform an evaluation or prospectively to inform study design, examples of both types of model applications are provided. One example, or case study, is provided here (Figure 1), with the remaining four available in Appendix B (available online at Each case study provides background about the model; how the model was applied to the specific research setting; and when possible, information related to construct measurement.

Figure 1
Case Study 1: RE-AIM (clinic-based diabetes intervention)

Evidence Synthesis

From a total of 109 models, 26 were excluded due to a focus on practitioners, rather than researchers; 12 were excluded because they were not applicable to local-level dissemination (communities or organizations); and eight were excluded because they focused on dissemination at the end of a research study rather than D&I research. Two models were identified as duplicates, and combined for inclusion. A total of 61 models were included in this review. A complete list of the models, including all three types of categorization can be found in Table 2. This table also includes the original reference for the model as well as references to publications updating the model. The models in Table 3 are organized first by classification along the D/I continuum, then by construct flexibility. Appendix A (available online at provides additional information about the field of origin of each model, the number of times a model was cited, and studies that use the model (where available).

Table 3
Frameworks in each category when BCO and D/I are cross-tabulated

Table 2 shows that each of the five categories within the construct flexibility variable was assigned to at least four models, with the greatest number of models (25 models) categorized as three. Similarly, each of the five categories within the D/I variable was assigned to at least five models, with slight skewing towards the dissemination end of the D/I continuum. Models were distributed across all levels of the SEF, with an emphasis on the community (52 models) and organization (59 models) levels. In addition, eight models addressed policy activities.

The models are presented in Table 3 based on their classification in two categories: construct flexibility and D/I. When these two categories were cross-tabulated, a number of findings are apparent. Models with a greater emphasis on implementation tended to have constructs that are more operational. In contrast, there was a greater quantity and variety of dissemination-focused models (D-only, D>I). Of note, broad models were identified only for D-only or D=I activities. It is important to acknowledge that while these models are presented as being distinct from each other, many of the models evolved from and/or were informed by other models. Thus, although the models were divided into discrete categories, the differences among models are much more fluid.

The case studies presented in Figure 1 and Appendixes BE explain some models in greater detail, discuss how each model was applied to the specific research setting, and, when possible, provide information on measurement. (Note that only two of the studies, shown in Figure 1 and Appendix E, included measures.) The five case studies show the diversity present in D&I research. Within this handful of examples, the fields of study represented include: obesity policy, substance use disorder treatment, and teen pregnancy prevention (Appendixes B, C, and E, respectively, available online at Further, the case studies demonstrate the many ways that a model can be applied. In three cases, a model was retrospectively applied to evaluate an existing intervention, wehereas in two cases, the researchers prospectively applied models to design an intervention. Further, Appendix A (available online at provides references for studies that use a given model, where they could be identified.


The importance of using models in D&I studies cannot be overstated. Use of models not only makes a study more likely to be successful, but if an existing model is used, this application also contributes to the literature on a particular model and enables continued distillation and better understanding of model constructs.1013 This paper presents 61 existing models (as well as information regarding the settings and approaches to which these models are suited) to assist researchers seeking to utilize an existing model to inform their work. Although some D&I models are likely missing from this review, the models presented in Table 2 represent the entire spectrum in the construct flexibility, D/I, and SEF categories. At least four models are in each of the five D/I and construct-flexibility groups. Table 3 displays the diversity of the models and suggests the need for guidance on using the information presented in this review. Issues to consider when using Tables 2 and and33 to inform the design of a D&I study are presented below.

Using an Existing Model Versus Developing a New Mode

The first consideration is the decision to use an existing model or develop an entirely new model. As the number of models presented in this review shows, researchers can choose from a wealth of existing models. There are many benefits to using an existing model. It encourages researchers to build on previous findings. Demonstrating a new application of the model increases the generalizability of the model thereby enhancing the field’s understanding of a model and its constructs.

Since D&I research crosses numerous disciplines, finding the right fit between research needs in a particular field and existing models can be a challenge. It is possible that no existing model is well suited for a given field. In these cases, the researcher can choose to develop a new model or adapt an existing model. As this review identified 61 models, any researcher considering developing a new model should note the considerable overlap between existing models and document that the new model truly addresses a gap in the literature. Based on face validity and expert experience, when adaptation of an existing model is considered, it is essential to review the goal, setting, population, and other contextual conditions for which the model was originally developed.110 The process for selecting and using or adapting an existing model is described below.

Selecting a Model

By classifying the models using three categories (construct-flexibility, D/I, and SEF), the authors sought to provide useful information to aid in the selection of an appropriate model for a D&I study. For scientists new to D&I research, who may need additional support in designing their study, the construct-flexibility variable may assist in selecting models that will provide additional guidance. Researchers that are considering a study that targets system, communities, organizations, and/or individual level changes may select models that include applications at those levels. Studies that are aimed at the entire dissemination-to-implementation spectrum can be informed by models that address both dissemination and implementation research. Lastly, researchers with interest in policy-related D&I issues may also identify models that will assist with their thinking on policy.

The inclusion in this review of the field of origin of each model provides D&I researchers additional information when selecting a model. The innovation of a research study can be enhanced by utilizing models originally developed in different disciplines, but which may be well suited to an alternative field. This also prevents duplication of models across disciplines.

The authors believe that the provided information will improve the process of selecting an appropriate model for a D&I study. By using Table 3, based on the considerations described above, researchers can identify a list of models most appropriate for their study. If necessary, the list of potential models can be further refined by using additional information (such as SEF and field of origin) found in Table 2 and Appendix A (available online at To envision how a model can be used in their research study, researchers can look to the articles that describe the model as well as studies that have used the models; these papers should provide guidance on how exactly the model is used and the availability of measures for the model’s constructs.

Using the Selected Model

Selection of a model should occur as part of study planning and design. Once the appropriate model has been selected, it should be applied throughout the study. Several resources, including the Veteran Affairs’ Quality Enhancement Research Initiative111; the National Cancer Institute’s Implementation Science Team112; Training Institute for Dissemination and Implementation Research in Health113; and the Canadian Knowledge Translation Clearinghouse114 websites provide more-detailed guidance on how to use a selected model to inform a D&I study.

In general, the model should be considered in a study’s design, aims, activities, methods, measures and evaluation. Models can be used directly or after some modification to make them more appropriate for the study. If using the model directly, with minimal adaptation, it is important to ensure that the model is appropriate for the proposed intervention and cultural preferences of the target population. Use of a model primarily implies conversion of the model into measurable components. This allows researchers to quantify mediators, moderators, and outcomes.20, 115, 116 This is easier when measures that capture the specific model constructs are available. Unfortunately, as discussed below, available measures are often lacking.

Adapting an Existing Model

A researcher will almost always adapt a model in some way; therefore, adaptation is often an important part of using a model. Adaptation often improves the appropriateness of the selected model to the intervention being disseminated or implemented, the population, and the setting.117 Further, adaptation contributes to the field by testing modifications to existing models, such as disregarding pieces shown to be ineffective or adding ones with additional evidence. Models should be viewed as living documents, or works in progress, not as static entities.

For researchers considering adapting an existing model, a number of issues are important to note. Initial identification of a D&I model to adapt should consider factors that influence the fit of a model such as the target population and/or setting (sociodemographics, geography, language, and culture) and the technology and resources needed for intervention delivery (e.g., high-speed Internet connection, media skills). In making adaptations, several types are possible.

Modifications that can be made without much hesitation include: wording to suit the audience, timeline (based on adaptation guides), or cultural preferences based on the population. Adaptations that may be possible, but should be made with caution, include: substituting activities or changing the order of the steps. Adaptations that compromise the core elements of the model should not be attempted without substantial evidence to support the adaptation. This includes changing the health communication model/theory or the health topic/behavior; deleting core elements; or putting in strategies that detract from the core elements. As long as model adaptations do not become a weakness of the proposed study, when drastic changes are made to a model, it provides an excellent opportunity for model testing. In studies that adapt a model, adaptations should be documented and monitored so that the impact of changes on model applicability can be reported and incorporated into the literature.

Measuring Constructs

A particularly important aspect to consider in model-informed studies is the availability of measures to assess a model’s constructs. Without measures, it is impossible to operationalize a model and conduct D&I research. As a developing field, many constructs are currently assessed as open-ended questions (or not assessed at all) because standard measures are lacking.

In addition, the small sample size of many studies prevents the development, evaluation, and use of standard measures. This difficulty is discussed by a number of authors. Damschroder et al. lay out common, overlapping constructs, which are found in many models, and note that reliable and valid measures to assess these common constructs, regardless of the model, would enhance the rigor of D&I research.104 Chamberlain et al. also discuss elements outside the constructs of the individual models that should be measured.118 Use of meta-analysis to enhance D&I measures has been inhibited by weaknesses in information about outcomes, use of dichotomous measures, and unit of analysis.119,120

Given the complexity of the issue of measurement, the authors attempted to provide examples of measurement use in the case studies. Unfortunately, only two of the case studies provide a detailed discussion of measures; this illustrates the difficulty of construct measurement. Readers can refer to the two specific case studies (Figure 1; and Appendix E, available online at for a more detailed discussion of how to measure constructs. Although there are few published studies that discuss in detail the use of construct measures, two new, increasingly important resources for researchers looking for relevant measures are: the Seattle Implementation Research Conference Measures Project121 and the Grid-Enabled Measures developed by the National Cancer Institute,122 both of which are initiatives to compile, enhance, and help harmonize D&I measures.

Model Categorization

The models described in this review have been organized using a number of categories. These divisions are intended to assist the reader in model selection, rather than to provide actual classifications for models. There is substantial overlap between models, as the included constructs are often similar. This may be due to the similarity of the theoretic underpinnings (such as organizational theory, diffusion of innovation theory, and political science theory), which broadly inform D&I research.123,124 These common theoretic foundations come from many fields, provide overarching roots for many models, and further emphasize the transdisciplinary nature of the field.


This study is strengthened by the face validity and reliability provided by model-developer agreement on the categorization of the models they developed for a subset of models. Further, receiving input from project officers at the NIH, who guide D&I researchers on model selection, ensures that the most commonly recommended models were considered by this review. Contacting all available model developers to ensure that the correct model names, original citations, and updated citations were included increases confidence in the findings. Finally, this review drew from models being used across the many disciplines conducting D&I research and will facilitate innovative, transdisciplinary use of models by D&I scientists.


Since this is not a systematic review, it is impossible to ensure all available models were included. As mentioned above, the lack of terminology in the D&I research field as well as the diverse range of disciplines contributing models to D&I research made this type of search prohibitively broad in scope. Further, it is likely that models from fields outside of health, such as education, business, and political science, may have been missed or under-represented. In addition, as it is difficult to measure the use of models in grant applications and unpublished research projects, the citation number for each model provided in Appendix A (available online at can serve as only a proxy for the popularity and use of any given model. Finally, only models published in English were included.

The current review suggests that much work remains to be done in the field of D&I research. These findings need to be spread to not only D&I researchers but also scientists who are less versed in D&I research. Nonresearchers would also benefit from this knowledge, so they become aware of D&I science as a field and how D&I researchers can help them deliver the best care to those they serve. As it was beyond the scope of this review to include models targeted at practitioners, such models should be similarly inventoried and synthesized. As mentioned above, the science of D&I research is severely limited by the lack of measures available to assess the constructs in the included models; future studies in this area should work to review and compile available measures and identify gaps. There is also a lack of consistency in the terminology used to discuss this type of work. Rabin et al. have created a glossary of terms to clarify this discussion, and consistent use of language would help the field as it moves forward.20

An additional characteristic to assess in future research is whether a model is designed to guide D&I intervention development, evaluate interventions, or both guide and evaluate efforts. Further directions for considerations in evidence-based decision-making may look to less-traditional methods such as dynamic simulation to inform implementation decision-making, as suggested by Hvitfeldt Forsberg et al.125 This is a truly transdisciplinary work, which charges researchers with the task of working across fields; this can bring benefits and challenges, both of which must be tackled as the field of D&I research continues to grow.

Supplementary Material



The authors are grateful to numerous model developers who commented on their models and the variables used for classification. They also appreciate the feedback of the Washington University Network for Dissemination and Implementation Research (WUNDIR).

This project was funded in part by cooperative agreement number U48/DP001903 from the CDC, Prevention Research Centers Program and Grant 1R01CA124404-01 from the National Cancer Institute at the NIH. It was also supported in part by the National Center for Research Resources and the National Center for Advancing Translational Sciences, NIH, through Grant TL1RR024995 and UL1RR024992. The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH.


No financial disclosures were reported by the authors of this paper.

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.


1. Woolf SH. The meaning of translational research and why it matters. JAMA. 2008;299(2):211–3. [PubMed]
2. Woolf SH, Johnson RE. The break-even point: when medical advances are less important than improving the fidelity with which they are delivered. Ann Fam Med. 2005;3(6):545–52. [PubMed]
3. Balas EA, Boren SA. Yearbook of Medical Informatics 2000: Patient-centered Systems. Schattauer; Stuttgart, Germany: 2000. Managing clinical knowledge for health care improvement; pp. 65–70.
4. Eccles MP, Mittman BS. Welcome to implementation science. Implementation Science. 2006;1(1):1.
5. Glanz K, Bishop DB. The role of behavioral science theory in development and implementation of public health interventions. Annu Rev Public Health. 2010;31:399–418. [PubMed]
6. Green LW, Kreuter MW. Health program planning : an educational and ecological approach. 4th ed McGraw-Hill; New York: 2005.
7. Ammerman AS, Lindquist CH, Lohr KN, Hersey J. The efficacy of behavioral interventions to modify dietary fat and fruit and vegetable intake: a review of the evidence. Prev Med. 2002;35(1):25–41. [PubMed]
8. Noar SM, Benac CN, Harris MS. Does tailoring matter? Meta-analytic review of tailored print health behavior change interventions. Psychol Bull. 2007;133(4):673–93. [PubMed]
9. Glasgow RE, Goldstein MG, Ockene JK, Pronk NP. Translating what we have learned into practice - Principles and hypotheses for interventions addressing multiple behaviors in primary care. American Journal of Preventive Medicine. 2004;27(2):88–101. [PubMed]
10. Bartholomew LK, Parcel GS, Kok G, Gottlieb NH, Fernandez ME. Planning health promotion programs: An intervention mapping approach. 3rd ed Jossey-Bass; San Francisco, CA: 2011.
11. Mitchell SA, Fisher CA, Hastings CE, Silverman LB, Wallen GR. A thematic analysis of theoretical models for translational science in nursing: mapping the field. Nursing Outlook. 2010;58(6):287–300. [PMC free article] [PubMed]
12. Sales A, Smith J, Curran G, Kochevar L. Models, strategies, and tools. Theory in implementing evidence-based findings into health care practice. Journal of General Internal Medicine. 2006;21(Suppl 2):S43–9. [PMC free article] [PubMed]
13. Van Achterberg T, Schoonhoven L, Grol R. Nursing Implementation Science: How Evidence Based Nursing Requires Evidence Based Implementation. Journal of Nursing Scholarship. 2008;40(4):302–310. [PubMed]
14. Chambers D. Foreword. In: Brownson R, Colditz G, Proctor EK, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford University Press; Oxford; New York: 2012.
15. Dearing JW, Kee KF. Historical Roots of Dissemination and Implementation Science. In: Brownson R, Colditz G, Proctor EK, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford University Press; Oxford; New York: 2012.
16. Rabin BA, RC B. Developing the terminology for dissemination and implementation research. In: RC B, Colditz G, EK P, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford University Press; New York; Oxford: 2012.
17. Kirkevold M. Integrative nursing research: an important strategy to further the development of nursing science and nursing practice. Journal of Advanced Nursing. 1997;25(5):977–984. [PubMed]
18. McPheeters MLBP, Teutsch SJ, Truman B. Systematic reviews in public health. In: Brownson RCPD, editor. Applied epidemiology: Theory to practice. 2 ed Oxford University Press; New York: 2006. pp. 99–124.
19. Havelock RG. Planning for innovation through dissemination and utilization of knowledge. Centre for Research on Utilization of Scientific Knowledge, Institute for Social Research, University of Michigan; 1969.
20. Rabin BA, Brownson RC, Haire-Joshu D, Kreuter MW, Weaver NL. A glossary for dissemination and implementation research in health. J Public Health Manag Pract. 2008;14(2):117–23. [PubMed]
21. Rogers EM. Diffusion of innovations. 5th ed Free Press; New York: 2003.
22. Winkler JD, Lohr KN, Brook RH. Persuasive communication and medical technology assessment. Arch Intern Med. 1985;145(2):314–7. [PubMed]
23. Scullion PA. Effective dissemination strategies. Nurse Res. 2002;10(1):65–77. [PubMed]
24. Anderson M, Cosby J, Swan B, Moore H, Broekhoven M. The use of research in local health service agencies. Soc Sci Med. 1999;49(8):1007–19. [PubMed]
25. Kingdon JW. Agendas, alternatives, and public policies. Little, Brown; Boston: 1984.
26. Kingdon JW. Agendas, alternatives, and public policies. Updated 2nd ed Longman; Boston: 2010.
27. Lester JP. The utilization of policy analysis by state agency officials. Science Communication. 1993;14(3):267.
28. Kramer DM, Cole DC. Sustained, intensive engagement to promote health and safety knowledge transfer to and utilization by workplaces. Science Communication. 2003;25(1):56.
29. Riley BL, Stachenko S, Wilson E, Harvey D, Cameron R, Farquharson J, et al. Can the Canadian Heart Health Initiative inform the population Health Intervention Research Initiative for Canada? Can J Public Health. 2009;100(1):Suppl):I20–6. [PubMed]
30. Elliott SJ, O’Loughlin J, Robinson K, Eyles J, Cameron R, Harvey D, et al. Conceptualizing dissemination research and activity: the case of the Canadian Heart Health Initiative. Health Educ Behav. 2003;30(3):267–82. discussion 283-6. [PubMed]
31. Owen N, Glanz K, Sallis JF, Kelder SH. Evidence-based approaches to dissemination and diffusion of physical activity interventions. Am J Prev Med. 2006;31(4 Suppl):S35–44. [PubMed]
32. Yuan CT, Nembhard IM, Stern AF, Brush JE, Jr., Krumholz HM, Bradley EH. Blueprint for the dissemination of evidence-based practices in health care. Issue Brief (Commonw Fund) 2010;86:1–16. [PubMed]
33. Jacobson N, Butterill D, Goering P. Development of a framework for knowledge translation: understanding user context. J Health Serv Res Policy. 2003;8(2):94–9. [PubMed]
34. Atun R, de Jongh T, Secci F, Ohiri K, Adeyi O. Integration of targeted health interventions into health systems: a conceptual framework for analysis. Health Policy Plan. 2010;25(2):104–11. [PubMed]
35. Atun RA, Kyratsis I, Jelic G, Rados-Malicbegovic D, Gurol-Urganci I. Diffusion of complex health innovations--implementation of primary health care reforms in Bosnia and Herzegovina. Health Policy Plan. 2007;22(1):28–39. [PubMed]
36. Langley GJ. The improvement guide : a practical approach to enhancing organizational performance. 2nd ed Jossey-Bass; San Francisco: 2009.
37. Nolan K, Schall MW, Erb F, Nolan T. Using a framework for spread: The case of patient access in the Veterans Health Administration. Jt Comm J Qual Patient Saf. 2005;31(6):339–47. [PubMed]
38. Baumbusch JL, Kirkham SR, Khan KB, McDonald H, Semeniuk P, Tan E, et al. Pursuing common agendas: a collaborative model for knowledge translation between research and practice in clinical settings. Res Nurs Health. 2008;31(2):130–40. [PubMed]
39. Lomas J. Retailing research: increasing the role of evidence in clinical services for childbirth. The Milbank Quarterly. 1993:439–475. [PubMed]
40. Funk SG, Tornquist EM, Champagne MT. A model for improving the dissemination of nursing research. West J Nurs Res. 1989;11(3):361–72. [PubMed]
41. Dobbins M, DeCorby K, Robeson P, Tirilis D. Public Health Model. In: Rycroft-Malone J, Bucknall T, editors. Models and frameworks for implementing evidence-based practice : linking evidence to action. Wiley-Blackwell; Chichester: 2010. p. xviii.p. 268.
42. Dobbins M, Ciliska D, Cockerill R, Barnsley J, DiCenso A. A framework for the dissemination and utilization of research for health-care policy and practice. Online J Knowl Synth Nurs. 2002;9:7. [PubMed]
43. Mendel P, Meredith LS, Schoenbaum M, Sherbourne CD, Wells KB. Interventions in organizational and community context: a framework for building evidence on dissemination and implementation in health services research. Adm Policy Ment Health. 2008;35(1-2):21–37. [PMC free article] [PubMed]
44. Robinson K, Elliott SJ, Driedger SM, Eyles J, O’Loughlin J, Riley B, et al. Using linking systems to build capacity and enhance dissemination in heart health promotion: a Canadian multiple-case study. Health Educ Res. 2005;20(5):499–513. [PubMed]
45. Kreuter MW, Casey CM, Bernhardt JM. Enhancing dissemination though marketing and distribution systems: A vision for public health. In: RC B, Colditz G, EK P, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford University Press; New York; Oxford: 2012.
46. Martin GW, Herie MA, Turner BJ, Cunningham JA. A social marketing model for disseminating research-based treatments to addictions treatment providers. Addiction. 1998;93(11):1703–15. [PubMed]
47. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629. [PubMed]
48. Harris JR, Cheadle A, Hannon PA, Forehand M, Lichiello P, Mahoney E, et al. A framework for disseminating evidence-based health promotion practices. Prev Chronic Dis. 2012;9:E22. [PMC free article] [PubMed]
49. Ward V, Smith S, Carruthers S, House A, Hamer S. Knowledge Brokering. Exploring the process of transferring knowledge into action Leeds. University of Leeds; 2010. [PMC free article] [PubMed]
50. Ward V, Smith S, House A, Hamer S. Exploring knowledge exchange: a useful framework for practice and policy. Soc Sci Med. 2012;74(3):297–304. [PubMed]
51. Ward VL, House AO, Hamer S. Knowledge brokering: exploring the process of transferring knowledge into action. BMC Health Serv Res. 2009;9:12. [PMC free article] [PubMed]
52. Ellen ME, Lavis JN, Ouimet M, Grimshaw J, Bedard PO. Determining research knowledge infrastructure for healthcare systems: a qualitative study. Implement Sci. 2011;6(1):60. [PMC free article] [PubMed]
53. IWH Institute for Work and Health - Knowledge Transfer & Exchange Guides. 2006
54. Lavis JN, Lomas J, Hamid M, Sewankambo NK. Assessing country-level efforts to link research to action. Bull WHO. 2006;84(8):620–8. [PubMed]
55. Lavis JN, Robertson D, Woodside JM, McLeod CB, Abelson J. How can research organizations more effectively transfer research knowledge to decision makers? Milbank Quarterly. 2003;81(2):221–248. [PubMed]
56. Dearing JW. Social marketing and diffusion-based strategies for communicating with unique populations: HIV prevention in San Francisco. Journal of Health Communication. 1996;1(4):343–364. [PubMed]
57. Dearing JW, Maibach EW, Buller DB. A convergent diffusion and social marketing approach for disseminating proven approaches to physical activity promotion. Am J Prev Med. 2006;31(4 Suppl):S11–23. [PubMed]
58. Dodson EA, Brownson RC, Weiss SW. Policy Dissemination Research. In: Brownson R, Colditz G, EK P, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford University Press; Oxford ; New York: 2012.
59. Orlandi MA. Health promotion technology transfer: organizational perspectives. Can J Public Health. 1996;87(Suppl 2):S28–33. [PubMed]
60. S D, A M, editors. Leading Health Care Organizations. Palgrave Macmillan; Houndmills: 2003. Adapted by Chambers D, H R, K H, V P. Leading Clinical Practice Change.
61. Pettigrew AM, Ferlie E, McKee L. Shaping strategic change: making change in large organizations: the case of the National Health Service. Sage Publications; Thousand Oaks CA: 1992.
62. Nieva VF, Murphy R, Ridley N, et al. From Science to Service: A Framework for the Transfer of Patient Safety. 2005. [PubMed]
63. TIDIRH Working Group et al. Training Institute for Dissemination and Implementation Research in Health; 2011. Chapel Hill; North Carolina: Aug 1, 2011. Interacting Elements of Integrating Science, Policy, and Practice. Adapted from Ward. 2011.
64. Wandersman A, Duffy J, Flaspohler P, et al. Bridging the gap between prevention research and practice: the interactive systems framework for dissemination and implementation. Am J Community Psychol. 2008;41(3-4):171–81. [PubMed]
65. Green LW, Orleans CT, Ottoson JM, Cameron R, Pierce JP, Bettinghaus EP. Inferring strategies for disseminating physical activity policies, programs, and practices from the successes of tobacco control. Am J Prev Med. 2006;31(4 Suppl):S66–81. [PubMed]
66. Green LW, Ottoson JM, Garcia C, Hiatt RA. Diffusion theory and knowledge dissemination, utilization, and integration in public health. Annu Rev Public Health. 2009;30:151–74. [PubMed]
67. Farkas M, Anthony WA. Bridging science to service: using Rehabilitation Research and Training Center program to ensure that research-based knowledge makes a difference. J Rehabil Res Dev. 2007;44(6):879–92. [PubMed]
68. Farkas M, Jette AM, Tennstedt S, Haley SM, Quinn V. Knowledge dissemination and utilization in gerontology: An organizing framework. The Gerontologist. 2003;43(suppl 1):47. [PubMed]
69. Kontos PC, Poland BD. Mapping new theoretical and methodological terrain for knowledge translation: contributions from critical realism and the arts. Implement Sci. 2009;4:1. [PMC free article] [PubMed]
70. Davis D, Evans M, Jadad A, et al. The case for knowledge translation: shortening the journey from evidence to effect. BMJ. 2003;327(7405):33–5. [PMC free article] [PubMed]
71. Pathman DE, Konrad TR, Freed GL, Freeman VA, Koch GG. The awareness-to-adherence model of the steps to clinical guideline compliance: the case of pediatric vaccine recommendations. Medical Care. 1996;34(9):873. [PubMed]
72. Dreisinger ML, Boland EM, Filler CD, Baker EA, Hessel AS, Brownson RC. Contextual factors influencing readiness for dissemination of obesity prevention programs and policies. Health Educ Res. 2011 [PubMed]
73. Gholami J, Majdzadeh R, Nedjat S, Maleki K, Ashoorkhani M, Yazdizadeh B. How should we assess knowledge translation in research organizations; designing a knowledge translation self-assessment tool for research institutes (SATORI) Health Res Policy Syst. 2011;9:10. [PMC free article] [PubMed]
74. Majdzadeh R, Sadighi J, Nejat S, Mahani AS, Gholami J. Knowledge translation for research utilization: design of a knowledge translation model at Tehran University of Medical Sciences. J Contin Educ Health Prof. 2008;28(4):270–7. [PubMed]
75. Frambach RT, Schillewaert N. Organizational innovation adoption: a multi-level framework of determinants and opportunities for future research. Journal of Business Research. 2002;55(2):163–176.
76. Logan J, Graham ID. Toward a comprehensive interdisciplinary model of health care research use. Science Communication. 1998;20(2):227.
77. Logan J, Graham ID. The Ottawa Model of Research Use. In: Bucknall JR-MaT., editor. Models and Frameworks for Implementating Evidence-Based Practice: Evidence to Action. Wiley-Blackwell; Oxford: 2010.
78. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–7. [PubMed]
79. Damush TMDV, Bravata DM, Plue L, Woodward-Hagg H, Williams LS. Facilitation of Best Practices (FAB) Framework. Stroke QUERI Center Annual Report. 2008
80. Bauman AE, Nelson DE, Pratt M, Matsudo V, Schoeppe S. Dissemination of physical activity evidence, programs, policies, and surveillance in the international public health arena. Am J Prev Med. 2006;31(4 Suppl):S57–65. [PubMed]
81. Bowen S, Zwi AB. Pathways to “evidence-informed” policy and practice: a framework for action. PLoS Med. 2005;2(7):e166. [PubMed]
82. Collins C, Harshbarger C, Sawyer R, Hamdallah M. The diffusion of effective behavioral interventions project: development, implementation, and lessons learned. AIDS Educ Prev. 2006;18(4 Suppl A):5–20. [PubMed]
83. Collins CB, Jr., Johnson WD, Lyles CM. Linking research and practice: evidence-based HIV prevention. Focus. 2007;22(7):1–5. [PubMed]
84. Neumann MS, Sogolow ED. Replicating effective programs: HIV/AIDS prevention technology transfer. AIDS Educ Prev. 2000;12(5 Suppl):35–48. [PubMed]
85. Sogolow E, Peersman G, Semaan S, Strouse D, Lyles CM. The HIV/AIDS Prevention Research Synthesis Project: scope, methods, and study classification results. J Acquir Immune Defic Syndr. 2002;30(Suppl 1):S15–29. [PubMed]
86. Sogolow ED, Kay LS, Doll LS, et al. Strengthening HIV prevention: application of a research-to-practice framework. AIDS Educ Prev. 2000;12(5 Suppl):21–32. [PubMed]
87. CDC-DHAP HIV/AIDS Prevention Research Synthesis Project. 2011 Aug 12;
88. Feldstein AC, Glasgow RE. A practical, robust implementation and sustainability model (PRISM) for integrating research findings into practice. Jt Comm J Qual Patient Saf. 2008;34(4):228–43. [PubMed]
89. Fixsen DL, Mental LPF, Florida UoS. Implementation research: A synthesis of the literature. National Implementation Research Network. 2005
90. Institute FCD National Implementation Research Network. 2008
91. Weiner BJ, Lewis MA, Linnan LA. Using organization theory to understand the determinants of effective implementation of worksite health promotion programs. Health Educ Res. 2009;24(2):292–305. [PubMed]
92. Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health. 2009;36(1):24–34. [PMC free article] [PubMed]
93. Klein KJ, Conn AB, Sorra JS. Implementing computerized technology: An organizational analysis. Journal of Applied Psychology. 2001;86(5):811. [PubMed]
94. Klein KJ, Sorra JS. The challenge of innovation implementation. Academy of management review. 1996:1055–1080.
95. May C, Murray E, Finch T, Mair F, Treweek S, Ballini, Macfarlane A, Rapley T. Normalization Process Theory On-line Users’ Manual and Toolkit. 2010
96. May C, Finch T. Implementing, Embedding, and Integrating Practices: An Outline of Normalization Process Theory. Sociology-the Journal of the British Sociological Association. 2009;43(3):535–554.
97. Murray E, Treweek S, Pope C, et al. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions. BMC Med. 2010;8:63. [PMC free article] [PubMed]
98. Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Quality in Health Care. 1998;7(3):149. [PMC free article] [PubMed]
99. Kitson AL, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A. Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges. Implement Sci. 2008;3:1. [PMC free article] [PubMed]
100. Rycroft-Malone J. The PARIHS framework--a framework for guiding the implementation of evidence-based practice. J Nurs Care Qual. 2004;19(4):297–304. [PubMed]
101. Pronovost PJ, Berenholtz SM, Needham DM. Translating evidence into practice: a model for large scale knowledge translation. BMJ. 2008:337. [PubMed]
102. Elwyn G, Taubert M, Kowalczuk J. Sticky knowledge: a possible model for investigating implementation in healthcare contexts. Implement Sci. 2007;2:44. [PMC free article] [PubMed]
103. Szulanski G. Exploring internal stickiness: Impediments to the transfer of best practice within the firm. Strategic management journal. 1996;17:27–43.
104. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50. [PMC free article] [PubMed]
105. Damschroder L. Consolidated Framework for Implementation Research (CFIR) Wiki. 2010. 2010 Nov 8;
106. Kilbourne AM, Neumann MS, Pincus HA, Bauer MS, Stall R. Implementing evidence-based interventions in health care: application of the replicating effective programs framework. Implement Sci. 2007;2:42. [PMC free article] [PubMed]
107. Glisson C, Schoenwald SK. The ARC organizational and community intervention strategy for implementing evidence-based children’s mental health treatments. Ment Health Serv Res. 2005;7(4):243–59. [PubMed]
108. Glisson C, Schoenwald SK, Hemmelgarn A, et al. Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. J Consult Clin Psychol. 2010;78(4):537–50. [PMC free article] [PubMed]
109. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23. [PMC free article] [PubMed]
110. CDC Division of Reproductive Health. Lezin N, Rolleri LA, Wilson MM, Fuller TR, Firpo-Triplett R, Barth RP. Reducing the Risk Adaptation Kit. ETR Associates; Santa Cruz, CA: 2010.
111. VA. QUERI - Quality Enhancement Research Initiative. 2011 Oct 14;
112. NCI. Implementation Science (IS) Team 2012 Jan 8; 2012.
113. TIDIRH Training Institute for Dissemination and Implementation Research in Health. 2012 [PMC free article] [PubMed]
114. CIHR Canadian Institute of Health Research; KT Clearinghouse: 2012.
115. Rabin BA, Glasgow RE, Kerner JF, Klump MP, Brownson RC. Dissemination and implementation research on community-based cancer prevention: a systematic review. Am J Prev Med. 2010;38(4):443–56. [PubMed]
116. Glasgow RE, Marcus AC, Bull SS, Wilson KM. Disseminating effective cancer screening interventions. Cancer. 2004;101(S5):1239–1250. [PubMed]
117. Allen JD, Linnan LA, Emmons KM. Fidelity and Its Relationship to Implementation Effectiveness, Adaptation and Dissemination. In: RC B, Colditz G, EK P, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford University Press; New York; Oxford: 2012.
118. Chamberlain P, Brown CH, Saldana L. Observational measure of implementation progress in community based settings: The Stages of implementation completion (SIC) Implement Sci. 2011;6:116. [PMC free article] [PubMed]
119. Grimshaw J, Eccles M, Thomas R, et al. Toward evidence-based quality improvement. Evidence (and its limitations) of the effectiveness of guideline dissemination and implementation strategies 1966-1998. J Gen Intern Med. 2006;21(Suppl 2):S14–20. [PMC free article] [PubMed]
120. Proctor EK, Brownson RC. Measurement Issues in Dissemination and Implementation Research. In: Brownson R, Colditz G, EK P, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. Oxford University Press; Oxford ; New York: 2012.
121. SIRC Seattle Implementation Research Conference Measures Project: A Comprehensive Review of Dissemination and Implementation Science Instruments.
122. CECCR Grid-Enabled Measures (GEM) Database. 2012. 2012 Mar 12;
123. Champagne F, Lemieux-Charles L. Using knowledge and evidence in health care: multidisciplinary perspectives. Univ of Toronto Pr; 2004.
124. Wilson PM, Petticrew M, Calnan MW, Nazareth I. Disseminating research findings: what should researchers do? A systematic scoping review of conceptual frameworks. Implement Sci. 2010;5:91. [PMC free article] [PubMed]
125. Forsberg HH, Aronsson H, Keller C, Lindblad S. Managing health care decisions and improvement through simulation modeling. Qual Manag Health Care. 2011;20(1):15–29. [PubMed]