Search tips
Search criteria 


Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
Gen Hosp Psychiatry. Author manuscript; available in PMC 2011 September 1.
Published in final edited form as:
PMCID: PMC2943485

Efficiency in Mental Health Practice and Research


Limited financial resources, escalating mental health related costs, and opportunities for capitalizing on advances in health information technologies have brought the theme of efficiency to the forefront of mental health services research and clinical practice. In this introductory paper to the journal series stemming from the 20th NIMH Mental Health Services Research Conference, we first delineate the need for a new focus on efficiency in both research and clinical practice. Second, we provide preliminary definitions of efficiency for the field and discuss issues related to measurement. Finally, we explore the interface between efficiency in mental health services research and practice and the NIMH strategic objectives of developing improved interventions for diverse populations and enhancing the public health impact of research. Case examples illustrate how perspectives from dissemination and implementation research may be used to maximize efficiencies in the development and implementation of new service delivery models. Allowing findings from the dissemination and implementation field to permeate and inform clinical practice and research may facilitate more efficient development of interventions and enhance the public health impact of research.

Keywords: efficiency, mental health, health services research, intervention development, implementation


In its 2001 landmark report, Crossing the Quality Chasm, the Institute of Medicine challenged the nation to develop a quality 21st century health care system by achieving six key aims [1]. Although initial progress has arguably been made toward defining and attending to five of these aims in both general and mental health -- safety, effectiveness, patient-centeredness, timeliness, and equity -- the sixth aim, efficiency, has remained largely unaddressed, especially within the realm of mental health services.

In July of 2009, the National Institute of Mental Health (NIMH) focused its 20th Mental Health Services Research Conference on the theme of “Increasing the Efficiency of Research and Mental Health Services Delivery.” The conference convened academic and community researchers and representatives to review progress in mental health services research and to advance the field by strategizing about ways to emphasize efficiency both in future research efforts and in how research informs clinical practice.

Why focus on efficiency?

Why did the NIMH choose to focus on efficiency at the current time, rather than on other key aims? First, although the cost of doing research, as measured by the Biomedical Research and Development Price Index (BRDPI), has been increasing at approximately 3% per year since 2003, the NIMH budget has experienced smaller increases – or even a decline - in its operating budget since 2004, resulting in declining purchase power [2]. In 1998, the average yearly cost of an NIMH research project grant was $234,000 while in 2007 it was $349,000 -- an almost 50% increase in less than 10 years [3]. Clearly, if research is to continue informing science and practice, methods to increase efficient use of scarce resources are critical.

The second reason for focusing on efficiency is that despite the skyrocketing costs of mental illness and growing expenditures on mental health care, clinical practices have, on the whole, have been largely unable to demonstrate improved outcomes or cost savings. From 1992 to 2002, health care expenditures for serious mental disorders increased from approximately 63 to 100 billion dollars [4]. Despite this significant increase, costs from loss of earnings due to serious mental illness rose from 77 to 193 billion dollars during the same time period, and disability-related costs from social security and disability income rose from 16 to 24 billion dollars [4]. Faced with insurmountable expenditures, state mental health agencies have needed to cut even the most basic services.

Third, focusing on efficiency allows us to capitalize on the many rapid advances in health information technologies in order to improve the ways in which research is conducted and mental health services are delivered. Newly accessible, highly applicable technologies permit the tracking and sharing of data from clinical and community settings in real time; the ability to use synchronous or asynchronous communication to provide quality care for those who would otherwise lack access; and the development of interventions using novel platforms (e.g., home computers, smartphones, social network sites). These technologies may permit efficiencies in research and practice not previously imaginable.

Mental health services research must play a critical role in informing the most efficient use of resources. Research can lead to the development and implementation of new and effective clinical practices; knowledge regarding which key intervention components have the greatest impact; and policy changes that promote the financing and implementation of evidence-based practices. Unfortunately, however, it may take up to 17 years for research findings to impact real world settings [5]. Thus, the challenge for mental health services research is three-fold: it must be conducted more efficiently, it should inform the development and implementation of effective clinical practices, and its findings should be more efficiently translated into clinical practice and health policy.

In addressing this three-fold challenge, we need to be aware of existing incentives and attitudes that may prevent an emphasis on efficiency in research and practice. In the conduct of research, for example, rather than find ways to produce the maximum amount of information quickly for low cost, investigators may feel incentivized to budget projects for the maximum time and dollars allowable. In clinical practice, misaligned financial incentives for providers and healthcare systems may lead to waste. In translational research, although comparative effectiveness research promises to provide important information regarding the relative value of interventions, fears about the possible rationing of care have often led to the omission of cost comparisons. Many may be concerned that emphasizing efficiency will compromise other key healthcare aims – such as equity – as researchers and practitioners aim to achieve the most improvements quickly and at least cost rather than reach out to those populations in most need. Clearly, we will need to address attitudes and realign incentives in order to focus on efficiency.

A focus on efficiency is consistent with the NIMH Strategic Plan from 2008 [6]. In order to transform our understanding and treatment of mental illnesses, the NIMH identified four overarching objectives, two of which are particularly relevant to the themes of he NIMH conference and this paper. The third strategic objective calls for the development of new and improved interventions that incorporate the diverse needs and circumstances of people with mental illnesses. As we strive to accomplish this objective, we can employ what we have learned from our experience with the different -- and typically successive -- stages of model development, intervention testing, real-world implementation, and policy development to move from intervention development to implementation and uptake more quickly and efficiently. In an ensuing paper in this series, Katon and colleagues summarize their experience with the development and implementation of the collaborative care model; lessons learned may help us to maximize the efficiency of new intervention development. The fourth NIMH strategic objective calls for strengthening the public health impact of research. Epidemiologic data, dissemination and implementation experiences, and policy perspectives can be used to inform the efficient development and implementation of interventions with broad public reach. In this paper series, Kolko and colleagues and Engel et al discuss their experiences developing interventions responsive to trauma, in settings as diverse as schools and military theaters.

Toward a construct of efficiency

The IOM defined efficiency as reducing waste in health care, and thus reducing total costs [1]. This is in keeping with the IOM’s call to improve the quality of health care by addressing overuse, misuse, and underuse of medical services. Although underuse of needed services may arguably be the critical priority area in mental health, overuse or misuse of diagnostic procedures or treatments may incur waste and unnecessary costs. For example, in the area of depression in primary care settings, overuse may occur if extensive diagnostic testing is ordered for medically unexplained symptoms that are part of depression; misuse may occur if antidepressants are prescribed for patients who have minor depression or adjustment disorders [7].

Health economists’ definitions of efficiency have been somewhat more specific, relating efficiency to the use of health care resources in such as a way as to get the best value for money spent [8]. In their thorough review, McGlynn and colleagues [9] defined efficiency as the relationship between a specific product or output of the health care system, and the resources or inputs used to create the product (see Figure 1). Achieving efficiency involves maximizing output for a given cost, or, minimizing cost for a given output.

Figure 1
Efficiency Defined as a Relationship of Outputs to Inputs

Before proceeding further, it is important to note that perspective is critical to consider in definitions of efficiency. Different stakeholders -- including consumers/patients, providers (physicians, hospitals), intermediaries (health plans, employers), and society as a whole -- each control a particular set of resources, or inputs, and may seek to obtain or deliver a different set of products [9]. What is efficient to an individual patient may differ from what is efficient to a specific provider or third-party payor. For example, a patient receiving medication and psychotherapy may find it most efficient to see the same physician provider for both treatments because of reduced time costs, while his or her insurance company may not, given higher fees associated with physician time relative to other mental health care providers. A physician conducting dementia evaluations may find it most efficient to obtain laboratory and neuroimaging studies on all new patients prior to meeting with them in order to improve diagnostic accuracy and timeliness at the first visit, while third party payors may prefer more targeted diagnostic testing, given the cost of extensive studies and the often low likelihood of pertinent findings. General health care providers may not find it efficient to care for mental health problems if such services are not incentivized; however, from a societal perspective, early diagnosis and treatment of mental disorders in general health care settings may produce savings in unemployment and in other service sector costs. Research funding agencies may value efficiencies in the research process itself (e.g., use of existing information systems or practice research networks, streamlining of recruitment procedures, incorporation of multiple analyses); in the development of new interventions (e.g., informed and efficient intervention development, capacity for real world implementation and sustainability); and in potential impact on public health and policy.

Outputs and inputs in mental health services and research

If efficiency involves maximizing the relationship of outputs to inputs, or resources to outcomes, how are outputs and inputs defined in relationship to mental health services? Outputs may include both health services and health outcomes; in fact, health services are an intermediate output that results in specific health outcomes [9]. Measures of mental health services may include the numbers of persons served or the service visits, psychoeducation sessions, prescribed medications, psychotherapy sessions, or evidence-based components delivered. It is important to note that outputs may be measured for a single visit, for variably defined time periods, or for discrete episodes of care. For example, prescriptions may be measured for single visits, for yearly periods, or for an acute illness episode. Research-related outputs may include the development of interventions that are easily adapted by diverse end-users or that have capacity for dissemination and sustainability.

Ultimately, health outcomes may be a more pertinent output to measure, despite the fact that they are impacted not only by health services received but also by a multitude of other factors, including patient characteristics. Diverse outcomes may be disease-specific or include quality of life, patient satisfaction, employment, or academic performance. From a societal or funding agency perspective, outcomes may also include concerns such as building research infrastructure and community capacity, reducing disparities for ethnic minorities, and having policy impact.

Similar to outputs, inputs may include both physical resources and financial costs; physical resources are an intermediary input that result in financial costs [9]. Physical resources required to deliver mental health services and improve outcomes may include clinical and administrative personnel time, with attention to the most efficient mix of needed personnel. Provider mix includes not only mental health professionals but also general practitioners, consumers, and families. Physical resources also include medical equipment, supplies, and information systems. From a research funding agency perspective, resource considerations include investigator time and the potential need to develop health information systems, research infrastructure, and community partnerships.

Ultimately, physical resources, including personnel time, can be converted to costs. Doing so allows for important comparisons to be made across treatment interventions or research projects. As depicted in Figure 1, the ratio of health services or health outcomes to specific costs may eventually be calculated as a measure of efficiency. Notably, although inputs for the most part may be reducible to costs, outputs from certain treatments, interventions, or research studies may be quite varied and not as easily reduced. An intervention may improve clinical and employment outcomes while reducing disparities; a research study may develop academic and community infrastructure while having important policy impact. Although efficiency may be calculated separately for each outcome, methods are being developed in other health care sectors to incorporate multiple outputs [9].

Three types of efficiency

Three types of efficiency have been broadly defined [10]. Technical efficiency is achieved when the maximum output is produced for a given set of physical resources (output/physical resources), or when a given output is produced using fewer resources. If an individual aspect of a multi-faceted intervention is almost entirely responsible for improving outcomes, it is more technically efficient to provide the individual intervention component. If a proposed study can produce valuable knowledge using existing data systems and infrastructure, it is more technically efficient than one needing to develop these resources anew.

Productive efficiency is achieved when the maximum output is produced for a given cost (output/costs), or when a given outcome is produced for the same cost. Rather than using physical resources as the measure of input, resources are translated into costs, allowing comparisons to be made. For example, for similar improvement in depressive severity, one might calculate the costs of providing antidepressant treatment or psychotherapy; the less costly treatment would have greater productive efficiency. Alternatively, for a given research cost, a study producing information on effectiveness and implementation would have more productive efficiency than another focusing on only one set of outcomes.

Critical to the current health care debate, allocative efficiency refers to maximizing societal good with a given set of resources. The goal is to maximize community welfare; the output is not only improved health outcomes, but how these outcomes are distributed across society. When strictly interpreted, allocative efficiency is only achieved when allocating resources any differently would make at least one person worse off. However, because this would preclude changes that would make many people much better off at the expense of making a few people a bit worse off, the definition of allocative efficiency has been adapted so as to describe a state in which resources -- in this case both for research and clinical practice -- are allocated in such as way as to maximize community welfare [10].

How do technical, productive, and allocative efficiency relate to one another? As described by Palmer and Torgeson [10], an intervention with allocative efficiency is usually productively efficient, and one with productive efficiency is usually technically efficient. The inverse of these relationships is not necessarily true, however. For example, productively efficient interventions that disproportionately impact a segment of the population may deepen disparities and defy allocative efficiency. Especially in environments with few resources, technically efficient interventions that make the best use of what is available may not be productively efficient.

How can we measure efficiency?

As we develop a focus on efficiency in mental health care practice and research, it will be critical to advance our measurement capacity in this area. Efficiency can only be maximized if it can be assessed and tracked. In their thorough review, Hussey and colleagues [11] conclude that no consensus set of health care efficiency measures currently exists. In addition, those measures that do exist have significant limitations [11]. First, measures that are used in real world settings and those that have been scientifically studied and validated diverge widely. Measures in common use have typically not undergone validation studies. Second, existing measures have mostly taken the vantage point of the payor or employer and focused on the efficiency of hospitals or providers. Provider, consumer, and societal perspectives have not been widely represented, nor has that of research funding agencies. Third, current measures typically fail to account for the quality of specific services; they may focus on the quantity of services provided or the number of patients served, without incorporating a measure of the quality of services rendered. Thus, a system design that results in more patients being served in less time might thus be considered more efficient than one delivering higher quality services to somewhat fewer persons. Finally, most existing measures are resource and time-intensive, and thus do not lend themselves easily to guiding system improvements in real time.

Advancing the field of efficiency in mental health services practice and research will thus require the development of valid and useful measures. These will need to capture those inputs and outputs that are most important to different stakeholders, and should incorporate concern for quality. Measures will need to be reliable and reproducible; feasible in relation to burden and cost; and actionable so as to be used in real time to impact practice. Measurements should enable the comparison of different treatments, interventions, or system designs in order to enable stakeholders to make meaningful choices regarding mental health care practices.

Integration of the efficiency theme with the NIMH Strategic Plan Objectives

How can efficiency be achieved in the development of mental health interventions and in services research? We previously underscored the three-fold need to more efficiently conduct research; develop and implement effective clinical practices; and translate research findings into clinical practice and health policy. Typically, however, as depicted in Figure 2, intervention research proceeds along a fairly linear continuum of basic research, efficacy trials, effectiveness studies, and widespread dissemination [12, 13,14]. However, starting with the ultimate goal in mind – dissemination – may lead to greater efficiency along this continuum. An early understanding of contextual factors - relating to patients, providers, organizations, and communities – that will likely influence successful dissemination can promote the development of those interventions most likely to be implemented and have public impact. Rather than spend significant resources to develop and test treatments that may eventually be difficult to implement and disseminate, efficiency may be achieved when considerations related to the widespread dissemination of an intervention permeate the process rather than being confined to final stages. These points are highlighted in the following case study.

Figure 2
Stages of Intervention Research and Potential Feedback from Dissemination and Implementation Research to Inform Early Stages of Intervention Development

How clinical epidemiologic data from real world practice settings can inform efficiencies in the development of small molecules targeting the secondary prevention of PTSD

A 25-year-old Marine in Iraq suffers multiple lumbar and left lower extremity fractures when his Humvee hits an explosive device. Over the course of 3–4 months, he is transferred from Bagdad, Iraq to Lansuthl, Germany to Walter Reed Hospital in Washington, DC and finally to an Army Medical Center on the West Coast of the United States. After being told that his left foot will need to be amputated, he requests a second opinion and is transferred to an urban level I trauma center, where he is seen by the psychiatry consultation service due to symptoms of pain, anxiety, and depression.

What evidence-based treatment exists for this man, and where and when might it be delivered, given his multiple medical treatment sites over the course of several months? In its evidence-based review of treatments for PTSD, the IOM [15] strongly endorses cognitive-behavioral therapy; however, this treatment is very difficult to deliver across multiple, acute medical settings in the early phases of post-injury care. As demonstrated by this case study, we have an urgent need to reach persons who are currently suffering by expediently and efficiently developing empirically supported treatments that can be feasibly delivered in unique post-traumatic contexts such as acute care medical settings.

In a recent review paper, NIMH Director Dr. Thomas Insel outlined the stages of intervention development for new pharmacologic treatments, commencing with the basic scientific development of small molecules and ending with clinical trials and eventually widespread implementation [13]. For the secondary prevention of PTSD, a theoretical rationale has been provided for exploring the use of a diverse group of candidate compounds, including corticosteroids, beta-adrenergic antagonists, and opiate analgesics. Corticosteroids and beta-adrenergic antagonists have both been recently selected for initial efficacy trials. However, using readily available, inexpensive data from population-based trauma registry information systems, Zatzick & Roy-Byrne [16] found that at the time of hospital discharge form inpatient settings, 80–90% of trauma survivors were receiving opiate analgesics and 30–45% were receiving non-opiate analgesics, in contrast to less than 5–10% receiving corticosteroids and beta-adrenergic antagonists.

This pharmacoepidemiologic study underscores the ubiquitous use of analgesic medication in acute care settings and is consistent with population-based phenomenological studies suggesting that patients’ primary concerns in the days and weeks following traumatic injury are physical pain and bodily integrity [17]. A recent investigation found that opiate administration in wounded combat veterans may decrease later PTSD symptoms [18]. These clinical epidemiologic studies suggest that initial efficacy trials should include analgesics for the prevention of secondary PTSD; results might also stimulate basic research on compounds targeting both pain and anxiety. As depicted in Figure 2, this example demonstrates how population-based data from real world practice settings – from contexts in which treatments will be disseminated – can inform the early stages and efficiency of treatment development.

Similarly, dissemination and policy experiences from disparate fields can feed back to inform basic investigations and intervention development in mental health services. At the NIMH conference, Roy Cameron, PhD from the Centre for Behavioral Research and Program Evaluation (CBPRE) demonstrated how evaluation findings from adolescent tobacco-related programs conducted in Canadian schools informed the development and implementation of future population level programs. Working together closely, interdisciplinary teams of social actors, researchers, and policy makers can inform program development through a process of intervention implementation, evaluation, and program and policy revisions based on findings. Field and research experiences with dissemination and implementation facilitate the more efficient development of interventions for diverse populations while also enhancing the public health impact of mental health research and services delivery.

Introduction to journal series

The plenary sessions of the 20th NIMH Mental Health Services Research Conference convened researchers, practitioners, and policymakers to discuss how to maximize efficiencies in the development and implementation of service delivery models and in response to public health challenges. The following papers in this series present the highlights of those sessions, with implications for future developments in mental health services research and practice.

The first, by Katon et al, describes the development, testing, implementation, and policy implications for the collaborative care model for depression. Lessons learned from different stages in this process suggest opportunities for further efficiencies in mental health service delivery and research.

The second, by Kolko et al, discusses emergent research on the impact of trauma on the mental health of the US population. Drawing from work on interventions for children and adolescents and from studies in response to large-scale traumatic events (e.g. terrorism, hurricanes), authors present ways in which research can maximally benefit public mental health.

Finally, the series includes two commentaries from the leadership of NIMH and from the journal editor. The former contextualizes the theme of efficiency within the mission of the Institute and its recent strategic plan, while the latter applies the theme to the Journal’s mandate.


Limited financial resources, rising costs for both mental health research and clinical practice, and the development of new technologies have spurred a new emphasis on efficiency, one of the six key aims specified by the IOM for developing a quality mental health system [1]. Definitions of efficiency in mental health research and clinical practice are still in preliminary stages, and measurements are in need of further development. Findings from dissemination and implementation studies may facilitate efficiencies in intervention development. By ensuring that available dollars for research and mental health service delivery are most effectively and efficiently spent, scientific discoveries and improvements in technology and field capacity can significantly alter the trajectories of those with mental disorders, and improve the response to the tremendous public health need within our society.


Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.


1. Committee on Quality of Health Care in America, Institute of Medicine. Crossing the quality chasm: a new health system for the 21st century. Washington, DC: National Academy Press; 2001.
2. Bureau of Economic Analysis, Department of Commerce. NIH Office of Budget; 2010. Feb 9, Biomedical Research and Development Price Index (BRDPI), Table of Annual Values Index. <>.
3. National Institute of Mental Health (NIMH) Inside NIMH: funding news for current and future NIMH awardees. NIMH; Mar 9, 2008. Feb 9, 2010. <>.
4. Insel TR. Assessing the economic costs of serious mental illness. Am J Psychiatry. 2008;165:663–5. [PubMed]
5. Balas EA, Boren SA. Managing clinical knowledge for health care improvement. In: Bemmel J, McCray AT, editors. Yearbook of Medical Informatics 2000: Patient-Centered Systems. Stuttgart, Germany: Schattauer Verlagsgesellschaft mbH; 2000. pp. 65–70.
6. U.S. Department of Health & Human Services, National Institutes of Health. National Institute of Mental Health Strategic Plan, NIH Publication No. 08-6368. Bethesda, Maryland: National Institutes of Health; 2008. (revised)
7. Katon WJ. The institute of medicine “chasm” report: implications for depression collaborative care models. Gen Hosp Psychiatry. 2003;25:222–229. [PubMed]
8. Williams A. Priority setting in public and private health care: a guide through the ideological jungle. J Health Economics. 1988;7:173–183. [PubMed]
9. McGlynn E, Shekelle PG, Chen S, et al. Rockville, MD: Agency for Healthcare Research and Quality; Apr, 2008. Health Care Efficiency Measures: Identification, Categorization, and Evaluation, AHRQ Publication No. 08-0030.
10. Palmer S, Torgeson DJ. Definitions of efficiency. BMJ. 1999;318:1136. [PMC free article] [PubMed]
11. Hussey PS, de Vries H, Romley J, et al. A systematic review of health care efficiency measures. Health Serv Res. 2009;44(3):784–805. [PMC free article] [PubMed]
12. Zatzick DF, Galea S. An epidemiologic approach to the development of early trauma focused intervention. J Trauma Stress. 2007;20(4):401–12. [PubMed]
13. Insel TR. Translating scientific opportunity into public health impact: a strategic plan for research on mental illness. Arch Gen Psychiatry. 2009;66(2):128–133. [PubMed]
14. National Institute of Mental Health. Bridging science and service: a report by the National Advisory Mental Health Council’s Clinical Treatment and Services Research Workgroup, NIH Publication No. 99-4353. Bethesda, Maryland: National Institute of Mental Health; 1999.
15. Committee on Treatment of Posttraumatic Stress Disorder (PTSD), Institute of Medicine. Treatment of PTSD: an assessment of the evidence. Washington, DC: National Academy Press; 2008.
16. Zatzick D, Roy-Byrne PP. From bedside to bench: how the epidemiology of clinical practice can inform the secondary prevention of PTSD. Psychiatr Serv. 2006;57(12):1726–30. [PubMed]
17. Zatzick D, Kang SM, Hinton WL, et al. Posttraumatic concerns: a patient-centered approach to outcome assessment after traumatic physical injury. Med Care. 2001;39(4):327–39. [PubMed]
18. Holbrook TL, Galarneau MR, Dye JL, Quinn K, Dougherty AL. Morphine use after combat injury in Iraq and post-traumatic stress disorder. N Engl J Med. 2010;362(2):110–7. [PubMed]