PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptNIH Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
J Behav Health Serv Res. Author manuscript; available in PMC Apr 1, 2011.
Published in final edited form as:
PMCID: PMC2850956
NIHMSID: NIHMS47742

Using Administrative Data for Longitudinal Substance Abuse Research

Abstract

The utilization of administrative data in substance abuse research has become more widespread than ever. This selective review synthesizes recent extant research from 31 articles to consider what has been learned from using administrative data to conduct longitudinal substance abuse research in four overlapping areas: (1) service access and utilization, (2) underrepresented populations, (3) treatment outcomes, and (4) cost analysis. Despite several notable limitations, administrative data contribute valuable information, particularly in the investigation of service system interactions and outcomes among substance abusers as they unfold and influence each other over the long term. This critical assessment of the advantages and disadvantages of using existing administrative data within a longitudinal framework should stimulate innovative thinking regarding future applications of administrative data for longitudinal substance abuse research purposes.

Keywords: administrative data, longitudinal research, substance abuse treatment outcomes, health services utilization

INTRODUCTION

Ample evidence indicates that decreases in substance abuse are associated with improvements in functioning across a broad range of areas, and substance abuse itself is widely acknowledged as a disorder that is often chronic and requiring of long-term care.15 To comprehensively assess the impact of chronic abuse of drugs and to track the course of recovery, researchers increasingly affirm the need to focus on the long-term interplay of multiple events associated with substance abuse over time (e.g., health services utilization, psychosocial mediation factors, criminal activity).

While there are notable exceptions,68 much of the research on substance abuse treatment and service utilization is limited in that it mainly relies on self-reported information collected from treated patients over a short period of time, usually at or soon after treatment discharge. Furthermore, information is typically gathered via follow-up interviews with participants, an expensive and time-consuming endeavor made even more challenging by inadequate resources for re-locating sufficient numbers of research participants, cognitive and technical factors that influence the accuracy of self-reported data,9 and increasingly restrictive regulatory criteria limiting re-contact methods.10 There are benefits to collecting self-reported information, particularly when studying a wide variety of stigmatized behaviors (e.g., illicit drug use, criminal activity, HIV risk behavior) that can often only be assessed by talking directly with research participants. The validity and limitations of self-reported information have received much attention in the literature.1116 However, there are few comparable critical assessments of the pros and cons of using administrative data as an alternative or complementary data source for longitudinal research on substance abuse.

Administrative data, defined as existing data routinely collected primarily for non-research purposes, can be thought of as an “official” record of events as they occur. Examples of administrative data include utilization of services for medical or psychiatric problems, receipt of public welfare benefits, insurance claims, and criminal justice system records on arrests, convictions, and incarcerations. Administrative data can be difficult to obtain and link across systems,17 and linkage methods (e.g., deterministic vs. probabilistic) must be carefully considered.1819 Despite the ethical, legal, and practical limitations to its use, 20 one principal advantage of administrative data is that it can be employed to examine events, service system interactions, and outcomes as they unfold and influence one another over the long term. Another benefit is that administrative data can provide information on individuals who may be characterized by unique needs and experiences but who, as a group, usually present too small of a sample for disaggregation in statistical analyses.21 Moreover, mining existing data presents a cost-effective opportunity to take full advantage of readily available resources. Finally, advancements in technology and statistics have diminished technical difficulties associated with sharing and manipulating large, and often messy, administrative datasets. For these and other reasons, the use of administrative data for research purposes has become more widespread than ever.

The health and social services field, and the criminal justice system, have long maintained databases for tracking phenomena by unique individual over time, and stakeholders in these disciplines have frequently identified administrative data as a rich resource for policy-relevant research opportunities.2231 For example, administrative data has been used to examine the impact of managed care on health service access and utilization,3233 to identify discrepancies between the prevalence of medical illness and service utilization,34 to document and describe the prevalence of criminal restraining orders,35 and to explore a wide range of health care topics, particularly those pertinent to veterans.3639 Numerous studies have relied upon administrative data to examine issues related to mortality and substance abuse.4048 Existing data has also been used to conduct cost-effectiveness and cost-offset studies, 49 and it is in this area that administrative data was first extensively used in the substance abuse research field.5052 Since these early economic evaluations, the substance abuse research field has continued to capitalize on administrative data to generate empirical evidence to inform practice and policy (e.g., Hser, Teruya, Brown, Huang, Evans, & Anglin, 2007),2 particularly as part of efforts to implement and evaluate treatment outcome monitoring systems.

Recognizing the need to monitor treatment outcomes, 53 in 1998 the Center for Substance Abuse Treatment (CSAT) at the Substance Abuse and Mental Health Services Administration (SAMHSA) initiated the Treatment Outcomes and Performance Pilot Studies Enhancement (TOPPS II), a project that pilot-tested substance abuse treatment outcome monitoring systems in 19 states. Implementing data-driven systems in substance abuse treatment settings was difficult 5458 and relatively expensive.59 Concurrent with TOPPS II, existing administrative data was identified as an alternative to primary data collection for examining service utilization and measuring treatment performance and outcomes, 21 an idea that subsequently garnered support among other substance abuse researchers.6062 Most states that participated in TOPPS II expanded or refined existing state-level drug treatment data collection efforts and implemented collection of self-reported follow-up data; however, a few states (e.g., California, Maryland, Oklahoma, Washington) used administrative data in their research designs.6364

More recently, the types of administrative data that are available for substance abuse treatment performance measurement have been well-documented, 6567 and some states (e.g., Oklahoma, Washington) 68 have made integration of administrative data into evaluation and research efforts ever more routine. Researchers continue to draw on existing data to conduct innovative studies that contribute to and expand what is known about the prevalence and nature of substance abuse and its association with other life events. Curiously, however, since Alterman et al.’s 2001 article, 60 there has been no critical assessment of lessons learned from the use of administrative data for substance abuse research purposes, nor of the advantages and disadvantages of using administrative data within a longitudinal framework.

The purpose of this paper is to synthesize recent extant research that used administrative data in longitudinal substance abuse studies so as to (1) consider what has been learned from such use, (2) review the limitations of using administrative data in longitudinal substance abuse studies, and (3) discuss future directions. Findings should stimulate critical thinking regarding the scope of opportunities generated by using administrative data for longitudinal substance abuse research.

METHODS

PubMed and PsychInfo were searched for substance abuse research articles published after 1999. Search terms included “substance abuse,” “administrative data,” “record or data linkage,” and “performance monitoring.” Studies were considered to be longitudinal if they covered at least 1 year of follow-up, although studies with follow-ups of 3 to 6 years were more common.

The 31 articles that were found are organized into four overlapping topic areas (see Table 1): service access and utilization (10 articles), underrepresented populations (11), treatment outcomes (4), and cost analysis (6). Within each domain, the focus is on representative articles that illustrate the depth and breadth of administrative data applications recently conducted within the substance abuse research field. Each study’s main findings are briefly summarized and a critical assessment of the contributions and lessons learned for longitudinal research is provided in each area as well. The review ends with a summary of how administrative data can be applied within a longitudinal research design to enhance knowledge, followed by a discussion of implications for future longitudinal research on substance abuse treatment/service utilization and outcomes.

Table 1
Selected substance abuse research published after 1999 that utilized administrative data

FINDINGS

Service access and utilization

Substance abusers with multiple needs often access care provided by several service systems, and, over time, individuals may have a history of a variety of healthcare experiences with varying results. But because each service system is usually a separate and distinct entity, it can be difficult to document the full constellation of services received, much less understand how utilization of services offered by different systems may have impacted outcomes over time. Several studies have combined administrative data from multiple agencies to broaden and extend our understanding of service access, utilization, and outcomes.

Researchers have used administrative data, combined with self-reported data, to strengthen study designs. Lundgren et al. (2005)69 linked self-reported interview data with statewide claims data on substance abuse treatment and health insurance to examine relationships over 6.5 years between drug treatment, health service use, HIV status, and emergency room and hospital use. The combined dataset generated complementary information on factors that predict costly events. Use of emergency room and hospital services was positively associated with mental health status, drug use severity, and having private health insurance. In another example, Rosen et al. (2007)70 also combined interview data with administrative records to determine whether assignment of a payee to receive funds affected clinical outcomes for 1,457 mentally ill individuals in 9 states over 12 months. Results showed that beneficiaries with a payee had more serious substance abuse and mental health problems, received more psychiatric services and a broader range of services, and demonstrated a greater reduction in substance use. Ray, Weisner, and Mertens (2005) 71 used administrative data to adjust for a variety of characteristics when examining the relationship between receipt of psychiatric services and 5-year drug treatment outcomes. Analysis of linked data on psychiatric services and drug treatment for 604 individuals served by a California health maintenance organization showed that over 5 years, drug abstinence was more likely among patients who received an average of 2.1 hours of psychiatric services per year.

Researchers have also used administrative data to document changes in patterns of care over time. Maynard et al. (2000)72 linked existing state records from multiple sources to examine service utilization over 4 years by 735 patients involuntarily committed to substance abuse treatment in Washington. Drug treatment completion was associated with a decreased likelihood in the use of acute care services and with an increased likelihood of post-discharge outpatient treatment participation. But of most interest for the purposes of this paper, utilization of some services was actually greater in the year immediately following treatment discharge, compared to the prior year, but by analyzing administrative data covering the second and third years following discharge, eventual decreases indicating service utilization below the pre-admission levels were revealed. Similarly, Lundgren et al. (2006)3 used administrative drug treatment data to explore service utilization patterns among 22,006 injection-drug-using treatment repeaters in Massachusetts over a 5-year period. Findings revealed variation in patterns of care, an overuse of detox only, and an underutilization of the state’s continuum-of-care model.

Finally, administrative data has been used to examine the impact of policy changes on the utilization of care. As one prime example, records from Oregon’s substance abuse treatment system and the state’s Medicaid eligibility and enrollment data were linked on more than 500,000 subjects over 4 years to analyze the impact of the shift from fee-for-service financing to managed care on access to publicly funded substance abuse treatment.73 Investigators found that access to drug treatment actually increased significantly under managed care, an unexpected consequence of this statewide health policy change. In subsequent studies, Oregon’s integrated administrative database was instrumental in examining the impact of policy decisions regarding coverage for drug treatment as a Medicaid benefit. When coverage was expanded, the number of opiate users enrolled in methadone maintenance increased.7475 But when eligibility for benefits was later reduced, administrative data was used to uncover how the neediest patients were the most negatively impacted76 and also how changes in coverage both reduced new admissions to opiate treatment and also decreased the likelihood of a methadone placement for people who did present for treatment.77

Lessons learned for longitudinal research

These studies illustrate several advantages of using administrative data to conduct longitudinal research on substance abuse service access and utilization. First, administrative data provide empirical evidence not just on how substance abuse treatment impacts later drug use, but also how treatment/service utilization relates to changes in other types of behavior and related events over time. As demonstrated by these studies, utilization of services provided by one system can have a “ripple effect,” both in utilization of services provided by a sister system (e.g., the decreased use of medical and mental health treatment due to drug treatment) and also in outcomes felt by other systems (e.g., the impact of changes in Medicaid coverage on drug treatment admissions). Administrative data is also instrumental for observing system-wide impacts of large policy changes (i.e., changes in managed care or Medicaid enrollment policies) that result in unexpected or unintended consequences. Also revealed are relationships between outcomes and degree of service exposure (i.e., treatment retention and completion, not just treatment entry itself), in addition to how those relationships change over time (e.g., service utilization can increase just after drug treatment but then eventually decrease below pretreatment levels when observation periods are extended). Similarly, administrative data allows analyses to incorporate policy-level factors not often included in outcome studies, such as the impacts of private health insurance, use of prescription medication, and HIV status. Finally, administrative data can be used to strengthen study designs, for example, through adjustment for group differences, and can be combined with self-reported information to provide more complete information on participants. In conclusion, synthesis of administrative data from separate but overlapping service systems generates new knowledge that is comprehensive, timely, and policy-relevant, enhancing the ability of researchers to understand longitudinal trends in healthcare utilization and outcomes.

Review of these studies also raises the issue of “history” effects,78 which must be considered when using administrative data for longitudinal research. Changes in behaviors and outcomes observed over time may be a function of changes in access to treatment/services, perhaps due to policy changes that influence funding levels, changes in eligibility for services (e.g., the actual change in SSI eligibility in which addiction was no longer considered a disability), or economic conditions that influence behaviors (e.g., employment, insurance enrollment). Although some studies using administrative data employ an “interrupted time series” design, in which behaviors or outcomes are compared in periods prior to and after a defined event that directly influences the observed outcomes, in other cases, historical events may not be adequately accounted for in the interpretation of findings. It must be kept in mind that administrative data reflects only those events that come to the attention of the system providing the data. That is, administrative data may provide a wealth of information about clients who used a set of services, but an inherent limitation is that individuals who needed the service but did not access it are excluded. It follows that findings resulting from administrative data analysis often reflect those who have been in treatment rather than the general community.

Underrepresented populations

Administrative data often contains information on the entire population served by the system, facilitating the exploration of issues specific to ethnic minority or other small populations that otherwise might not be examined, partly because of insufficient sample size. As one example, the California Treatment Outcome Project (CalTOP) combined self-reported information with administrative data from four sources (adult lifetime records on arrests, mental health service utilization, drug treatment histories, and driving records) on more than 20,000 individuals admitted to publicly funded drug treatment.63 The resulting dataset not only permitted analysis of emerging substance abuse policy issues, 79 but moreover it was used by several investigations aimed at exploring issues unique to a number of populations historically underrepresented in substance abuse research, including Hispanics, 80 American Indians, 81 Asian Americans, 82 methamphetamine users,83 women treated in women-only versus mixed-gender programs, 8485 dually diagnosed patients, 86 and mothers involved with the child welfare system. 87 Discussing the design and findings of each CalTOP article is beyond the scope of this paper. Instead, CalTOP is cited as an example of how the combination of multiple sources of existing records generates a versatile dataset with enough sample size, depth, and breadth to permit analyses of complex issues unique to often neglected groups.

Green, Rockhill, and Furrer (2006)88 used administrative data to examine issues pertinent to a hard-to-study population, substance abusing women involved with the child welfare system, while also documenting the complex interplay of events, system interactions, and outcomes over time. The study examined 3 years of data on 1,911 women involved in Oregon’s child welfare and drug treatment systems. Data was used to track events over time and also to control for a number of diverse pretreatment characteristics such as treatment and child welfare history, and substance abuse frequency and chronicity. Results indicated that when women entered drug treatment sooner after the date of having a child placed in substitute care, spent more time in treatment, or completed treatment, their children spent fewer days in foster care and were more likely to be reunified with their parents.

In another example of using administrative data to conduct longitudinal research on an understudied population, Claus, Orwin, Kissin, et al. (2007)89 analyzed 6 years of Washington State’s treatment admission and discharge data on more than 1,500 individuals to examine differences in continuity of care between women with children who entered specialized women-only residential treatment versus standard mixed-gender residential treatment. Administrative data was used not only to examine these issues over time, but also to construct propensity scores for addressing group nonequivalence, to control for treatment completion and length of stay, and to examine alternative explanations of observed associations. Results showed that specialized treatment was associated with continuing care, and women who completed specialized treatment with longer stays were most likely to continue care.

Lessons learned for longitudinal research

These studies highlight how administrative data can be a rich resource for conducting rigorous longitudinal research on understudied populations. With administrative data, not only are sample sizes large enough to support statistical analyses, but, moreover, such data makes it possible to employ strong study designs and to examine complex interactions over time, two advantages that strengthen longitudinal research on underrepresented groups.

These studies also raise issues researchers face when individual variable definitions, such as aggregation of particular ethnicities within larger racial categories, not to mention entire data systems, change over time, making longitudinal analyses especially problematic. Often, there is little information on changes in the accuracy of data as it is collected over time, and without a means to verify data quality, “dirty” data may be omitted from analysis altogether. Furthermore, events of interest may lack needed specificity or may have been collected inconsistently over time, or data may not be available until some time after events have occurred. Finally, caution must be exercised when making causal attributions simply because events captured in administrative databases are associated in time. External criteria or self-selection forces that influence where and when people enter into different service systems must be considered, especially in terms of the generalizability of findings that are rooted in administrative data.

Treatment outcomes

More than 10 years ago, Washington State recognized the potential of using statewide information systems to provide meaningful data for informing policy and practice,90 and since then Washington researchers have developed an impressive portfolio of treatment outcome studies based on administrative data. Others outside of Washington have also used existing data to measure treatment outcomes,9193 but because of the state’s long history in this area, in this section the focus is on a few of their studies that illustrate useful lessons in the application of administrative data for longitudinal outcome research on substance abuse treatment.

In 2000, Luchansky et al. (2000)94 linked three sources of state-level data (substance abuse treatment, criminal histories, and employment wages) on 10,284 individuals to analyze factors related to treatment readmission in Washington. The administrative data, which covered 13 months, was instrumental in documenting the continuum of care and its impact on patterns of readmission over time. Findings revealed that only about a quarter of clients were readmitted to treatment over 1 year. Readmission was more likely for females and people arrested in the year prior to treatment, and it was less likely for males and those receiving a combination of inpatient and outpatient treatments.

Four years later, Maynard and colleagues (2004)95 examined relationships between death, mental illness, and substance abuse among 2,041 individuals discharged from Washington State mental hospitals. Relying on administrative data from 3 sources (mental health hospitalizations, Medicaid diagnostic data, and cause of death) covering 5 years, analysis revealed that patients with a co-occurring or substance use disorder had a 44% higher risk of death after discharge, compared to those with a mental illness diagnosis only, and these individuals died at a younger age, primarily due to injury, accidents, and medical conditions directly related to their addiction.

Most recently, Luchansky and colleagues96 obtained existing data from multiple state-level sources on 8,343 Supplemental Security Income recipients covering 1 year. Investigators examined the association between receipt of needed treatment and subsequent criminal justice involvement. Administrative data was used to capture not only the substance abuse treatment experiences of the sample but also several other related events, including medical care, arrests and convictions, and receipt of other health and social services. Also, identification of the need for substance abuse treatment drew upon administrative data from not just 1 but 3 data sources. Interestingly, as the authors note, more than 68% of the study population met criteria for substance abuse treatment need in more than one source, indicating that most of the study population came into contact with more than one governmental agency. Despite these multiple contacts, only about half of the study population actually entered substance abuse treatment. Furthermore, administrative data on substance abuse treatment histories allowed for analysis of outcomes related to treatment episodes of care (i.e., multiple treatment admissions that occur within 30 or fewer days after discharge from a previous admission), a preferred alternative to analyzing the outcomes of a single treatment admission and discharge, especially within a chronic care context. Finally, administrative data permitted an examination of whether the amount of treatment was associated with outcomes and also whether simply entering treatment had an impact. Findings revealed that reduced risks for re-arrest and conviction were associated with treatment completion and longer retention, as well as simply entering treatment.

A study that combined data from Washington with data from 2 other states provides a final example. The TOPPS II Interstate Cooperative Study Group (2003)64 used employment and drug treatment data on 20,495 drug treatment patients living in Baltimore, Washington State, and Oklahoma, to examine the effect of drug treatment completion on employment and wages in the year after treatment. Employment history prior to treatment was used to adjust for group differences, and all treatment services received within an episode of care, despite changes in modality, were captured. Posttreatment employment was associated with treatment completion and longer treatment stays. Findings were consistent across all three states, despite different populations, treatment delivery systems, and labor markets.

Lessons learned for longitudinal research

One ongoing issue raised by review of these studies that is of particular concern when using administrative data to study treatment outcomes involves researchers’ ability to link data across datasets and time. Reasons for unlinked data are frequently unknown; however, the absence of a record is often interpreted as a non-occurrence of an event. For example, when no record is found to indicate an arrest, utilization of services, or treatment readmission, it is sometimes assumed that behavioral or medical improvements have occurred. While some absent data can be explained by improvements in subject status, other explanations may include insufficient identifiers needed for data matching,72 technical complications prohibiting linkage such as duplicate records,97 inconsistencies between administrative records and clinical records,98 and systematic or inadvertent data purges. Additionally, different strategies for record matching might be employed when acquiring data repeatedly and from multiple sources, possibly resulting in differences in linkage rates. The problem of missing data has been associated with particular subject characteristics such as being a member of a racial/ethnic minority or infrequent exposure to a particular service system.99 Communication with agency staff that provide data is key to understanding reasons for missing data. Also important are the utilization of different linking strategies (probabilistic vs. deterministic) designed to minimize the amount of unlinked data, the use of thresholds for determining legitimate matches, the examination of possible biases resulting from unlinked records, and the use of administrative data in combination with self-reported data.

Despite these issues, Washington’s work demonstrates how existing data provides the means to generate findings on longitudinal treatment outcomes that converge across service systems, over time, and even across states. Furthermore, from study to study, Washington’s administrative data-based treatment outcome studies have involved large sample sizes, multiple datasets, and long observation periods, making it possible to examine how complexities resulting from the same individuals entering different systems of care impact long-term outcomes. Additional examples of Washington’s work with administrative data are available,100102 but due to space constraints, they cannot be covered in this article. This body of work demonstrates that administrative data is a useful tool for generating knowledge by expanding, lengthening, and strengthening our observation of addiction treatment and its impact, while providing information on the complex interactions and processes that can influence outcomes over time.

Cost analysis

A great deal of empirical evidence suggests that substance abuse is associated with increases in a wide range of costs to society,51, 103108 including costs associated with crime and the criminal justice system;109110 medical care; 109, 111116 infectious diseases;117118 perinatal care;118 mental health disorders;104 and public benefits programs.118121 Using administrative data to examine economic issues associated with drug abuse is a widespread practice that has resulted in several notable contributions to the field. Just a few recent examples have been chosen to illustrate some strengths and challenges associated with using administrative data for longitudinal substance abuse cost/benefit studies.

Parthasarathy & Weisner (2005)122 linked service utilization and cost data to information self-reported by 1,204 commercially insured chemical dependency patients to examine 5-year patterns of health care utilization and costs. Administrative data on patients without alcohol and drug problems was used to form a matched comparison group to examine whether patterns were attributable to regional trends. The most significant predictors of long-term utilization and costs were age, gender, employment status, medical and psychiatric severity, dependence type, treatment modality, and abstinence. Administrative data allowed researchers to ascertain that total health care costs increased initially from baseline to 6 months later, but costs then decreased below intake levels by 1- and 5-years post-intake.

In a different but similarly designed study, Polen et al. (2006)123 examined differences in medical care costs over 6 years between 1,472 individuals recommended for substance abuse treatment and 738 people without substance abuse diagnoses or treatment within a health maintenance organization in Oregon. Administrative records provided data on demographic characteristics, psychiatric diagnoses, prior care, service utilization, treatment completion, and costs. Changes in medical care costs over time did not differ between the two groups, and individuals with improved treatment outcomes did not have greater reductions in medical costs.

Ettner et al. (2006)124 combined self-reported information from 2,567 individuals with state-level administrative data from four sources to examine costs/benefits associated with publicly funded drug treatment in California provided through CalTOP over 2 years. Substance abuse treatment represented a greater than 7 to 1 ratio of benefits to costs, primarily because of reduced crime costs and increased employment earnings following treatment. The authors noted that the most significant monetary benefits occurred in areas (crime, hospitalizations, earnings) that were captured by administrative data, suggesting that in the future, similarly designed cost/benefit analyses that omit self-reported information and rely entirely on administrative data would likely result in reasonable estimates.

Wickizer et al. (2006)125 relied entirely on administrative data from Washington to evaluate the economic impact of substance abuse treatment on medical expenditures for welfare recipients over 4 years. By using linked administrative data from six sources, researchers were able to define a large study population (n=32,919), control for a number of factors (including differences in baseline medical care expenditures, demographics, mental health status, and health risk), and modify the parameters of statistical models to test the robustness of study findings. Substance abuse treatment was associated with a reduction in medical expenses of about $2,500 annually. Additionally, secondary analyses revealed that, compared to an untreated group of welfare recipients, individuals in the treated group who used inpatient mental health services incurred fewer costs on average. The treated group was also more likely to use outpatient mental health services and less likely to use adult services such as in-home nursing care and assisted living.

Carey et al. (2006)126 obtained administrative data on drug treatment and numerous criminal justice system interactions to assess costs and benefits associated with California drug courts over 4 years. Administrative data provided information on drug court participants and also on a matched comparison group of offenders who did not enter drug court. The study found that every $1 invested in drug courts was associated with a return of $3.50, mainly due to reduced recidivism rates among participants. Additionally, for each year studied, the state saw a combined net savings of more than $9 million.

Longshore et al. (2007)127 utilized existing state-level records on more than 130,000 individuals covering 8 domains (e.g., criminal justice system interactions, drug treatment, healthcare) to assess the cost implications of California’s initiative to provide community-based substance abuse treatment to eligible drug offenders. Findings showed that over the 2.5 years of observation, the program yielded significant savings, with cost-saving ratios of 1:2.5 for participants generally (i.e., for every $1 invested, $2.50 was saved) and 1:4 for completers (i.e., for every $1 invested, $4 was saved). Additionally, administrative data was used to construct a comparison group, replicate findings using subsequent years of data, examine the impact of degree of treatment participation, and conduct sub-studies on populations of particular interest.

Lessons learned for longitudinal research

An issue raised by review of these studies is the need for an examination of the correspondence of information gathered from self-reported sources compared to information that relies on administrative data sources. For example, an individual’s self-reported estimate of income may include sources, such as “under-the-table” wages, not captured by administrative data sources, a discrepancy that might affect cost-benefit ratios associated with the program being studied. While some researchers have used administrative data to test the validity of measures128132 and others have commented on how external forces such as regulatory requirements and billing systems likely influence the accuracy of administrative data,133 there have been no empirical investigations of the degree to which self-reported and administrative data sources complement each other, or whether the accuracy or reliability of information may be comparatively greater in one data source compared to the other.

These cost studies illustrate how administrative data can be used to enhance flexibility in study design, facilitating the measurement of behavior and service system interactions and associated costs as they unfold over time. Added strengths associated with the utilization of administrative data include large sample sizes, longitudinal study designs, multiple data points, matched comparison groups, risk adjustments, and robustness tests. Studies are simultaneously broad and deep enough to capture complexities, while revealing new policy-relevant knowledge on the intersections between different service systems (i.e., substance abuse treatment, welfare, mental health, adult health services) and their economic impacts.

DISCUSSION

Using administrative data to advance longitudinal research

The results of this review must be considered within the constraints of its design. Only longitudinal substance abuse research articles published in peer-reviewed journals after 1999 were included in this review. However, the results of many studies using administrative data are not necessarily published in peer-reviewed journals, but instead are in the form of reports to state governments or reports on program evaluations.134136 These reports can provide valuable guidance, especially to researchers seeking assistance from experienced colleagues in navigating the intricacies of accessing and utilizing datasets unique to specific states.

Despite its limitations, administrative data is a valuable resource for conducting longitudinal research on substance abuse treatment, service utilization, and outcomes. A considerable amount of time and resources are required simply to access administrative data and adequately address concerns regarding confidentiality, and researchers are advised to carefully weigh the benefits to be gained from analyzing administrative data against the resources required to obtain and derive adequate information from it. But beyond enhancing the ability of researchers to analyze events and associated costs as they arise and unfold, other advantages, especially applicable to applying a long-term view, include opportunities to track historical trends, apply flexible follow-up intervals, and take advantage of large pools of data for matching groups on key characteristics for quasi-experimental designs. Additionally, administrative data can be used to verify some self-reported key events in an individual’s life, enabling researchers to strengthen the validity of findings informing the understanding of causes and effects surrounding particular events. Administrative data is also particularly well-suited to studying the long-term course of health service utilization and outcomes among co-morbid and disadvantaged populations, as these groups are often excluded from traditional randomized clinical trials. 137138 Finally, even as longitudinal research participants age and die, administrative data on them remains available, and utilizing such data is one way to make up for lost opportunities to extend knowledge about key events within a life-course perspective.139

IMPLICATIONS FOR BEHAVIORAL HEALTH

This article makes a contribution to the field by articulating the numerous issues associated with using administrative data for longitudinal research purposes, thereby creating a key resource for various stakeholders. For example, this paper may aid junior investigators who are seeking guidance before utilizing administrative data for the first time, and it may also be of use to more experienced researchers who are seeking evidence to corroborate their own experiences with administrative data or who are, perhaps, seeking new directions for extending or strengthening their current work. This review might also assist state agencies in considering the pros and cons of fostering greater collaboration and data integration, especially between sister agencies interested in analyzing existing administrative data within a framework that employs a longitudinal perspective.

Examining how behavior patterns are shaped and altered by events over time is a key component of longitudinal research. Knowing when and how often an individual engages in behaviors and how the course of those behaviors is altered through interactions with different systems could help to improve the planning and delivery of strategies to change those behaviors for the better. Few substance-abuse-related datasets contain the necessary data elements needed to track participants over time, particularly in complex areas of interest. As illustrated by the articles reviewed in this paper, combining elements on individuals from several administrative datasets, or in combination with complementary self-reported information, increases confidence in findings,140 strengthens research designs, and broadens the scope and flexibility of analyses, providing opportunities to generate empirical evidence on the longitudinal effects of drug abuse and related complex events that would not have been revealed by analysis of single datasets independent of one another.

A comprehensive, integrated administrative dataset, i.e., a multi-dimensional measurement of diverse events over time as they occur, would allow for more complex models that reflect real-life social interactions and phenomena. Research that utilizes data collected by self-contained systems that incorporate multidimensional data under one unique identifier (e.g., data from Health Maintenance Organizations, the Department of Veterans Affairs, public insurance recipients, and some individual states) is promising. But the drug abuse research field is not yet at the point of creating an information system like Denmark’s social registries, which constitute a single coherent source of social statistics.141

Is a “cyberinfrastructure” age on the horizon for addiction research? Touted as a tool that can “enable the development of more realistic models of complex social phenomena” and “the production of and analysis of larger datasets…that more completely record human behavior…” (Berman & Brady, 2005, p. 9),142 a cyberinfrastructure would make it possible to “track change in human behavior at multiple time scales and from multiple perspectives” (Berman & Brady, 2005, p. 13).142 Enthusiasm for technological advancements that rely on identifiable personal data must be tempered by persistent uncertainties regarding the maintenance of individual privacy, the potential for the fraudulent use of such data, and the legal requirements guiding use of administrative data for research purposes.

Admittedly not without its risks, a cyberinfrastructure is certainly an intriguing concept and its application in the addiction field would signal a great technological and conceptual leap forward. Further consideration is needed and, considering that much of the research included in this paper originated as statewide evaluations of specific or localized programs or policies, ongoing discussion regarding the utilization of administrative data for longitudinal research might best occur as part of the continuing national debate regarding different operational, conceptual, and methodological approaches to measuring drug treatment quality, performance, and outcomes.143 As a valuable resource for generating empirical evidence revealing complex events as they unfold and interact over time, administrative data is likely to continue to be a key feature of future longitudinal substance abuse research on treatment/service utilization and outcomes.

Acknowledgments

The project described was supported in part by Grant Number P30 DA016383 from the National Institute on Drug Abuse. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institute on Drug Abuse or the National Institutes of Health. Special thanks are due to staff at the UCLA Integrated Substance Abuse Programs for manuscript preparation. We particularly wish to thank Dr. Anoinette Krupski for her generous insights and comments.

REFERENCES

1. Hser YI, Anglin MD, Grella C, et al. Drug treatment careers: a conceptual framework and existing research findings. Journal of Substance Abuse Treatment. 1997;14(6):543–558. [PubMed]
2. Hser YI, Teruya C, Brown A, et al. Impact of California’s Proposition 36 on the drug treatment system: treatment capacity and displacement. American Journal of Public Health. 2007;97(1):104–109. [PubMed]
3. Lundgren LM, Sullivan L, Amodeo M. How do treatment repeaters use the drug treatment system? an analysis of injection drug users in Massachusetts. Journal of Substance Abuse Treatment. 2006;30(2):121–128. [PubMed]
4. McLellan AT. Have we evaluated addiction treatment correctly? implications from a chronic care perspective. Addiction. 2002;97(3):249–252. [PubMed]
5. McLellan AT, Lewis DC, O'Brien CP, et al. Drug dependence, a chronic medical illness: implications for treatment, insurance, and outcomes evaluation. JAMA. 2000;284(13):1689–1695. [PubMed]
6. Hser YI, Hoffman V, Grella CE, et al. A 33-year follow-up of narcotics addicts. Archives of General Psychiatry. 2001;58(5):503–508. [PubMed]
7. Moos RH, Moos BS. Sixteen-year changes and stable remission among treated and untreated individuals with alcohol use disorders. Drug and Alcohol Dependence. 2005;80(3):337–347. [PubMed]
8. Price RK, Risk NK, Spitznagel EL. Remission from drug abuse over a 25-year period: patterns of remission and treatment use. American Journal of Public Health. 2001;91(7):1107–1113. [PubMed]
9. Bhandari A, Wagner T. Self-reported utilization of health care services: improving measurement and accuracy. Medical Care Research and Review. 2006;63(2):217–235. [PubMed]
10. Evans E, Murphy D, Grella C, et al. Regulatory issues encountered when conducting longitudinal substance abuse research. Journal of Drug Issues. In press. [PMC free article] [PubMed]
11. Anglin MD, Hser YI, Chou C. Reliability and validity of retrospective behavioral self-report by narcotics addicts. Evaluation Review. 1993;17(1):91–108.
12. Cherpitel CJ, Ye Y, Bond J, et al. Validity of self-reported drinking before injury compared with a physiological measure: cross-national analysis of emergency-department data from 16 countries. Journal of Studies on Alcohol and Drugs. 2007;68(2):296–302. [PubMed]
13. Del Boca FK, Noll JA. Truth or consequences: the validity of self-report data in health services research on addictions. Addiction. 2000;95:347–360. [PubMed]
14. Fendrich M, Johnson TP, Wislar JS, et al. The utility of drug testing in epidemiological research: results from a general population survey. Addiction. 2004;99:2197–2208. [PubMed]
15. Langenbucher J, Merrill J. The validity of self-reported cost events by substance abusers: limits, liabilities, and future directions. Evaluation Review. 2001;25(2):184–210. [PubMed]
16. Vitale SG, van de Mheen H, van de Wiel A, et al. Substance use among emergency room patients: is self-report preferable to biochemical markers? Addictive Behaviors. 2006;31(9):1661–1669. [PubMed]
17. Saunders RC, Heflinger CA. Integrating data from multiple public sources: opportunities and challenges for evaluators. Evaluation: The International Journal of Theory, Research and Practice. 2004;10(3):349–365.
18. Banks SM, Pandiani JA. Probabilistic population estimation of the size and overlap of data sets based on date of birth. Statistics in Medicine. 2001;20(9–10):1421–1430. [PubMed]
19. Whalen D, Pepitone A, Graver L, et al. Linking Client Records from Substance Abuse, Mental Health and Medicaid State Agencies. Rockville, MD: U.S. Department of Health and Human Services, Substance Abuse and Mental Health Services Administration; 2001.
20. Brady H, Powell A, Grand S, et al. National Research Council Studies of Welfare Populations: Data Collection and Research Issues. Washington: National Academics Press; 2001. Access and Confidentiality Issues with Administrative Data.
21. McCarty D, McGuire TG, Harwood HJ, et al. Using state information systems for drug abuse services research. American Behavioral Scientist. 1998;41(8):1090–1106.
22. Billings J. Using administrative data to monitor access, identify disparities, and assess performance of the safety net: tools for monitoring the health care safety net. 2003. [Accessed August 16, 2007]. Available at: http://www.ahrq.gov/data/safetynet/billing2.htm/
23. Dickey B, Normand ST, Drake R, et al. Limiting inpatient substance use treatment: what are the consequences? Medical Care Research and Review. 2003;60(3):332–346. [PubMed]
24. Duran F, Wilson S, Carroll D. Putting Administrative Data to Work: A Toolkit for State Agencies on Advancing Data Integration and Data Sharing Efforts to Support Sound Policy and Program Development. Farmington, CT: Child Health and Development Institute of Connecticut; 2005.
25. Cuellar AE, Snowden LM, Ewing T. Criminal records of persons served in the public mental health system. Psychiatric Services. 2007;58(1):114–120. [PubMed]
26. Garnick DW, Hendricks AM, Comstock CB. Measuring quality of care: fundamental information from administrative datasets. International Journal for Quality in Health Care. 1994;6(2):163–177. [PubMed]
27. Glance LG, Dick AW, Osler T, et al. Accuracy of hospital report cards based on administrative data. Health Services Research. 2006;41(4):1413–1437. [PMC free article] [PubMed]
28. Goerge R. Using administrative data to perform policy-relevant research. 1997. [Accessed January 26, 2006]. Available at: www/jcpr.org/fall97/v14article3.html/
29. Hotz VJ, Goerge R, Balzekas J, et al. A Report of the Advisory Panel on Research Uses of Administrative Data. Northwestern University/University of Chicago: Joint Center for Poverty Research; 1998. Administrative Data for Policy-Relevant Research: Assessment of Current Utility and Recommendations for Development.
30. Smilanick P. Welfare attrition: cases leaving aid what we know from statewide administrative data. 2001. [Accessed December 9, 2005]. Available at: http://www.dss.cahwnet.gov/research/res/pdf/CalWORKsAttrition10_01.pdf.
31. UC Data. A Report by UC DATA to the Northwestern University/University of Chicago Joint Center for Poverty Research. An Inventory of Research Uses of Administrative Data in Social Services Programs in the United States: 1998. Berkeley: Data Archive & Technical Assistance, University of California; 1999.
32. Saunders RC, Heflinger CA. Access to and patterns of use of behavioral health services among children and adolescents in Tenn Care. Psychiatric Services. 2003;54(10):1364–1371. [PubMed]
33. Zingmond DS, Ettner SL, Cunningham WE. The impact of managed care on access to highly active antiretroviral therapy and on outcomes among medicaid beneficiaries with AIDS. Medical Care Research and Review. 2007;64(1):66–82. [PubMed]
34. Young TK, Kliewer E, Blanchard J, et al. Monitoring disease burden and preventive behavior with data linkage: cervical cancer among aboriginal people in Manitoba, Canada. American Journal of Public Health. 2000;90:1466–1468. [PubMed]
35. Sorensen B, Shen H. Restraining orders in California: a look at statewide data. Violence Against Women. 2005;11(7):912–933. [PubMed]
36. Desai MM, Rosenheck RA, Craig TJ. Case-finding for depression among medical outpatients in the veterans health administration. Medical Care. 2006;44(2):175–181. [PubMed]
37. Desai RA, Stefanovics EA, Rosenheck RA. The role of psychiatric diagnosis in satisfaction with primary care: data from the department of veterans affairs. Medical Care. 2005;43(12):1208–1216. [PubMed]
38. Greenberg GA, Rosenheck RA. Continuity of care and clinical outcomes in a national health system. Psychiatric Services. 2005;56(4):427–433. [PubMed]
39. McGinnis KA, Fine MJ, Sharma RK, S, et al. Understanding racial disparities in HIV using data from the veterans aging cohort 3-site study and VA administrative data. American Journal of Public Health. 2003;93(10):1728–1733. [PubMed]
40. Dickey B, Dembling B, Azeni H, et al. Externally caused deaths for adults with substance use and mental disorders. Journal of Behavioral Health Services & Research. 2004;31(1):75–85. [PubMed]
41. Drescher K, Rosen C, Burling T, et al. Causes of death among male veterans who received residential treatment for PTSD. Journal of Traumatic Stress. 2003;16(6):535–543. [PubMed]
42. Johnson JE, Finney JW, Moos RH. Predictors of 5-year mortality following inpatient/residential group treatment for substance use disorders. Addictive Behaviors. 2005;30(7):1300–1316. [PubMed]
43. Liskow BI, Powell BJ, Penick EC, et al. Mortality in male alcoholics after ten to fourteen years. Journal of Studies on Alcohol. 2000;61(6):853–861. [PubMed]
44. Masudomi I, Isse K, Uchiyama M, et al. Self-help groups reduce mortality risk: a 5-year follow-up study of alcoholics in the Tokyo metropolitan area. Psychiatry and Clinical Neurosciences. 2004;58(5):551–557. [PubMed]
45. Moos RH, Brennan PL, Mertens JR. Mortality rates and predictors of mortality among late-middle-aged and older substance abuse patients. Alcoholism: Clinical and Experimental Research. 1994;18(1):187–195. [PubMed]
46. Schifano F, Oyefeso A, Corkery J, et al. Death rates from ecstasy (MDMA, MDA) and polydrug use in England and Wales, 1996–2002. Human Psychopharmacology: Clinical and Experimental. 2003;18(7):519–524. [PubMed]
47. Smyth B, Fan J, Hoffman V, et al. Years of potential life lost among heroin addicts 33 years after treatment. Preventive Medicine. 2007;44(4):369–374. [PMC free article] [PubMed]
48. Vlahov D, Wang C, Galai N, et al. Mortality risk among new onset injection drug users. Addiction. 2004;99(8):946–954. [PubMed]
49. Katon WJ, Roy-Byrne P, Russo J, et al. Cost-effectiveness and cost offset of a collaborative care intervention for primary care patients with panic disorder. Archives of General Psychiatry. 2002;59(12):1098–1104. [PubMed]
50. Fingan MW. Societal Outcomes and Cost Savings of Drug and Alcohol Treatment in the State of Oregon. Office of Alcohol and Drug Abuse Programs: Oregon Department of Human Resources; 1996.
51. Gerstein DR, Johnson RA, Harwood H, et al. Evaluating Recovery Services. The California Drug and Alcohol Treatment Assessment (CALDATA) Sacramento, CA: State of California Department of Drug and Alcohol Programs; 1994.
52. Longhi D, Brown M, Comtois R. ADATSA Treatment Outcomes: Employment and Cost Avoidance: An Eighteen Month Follow-Up Study of Indigent Persons Served by Washington State’s Alcoholism and Drug Addiction Treatment and Support Act (Report Number 4.19) Washington State Department of Social and Health Services Planning, Research and Development. Office of Research and Data Analysis; 1994. Nov,
53. U. S. Department of Health and Human Services. Developing State Outcomes Monitoring Systems for Alcohol and Other Drug Abuse Treatment: Treatment Improvement Protocol (TIP) Series 14. DHHS Pub. No. SMA 95-3021. Rockville, MD: Substance Abuse and Mental Health Services Administration, Center for Substance Abuse Treatment; 1995.
54. Brown TG, Topp J, Ross D. Rationales, obstacles and strategies for local outcome monitoring systems in substance abuse treatment settings. Journal of Substance Abuse Treatment. 2003;24(1):31–42. [PubMed]
55. Camp JM, Krakow M, McCarty D, et al. Substance abuse treatment management information systems: balancing federal, state, and service provider needs. Journal of Mental Health Adminstration. 1992;19(1):5–20. [PubMed]
56. Ogborne AC, Braun K, Rush BR. Developing an integrated information system for specialized addiction treatment agencies. Journal of Behavioral Health Services & Research. 1998;25(1):100–107. [PubMed]
57. Teruya C, Hardy M, Hser YI, et al. Implementation of a statewide outcome monitoring system: lessons learned from substance abuse treatment provider staff. Qualitative Health Research. 2006;16(3):337–352. [PubMed]
58. Wisdom JP, Ford IH, Hayes RA, et al. Addiction treatment agencies' use of data: a qualitative assessment. Journal of Behavioral Health Services & Research. 2006;33(4):394–407. [PubMed]
59. Tiet QQ, Byrnes HF, Barnett P, et al. A practical system for monitoring the outcomes of substance use disorder patients. Journal of Substance Abuse Treatment. 2006;30(4):337–347. [PubMed]
60. Alterman AI, Langenbucher J, Morrison RL. State-level treatment outcome studies using administrative databases. Evaluation Review. 2001;25(2):162–183. [PubMed]
61. Bailey WP. Tools for monitoring the health care safety net: integrated state data systems. 2007. [Accessed February 6, 2007]. Available at: www.ahrq.gov/data/safetynet/bailey.htm.
62. Garnick DW, Lee MT, Chalk M, et al. Establishing the feasibility of performance measures for alcohol and other drugs. Journal of Substance Abuse Treatment. 2002;23(4):375–385. [PubMed]
63. Evans E, Hser YI. Pilot-testing a statewide outcome monitoring system: overview of the California Treatment Outcome Project (CalTOP) Journal of Psychoactive Drugs, SARC Supplement. 2004;2:109–114. [PubMed]
64. TOPPS-II Interstate Cooperative Study. Drug treatment completion and post-discharge employment in the TOPPS-II Interstate Cooperative Study. Journal of Substance Abuse Treatment. 2003;25(1):9–18. [PubMed]
65. CSAT Performance Management Technical Assistance Coordinating Center. TAP 29: Integrating State Administrative Records to Manage Substance Abuse Treatment System Performance. Rockville, MD: U.S. Department of Health and Human Services: Substance Abuse and Mental Health Services Administration, Center for Substance Abuse Treatment; 2007.
66. Garnick DW, Hodgkin D, Horgan CM. Selecting data sources for substance abuse services research. Journal of Substance Abuse Treatment. 2002;22(1):11–22. [PubMed]
67. Garnick DW, Horgan CM, Chalk M. Performance measures for alcohol and other drug services. Alcohol Research & Health. 2006;29(1):19–26. [PubMed]
68. Oklahoma Department of Mental Health and Substance Abuse Services. Developing an Outcomes Monitoring System Using Secondary Data to Evaluate Substance Abuse Treatment. Final Report. Rockville, MD: Center for Substance Abuse Treatment, Substance Abuse and Mental Health Services Administration; 2000.
69. Lundgren L, Chassler D, Ben-Ami L, et al. Factors associated with emergency room use among injection drug users of African-American, Hispanic and White-European background. American Journal on Addiction. 2005;14(3):268–280. [PubMed]
70. Rosen MI, McMahon TJ, Rosenheck R. Does assigning a representative payee reduce substance abuse? Drug and Alcohol Dependence. 2007;86(2–3):115–122. [PubMed]
71. Ray GT, Weisner CM, Mertens JR. Relationship between use of psychiatric services and five-year alcohol and drug treatment outcomes. Psychiatric Services. 2005;56(2):164–171. [PubMed]
72. Maynard C, Cox GB, Krupski A, et al. Utilization of services by persons discharged from involuntary chemical dependency treatment. Journal of Addictive Diseases. 2000;19(2):83–93. [PubMed]
73. Deck DD, McFarland BH, Titus JM, et al. Access to substance abuse treatment services under the Oregon health plan. JAMA. 2000;284(16):2093–2099. [PubMed]
74. Deck D, Carlson MJ. Access to publicly funded methadone maintenance treatment in two western states. Journal of Behavioral Health Services & Research. 2004;31(2):164–177. [PubMed]
75. Deck D, Carlson MJ. Retention in publicly funded methadone maintenance treatment in two western states. Journal of Behavioral Health Services & Research. 2005;32(1):43–60. [PubMed]
76. Fuller BE, Rieckmann TR, McCarty DJ, et al. Elimination of methadone benefits in the Oregon health plan and its effects on patients. Psychiatric Services. 2006;57(5):686–691. [PubMed]
77. Deck DD, Wiitala WL, Laws KE. Medicaid coverage and access to publicly funded opiate treatment. Journal of Behavioral Health Services & Research. 2006;33(3):324–334. [PubMed]
78. Cook TD, Campbell D. Quasi-experimentation: Design & Analysis Issues for Field Settings. Chicago: RandMcNally College Publishing Co.; 1979.
79. Farabee D, Hser T, Anglin MD, et al. Recidivism among an early cohort of California’s Proposition 36 offenders. Criminology & Public Policy. 2004;3(4):563–584.
80. Niv N, Hser YI. Drug treatment service utilization and outcomes for Hispanic and White methamphetamine abusers. Health Services Research. 2006;41(4):1242–1257. [PMC free article] [PubMed]
81. Evans E, Spear S, Huang Y, et al. Do American Indians benefit from drug and alcohol treatment? treatment outcomes among American Indians in CalTOP. American Journal of Public Health. 2006;96(5):889–896. [PubMed]
82. Niv N, Wong EC, Hser YI. Asian Americans in community-based substance abuse treatment: service needs, utilization, and outcomes. Journal of Substance Abuse Treatment. 2007;33(3):313–319. [PubMed]
83. Hser YI, Evans E, Huang D. Treatment outcomes among women and men methamphetamine abusers in California. Journal of Substance Abuse Treatment. 2005;28:77–85. [PubMed]
84. Hser YI, Niv N. Pregnant women in women-only and mixed-gender substance abuse treatment programs: a comparison of client characteristics and program services. Journal of Behavioral Health Services & Research. 2006;33(4):431–442. [PubMed]
85. Niv N, Hser YI. Women-only and mixed-gender drug abuse treatment programs: service needs, utilization and outcomes. Drug and Alcohol Dependence. 2007;87(2–3):194–201. [PubMed]
86. Hser YI, Grella C, Evans E, et al. Utilization and outcomes of mental health services among patients in drug treatment. Journal of Addictive Diseases. 2006;25(1):73–85. [PubMed]
87. Grella CE, Hser YI, Huang Y. Mothers in substance abuse treatment: differences in characteristics based on involvement with child welfare services. Child Abuse & Neglect. 2006;30(1):55–73. [PubMed]
88. Green BL, Rockhill A, Furrer C. Does substance abuse treatment make a difference for child welfare case outcomes? a statewide longitudinal analysis. Children and Youth Services Review. 2006;29:460–473.
89. Claus RE, Orwin RG, Kissin W, et al. Does gender-specific substance abuse treatment for women promote continuity of care? Journal of Substance Abuse Treatment. 2007;32(1):27–39. [PubMed]
90. Wickizer T, Maynard C, Atherly A, et al. Completion rates of clients discharged from drug and alcohol treatment programs in Washington state. American Journal Public Health. 1994;84(2):215–221. [PubMed]
91. Mertens JR, Weisner CM, Ray GT. Readmission among chemical dependency patients in private, outpatient treatment: patterns, correlates and role in long-term outcome. Journal of Studies on Alcohol. 2005;66(6):842–847. [PubMed]
92. Metsch LR, Pereyra M, Miles CC, et al. Welfare and work outcomes after substance abuse treatment. Social Service Review. 2003;77(2):237–254.
93. Panas L, Caspi Y, Fournier E, et al. Performance measures for outpatient substance abuse services: group versus individual counseling. Journal of Substance Abuse Treatment. 2003;25(4):271–278. [PubMed]
94. Luchansky B, He L, Krupski A, et al. Predicting readmission to substance abuse treatment using state information systems: the impact of client and treatment characteristics. Journal of Substance Abuse. 2000;12(3):255–270. [PubMed]
95. Maynard C, Cox GB, Hall J, et al. Substance use and five-year survival in Washington state mental hospitals. Administration and Policy in Mental Health. 2004;31(4):339–345. [PubMed]
96. Luchansky B, Nordlund D, Estee S, et al. Substance abuse treatment and criminal justice involvement for ssi recipients: results from Washington state. American Journal on Addictions. 2006;15(5):370–379. [PubMed]
97. Bureau of Justice Statistics. Use and Management of Criminal History Record Information: A Comprehensive Report, 2001 Update (Report No. NCJ 187670) Washington, D.C: U. S. Department of Justice, Office of Justice Programs; 2001.
98. Lu M, Ma CT. Consistency in performance evaluation reports and medical records. Journal of Mental Health Policy and Economics. 2003;5(4):141–152. [PubMed]
99. Kressin NR, Chang BH, Hendricks A, et al. Agreement between administrative data and patients' self-reports of race/ethnicity. American Journal of Public Health. 2003;93(10):1734–1739. [PubMed]
100. Luchansky B, Brown M, Longhi D, et al. Chemical dependency treatment and employment outcomes: results from the 'ADATSA' program in Washington state. Drug and Alcohol Dependence. 2000;60(2):151–159. [PubMed]
101. Luchansky B, He L, Longhi D, et al. Treatment readmissions and criminal recidivism in youth following participation in chemical dependency treatment. Journal of Addictive Diseases. 2006;25(1):87–96. [PubMed]
102. Luchansky B, Krupski A, Stark K. Treatment response by primary drug of abuse: does methamphetamine make a difference? Journal of Substance Abuse Treatment. 2007;32(1):89–96. [PubMed]
103. French MT, Salomé HJ, Carney M. Using the DATCAP and ASI to estimate the costs and benefits of residential addiction treatment in the state of Washington. Social Science & Medicine. 2002;55(12):2267–2282. [PubMed]
104. Harwood H, Fountain D, Livermore G. The Economic Costs of Alcohol and Drug Abuse in The United States, 1992. Rockville, MD: National Institute on Drug Abuse; 1998.
105. Holder HD. Cost benefits of substance abuse treatment: an overview of results from alcohol and drug abuse. Journal of Mental Health Policy and Economics. 1998;1:23–29. [PubMed]
106. McCollister KE, French MT. The relative contribution of outcome domains in the total economic benefit of addiction interventions: a review of first findings. Addiction. 1998;98:1647–1659. [PubMed]
107. Salome HJ, French MT, Scott C, et al. Investigating the economic costs and benefits of addiction treatment: econometric analysis of the Chicago target cities project. Evaluation and Programming Planning. 2003;26(3):325–338.
108. Sindelar JL, Jofre-Bonet M, French MT, et al. Cost-effectiveness analysis of addiction treatment: paradoxes of multiple outcomes. Drug and Alcohol Dependence. 2004;73(1):41–50. [PubMed]
109. Wall R, Rehm J, Fischer B, et al. Social costs of untreated opioid dependence. Journal of Urban Health: Bulletin of the New York Academy of Medicine. 2000;77(4):688–722. [PMC free article] [PubMed]
110. Vencill C, Sadjadi Z. Allocation of the California war costs: direct expenses, externalities, opportunity costs, and fiscal losses. The Justice Policy Journal. 2001;1(1):1–40.
111. Office of National Drug Control Policy. The Economic Costs of Drug Abuse in the United States, 1992–1998. (Publication No. NCJ-190636.) Washington, DC: The Executive Office of the President; 2001.
112. French MT, Salome H, Krupski A, et al. Benefit-cost analysis of residential and outpatient addiction of treatment in the state of Washington. Evaluation Review. 2000;24(6):609–634. [PubMed]
113. Hunkeler E, Hung Y, Rice D, et al. Alcohol consumption patterns and health care costs in an HMO. Drug and Alcohol Dependence. 2001;64:181–190. [PubMed]
114. Sturm R. The Costs of Covering Mental Health and Substance Abuse as Medical Care in Private Insurance Plans. (RAND Health Publication No. CT-180.) Chicago, IL: RAND; 2001.
115. Sturm R. The effects of obesity, smoking and drinking on medical problems and costs. Health Affairs. 2002;21(2):245–253. [PubMed]
116. Palepu A, Tyndall MW, Leon H, et al. Hospital utilization and costs in a cohort of injection drug users. Canadian Medical Association Journal. 2001;165:415–420. [PMC free article] [PubMed]
117. Daley M, Argeriou M, McCarty D, et al. The costs of crime and the benefits of substance abuse treatment for pregnant women. Journal of Substance Abuse Treatment. 2000;19(4):445–458. [PubMed]
118. Mark T, Woody G, Juday T, et al. The economic costs of heroin addiction. Drug and Alcohol Dependence. 2001;61:195–206. [PubMed]
119. Merrill J, Fox K. The Impact of Substance Abuse on Federal Spending. Cost-Benefit/Cost-Effectiveness Research of Drug Abuse Prevention: Implications for Programming and Policy. Rockville, MD: National Institute on Drug Abuse; 1998.
120. Gresenz C, Watkins K, Podus D. Supplemental security income ssi; disability insurance (DI), and substance abusers. Community Mental Health Journal. 1998;34(4):337–350. [PubMed]
121. Cook P, Moore M. The economics of alcohol abuse and alcohol-control policies. Health Affairs. 2000;21(2):120–133. [PubMed]
122. Parthasarathy S, Weisner CM. Five-year trajectories of health care utilization and cost in a drug and alcohol treatment sample. Drug and Alcohol Dependence. 2005;80(2):231–240. [PubMed]
123. Polen MR, Freeborn DK, Lynch FL, et al. Medical cost-offset following treatment referral for alcohol and other drug use disorders in a group model HMO. Journal of Behavioral Health Services & Research. 2006;33(3):335–346. [PubMed]
124. Ettner SL, Huang D, Evans E, et al. Benefit-cost in the California treatment outcome project: does substance abuse treatment "pay for itself?” Health Services Research. 2006;41(1):192–213. [PMC free article] [PubMed]
125. Wickizer TM, Krupski A, Stark KD, et al. The effect of substance abuse treatment on medicaid expenditures among general assistance welfare clients in Washington state. Milbank Quarterly. 2006;84(3):555–576. [PubMed]
126. Carey SM, Finigan M, Crumpton D, et al. California drug courts: outcomes, costs and promising practices: an overview of phase ii in a statewide study. Journal of Psychoactive Drugs, SARC Supplement. 2006;3:345–356. [PubMed]
127. Longshore D, Hawken A, Urada D, et al. SACPA Cost-Analysis Report. Submitted to the California Department of Alcohol and Drug Programs. Los Angeles, CA: UCLA Integrated Substance Abuse Programs; 2007.
128. Caspi Y, Turner WM, Panas L, et al. The severity index: an indicator of alcohol and drug dependence using administrative data. Alcoholism Treatment Quarterly. 2001;19(4):49–64.
129. Deck DD, McFarland BH. Medicaid managed care and substance abuse treatment. Psychiatric Services. 2002;53(7):802. [PubMed]
130. Garnick DW, Horgan CM, Lee MT, et al. Are Washington circle performance measures associated with decreased criminal activity following treatment? Journal of Substance Abuse Treatment. 2007;33(4):341–352. [PMC free article] [PubMed]
131. McCamant LE, Zani BG, McFarland BH, et al. Prospective validation of substance abuse severity measures from administrative data. Drug and Alcohol Dependence. 2007;86(1):37–45. [PubMed]
132. McFarland BH, Deck DD, McCamant LE, et al. Outcomes for medicaid clients with substance abuse problems before and after managed care. Journal of Behavioral Health Services & Research. 2005;32(4):351–367. [PMC free article] [PubMed]
133. Bray J, Vandivort R, Dilonardo J, et al. Healthcare utilization of individuals with opiate use disorders: an analysis of integrated medicaid and state mental health/substance abuse agency data. Journal of Behavioral Health Services & Research. 2008;35(1):91–106. [PMC free article] [PubMed]
134. Hser YI, Evans E, Teruya C, et al. The California Treatment Outcome Project (CalTOP) Final Report. Submitted to the California Department of Alcohol and Drug Programs. Los Angeles, CA: UCLA Integrated Substance Abuse Programs; 2003.
135. Walker R, Mateyoke-Scrivner A, Cole J, et al. Kentucky treatment outcome study statewide follow-up findings fiscal year 2005. 2007. [Accessed January 25, 2008]. Available at http://cdar.uky.edu//ktos/KTOSFollow.html.
136. Logan TK, Hoyt W, Leukefeld C. Kentucky drug court outcome evaluation: behavior, costs, & avoided costs to society. 2001. [Accessed January 25, 2008]. Available at http://courts.ky.gov/stateprograms/drugcourt/articles.htm.
137. Justice AC, Erdos J, Brandt C, et al. The veterans affairs healthcare system a unique laboratory for observational and interventional research. Medical Care. 2006;44(8):S7–S12. [PubMed]
138. Walkup JT, Yanos PT. Psychological research with administrative data sets: an underutilized strategy for mental health services research. Professional Psychology Research and Practice. 2005;36(5):551–557. [PMC free article] [PubMed]
139. Robins LN. Explaining when arrests end for serious juvenile offenders: comments on the Sampson and Laub study. The ANNALS of the American Academy of Political and Social Science. 2005;602:57–72.
140. Divorski S, Scheirer MA. Improving data quality for performance measures: results from a GAO study of verification and validation. Evaluation and Program Planning. 2001;24:83–94.
141. Wismer K. Expert Group Meeting on Setting the Scope of Social Statistics. New York: United Nations Statistics Division in collaboration with the Siena Group on Social Statistics; 2003. Use of registers in social statistics in Denmark.
142. Berman F, Brady H. Cyberinfrastructure and the social sciences: final report of NSF SBE-CISE workshop. 2005. [Accessed August 16, 2007]. Available at: http://vis.sdsc.edu/sbe/reports/SBE-CISE-FINAL.pdf.
143. McLellan AT, Chalk M, Bartlett J. Outcomes, performance, and quality: what's the difference? Journal of Substance Abuse Treatment. 2007;32(4):331–340. [PubMed]