|Home | About | Journals | Submit | Contact Us | Français|
Over the past 10 years, the field of health services and management research has seen renewed interest in the use of qualitative research methods. This article examines the volume and characteristics of qualitative research articles published in nine major health services and management journals between 1998 and 2008. Qualitative research articles comprise 9% of research articles published in these journals. Although the publication rate of qualitative research articles has not kept pace with that of quantitative research articles, citation analysis suggests that qualitative research articles contribute comparably to the field’s knowledge base. A wide range of policy and management topics has been examined using qualitative methods. Case study designs, interviews, and documentary sources were the most frequently used methods. Half of qualitative research articles provided little or no detail about key aspects the study’s methods. Implications are discussed and recommendations are offered for promoting the publication of qualitative research.
Over the past 10 years, the field of health services and management research has seen renewed interest in the use of qualitative research methods. In 1998, the Agency for Health Care Policy and Research (now the Agency for Healthcare Research and Quality) and the Robert Wood Johnson Foundation jointly sponsored an all-day forum for experienced health services researchers and qualitative methodologists to discuss the role of qualitative methods in examining the challenges facing the health care system in the 21st century. In 1999, Health Services Research published a special supplement in which participants of the 1998 conference described the value of qualitative methods for health services and management research, proposed ways to assess the quality of qualitative research, and presented examples of qualitative research that explored contemporary issues in health care policy and delivery.1 Also that year, the Association for Health Services Research (now AcademyHealth) held a workshop on qualitative research methods in conjunction with its annual meeting, a tradition that has continued to the present day. Since then, health services and management journals have published several articles describing specific approaches to qualitative research, discussing strategies for addressing common methodological issues, and exploring the promise and pitfalls of mixing qualitative and quantitative methods (Bradley, Curry, & Devers, 2007; Collingridge & Gantt, 2008; Ensign, 2003; Hoff & Witt, 2000; Moffatt, White, Mackintosh, & Howel, 2006; Mykhalovskiy et al., 2008; Pope, van Royen, & Baker, 2002; Sofaer, 2002). Other developments, such as journal editorial policies expressing openness to qualitative research and greater exposure to qualitative research methods in doctoral programs, also point to a rising recognition and acceptance of qualitative research.
Although interest seems to have grown, we know little about the extent to which health services and management researchers use qualitative research methods, the kinds of research questions examined with these methods, the types of qualitative methods employed, or the impact that qualitative research articles have had on the field. In an earlier review, Hoff and Witt (2000) explored these issues by examining articles published between 1995 and 1997 in nine major health services and management journals. They found that (a) only one in seven published articles used qualitative methods, (b) two of the nine journals they examined published nearly half of the articles using qualitative methods, (c) qualitative methods were used primarily for description and for articulating stakeholder perspectives, and (d) articles using qualitative methods often presented little or no information about methodological procedures. It is unclear whether this portrait of qualitative research in the mid-1990s still holds or whether the use of qualitative methods has changed over the past 10 years. Given the growing interest in qualitative research, it is timely to revisit these issues in order to provide researchers, journal editors, policy makers, and educators an updated picture of how often, for what purpose, and in what manner qualitative methods are used in health services and management research.
In this article, we examine the volume and characteristics of qualitative research articles published in the same nine health services and management journals reviewed by Hoff and Witt (2000). The time period covered in our review extends from 1998 to 2008. Our review explores the following questions: How prevalent is the use of qualitative methods in health services and management research? Has the use of qualitative methods increased, decreased, or remained steady over the past 10 years? What kinds of qualitative methods are researchers using, and for what purpose? How have articles using qualitative methods contributed to the field’s knowledge base? We examine these and other questions using content analytic methods modeled on those that Hoff and Witt (2000) used in their earlier review.
This article provides updated information to journal editors, reviewers, readers, and researchers on the volume and characteristics of qualitative research articles published in health services and management journals. For journal editors, the review provides information about overall publication rate, publication rate over time, publication rate by journal, and published article content. This would allow editors to compare the volume and type of qualitative research articles that their own journal publishes with other competitor journals in the field. Editors could use this information to examine and, possibly, alter formal policies and informal perceptions about their particular journal’s interest in qualitative research. For editors, reviewers, and readers, this review reveals the amount of methodological detail reported in published qualitative research articles. This information could inform editorial policies and reviewer expectations about the type and amount of methodological detail that qualitative research articles should contain. For researchers, the review indicates the types of qualitative methods that other researchers are using, the purposes and manner in which they are using them, and the journals in which they are publishing their qualitative research findings. This information could inform the choices that researchers using qualitative methods make about study designs, research objectives, and journal submissions. In these ways, this review could influence the scope and substance of future published research in health services and management journals.
Renewed interest in qualitative research methods can be traced to growing concerns about the gap between what we know and what we need to know about health care financing, organization, delivery, and outcomes. Scholars have argued that health policy making, research, and management could benefit from more in-depth, textured descriptions of what actually happens in practice settings, health care markets, and patients’ lives (Beaton & Clark, 2009; Hurley, 1999; Nembhard, 2009; Shortell, 1999; Sinuff, Cook, & Giacomini, 2007; Sofaer, 2002; Steinhauser & Barroso, 2009). Likewise, scholars and practitioners alike increasingly recognize that context matters when transferring evidence-based practices from one setting to another or translating research evidence from randomized controlled trials into everyday clinical practice (Berwick, 2003; Grol, Wensing, & Eccles, 2005; Hoff & Sutcliffe, 2006; Keenan, 2007). Yet our knowledge of which contextual features matter, when they matter, and how much they matter remains limited (Glasgow & Emmons, 2007; Greenhalgh, Robert, Macfarlane, Bate, & Kyriakidou, 2004; Pope et al., 2002). Finally, scholars and practitioners have noted that rapid changes in health care often outstrip the capacity of existing data sets to address the issues at hand, and that the length of time needed to field new surveys is usually too long to meet the time-sensitive information needs of health care managers and policy makers (Lavis et al., 2002; Shortell, 1999; Walshe & Rundall, 2001).
Qualitative research methods are well suited for addressing such gaps in our knowledge. Qualitative methods are useful for developing rich, detailed description; studying the complex interplay of action and context; exploring new or rapidly emerging phenomena; and generating theoretical insights (Bradley et al., 2007; Creswell, 2007; Devers, 1999; Hoff & Witt, 2000; Hurley, 1999; Nembhard, 2009). Qualitative research can be described as
… an inquiry process of understanding based on distinct methodological traditions of inquiry that explore a social or human problem. The researcher builds a complex, holistic picture, analyzes words, reports detailed views of informants, and conducts the study in a natural setting (Creswell, 1998, p. 15).
As this definition emphasizes, qualitative research embraces a variety of traditions or modes of inquiry, including phenomenology, grounded theory, ethnography, discourse analysis, narrative analysis, and case study. Creswell (1998, p. 2) notes that each of these traditions has “a distinguished history in one of the disciplines” and has “spawned books, journals, and distinct methodologies that characterize its approach.” Different traditions are suited for different purposes (Devers, 1999). For example, grounded theory is useful for generating theory on an inductive basis (Charmaz, 2006; Corbin & Strauss, 2008; Strauss, Corbin, & Moye Wicks Freymann Memorial Fund, 1998). Discourse analysis, on the other hand, is useful to examining how actors use language to frame issues, actions, and events (Johnstone, 2008; Wodak & Krzyzanowski, 2008). Qualitative research embraces several data collection strategies: interviews, observation, focus groups, field work, and use of archival materials such as reports, memos, agendas, and minutes (Charmaz, 2006; Krueger & Casey, 2009; Marshall & Rossman, 2006; Miles & Huberman, 1994; Ulin, Robinson, Tolley, & Speizer, 2005). These data collection methods may be used separately or they may be combined with each other or with quantitative data collection methods, such as surveys (Creswell & Plano Clark, 2007). Qualitative data can be analyzed in a variety of ways, depending on the research aim. Common analysis strategies include “thick” description, theme identification, and pattern matching. Although qualitative methods are diverse, they share several features: (a) a focus on actors’ perceptions and “lived experience;” (b) close attention to the context embedding a phenomenon; (c) an emphasis on interpretative rather than statistical forms of pattern matching; and (d) an iterative process of sampling, data collection, and data analysis (Creswell, 2007; Marshall & Rossman, 2006; Miles & Huberman, 1994).
Several factors could influence the prevalence and characteristics of qualitative research articles appearing in health services and management journals. One such factor is the number of researchers conducting qualitative health services and management research. This factor is influenced, in turn, by the funding priorities of federal agencies, foundations, and other organizations that sponsor health services and management research. It is also influenced by the availability and quality of training in qualitative research methods in graduate (doctoral) and postgraduate career stages. Another factor is researchers’ choice of publication outlet. Health services and management researchers can submit their qualitative research manuscripts to a growing number of journals, including many topically focused or disciplinary journals that routinely or periodically publish health services and management research (e.g., Implementation Science and Journal of Health and Social Behavior). Researchers’ choices are influenced by the degree to which they perceive a fit between the study’s focus and the journal’s focus, as well as the degree to which they perceive the journal as receptive to qualitative forms of research. A third important factor is the preference of journal editors and reviewers. Editors and reviewers may vary in the extent to which they value qualitative approaches to research, feel comfortable assessing the quality of qualitative research, or both. Last, the prevalence and characteristics of qualitative research articles could be influenced by the fit or lack of fit between the amount of information presented in studies using qualitative research methods and the page-space available in journals. Qualitative researchers often find it challenging to condense richly detailed data. Thus, where researchers publish qualitative articles, and what they report, may be influenced by journals’ page limitations and formatting requirements.
We performed a content analysis of research articles published in selected health services and management journals from 1998 to 2008. We followed Hoff and Witt’s (2000) methods as closely as possible to maximize the comparability of the current review. Specifically, we reviewed the same nine interdisciplinary journals in health services and management research included in the earlier review. The journals included Health Services Research; Medical Care Research and Review; Inquiry; Medical Care; Health Affairs; Milbank Quarterly; Journal of Health Politics, Policy, and Law; Health Care Management Review; and Journal of Healthcare Management. Hoff and Witt (2000) focused on these nine journals for two reasons. First, they argued, these journals exert the greatest influence on what health services researchers study and how they study it. Second, these journals ranked as high-quality journals in published surveys of health administration faculty (Brooks, Walker, & Szorady, 1991; McCracken & Coffey, 1996; see also Williams, Stewart, O’Connor, Savage, & Shewchuk, 2002). Like their review, ours excluded discipline- or topic-specific journals such Journal of Health Economics and Journal of Patient Safety
The review proceeded in four stages. First, we assembled a bibliographic library containing the references of all items published in the nine aforementioned journals between January 1998 and December 2008. We then excluded editorials, letters, commentaries, personal narratives, and book reviews. Following Hoff and Witt (2000), we also excluded articles published in special or supplemental issues. These authors justified this decision by arguing that the status of articles appearing in special or supplemental issues is often not clear (e.g., invited vs. regular submission, special vs. normal peer review). Such uncertainty makes it difficult to determine whether such articles are typical of what the journal publishes.
Second, we counted the number of articles that could be classified as research. To make this count, two authors independently reviewed abstracts using predetermined inclusion criteria and reconciled disagreements through discussion and re-review until they reached consensus. Following Hoff and Witt (2000), we defined “research articles” using two criteria: (a) inclusion of empirical questions or hypotheses and (b) analysis of data addressing these questions or hypotheses. We also included articles presenting data about temporal or geographic trends (e.g., cross-national comparisons in medical expenditures). We excluded literature reviews, conceptual articles (i.e., thought-pieces), opinion pieces, and firsthand accounts. Following Hoff and Witt (2000), we also excluded articles that focused on model, method, or measure development. “Methodologically oriented” articles, these authors argued, do not investigate specific research questions or hypotheses, but rather develop or evaluate means by which to investigate a phenomenon.
In the third stage, we counted the number of research articles that could be classified as qualitative research. To make this count, two authors independently reviewed abstracts to identify those articles that used qualitative data collection and analysis methods. When the abstract did not contain enough information to make this determination, the authors independently consulted the full text of the article.2 The two authors reconciled disagreements through re-review and discussion until they reached consensus. We did not count articles as qualitative research if they described the use of structured or standardized interviews (in which the data are encoded as numbers during data collection), the use of quantitative content-analytic techniques (in which the data are encoded and analyzed exclusively as numbers during data analysis), or the use of qualitative techniques limited to survey instrument development and not the empirical investigation of research questions or hypotheses.
In the fourth stage of the analysis, three authors performed a content analysis of the research articles that used qualitative data collection and analysis methods. Using a structured form, we abstracted the following information from the full text of the article: (a) the purpose of the article; (b) the data collection method; (c) the study design; (d) the level of methodological detail in the article; (e) the funding source for the research, if any; and (f) the number of pages. Table 1 shows the “dictionary” that we used to structure the abstraction form.
Our abstraction procedures followed those of the earlier review, with four exceptions. First, we systematically abstracted information about study design. Hoff and Witt (2000) did not abstract such information, although they discussed study design issues in their results. Second, we added a category for funding sources to account for research supported by universities, unions, or other funding sources (e.g., World Health Organization). Hoff and Witt had no such category. Third, we used MeSH terms from PubMed to characterize the topical focus of the articles identified in our review. Hoff and Witt accomplished this task by listing an abbreviated title for the 47 articles for which they abstracted information. Our review contained too many articles to make this approach practical. We therefore used MeSH terms to classify and summarize the topics examined in qualitative research articles because the terms are extensive (i.e., comprehensive), explicitly defined, and consistently applied. Finally, we used ISI Social Science Citation Index to obtain a count of the number of times each article had been cited, as of March 2009. Hoff and Witt did not examine this information in their earlier review. We added this information to the current review because we wanted to gauge the impact of qualitative research articles in the health services research literature. Specifically, we wanted to know whether qualitative research articles were cited with comparable frequency as quantitative research articles. To make this comparison, we constructed a random sample of research articles using quantitative methods, stratified by journal and year of publication to match the 329 qualitative research articles identified in the review.
To assess the reliability of our study procedures, we examined the level of intercoder agreement at each stage of our review using Cohen’s Kappa or percentage agreement. Unlike Hoff and Witt (2000), we relied primarily on abstracts to determine whether an article counted as a research article and, if so, whether it used qualitative methods of data collection and analysis. To gauge the reliability of using abstracts to make these determinations, we conducted a full-text review of a 20% random sample of articles considered not research in the second stage, and a full-text review of another 20% random sample of articles considered not qualitative research in the third stage.
Figure 1 summarizes the results of the first three stages of our review. We began with 8,377 items. Excluding letters, editorials, commentaries, and the like reduced the count to 6,810 articles. Based on abstract review and reconciliation, we classified 3,637 articles as research articles. Intercoder agreement prior to reconciliation reached “near perfect” levels (Κ = .93).3 Of the 627 articles subjected to both abstract review and full-text review, 32 were reclassified as research articles based on full-text review. Specifically, 23 were reclassified as quantitative and 9 were reclassified as qualitative. In sum, we correctly classified 95% of articles as not research articles based on abstract review alone.
In the second stage, we reviewed 3,637 articles. Based on abstract review, full-text review in uncertain cases, and reconciliation, we identified 332 articles using qualitative data collection and analysis methods. Intercoder agreement prior to reconciliation reached “near perfect” levels (Κ = .92).4 Of the 727 articles subjected to both abstract review and full-text review, 5 were reclassified as qualitative research articles based on full-text review and 1 was reclassified as not research. In other words, in the second stage, we correctly classified 99% of articles as qualitative research articles based on abstract review alone.
In the third stage, we reviewed the full-text of 346 articles that used qualitative research methods (i.e., 332 articles identified in the section stage, plus 14 articles identified in the reliability check). Three authors divided these articles evenly and abstracted the characteristics listed in Table 1. The other authors each independently re-abstracted separate 10% random samples of articles from the other three authors. The abstraction team achieved 82% agreement with one re-abstraction team member and 88% agreement with the other. Disagreements occurred most frequently in coding for article purpose: Many articles have several aims, and distinctions between purpose codes were subtle. We reclassified 17 articles in the third stage based on abstraction results. Nine of these articles used only quantitative methods, although they were characterized by the authors as case studies. The remaining eight articles were not considered empirical research on closer inspection (e.g., no systematic data collection). The final count of articles that used qualitative research methods was 329.
Figure 2 shows the frequency with which research articles using qualitative methods appeared in nine health services and management research journals included in the review. Overall, 9% of all research articles published in these journals between 1998 and 2008 used qualitative methods. This figure is lower than that reported by Hoff and Witt (2000). They observed that 14% of research articles appearing in these journals between 1995 and 1997 used qualitative methods. Although the number of research articles using qualitative methods fluctuates from year to year, the average number of research articles using these methods over the 10-year period approximates the figure that Hoff and Witt reported for the earlier time period: roughly 30 articles per year. By contrast, the average number of research articles using quantitative methods has grown substantially. Between 1998 and 2008, the 9 journals published an average of 330 quantitative research articles per year. Hoff and Witt report that, between 1995 and 1997, these journals published an average of 226 quantitative research articles per year. It appears, then, that the decline in the percentage of published articles using qualitative research methods reflects an increase in the publication of quantitative research articles rather than a decrease in the publication of qualitative research articles.
Figure 3 shows the distribution of the 329 research articles using qualitative methods across the nine health services and management research journals. Health Affairs published the highest percentage of the 329 research articles using qualitative methods (35%), followed by Health Care Management Review (15%), Journal of Healthcare Management (12%), Health Services Research (12%), and Journal of Health Politics, Policy, and Law (12%). Only a small percentage of the 329 research articles using qualitative methods appeared in Medical Care, Medical Care Research and Review, or Inquiry.
These findings generally match those reported in the earlier review, with three exceptions. Milbank Quarterly and Journal of Health Politics, Policy, and Law accounted for a smaller percentage of the qualitative research articles published between 1998 and 2008 than they did between 1995 and 1997 (8% vs. 19%, and 12% vs. 25%, respectively). These trends are partly explained by the fact that both journals published fewer research articles annually between 1998 and 2008 than they did between 1995 and 1997. Milbank Quarterly published an average of 8.4 research articles per year between 1998 and 2008, compared with an average of 10.4 research articles per year between 1995 and 1997. Journal of Health Politics, Policy, and Law showed a similar downward trend. In both journals, however, the average annual publication rate of qualitative research articles showed an even steeper decline. Milbank Quarterly published an average of 2.6 qualitative research articles per year between 1998 and 2008, compared with an average of 6 per year between 1995 and 1997. Similarly, Journal of Health Politics, Policy, and Law published an average of 3.9 qualitative research articles per year between 1998 and 2008, compared with an average of 8 qualitative research articles per year between 1995 and 1997. The reasons for these declines are unclear.
In contrast, Health Services Research accounted for 13% of the qualitative research articles published between 1998 and 2008, whereas in the earlier period (1995 to 1997), the journal published no qualitative research articles. This increase coincided with a change in the journal’s editor in 1997 (Stephen Shortell, PhD) and the publication of the supplemental issue on qualitative research methods in 1999. However, from 2000 to present, Medical Care Research and Review has had two well-known editors who have published many research articles that used qualitative methods (Jeffrey Alexander, PhD, and Gloria Bazzoli, PhD). Yet a small percentage of qualitative research articles published during these years appeared in the journal. It seems that it takes more than the appointment of a sympathetic editor to produce an increased number of qualitative research articles published in a health services and management journal.
Health services and management researchers have used qualitative methods to investigate a wide range of issues. PubMed assigned 600 “topical” MeSH terms to the 329 qualitative research articles identified in this review. (We excluded “human,” “United States,” “female,” and other less descriptive terms from our definition of topics.) Table 2 lists the 25 MeSH terms most frequently assigned to qualitative research articles. As expected, qualitative methods have been used to examine management-oriented topics such as innovation, leadership, decision making, total quality management, and organizational culture. For example, McAlearney (2008) used data from three qualitative studies to examine the potential for leadership development programs to improve quality and efficiency in hospitals and health systems. Similarly, Lukas et al. (2007) used a comparative (multiple) case study design to build and explore an organizational model of transformational change in health care delivery systems. Health services researchers did not use qualitative methods solely to explore management-oriented topics, however. Other, more policy-oriented topics included health care financing, health reform, market competition, health insurance benefits, managed care, and health services access. Devers et al. (2003), for example, used qualitative and quantitative data from the Community Tracking Study to describe how hospitals’ negotiating leverage with managed care plans changed from 1996 to 2001 and to identify factors that explained such changes. Similarly, McCormack, Gabel, Whitmore, Anderson, and Pickreign (2002) used data from national surveys and key informant interviews to examine changes in retiree health benefits from 1998 to 2001.
Other topics examined with qualitative research methods, albeit with less frequency, include physician–patient communication and decision making (McGuire, McCullough, Weller, & Whitney, 2005; Sarkisian, Hays, Berry, & Mangione, 2001; Tucker et al., 2003), team models of care delivery (Best, Hysong, Pugh, Ghosh, & Moore, 2006; Cohen et al., 2004; Dreachslin, Hunt, & Sprainer, 1999; Jacobson, Parker, & Coulter, 1998), consumer use of cost and quality information (O’Day, Palsbo, Dhont, & Scheer, 2002; Smith, Gerteis, Downey, Lewy, & Edgman-Levitan, 2001; Sofaer, Crofton, Goldstein, Hoy, & Crabb, 2005), and consumer engagement in health plan choice (Fraser, Chait, & Brach, 1998; Lave, Peele, Black, Evans, & Amersbach, 1999; Sarkisian et al., 2001).
In addition to looking at wide range of topics, health services researchers have used qualitative methods for a variety of research purposes (see Table 3). In all, 37% of the qualitative research articles identified in our review focused primarily on description (i.e., the way things are). For example, Barr et al. (2006) conducted interviews with hospital executives to explore how hospitals are using public reports of patient satisfaction to identify and target quality improvement activities, evaluate performance, and monitor progress. A total of 27% of the articles identified in our review focused primarily on process, or how things have changed over time. To illustrate, Felland, Lesser, Staiti, Katz, and Lichiello (2003) used three rounds of site visit data from the Community Tracking Study to determine how the capacity and viability of local health care safety nets had changed from 1996 to 2001. A somewhat smaller percentage of articles (18%) used qualitative methods to explore people’s views and perspectives on issues or events. Often such articles examine people’s views about how things should be or could be. Lave et al. (1999), for example, conducted focus groups to explore what employees of large firms think should or should not change in employer-sponsored health plans. Finally, 13% of the articles in our review sought to explain how or why something happens or exists. In some cases, explanatory articles employed an explicit conceptual model, as in Helfrich, Weiner, McKinney, and Minasian’s (2007) study of the factors affecting the implementation of complex innovations in health care. In other cases, the authors took a more inductive explanatory approach, as in R. S. Brown and Gold’s (1999) multimarket study of what drives Medicare managed care growth. In general, the findings for research purpose observed in the present review corresponded to the findings for research purpose observed in Hoff and Witt’s (2000) review.
The case study was the most common study design. Of the 329 qualitative research articles in our review, 59% used a multiple-case study design and 11% used a single-case study design. It is important to note, however, that we defined “multiple-case study design” loosely as a design involving the collection of data from multiple organizations, markets, or states. Most of the articles classified as multiple case studies did not describe a systematic strategy for comparing cases (e.g., replication logic, qualitative comparative analysis). Some authors made comparative statements in the results section with no mention of a systematic analysis strategy (L. D. Brown & Kraft, 2008; Fossett et al., 2000; Luck & Peabody, 2000; McHugh, Staiti, & Felland, 2004). Other authors did not make any “cross-case” comparisons. Instead, they simply reported aggregate results for the sample as a whole (Chernew, Jacobson, Hofer, Aaronson, & Fendrick, 2004; Hall, 2000; McCue, Hurley, Draper, & Jurgensen, 1999; Rosenthal, Landon, Howitt, Song, & Epstein, 2007).
Fewer than 10 of the qualitative research articles published in the 9 journals included in this review employed an ethnographic, grounded theory, phenomenological, or biographical approach. Four studies used ethnographic methods (Dohan, 2002; Ensign, 2004; Malone, 1998; Waitzkin, Schillaci, & Willging, 2008). For example, Malone (1998) conducted ethnographic fieldwork and interviews in two inner-city hospital emergency departments (EDs) over a 12-month period to gain insight into the phenomenon of emergency services overutilization from the perspective of heavy users of the ED. Her study revealed that heavy ED utilization is not merely a function of unmet medical need, but also a function of what historically were considered “alms-house” needs. Many exploratory studies made passing references to grounded theory, but four articles used grounded theory methods systematically in data collection and analysis (Hysong, Best, Pugh, & Moore, 2005; Jacobson, Dalton, Berson-Grand, & Weisman, 2005; Smythe, Malloy, Hadjistavropoulos, Martin, & Bardutz, 2006; Wallack, Weinberg, & Thomas, 2004). Hysong et al. (2005), for example, took a grounded theory approach to explore employees’ mental models of clinical practice guidelines in 15 Veterans Health Administration facilities. While the study did not produce a theory, it did generate hypotheses about the types of mental models that employees hold about clinical practice guidelines and the degree of consensus in employees’ mental models in high-performing versus low-performing facilities. Only one qualitative research article identified in our review took a phenomenological approach. Using the phenomenological concept of embodiment, Hudak, McKeever, and Wright (2004) explored the meaning of patient satisfaction with treatment outcomes among 31 individuals who had undergone elective hand surgery. In addition to highlighting the emotional and social aspects of patient satisfaction, their findings suggested that interviews that facilitate patients’ experience of body-self unity could promote satisfaction with treatment outcomes. Finally, no study identified in our review used the biographical approach.
Somewhat more than a quarter (28%) of the articles took a general approach that did not fit any of the five qualitative research traditions described by Creswell (1998). For example, Miller and Sim (2004) conducted interviews with 90 physicians and managers in 30 health care delivery organizations to identify barriers to physician use of electronic medical records. Tucker et al. (2003) conducted focus groups to examine cultural sensitivity in physician–patient relationships from the perspectives of ethnically diverse and low-income primary care patients. Finally, Nembhard (2009) used qualitative and quantitative methods to explore the features of quality improvement “collaboratives” that participants valued most.
There is some concern that health services and management researchers have borrowed qualitative research methods without a deep appreciation of the philosophical perspectives, conventions, and customs of established qualitative traditions. Data to support or refute this concern are scarce. However, our review indicates that qualitative research articles that took a general approach reported comparable levels of methodological detail as did articles that fit one of five qualitative research traditions described by Creswell (1998).
Qualitative research can employ a variety of data collection methods. Almost half (47%) of the qualitative research articles identified in the current review collected data through interviews. About a third (32%) collected archival data (e.g., reports and other documents). About an equal proportion of studies collected data through focus groups or observations (9% and 11%, respectively). Hoff and Witt (2000) reported a similar pattern, although they observed a higher percentage of qualitative articles used focus groups than observations.
In case study research, the use of multiple methods of data collection is valued because the convergence of multiples types of evidence (e.g., interviews, documents, and observations) enhances the credibility of the analysis (Yin, 2003). Although this form of “triangulation” is not emphasized in all qualitative research traditions (e.g., phenomenology), authors of general texts on qualitative methods recommend using more than one data-gathering method (Marshall & Rossman, 2006; O’Donoghue & Punch, 2003; Ulin et al., 2005). Contrary to this general advice, half of the qualitative research articles identified in our review used only one data collection method, with interviews comprising the sole method in 80% of these single-method studies. Of concern is that 33% of the single-case studies and 45% of the multiple-case studies in our review used only the interview method of data collection. Interviews can provide valuable insight into people’s intentions, perceptions, causal inferences, and explanations. Yet interviews can suffer from the problems of memory decay, impression management, over- or under-rapport with the interviewer, and poorly articulated questions. Because every type of evidence has its strengths and weaknesses, Yin (2003) recommends that case study researchers use the strengths of one type to compensate for the weaknesses of another.
Social scientists have long discussed the possibilities and pitfalls of combining qualitative and quantitative research methods (Campbell & Fiske, 1959; Denzin, 1978; Jick, 1979). In recent years, interest has grown in “mixed methods” as a distinctive approach to conducting research. There are now several methodological texts on mixed methods design (Creswell, 2009; Hesse-Biber & Leavy, 2006; Newman & Benz, 1998; Tashakkori & Teddlie, 2003) and a journal devoted to publishing mixed methods research (Journal of Mixed Methods Research). Reflecting this increased interest, our results indicate greater use of mixed methods in articles published in health services and management journals. In total, 21% of the 329 articles in the current review combined qualitative and quantitative methods of data collection and analysis, compared with 12% of the articles in the earlier review (Hoff and Witt, 2000). Most mixed methods articles combined surveys with interviews. A few combined surveys with focus groups or archival methods. Publication of articles using mixed methods did not vary systematically by journal or display an obvious yearly trend.
A major finding in Hoff and Witt’s (2000) earlier review was that most of the research articles using qualitative methods presented little or no information about how the data for the study were collected and analyzed. Specifically, only 17% of the articles provided “extensive” methodological detail, which Hoff and Witt defined as information on data collection, analysis, and site selection (if a case study) that spanned at least one journal page or more. Almost half (49%) of the articles in their review provided “few details,” meaning that the information presented on methods was one page or less in length and omitted one or more of the following: data collection, data analysis, site selection (if a case study), or methodological limitations. A third (34%) of the articles in the earlier review offered no discernible discussion of methods anywhere in the article.
Analysis of the 329 articles in the current review indicates that the level of methodological detail reported in qualitative research articles has improved. The percentage of articles providing “extensive” methodological detail more than doubled to 40%, using Hoff and Witt’s (2000) definition. Although the percentage of articles providing “few details” remained roughly the same (52%), the percentage offering no discernible discussion of methods dropped substantially to 9%.
Editors of some health services and management journals encourage authors to reduce the amount of methodological information in the published form of their research articles to accommodate journal length constraints or perceived audience interest in methodological detail. The movement toward shorter methods sections, at least in some journals, makes Hoff and Witt’s (2000) page-length criterion for “extensive” methodological information somewhat problematic. As an alternative, we relaxed the definition of “extensive” to mean that the methods section described, even if only briefly, the following three items: (a) the study’s sample (e.g., number, characteristics) and sampling procedures (e.g., inclusion criteria, recruitment); (b) the study’s data collection procedures (e.g., frequency of data collection, types of data collected, data capture/recording methods); and (c) the study’s analytic procedures (e.g., coding, memo writing, theme identification, replication). Using this relaxed definition, more than half (57%) of the articles in the current review offered “extensive” detail about the study’s methods. A little more than a third (35%) did not discuss at least one of the three items noted above. In most of these cases, the article did not describe the study’s analytic procedures and 8% offered no methods discussion. Methodological detail did not display any clear trend over time or by journal, regardless of whether Hoff and Witt’s (2000) or the looser definition was used.
Funding is a key factor for promoting or limiting qualitative research. Our review suggests that the Robert Wood Johnson Foundation, the W. K. Kellogg Foundation, and other foundations are the principal source of funding for studies using qualitative methods. About half of the qualitative research articles in our review (48%) acknowledged foundations as a source of financial support. Federal and state agencies appear to play a smaller role in funding qualitative health services and management research. Only 26% of the qualitative research articles identified in our review reported a government funding source. Other possible sources (e.g., universities, unions, World Health Organization) were acknowledged even less frequently. Overall, 10% of articles acknowledged multiple funding sources.
In their earlier review, Hoff and Witt (2000) found that 51% of the articles they reviewed acknowledged no funding source. In the current review, only 30% reported no source of funding. This decline could reflect improvement in the prospects for securing funding for research studies using qualitative methods. Alternatively, it could simply reflect changes in professional norms or journal policies concerning the acknowledgement of funding sources in published articles. It is worth noting that although we did find that some qualitative research was funded on a stand-alone basis, more often we found qualitative research was funded as part of a mixed methods study. Examples include the Community Tracking Study, funded by the Robert Wood Johnson Foundation, and the New Federalism project, funded by several foundations.
A common complaint among qualitative researchers is that the standard-length research article is a poorly suited vehicle for conveying the richness of qualitative research findings. While this might be true, qualitative researchers seem to have adapted to the field’s conventions about article length. The average page length observed in this review (16.5 pages) was shorter than we expected and, surprisingly, shorter than the average page length reported in Hoff and Witt’s (2000) earlier review (20 pages). For the articles in this review, the median page length was 13, with a range from 3 to 58 pages. Hoff and Witt (2000) reported a higher median, but a similar range. Page length exhibited no obvious trend over time. Average page length varied somewhat across journals. Qualitative research articles published in Medical Care and Health Affairs averaged 9 and 11 pages, respectively. Qualitative research articles published in Milbank Quarterly and Journal of Health Politics, Policy, and Law averaged 27 and 30 pages, respectively. Articles published in the other journals included in this review fell in between.
Finally, we wanted to know how articles that use qualitative methods have contributed to the field’s knowledge base. This is a complex question that admits no simple answer. One way to gauge an article’s contribution to the scientific literature is to examine the number of times it is cited in later articles. With the passage of time, articles making substantial contributions to scientific knowledge are likely to be cited with greater frequency than articles making moderate or minor contributions. For the 329 research articles using qualitative methods identified in our review, the average number of times articles were cited was 8.6 (range 0 to 89). The median number of citations was 5 and the mode was 1. For comparison, we constructed a random sample of research articles using quantitative methods, stratified by journal and year of publication to match the 329 qualitative research articles identified in the review. For this matching sample of quantitative research articles, the average number of times the articles were cited was 11.3 (range 0 to 220). The median number of citations was 5 and the mode was 0. Table 4 lists the 25 most frequently cited qualitative research articles appearing in the journals included in this review.
Given the strong positive skew in the citation distributions for both qualitative research articles and quantitative research articles, the mean is not as informative as a measure of central tendency as either the median or the mode. On the basis of these two measures, our analysis of citation frequencies suggests that qualitative research articles contribute comparably to the field’s scientific knowledge base, even though they appear less frequently in the published literature than quantitative research articles do.
Ten years ago, Hoff and Witt (2000) explored the use and characteristics of qualitative methods in nine health services research and management journals. The publication of their review, which covered the period 1995 to 1997, followed on the heels of two important events: a 1998 all-day forum on qualitative methods at the Annual Meeting of the Association for Health Services Research (now AcademyHealth) and the 1999 publication of articles from that forum in Health Services Research. Many hoped that these events would promote greater understanding, acceptance, and use of qualitative methods in a field dominated by quantitative approaches to empirical inquiry. What has happened since then? What has changed and what has not?
Our 10-year review of the same nine health services research and management journals presents a mixed picture of progress. On the one hand, these journals have published more than 300 research articles that used qualitative methods. These articles have enriched our knowledge of a wide range of health policy and management issues. More qualitative research articles published in these journals used mixed methods of data collection and analysis, and more reported funding support than Hoff and Witt (2000) found in their earlier review. Finally, the qualitative research articles published in these journals have been cited with comparable frequency as quantitative research articles, suggesting that articles using qualitative methods have a comparable contribution to the field’s knowledge base.
On the other hand, qualitative research articles constituted a small proportion of the empirical research literature published in these nine journals. In fact, the proportion declined from 14% during the 1995–1997 period to less than 10% in the 1998–2008 period. Although the qualitative research articles appearing in these journals exhibited substantial topical diversity, they exhibited limited methodological diversity. Case study research designs predominated while other research designs, such as ethnography, grounded theory, phenomenology, and biography, rarely appeared. Finally, many of the qualitative research articles published in these journals omitted basic information about study methods, especially information about analysis procedures. Although methodological reporting has improved since Hoff and Witt’s (2000) review, a high percentage of qualitative research articles did not meet widely held expectations about what research reports should include.
Why has the publication of qualitative research articles in nine core health services and management journals not kept pace with the publication of quantitative research articles? There are a number of possible and related explanations, all of which merit further investigation. The findings might reflect the limited production capacity of a small, but slowly growing, community of health services and management researchers trained and interested in using qualitative methods. Likewise, they might reflect production constraints arising from the challenges that researchers report in obtaining funding, especially federal funding, for studies that rely heavily or exclusively on qualitative methods. Alternatively, they might reflect the growing number of choices that health services and management researchers have for publishing their qualitative research. Many clinical, nursing, management, and social science journals will occasionally publish health services and management research articles, including ones that use qualitative methods. New topically focused journals such as Journal of Quality and Safety, Journal of Patient Safety, and Implementation Science represent additional outlets for both quantitative and qualitative health services and management research. It is not clear what effect the proliferation of journal options has had on the distribution of published qualitative research articles, or whether the publication of such articles “outside” the nine core journals included in this review raises cause for concern. It is less of an issue if authors choose to publish their qualitative work in these alternative journals because these journals better fit the research topic or reach a different target audience. It is more of an issue if authors seek alternative outlets because they perceive health services and management journals as unreceptive to qualitative research.
Several steps could be taken to increase the use of qualitative methods in health services and management research and, consequently, the representation of qualitative research articles in health services and management research journals. First, more doctoral programs could require students to take at least one course in qualitative research methods. Four years ago, our institution developed such a course and required doctoral students to take it. Since then, we have seen more students conduct dissertations that involve qualitative or mixed methods. Even students who do not plan to use qualitative methods report a greater appreciation of qualitative approaches to empirical inquiry, as well as greater comfort and skill in appraising the quality and contribution of qualitative research articles. In the future, these students will become reviewers, study section members, and journal editors. Although some might balk at such a requirement, we believe that qualitative research methods will not become fully accepted or commonly used if training in such methods remains an optional component of doctoral education.
Of course, what appears in the published literature is not simply a function of researcher training and interest. It is also a reflection of the research priorities and review processes of funding sources. Although federal agencies played a limited role in funding the qualitative research articles identified in our review, federal support for studies using qualitative methods or mixed methods could improve as these methods are suitable for addressing current concerns such as patient safety, electronic medical records, patient-centered medical homes, care coordination, and innovation implementation. To assure fair reviews, scientific administrators must ensure that study sections have members with expertise in qualitative methods. Members who do not have such expertise must also be open to qualitative approaches to empirical inquiry.
Finally, editors interested in seeing more qualitative research articles published in their journals might examine whether the journal’s formal policies or informal practices create impediments to the receipt, review, and acceptance of manuscripts using qualitative methods. Authors’ perceptions of a journal’s receptivity to qualitative research depend in large part on the journal’s track record in publishing qualitative research articles. The remarkable increase in the percentage of qualitative research articles appearing in Health Services Research over the past 10 years can likely be attributed in part to Stephen Shortell’s editorial leadership, the journal’s publication of the 1999 special supplement on qualitative methods, and the sustained commitment of subsequent editors.
Increasing the methodological diversity of qualitative research articles that appear in health services and management research journals might prove more challenging. Disciplines are defined in part by their methods of inquiry. Although health services and management research is an interdisciplinary field, it remains to be seen whether ethnographic, phenomenological, and narrative approaches will successfully migrate across the field’s permeable boundaries from their home disciplines in anthropology, psychology, and the humanities. As illustrated by the handful of examples identified in this review, these qualitative methods can be useful for investigating health policy and management issues. A well-conducted phenomenological study of the “lived experience” of primary care physicians, for instance, could inform current debates about health care workforce issues, pay for performance, electronic medical records, and medical homes (Hoff, 2010). Encouraging the submission and publication of health services and management research using these approaches might require extraordinary efforts to send a strong signal of welcome and acceptance. Such efforts might include editorials commenting on the value and need for research using these approaches, special supplements or calls for research papers on topics that lend themselves to investigation using these approaches, and articles that describe the epistemological assumptions, theoretical perspectives, and methodological procedures of these approaches.
In their earlier review, Hoff and Witt (2000) found that fewer than one in five qualitative research articles provided at least one page of detail about sampling, data collection, and analysis procedures. The figure, according to our review, more than doubled to 40% in the past 10 years. Relaxing the one-page requirement raised the figure to 57%. While encouraging, these findings still suggest a troubling lack of adequate methods description in 40% to 60% of qualitative research articles. An article with only a single sentence each for sampling, data collection, and analysis procedures would have met the highly relaxed definition of “extensive” reporting that we added in this review. Yet almost half of the qualitative research articles published in the nine health services and management journals included in this review did not even meet this minimum criterion.
Some journals place less emphasis than others do on methods and, therefore, require less reporting of methodological detail. The methods sections of articles published in Health Affairs, for example, are often very brief, presumably because the journal’s audience has a limited interest in how a particular research study is conducted. However, we saw no clear pattern in the amount of methodological detail found in qualitative research articles published in different journals. This suggests that the problem of inadequate methodological reporting is not limited to journals that publish shorter articles or that minimize methodological detail because of perceived audience interest. For example, more than half of the qualitative research articles published in Journal of Health Politics, Policy, and Law did not meet Hoff and Witt’s (2000) definition of “extensive” reporting or our more relaxed definition. The average length of a qualitative research article in this journal was 30 pages, nearly three times the average length of articles in Health Affairs. The same is true for Milbank Quarterly.
It is unclear whether authors are simply not reporting such information at all, or whether they are reporting it in the review stage, but then cutting it at the publication stage because of page constraints or editorial pressure. Regardless, it seems reasonable to expect published qualitative research articles to describe, even if only briefly, the study’s sampling, data collection procedures, and data analysis procedures. This expectation should not be controversial. It accords with the general standards for research reporting found in methodological textbooks of both the quantitative and qualitative variety, as well as the professional norms transmitted through education, training, and peer review. Journal conventions can, and should, dictate the amount of methodological detail provided. But, any published research article, whether qualitative or quantitative, should provide the reader at least some indication of what the sample includes, how the data were obtained, and how the data were analyzed. For qualitative research articles, it is especially important that published article contain at least some details about the study’s analysis procedures so that readers can understand how the authors transformed the mountain of transcribed words and documents that they collected into the succinctly stated results and interpretations that appear in the article. Likewise, it is important for studies that have a comparative intent (e.g., multiple case studies) to describe, even if only briefly, the procedures used to make comparisons and, equally important, the rationale for making such comparisons given the study’s research question. Expecting published qualitative research articles to provide a level of methodological detail equivalent to what journals expect of published quantitative research articles would enhance the standing of qualitative methods as legitimate tools for scientific inquiry in the field of health services and management research.
Two limitations merit discussion. First, we focused our review on the nine health services and management journals that Hoff and Witt (2000) included in their earlier review. In making this choice, we sought to maximize the continuity and comparability of the two reviews. Health services and management researchers can, and sometimes do, publish qualitative research articles in clinical, disciplinary, or topically focused journals. This review does not, and could not, provide a complete account of all qualitative research articles published by health services and management researchers over the past 10 years. Apart from the sheer volume of work that such an account would entail, the inclusion of clinical, disciplinary, or topically focused journals in a review would immediately confront two questions for which there are no easily acceptable answers: (a) which of the dozens of journals that might contain health services and management research articles should be included in the review and (2) what would count as a “health services and management research” article?
In the absence of any definitive or defensible answers to these questions, we can only speculate about how our findings might have differed had we expanded the scope of our review. A broader review would certainly identify additional qualitative health services and management research articles. However, given the quantitative orientation of the field, a broader review would probably turn up even more quantitative health services and management research articles. Thus, the proportion of health services and management research articles using qualitative, as opposed to quantitative, methods would likely decline. Whether the characteristics of qualitative research articles would differ if the scope of the review were broadened is debatable, since it would depend on which journals were included and how articles were identified as health services and management research articles, or not. For example, one might see more qualitative research articles focused on theory building if one included disciplinary journals like Administrative Science Quarterly. Likewise, one might see more qualitative research articles using grounded theory approaches if one included the journal Qualitative Health Research, assuming one could distinguish the health services and management research articles from the nursing, health psychology, and social work articles that also appear in that journal.
In their review, Hoff and Witt (2000) characterized the nine journals that they examined as “core” journals within the field of health services and management research. We concur. These are the journals that are affiliated with the field’s professional associations, the journals that people think of when they say “health services and management research,” the journals for which researchers in the field serve as reviewers and editors, the journals that are recognized and valued in tenure and promotion decisions. By focusing on nine “core” journals in the field, our review offers a fairly comprehensive view of the use and characteristics of qualitative methods in health services and management research.
The second limitation has to do with our use of content analysis, which involves an irreducible element of individual and collective judgment. Determining whether an article’s primary purpose is descriptive or explanatory, for example, can be challenging when an article discusses not only what happens but also why and it happens. Even distinguishing between research and non-research articles can be difficult. Authors writing in policy-oriented journals, for example, do not always indicate whether the tables and graphs included in the article represent original analysis or simply visual displays of the cited data source. In conducting this review, we sought to maximize consistency and minimize error by developing explicit decision rules and code definitions, training research team members in study procedures, performing periodic calibration checks, and testing reliability in each stage of the review. While these steps reduce the potential for article misclassification or coding error, they do not eliminate it entirely.
The authors received no financial support for the research and/or authorship of this article.
Declaration of Conflicting Interests
The authors declared no conflicts of interest with respect to the authorship and/or publication of this article.
1See the following issue: Health Services Research 34(5 Pt 2); December 1999.
2One author looked up the full-text of 160 articles to determine the method used. The other author looked up the full-text of 142 articles. These articles represent a small fraction of the 3,447 articles that each author examined at this stage of the review. In most cases, article authors were explicit about the methods they used. The movement to structured abstracts helps in this regard.
3Reconciliation took place as follows. We created a library containing all 259 references on which a disagreement occurred. The authors met to discuss and refine the study’s exclusion criteria and definition of “research article” as it pertained to this study. Each author then conducted a second review of abstracts and, when abstracts were ambiguous, full texts of the articles. Based on this reexamination, the authors resolved 97% of disagreements from the first round of the first stage of analysis. The main cause of initial discrepancy was an initial misclassification by one reviewer of some review- and methods-focused articles as “research articles” (86%). The remaining 3% of the original discrepancies were resolved by further clarification of the definition of “research article.”
4Reconciliation took place as follows. We created a spreadsheet listing all 109 references in which a disagreement occurred. Each author then conducted a full-text review of those articles in which he or she disagreed with the other author. Based on the full-text review, the author could revise his or her decision about the article or stick with his or her original decision. If disagreement persisted, the authors met to discuss and resolve their differences. Authors resolved 76% of their disagreements on the first round of review. In most cases, the reason for the discrepancy was obvious (e.g., misapplication of inclusion/exclusion criteria, simple lapse of attention). Discussion of the remaining 23% of the disagreements focused on how to apply the inclusion/exclusion criteria to articles containing little or no information about data collection, data analysis methods, or both