Ten years ago, Hoff and Witt (2000)
explored the use and characteristics of qualitative methods in nine health services research and management journals. The publication of their review, which covered the period 1995 to 1997, followed on the heels of two important events: a 1998 all-day forum on qualitative methods at the Annual Meeting of the Association for Health Services Research (now AcademyHealth) and the 1999 publication of articles from that forum in Health Services Research
. Many hoped that these events would promote greater understanding, acceptance, and use of qualitative methods in a field dominated by quantitative approaches to empirical inquiry. What has happened since then? What has changed and what has not?
Our 10-year review of the same nine health services research and management journals presents a mixed picture of progress. On the one hand, these journals have published more than 300 research articles that used qualitative methods. These articles have enriched our knowledge of a wide range of health policy and management issues. More qualitative research articles published in these journals used mixed methods of data collection and analysis, and more reported funding support than Hoff and Witt (2000)
found in their earlier review. Finally, the qualitative research articles published in these journals have been cited with comparable frequency as quantitative research articles, suggesting that articles using qualitative methods have a comparable contribution to the field’s knowledge base.
On the other hand, qualitative research articles constituted a small proportion of the empirical research literature published in these nine journals. In fact, the proportion declined from 14% during the 1995–1997 period to less than 10% in the 1998–2008 period. Although the qualitative research articles appearing in these journals exhibited substantial topical diversity, they exhibited limited methodological diversity. Case study research designs predominated while other research designs, such as ethnography, grounded theory, phenomenology, and biography, rarely appeared. Finally, many of the qualitative research articles published in these journals omitted basic information about study methods, especially information about analysis procedures. Although methodological reporting has improved since Hoff and Witt’s (2000)
review, a high percentage of qualitative research articles did not meet widely held expectations about what research reports should include.
Why has the publication of qualitative research articles in nine core health services and management journals not kept pace with the publication of quantitative research articles? There are a number of possible and related explanations, all of which merit further investigation. The findings might reflect the limited production capacity of a small, but slowly growing, community of health services and management researchers trained and interested in using qualitative methods. Likewise, they might reflect production constraints arising from the challenges that researchers report in obtaining funding, especially federal funding, for studies that rely heavily or exclusively on qualitative methods. Alternatively, they might reflect the growing number of choices that health services and management researchers have for publishing their qualitative research. Many clinical, nursing, management, and social science journals will occasionally publish health services and management research articles, including ones that use qualitative methods. New topically focused journals such as Journal of Quality and Safety, Journal of Patient Safety, and Implementation Science represent additional outlets for both quantitative and qualitative health services and management research. It is not clear what effect the proliferation of journal options has had on the distribution of published qualitative research articles, or whether the publication of such articles “outside” the nine core journals included in this review raises cause for concern. It is less of an issue if authors choose to publish their qualitative work in these alternative journals because these journals better fit the research topic or reach a different target audience. It is more of an issue if authors seek alternative outlets because they perceive health services and management journals as unreceptive to qualitative research.
Several steps could be taken to increase the use of qualitative methods in health services and management research and, consequently, the representation of qualitative research articles in health services and management research journals. First, more doctoral programs could require students to take at least one course in qualitative research methods. Four years ago, our institution developed such a course and required doctoral students to take it. Since then, we have seen more students conduct dissertations that involve qualitative or mixed methods. Even students who do not plan to use qualitative methods report a greater appreciation of qualitative approaches to empirical inquiry, as well as greater comfort and skill in appraising the quality and contribution of qualitative research articles. In the future, these students will become reviewers, study section members, and journal editors. Although some might balk at such a requirement, we believe that qualitative research methods will not become fully accepted or commonly used if training in such methods remains an optional component of doctoral education.
Of course, what appears in the published literature is not simply a function of researcher training and interest. It is also a reflection of the research priorities and review processes of funding sources. Although federal agencies played a limited role in funding the qualitative research articles identified in our review, federal support for studies using qualitative methods or mixed methods could improve as these methods are suitable for addressing current concerns such as patient safety, electronic medical records, patient-centered medical homes, care coordination, and innovation implementation. To assure fair reviews, scientific administrators must ensure that study sections have members with expertise in qualitative methods. Members who do not have such expertise must also be open to qualitative approaches to empirical inquiry.
Finally, editors interested in seeing more qualitative research articles published in their journals might examine whether the journal’s formal policies or informal practices create impediments to the receipt, review, and acceptance of manuscripts using qualitative methods. Authors’ perceptions of a journal’s receptivity to qualitative research depend in large part on the journal’s track record in publishing qualitative research articles. The remarkable increase in the percentage of qualitative research articles appearing in Health Services Research over the past 10 years can likely be attributed in part to Stephen Shortell’s editorial leadership, the journal’s publication of the 1999 special supplement on qualitative methods, and the sustained commitment of subsequent editors.
Increasing the methodological diversity of qualitative research articles that appear in health services and management research journals might prove more challenging. Disciplines are defined in part by their methods of inquiry. Although health services and management research is an interdisciplinary field, it remains to be seen whether ethnographic, phenomenological, and narrative approaches will successfully migrate across the field’s permeable boundaries from their home disciplines in anthropology, psychology, and the humanities. As illustrated by the handful of examples identified in this review, these qualitative methods can be useful for investigating health policy and management issues. A well-conducted phenomenological study of the “lived experience” of primary care physicians, for instance, could inform current debates about health care workforce issues, pay for performance, electronic medical records, and medical homes (Hoff, 2010
). Encouraging the submission and publication of health services and management research using these approaches might require extraordinary efforts to send a strong signal of welcome and acceptance. Such efforts might include editorials commenting on the value and need for research using these approaches, special supplements or calls for research papers on topics that lend themselves to investigation using these approaches, and articles that describe the epistemological assumptions, theoretical perspectives, and methodological procedures of these approaches.
In their earlier review, Hoff and Witt (2000)
found that fewer than one in five qualitative research articles provided at least one page of detail about sampling, data collection, and analysis procedures. The figure, according to our review, more than doubled to 40% in the past 10 years. Relaxing the one-page requirement raised the figure to 57%. While encouraging, these findings still suggest a troubling lack of adequate methods description in 40% to 60% of qualitative research articles. An article with only a single sentence each for sampling, data collection, and analysis procedures would have met the highly relaxed definition of “extensive” reporting that we added in this review. Yet almost half of the qualitative research articles published in the nine health services and management journals included in this review did not even meet this minimum criterion.
Some journals place less emphasis than others do on methods and, therefore, require less reporting of methodological detail. The methods sections of articles published in Health Affairs
, for example, are often very brief, presumably because the journal’s audience has a limited interest in how a particular research study is conducted. However, we saw no clear pattern in the amount of methodological detail found in qualitative research articles published in different journals. This suggests that the problem of inadequate methodological reporting is not limited to journals that publish shorter articles or that minimize methodological detail because of perceived audience interest. For example, more than half of the qualitative research articles published in Journal of Health Politics, Policy, and Law
did not meet Hoff and Witt’s (2000)
definition of “extensive” reporting or our more relaxed definition. The average length of a qualitative research article in this journal was 30 pages, nearly three times the average length of articles in Health Affairs
. The same is true for Milbank Quarterly
It is unclear whether authors are simply not reporting such information at all, or whether they are reporting it in the review stage, but then cutting it at the publication stage because of page constraints or editorial pressure. Regardless, it seems reasonable to expect published qualitative research articles to describe, even if only briefly, the study’s sampling, data collection procedures, and data analysis procedures. This expectation should not be controversial. It accords with the general standards for research reporting found in methodological textbooks of both the quantitative and qualitative variety, as well as the professional norms transmitted through education, training, and peer review. Journal conventions can, and should, dictate the amount of methodological detail provided. But, any published research article, whether qualitative or quantitative, should provide the reader at least some indication of what the sample includes, how the data were obtained, and how the data were analyzed. For qualitative research articles, it is especially important that published article contain at least some details about the study’s analysis procedures so that readers can understand how the authors transformed the mountain of transcribed words and documents that they collected into the succinctly stated results and interpretations that appear in the article. Likewise, it is important for studies that have a comparative intent (e.g., multiple case studies) to describe, even if only briefly, the procedures used to make comparisons and, equally important, the rationale for making such comparisons given the study’s research question. Expecting published qualitative research articles to provide a level of methodological detail equivalent to what journals expect of published quantitative research articles would enhance the standing of qualitative methods as legitimate tools for scientific inquiry in the field of health services and management research.
Two limitations merit discussion. First, we focused our review on the nine health services and management journals that Hoff and Witt (2000)
included in their earlier review. In making this choice, we sought to maximize the continuity and comparability of the two reviews. Health services and management researchers can, and sometimes do, publish qualitative research articles in clinical, disciplinary, or topically focused journals. This review does not, and could not, provide a complete account of all qualitative research articles published by health services and management researchers over the past 10 years. Apart from the sheer volume of work that such an account would entail, the inclusion of clinical, disciplinary, or topically focused journals in a review would immediately confront two questions for which there are no easily acceptable answers: (a) which of the dozens of journals that might contain health services and management research articles should be included in the review and (2) what would count as a “health services and management research” article?
In the absence of any definitive or defensible answers to these questions, we can only speculate about how our findings might have differed had we expanded the scope of our review. A broader review would certainly identify additional qualitative health services and management research articles. However, given the quantitative orientation of the field, a broader review would probably turn up even more quantitative health services and management research articles. Thus, the proportion of health services and management research articles using qualitative, as opposed to quantitative, methods would likely decline. Whether the characteristics of qualitative research articles would differ if the scope of the review were broadened is debatable, since it would depend on which journals were included and how articles were identified as health services and management research articles, or not. For example, one might see more qualitative research articles focused on theory building if one included disciplinary journals like Administrative Science Quarterly. Likewise, one might see more qualitative research articles using grounded theory approaches if one included the journal Qualitative Health Research, assuming one could distinguish the health services and management research articles from the nursing, health psychology, and social work articles that also appear in that journal.
In their review, Hoff and Witt (2000)
characterized the nine journals that they examined as “core” journals within the field of health services and management research. We concur. These are the journals that are affiliated with the field’s professional associations, the journals that people think of when they say “health services and management research,” the journals for which researchers in the field serve as reviewers and editors, the journals that are recognized and valued in tenure and promotion decisions. By focusing on nine “core” journals in the field, our review offers a fairly comprehensive view of the use and characteristics of qualitative methods in health services and management research.
The second limitation has to do with our use of content analysis, which involves an irreducible element of individual and collective judgment. Determining whether an article’s primary purpose is descriptive or explanatory, for example, can be challenging when an article discusses not only what happens but also why and it happens. Even distinguishing between research and non-research articles can be difficult. Authors writing in policy-oriented journals, for example, do not always indicate whether the tables and graphs included in the article represent original analysis or simply visual displays of the cited data source. In conducting this review, we sought to maximize consistency and minimize error by developing explicit decision rules and code definitions, training research team members in study procedures, performing periodic calibration checks, and testing reliability in each stage of the review. While these steps reduce the potential for article misclassification or coding error, they do not eliminate it entirely.