Health policy makers now have access to a greater number and variety of systematic reviews to inform different stages in the policy making process, including reviews of qualitative research. Systematic reviews of observational studies help identify the magnitude of a particular health problem, reviews of randomized controlled trials provide reliable information the benefits or harms of policy options being considered, and reviews of economic evaluations help determine cost-effectiveness of different options [1
]. Another form of evidence summary are systematic reviews of qualitative research (see eg. refs [2
]). Which can be central to exploring patient or user views or experiences of a policy option, understanding health behaviours in relation to an illness or patient decision making and uptake of an intervention, as well as identifying barriers and facilitators to implementing specific interventions. Unlike the more established methods for reviewing effectiveness studies, methods for systematically reviewing qualitative research are still in development.
An important question in qualitative research synthesis is what type of study to include. Review authors might include studies where qualitative research is the main focus or where qualitative methods are combined with quantitative in the same study (mixed methods). Researchers tend to use mixed methods for pragmatic reasons, since the design allows for use of a range of methods and can yield more comprehensive findings [5
]. Of course researchers use mixed methods designs for strategic reasons too. Key funders of health and development research recognize the value of combining qualitative and quantitative research, increasingly call for the inclusion of qualitative methods, and increased inter-disciplinary and social science contribution to proposals.
As the number of mixed methods studies increases, their inclusion in systematic reviews is more likely. But there are recognised design issues with mixed methods research that pose challenges for including and appraising these studies in systematic reviews. For example, in some cases the author’s intended rationale for using mixed methods may not match how they actually combined methods in practice, and the risk here is that the study produces unnecessary or redundant data which fails to address the research question [7
]. There are also inherent problems with reporting findings of mixed methods research, particularly the role and sequence of different data collection methods and integration of analysis and findings [6
The quality and utility of a systematic review depends on the quality of the included studies, and so the methodological challenges in mixed methods research are important for synthesis [9
]. For reviews of effects, there are strict criteria for inclusion, and stringent criteria for assessing risk of bias in included studies [10
]. In reviews that include qualitative and other research designs, decisions on which studies to include, and whether and how to conduct quality assessment are still widely debated [11
]. Recent developments include a scoring system for appraising the methodological quality of mixed methods studies alongside qualitative and quantitative studies included in ‘mixed studies reviews’ [13
] and guidelines for good reporting of mixed methods studies [7
]. Even when time and effort is invested in quality appraisal systematic review authors rarely comment on the credibility, rigour of conduct or the contribution the different studies make to the synthesis outcome. Sometimes reviewers indicate that poorer quality studies contribute less to the synthesis, without further elaboration of the types of studies that were considered poorer quality and why. Yet scrutiny of the poorer quality studies might help determine which types of studies it is meaningful to summarise together and which could reasonably be excluded or summarised in other sorts of reviews.
Though there are indications that mixed methods studies tend to be poorly reported and fare poorly in methodological quality appraisal [14
], we could not find any systematic analysis comparing the quality of reporting of qualitative-only research with qualitative research from mixed methods studies. In this exploratory study we assess how the quality of reporting and rigor of qualitative-only studies compares with that of mixed-methods studies. To do this we examined studies included in two completed systematic reviews - one focusing on factors influencing adherence to tuberculosis treatment [15
] and the other on uptake of malaria preventive interventions in pregnancy [16
]. Secondary objectives were to determine any differences in the quality of reporting between the malaria and TB focused studies and to explore any differences in the quality of reporting in qualitative-only and mixed methods studies over time.