PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
Trop Med Int Health. Author manuscript; available in PMC 2012 April 9.
Published in final edited form as:
PMCID: PMC3321435
NIHMSID: NIHMS340823

How to Improve the Validity of Sexual Behaviour Reporting: Systematic Review of Questionnaire Delivery Modes in Developing Countries

Summary

Objectives

To systematically review comparative research from developing countries on the effects of questionnaire delivery mode.

Methods

We searched Medline, EMbase and PsychINFO and ISSTDR conference proceedings. Randomized-controlled trials and quasi-experimental studies were included if they compared two or more questionnaire delivery modes, were conducted in a developing country, reported on sexual behaviours, and occurred after 1980.

Results

28 articles reporting on 26 studies met the inclusion criteria. Heterogeneity of reported trial outcomes between studies made it inappropriate to combine trial outcomes. 18 studies compared audio computer-assisted survey instruments (ACASI) or its derivatives (PDA or CAPI) against another self-administered questionnaires, face-to-face interviews, or random response technique. Despite wide variation in geography and populations sampled, there was strong evidence that computer-assisted interviews lowered item-response rates and raised rates of reporting sensitive behaviours. ACASI also improved data entry quality. A wide range of sexual behaviours were reported including vaginal, oral, anal and/or forced sex, age of sexual debut, condom use at first and/or last sex. Validation of self-reports using biomarkers was rare.

Conclusions

These data reaffirm that questionnaire delivery modes do affect self-reported sexual ehaviours and that use of ACASI can significantly reduce reporting bias. Its acceptability and feasibility in developing country settings should encourage researchers to consider its use when conduct ing sexual health research. Triangulation of self-reported data using biomarkers is recommended. Standardising sexual behaviour measures would allow for meta-analysis.

Keywords: developing country, systematic review, validity, method comparison

Background

After almost three decades, the burden of the HIV epidemic remains squarely in developing countries where the disease is primarily spread sexually (UNAIDS 2007; Wilson & Halperin 2008). Sexual behaviour is complex and influenced by many factors such as socio-economic, cultural, biological, and psychological conditions, many of which cannot be easily externally validated or objectively measured.

Another measurement challenge surrounds the private aspect of sexual behaviours. As sexual behaviour cannot ethically be observed directly, we must rely on individuals’ self-reports of their sexual experiences. Although cultural norms differ around the world, most sexual behaviours are socially censured, and strong pressure to conform to societal norms causes self-reports to be fraught with bias, particularly around recall and social desirability.

Over the past two decades a diverse studies (condom usage, discordant couples, condom negotiations) have explored the relationship between risky behaviours and their impact on HIV. Researchers noted a gap between the validity and reliability of the self-reported measure and other outcomes. Ensuing under-reporting of sexual behaviours renders it difficult to interpret trends in HIV prevalence or incidence, to design appropriate behavioural interventions and to interpret their effects. This requires greater attention to improving measurement techniques. The scope for change ranges from questionnaire wording (Wellings et al. 2001; Elam & Fenton 2003; Mavhu et al. 2008) and ensuring privacy and confidentiality (Catania et al. 1990; Fenton et al. 2001; Tourangeau & Yan 2007), to improving questionnaire delivery modes (Mensch et al. 2003, 2008; Tourangeau & Yan 2007).

Traditionally the field has relied on interviewer-administered questionnaires to collect self-reported sexual behaviour information. Growing concern for better greater validity prompted researchers to explore other questionnaire delivery modes. This search has been most radically transformed with the recent advent of computer programming in questionnaire design (Turner et al. 1998; Jones 2003).

In the early 1990s, Catania and colleagues (1990) reviewed the methodological challenges for assessing sexual behaviour, including the effects of questionnaire delivery mode on measurement error. Their conclusion called for more rigorous research into the assessments of sexual behaviours and emphasized that the foundation lay in improving the reliability and validity of its measurement. All 20 studies included in the review on response bias in sexual behaviour research were conducted in North America. At the end of the 1990s, Weinhardt and colleagues (1998) re-examined the literature since Catania’s review on reliability and validity of questionnaire delivery method and assessed the evidence relating to questionnaire delivery mode and reporting of sexual behaviours. Despite developments in the field, such as the use of computer technology, Weinhardt et al. believed that comparative research of questionnaire delivery modes remained limited. Moreover, of the 30 studies included in their review, only three were from developing countries, and only one compared questionnaire delivery modes (Konings et al. 1995).

Two recent developments have provided the impetus for this systematic review. First there has been a rise in the number of comparative studies on questionnaire delivery modes in developing country settings. While the bulk (14/26) of the studies reviewed here took place between 2000 and 2004, almost two-thirds (18/26) were published between 2005 and 2008. Secondly, there has been an increase – albeit still tentative -- in the use of computer-administered modes in developing country settings on sexual behaviour research. Within the comparative research conducted in developing countries, more than half include computer-administered modes. These two developments prompted this systematic review to examine the body of evidence and to summarize the findings around different questionnaire delivery modes regarding sexual behaviour measurement.

Methods

Inclusion criteria

Studies were selected for review if they met the following criteria: published or cited in peer-reviewed journals that compared two or more questionnaire delivery modes; conducted in a devel-oping country; including data that reported on sexual behaviour (vaginal, anal, or oral sex, condom use, risky sexual behaviours, contraceptive use), and published between 1 January 1980 and 31 December 2008. In addition, abstracts for the conference proceedings from the International Congress of Sexually Transmitted Infections (ISSTDR) were examined from 2001 onwards (2001, 2003, 2005, and 2007). In this review the countries listed as having ‘emerging’ or ‘developing economies’ by the International Monetary Fund World Economic Outlook report were considered a ‘developing’ country (IMF 2009). Studies were included if they were evaluated in an experimental (RCT), quasi-experimental (i.e. had non-randomised comparison group) or test-retest design. Studies were excluded if they compared one questionnaire delivery mode against a biological marker, the impact of interviewer gender on questionnaire responses or data reported by married couples rather than between questionnaire methods. Unpublished studies emanating from references of published articles and studies published in non-English language journals were considered for inclusion.

Search strategy

Three databases were searched: Medline, EMbase, and PsychINFO using key MeSH terms and text words relevant to each data base (Table 1). Duplicates were manually discarded.

Table 1
Medline search strategy

Analysis

Titles and abstracts were used to screen for relevance to the literature review. If the questionnaire delivery method was not mentioned in the title or abstract, it was assumed there was no mode comparison. Where the title or abstract were not sufficient to make a determination, the article was downloaded and read. Reference lists for all included articles were examined. 3% of articles were re-examined blind by one co-investigator (FC) to check that inclusion criterion were being met. Articles where inclusion criteria were unclear (n=119) and all articles included in the review were jointly discussed by two of the investigators (FC & LL).

Results

Of the 6824 references reviewed (Medline=3261; Embase=1761; PsycINFO=1800, ISSTDR=2), 28 articles reporting on 26 studies met the inclusion criteria (Figure 1). Articles reporting on results from more than one study were analysed separately. Where studies were reported in more than one article; these results were combined for analysis. Studies ranged geographically (from China, to Hanoi, to rural Malawi) and in their selection of respondents (from female sex workers to South African students). Self-administered questionnaires (SAQ) were used as the comparison in 5 studies (Table 2) whereas interviewer-administered questionnaires were the comparison in 16 studies (Table 3). Seven studies included SAQ, an interviewer-administered mode, and at least one other mode for comparison (see Table 4).

Figure 1
Diagram of Systematic Review
Table 2
Method comparison using Self-Administered Questionnaire (SAQ)
Table 3
Method comparison using Face-to-Face Interviewing (FTFI)
Table 4
Multiple Comparisons (SAQ, FTFI, and others)

This analysis makes the general assumption that an increase in reporting of a socially censured behaviour indicates more accurate reporting (reducing social desirability bias and increasing validity) (Konings et al. 1995; Durant & Carey 2000; Kreuter et al. 2008; Brener et al. 2003).

Comparison with SAQ

All 5 studies that compared another method against SAQ used an audio-computer-assisted survey in-strument (ACASI) or a derivative thereof (e.g. personal digital assistant (PDA) or phone-ACASI) (see Table 2). Two were randomized control trials (Rumakom et al. 2005; Seebregts et al. 2008), one was quasi-experimental (Fielding et al. 2006), and two focused on test-retest (Jaspan et al. 2007; Bernabe-Ortiz et al. 2008). Three studies included school attenders (Rumakom et al. 2005; Jaspan et al. 2007; Seebregts et al. 2008).

Comparison of response rates

Where non-response rates were reported, SAQ performed poorly against ACASI and its derivatives. In South Africa, SAQ respondents were seven times more likely to have missing items than respondents using PDA (Jaspan et al. 2007). In China, nonresponse rates to items were 8% to 14% with SAQ and 0.4-3.0% with phone-ACASI (Fielding et al. 2006).

Comparisons of reporting of sexual behaviours

In general, SAQ respondents reported lower levels of risk exposure than respondents using computer self-administered modes (e.g. in China, bisexual sex (SAQ 0.6% versus ACASI 2.6%, p<0.001; sex in the past year SAQ 50.3% versus ACASI 56.5%, p=0.006; and belief that their partner had sex with others SAQ 4.3% versus ACASI 7.0%; p=0.005) (Jaspan et al. 2007). The exception was among South African youth where there were no differences between SAQ and PDA except for ‘ever sex’, which was more often reported in SAQ (36%) than PDA (26%; p=0.003) (Jaspan et al. 2007).

Reliability between modes

In three studies respondents used SAQ followed by PDA (Jaspan et al. 2007; Seebregts et al. 2008; Bernabe-Ortiz et al. 2008). Response agreement between the two modes was high at 85% and a Kappa value of 0.5 or more.

Acceptability of modes

Respondents were more likely to report that ACASI and its derivatives were more confidential when answering sexual behavioural questions. For example, in the Thai study, 7.8% of college females who used SAQ reported feeling embarrassed answering the sexual behaviour questions, whereas fewer than 1.5% of female ACASI respondents did (Rumakom et al. 2005).

Comparisons with Face-to-Face Interviewing (FTFI)

Seven of the 16 studies compared face-to-face interviewing against ACASI (Table 3). Five studies explored an adaptation of face-to-face interviewing which allowed respondents to self-report sensitive questions on a ballot card (Phillips et al. 2007; Gregson et al. 2002, 2004; Hanck et al. 2008; Jaya et al. 2008). One study from India compared an additional mode termed ‘interactive interviewing’ which included a tape-recorded drama and dolls to desensitise respondents around sensitive issues (Jaya et al. 2008). Four other studies compared face-to-face interviewing against in-depth interviewing (Konings et al. 1995), coital diaries (Ramjee et al. 1999; Allen et al. 2007) or used a derivative of SAQ where the questions were read aloud in a group setting (Plummer et al. 2004a,b).

As shown in Table 3, eight studies were randomized control trials (Simoes et al. 2006; Allen et al. 2007; Caceres et al. 2007; Minnis et al. 2007; Phillips et al. 2007; Mensch et al. 2008; Jaya et al. 2008; Hewett et al. 2008), five were quasi-experimental studies (Konings et al. 1995; Gregson et al. 2002, 2004; Bernabe-Ortiz 2008; Hanck et al. 2008), and three focused solely on test-retest (Ramjee et al. 1999; Plummer et al. 2004a,b).

Comparison of response rates

As interviewer presence renders it more difficult for respondents to ignore a question, few studies (3/15) reported non-item response rates. In Peru, where data entry was handled manually (FTFI) or directly into a PDA, missing responses were significantly more common using FTFI (p<0.001) (Bernabe-Ortiz 2008).

Comparisons of reporting of sexual behaviours

Overall, respondents using face-to-face interviewing reported fewer sensitive behaviours than respondents using other questionnaire delivery modes. Six of 7 studies that compared face-to-face interviewing with ACASI (RCT=6) showed increased reporting of various sexual behaviours in ACASI. For example, urban Brazilian women were more likely to report behaviours risking STI in ACASI (e.g. anal sex in the last six months: 33% versus 24%; p<0.01; no condom use with vaginal sex in the last month: 59% versus 51%; p<0.01) (Hewett et al. 2008).

In a study in Zimbabwe on hormonal contraception where eligibility of respondents was conditional on specific behaviours, ACASI users were more likely to report an undesirable behaviour (Minnis et al. 2007). For example, not getting pregnant was a study condition and more pregnancies were reported in ACASI interviews than in face-to-face ones (OR=1.5; 95% CI 1.1-1.9). ACASI users were also more likely to report multiple partners than FTFI users (OR=5.7; 95% CI 2.1-15.2).

Data from India produced less consistent results where face-to-face interviewing was compared against both ACASI and interactive interviewing (Jaya et al. 2008). Among females more sexual behaviours were reported using interactive interviewing than face-to-face interviewing, and they consistently reported fewer sexual behaviours using ACASI than FTFI. However, males using ACASI were more likely to report having had sex (26.9% versus 21.4%; p=0.03) and having been forcibly touched (26.2% versus 21.4%; p=0.09) than males using FTFI.

In two studies there were no reported differences between ACASI and interviewer data collected using a computer (CAPI) (Jasper et al. 2007; Bernabe-Ortiz 2008). This was attributed to smaller sample sizes.

When face-to-face interviewing was compared with a non-ACASI mode (ICVI, coital diaries, in-depth interviewing, and assisted self-completed questionnaires-see definitions at the end of Table 3), all 8 studies found lower rates of reporting of sensitive behaviours using face-to-face interviewing. This was true for both socially desirable and socially undesirable behaviours. While data from coital diaries provided increased reporting of sexual behaviours (Allen et al. 2007), diary loss or incompletion posed a problem (Ramjee et al. 1999; Sedyaningsih-Mamahit et al. 2003). Four studies examined interviewer-controlled voting instrument (ICVI), where respondents marked their responses to sensitive questions privately and then posted them into a locked box (Gregson et al. 2002,2004; Phillips et al. 2007; Hanck et al. 2008). Overall there was an increase in reports of sexual behaviours among ICVI users (e.g. in Zimbabwe: multiple sex partners males OR = 1.33; p = 0.028; females OR = 5.21; p<0.001). Only among transgendered males was there increased report of some behaviours but not others (Phillips et al. 2007) (Au: This sentence is strange English; and what precisely does it mean? Which behaviours? Please rephrase.)

Comparison of the same respondents’ answers to different questionnaire delivery modes

In the three studies which examined respondent agreement where differences occurred, reporting was less likely to have occurred during face-to-face interviews (Konings et al. 1995; Plummer et al. 2004a,b; Mensch et al. 2008). Tanzanian pupils were administered the same questionnaire in two modes (FTFI or ASCQ where respondents had the questions read aloud to them in a single-gender group setting) five weeks apart (Plummer et al. 2004a,b). 62% of males and 41% of females reported having sex in both surveys (Plummer et al. 2004a). There was 64.4% agreement in reporting of age at first sex and 47.3% agreement around the number of sexual partners (Plummer et al. 2004b). Respondents were more likely to report condom use, forced sex, and pregnancy in ASCQ. For example, forced sex was reported by only one female in both surveys, but it was reported by 12.3% of females in ACSQ and by 0.2% in FTFI (Plummer et al. 2004b). In Uganda, 23.3% of respondents reported not engaging in sex when asked in the face-to-face interview but then reported sexual activity in an indepth interview (Konings et al. 1995).

Comparison of SAQ and FTFI against other modes

There were seven studies that compared both SAQ and FTFI with at least one other mode .Six of these were conducted as randomized control trials (Mensch et al. 2003; Hewett et al. 2004; Potdar & Koenig 2005; Le et al. 2006; van Griensven et al. 2006; Langhaug et al. 2007, one used a quasi-experimental design (Lara et al. 2004; Table 4).

Comparison of response rates

In both studies that reported on nonresponse rates to items, computer-administered questionnaires had the fewest missing data (van Griensven et al. 2006; Langhaug et al. 2007).

Comparisons of reporting of sexual behaviours

Data on rates of reporting sexual behaviours are less clear. In India, among college males using SAQ, ACASI or FTFI, men responding to ACASI reported heterosexual sex (AOR = 1.8, p<0.05), oral sex with a female (AOR = 2.08; p<0.05), homosexual sex (AOR = 8.1; p<0.05) and having experienced coercive sex (AOR = 11.35; p<0.01) more often. When using ACASI rather than FTFI, young men living in slums were more likely to report masturbation (AOR=22.53; p<0.001) and oral sex with a woman (AOR = 2.4; p<0.010) than vaginal sex (AOR = 0.23, p<0.001).

In Viet Nam young people’s attitudes and norms around sexual behaviours and condom use were compared using SAQ, FTFI, and ACASI (Le et al. 2006). Respondents using ACASI were more likely to report liberal attitudes around premarital sex (a socially censured norm) and to report less confidence in their ability to access condoms (a socially condoned behaviour). Female refusals to answer questions about condom use were most frequent among SAQ and FTFI users. ACASI users were also more likely to report sex (e.g. unmarried males’ sex with commercial sex workers OR=2.8, p<0.05; males aged 15-19 having sex OR = 2.79, p<0.05).

Three studies compared four modes (Lara et al. 2004; van Griensven et al. 2006; Langhaug et al. unpublished). In the Thai study, data generally showed no difference in reporting of sexual behaviours between SAQ, ACASI, and PASI. However, regarding self-reports of the most sensitive behaviours, there was a statistically significant difference (p<0.001) between PASI and FTFI (history of oral sex 37.3% versus 13.2%; sex today/yesterday 19.3% versus 6.1%; sold sex 8.2% versus 0.9%; bought sex 8.2% versus 2.5%). In the Zimbabwean study, after adjusting for covariates, Audio-SAQ and ACASI users were twice as likely to report sexual activity as SAQ users (Audio-SAQ AOR = 2.05 [95%CI: 1.2-3.4]; ACASI AOR = 2.0 [95%CI: 1.2-3.2]), with no reporting difference between ICVI and SAQ users (ICVI AOR = 1.0 [95%CI: 0.6-1.8) (Langhaug et al. 2007). ACASI users reported a younger age at first sex (0.7-1.7 years lower) (p<0.045). In a post-survey questionnaire, ACASI users reported improved ability to answer questions honestly (p=0.004) and believed their answers would be kept secret.

ACASI only performed poorly compared to the random response technique (RRT). RRT asks a respondent to answer one of two questions picked randomly where the interviewer is blinded to the question being answered. One question is of a sensitive nature whereas the other question is not sensitive and for which there is a known probability for that answer in a population (e.g the question, ‘were you born in April’ has a known probability.). Mathematical techniques allow indirect estimates of the pro-portion reporting the sensitive behaviour (Lara et al. 2004). In three sub-populations in Mexico (Lara et al. 2004), RRT users reported the highest rates of attempted abortion. Differences were only statistically significant in the hospital survey (RRT = 22%, SAQ = 19%, ACASI = 13%, FTFI = 12%, p = 0.012).

Comparison with biological markers

Whilst 5 of the 26 studies used biological markers as external comparators, only 4 examined sexual biomarkers (Mensch et al. 2008; Plummer et al. 2002,2004; Langhaug et al. 2007; Hewett et al. 2008), one study evaluated drug use (van Griensven et al. 2006). Sexual biomarkers comprised Chlamydia, gonorrhoea, trichonomoniasis, HIV, and current pregnancy in females. While prevalence of biological markers varied among studies involving adolescents, absolute numbers were relatively small. In the study from Brazil among urban women, those who used ACASI had stronger associations between their reported risk behaviours and STIs (Hewett et al. 2008). STI positive respondents using FTFI were more likely to underreport sexual behaviours than their STI negative peers.

Discussion

The results outlined here reaffirm that questionnaire delivery modes do affect self-reported sexual behaviour (Tourangeau & Smith 1996; Turner et al. 1998; des Jarlais et al. 1999; Metzger et al. 2000; Tourangeau & Yan 2007). This systematic review is the first to examine data from developing countries that compares self-reports of sexual behaviours between various questionnaire delivery modes. Despite wide variation in geography and populations sampled, we found strong evidence that computer-assisted interviewing decreases non-response rates to items and increases rates of reporting of sexual behaviours. This was true when ACASI and its derivatives (PASI, phone-ACASI, PDA) were compared with other self-administered and interviewer-administered questionnaire delivery modes. Comparative research using ACASI was predominantly conducted among young people (11 out of 16 studies were exclusively with young people and the other 5 included individuals aged 18 and older), whose reporting of sexual behaviours is very likely to be socially censured (Mensch et al. 2003, 2008; Hewett et al. 2004; Rumakom et al. 2005; Potdar & Koenig 2005; Le et al. 2006; van Griesven et al. 2006; Jaspan et al. 2007; Langhaug et al. 2007; Seebregts et al. 2008). Data entry errors were also reduced when controlled by a computer programme: in studies with no differences in reporting of sexual behaviours, ACASI still improved the quality of data entry (Jaspan et al. 2007; Seebregts et al. 2008; Bernabe-Ortiz et al. 2008).

Validation of self-reports against biomarkers for sexual activity were rarely available. Where sample sizes were sufficient, results suggest more accurate reporting using ACASI than face-to-face interviewing.

These studies also support the acceptability and feasibility of using computers in developing country settings. In those studies where it was examined, ACASI and its derivatives were found acceptable, easy to use, and respondents, particularly young women, reported feeling more comfortable using a computer to report sensitive behaviours than they did with other methods (Rumakom et al. 2005; Le et al. 2006; Langhaug et al. 2007). Similar findings have emerged from Zimbabwe and the US (Millstein & Irwin 1983; Kissinger et al. 1999; Metzger et al. 2000; van de Wijgert et al. 2000; Kurth et al. 2004). Acceptability of computer technologies may vary geographically and be related to level of exposure. Generally, an increased sense of trust and sense of privacy is expressed by those who live in countries where computers are less commonly used. Only one study compared ACASI against ICVI and found less reporting of sensitive behaviours using ICVI (Langhaug et al. 2007). Results of studies using ICVI or interactive interviewing which did not compare ACASI emphasize that any effort made to improve privacy levels when answering sensitive questions increases the reporting of sexual behaviours (Gregson et al. 2002; Hanck et al. 2008; Jaya et al. 2008). ACASI has not been compared against interactive interviewing. This research is needed in order to better establish their comparative strengths and limitations.

Equally encouraging is the work of Mensch 2008 is research in adol in Malawi and Hewett 2004 is research in Kenya, which suggests that levels of literacy per se may not affect ACASI, as it was used successfully among rural youth. Potdar and Koenig (2005) used ACASI (and not SAQ) among young people living in Indian slums, where despite differences in reporting to college-educated peers, no mention was made of their inability to use the computers. However, evidence from three studies is not sufficient to suggest that ACASI can always be used in settings with low literacy rates; more feasibility research is required.

While coital diaries also demonstrated increased reporting of sensitive behaviours (Ramjee et al. 1999; Allen et al. 2007), they carry requirements which render them less suitable for large surveys: more logistical support, more time for data entry, and specific training to ensure appropriate completion. Coital diaries have a low completion rate (20% in one study). However, including coital diary data from a sub-sample of a large survey population can complement the data collected by other means.

One of the strengths of this review is that a number of the studies reported both socially censured and socially condoned behaviours (Lau et al. 2003; Sedyaningsih-Mamahit et al. 2003; Potdar & Koenig 2005; Le et al. 2006; Simoes et al. 2006; Minnis et al. 2007; Hanck et al. 2008). Conclusions drawn from these studies are strengthened when users of a mode are not only more likely to report socially censured behaviours but also less likely to report socially acceptable behaviours(Catania et al. 1990). Reports for computer self-administered questionnaires followed this pattern. In India, more college men using ACASI reported behaving violently after drinking than those using FTFI (3.0 vs. 1.7%) (Potdar & Koenig 2005). Similarly, in a study in Zimbabwe where hormonal contraceptive use was a prerequisite for participation, women were more likely to report that they were not using them in ACASI than in FTFI (Minnis et al. 2007). Equally heartening is the growing comparative literature around questionnaire delivery modes in developing countries and the increased interest in ACASI and its derivatives.

There are, some limitations to this review. A number of studies did not show statistically significant differences around reporting of sexual behaviours between questionnaire delivery modes. This is in part attributable to the small sample size of these studies, or when youth were sampled, to the small number who reported sexual behaviours overall. Studies did not report the same sexual behaviour outcomes, nor did they always disaggregate their data by gender or age. This made it difficult to make comparisons across studies. Only four studies included biological markers of sexual behaviour as part of their analysis. Biological markers offer complimentary evidence that can be used to explore directions of effect. As mentioned above, for most sexual behaviors which are socially censured, particularly for young people, it is assumed that an increase in reporting indicates a improved validity of that report, but the ability to triangulate against objective data improves our understanding of the differences in self-reported sexual behaviours between questionnaire delivery modes. Researchers should incorporate biological markers (or other externally valid outcomes) into evaluations whenever possible to broaden the evidence within comparative studies. We did not examine the effect of interviewer age or gender. Research on this has been extensive but inconclusive (Becker & Sosa 1992; Blanc & Croft 1992; Catania et al. 1996; Elam & Fenton 2003; Wellings et al. 1990).

In 2003, a technical meeting on “Measurement of Trends in Sexual Behaviour” called for more rigorous comparative studies before anything more definitive could be concluded (Cleland et al. 2004). Since then, there has been a noteworthy increase in the number of published articles in peer-reviewed journals comparing questionnaire delivery modes. Most articles considered in this review, which focussed exclusively on research performed in development country settings, were published after 2003 (n=21/28). Their data strongly suggest the use of computer-assisted methods.

This is important in view of the fact that the principal data collection tool for sexual behaviours in developing countries remains the interviewer-administered questionnaire. While interviewer-administered questionnaires remain an important tool for collecting survey data, this review provides good evidence that self-administered options, especially those using computers, will enhance data quality, particularly of socially sensitive data.

Acknowledgements

We express our profound gratitude to Angela Young, the wonderful librarian who assisted in structuring this systematic review.

References

  • Allen CF, Lees SS, Desmond NA, et al. Validity of coital diaries in a feasibility study for the Microbicides Development Programme trial among women at high risk of HIV/AIDS in Mwanza, Tanzania. Sex Transm Infect. 2007;83:490–6. [PMC free article] [PubMed]
  • Becker S, Sosa D. An experiment using a month-by-month calendar in a family planning survey in Cosa Rica. Stud Fam Plann. 1992;23:386–91. [PubMed]
  • Bernabe-Ortiz A, Curioso WH, Gonzales MA, et al. Handheld computers for self-administered sensitive data collection: a comparative study in Peru. BMC Medical Informatics & Decision Making. 2008;8:11. [PMC free article] [PubMed]
  • Blanc AK, Croft TN. The effect of the sex of the interviewer on responses in fertility surveys: the case of Ghana. Annual Meeting of the Population Association of America; Denver. 1992.
  • Brener ND, Billy J, Grady WR. Assessment of factors affecting the validity of self-reported health-risk behaviour among adolescents: evidence from the scientific literature. J Adolesc Health. 2003;33:436–57. [PubMed]
  • Caceres CF, Celentano DD, Coates TJ, et al. The feasibility of audio computer-assisted self-interviewing in international settings. AIDS. 2007;21(Suppl. 2):S49–S58. [PubMed]
  • Catania JA, Gibson DR, Chitwood DD, Coates TJ. Methodological problems in AIDS behavioral research: influences on measurement error and participation bias in studies of sexual behav-ior. Psychol Bull. 1990;108:339–62. [PubMed]
  • Catania JA, Binson D, Canchola J, Pollack LM, Hauck W. Effects of interviewer gender, interviewer choice, and item wording on responses to questions concerning sexual behavior. Public Opinion Quarterly. 1996;60:345–75.
  • Cleland J, Boerma JT, Carael M, Weir SS. Monitoring sexual behaviour in general populations: a synthesis of lessons of the past decade. Sex Transm Infect. 2004;80(Suppl. 2) [PMC free article] [PubMed]
  • des Jarlais DC, Paone D, Milliken J, et al. Audio-computer interviewing to measure risk behaviour for HIV among injecting drug users: A quasi-randomised trial. Lancet. 1999;353:1657–62. [PubMed]
  • Durant LE, Carey MP. Self-administered questionnaires versus face-to-face interviews in as-sessing sexual behavior in young women. Arch Sex Behav. 2000;29:309–22. [PubMed]
  • Elam G, Fenton KA. Researching sensitive issues and ethnicity: lessons from sexual health. Ethn Health. 2003;8:15–27. [PubMed]
  • Fenton KA, Johnson AM, McManus S, Erens B. Measuring sexual behaviour: methodological challenges in survey research. Sex Transm Infect. 2001;77:84–92. [PMC free article] [PubMed]
  • Fielding R, Lam TH, Hedley A. Risk-behavior reporting by blood donors with an automated telephone system. Transfusion (Paris) 2006:289–97. Vol? [PubMed]
  • Gregson SA, Zhuwau T, Ndlovu J, Nyamukapa CA. Methods to reduce social desirability bias in sex surveys in low-development settings: Experience in Zimbabwe. Sex Transm Dis. 2002;29:568–75. [PubMed]
  • Gregson SA, Mushati P, White PJ, Mlilo M, Mundandi C, Nyamukapa C. Informal confidential voting interview methods and temporal changes in reported sexual risk behaviour for HIV transmission in sub-Saharan Africa. Sex Transm Infect. 2004;80(Suppl. 2):ii36–ii42. [PubMed]
    UNAIDS AIDS epidemic update: December 2007. 2008. UNAIDS/07.27E/JC1322E.
  • Hanck SE, Blankenship KM, Irwin KS, West BS, Kershaw T. Assessment of self-reported sexual behavior and condom use among female sex workers in India using a polling box approach: a preliminary report. Sex Transm Dis. 2008;35:489–94. [PubMed]
  • Hewett PC, Mensch BS, Erulkar AS. Consistency in the reporting of sexual behaviour by adoles-cent girls in Kenya: A comparison of interviewing methods. Sex Transm Infect. 2004;80(Suppl. 2):ii43–ii48. [PMC free article] [PubMed]
  • Hewett PC, Mensch BS, Ribeiro MCSD, et al. Using sexually transmitted infection biomarkers to validate reporting of sexual behavior within a randomized, experimental evaluation of interviewing methods. Am J Epidemiol. 2008;168:202–11. [PMC free article] [PubMed]
  • International Monetary Fund IMF Emerging and Developing Economies List: World Economic Outlook Database. 2009 www.imf.org/external/pubs/ft/weo/2009/01/wepdata/groupshtm#oem.
  • Jaspan HB, Flisher AJ, Myer L, et al. Brief report: Methods for collecting sexual behaviour information from South African adolescents--a comparison of paper versus personal digital assistant questionnaires. J Adolesc. 2007;30:353–9. [PubMed]
  • Jaya J, Hindin MJ, Ahmed S. Differences in young people’s reports of sexual behaviors according to interview methodology: a randomized trial in India. Am J Public Health. 2008;98:169–74. [PubMed]
  • Jones R. Survey data collection using Audio Computer Assisted Self-Interview. West J Nurs Res. 2003;25:349–58. [PubMed]
  • Kissinger P, Rice J, Farley T, et al. Application of computer-assisted interviews to sexual behavior research. Am J Epidemiol. 1999;149:950–4. [PubMed]
  • Konings E, Bantebya G, Carael M, Bagenda D, Mertens T. Validating population surveys for the measurement of HIV/STD prevention indicators. AIDS. 1995;9:375–82. [PubMed]
  • Kreuter F, Presser S, Tourangeau R. Social desirability bias in CATI, IVR, and web surveys: the effects of mode and question sensitivity. Public Opin Q. 2008;72:847–65.
  • Kurth AE, Martin DP, Golden MR, et al. A comparison between audio computer-assisted self-interviews and clinician interviews for obtaining the sexual history. Sex Transm Dis. 2004;31:719–26. [PubMed]
  • Langhaug LF, Cheung YB, Pascoe SJS, Mavhu W, Chirawu P, Cowan FM. Comparing four questionnaire delivery methods for collection of self-reported sexual behaviour data in rural Zimbabwean youth. 17th ISSTDR. Seattle: 2007.
  • Lara D, Strickler J, Diaz C Olavarrieta, Ellerston C. Measuring induced abortion in Mexico: A comparison of four methodologies. Sociological Methods & Research. 2004;21:529–58.
  • Lau JTF, Tsui HY, Wang QS. Effects of two telephone survey methods on the level of reported risk behaviours. Sex Transm Infect. 2003;79:325–31. [PMC free article] [PubMed]
  • Le LC, Blum RW, Magnani R, Hewett PC, Do HM. A pilot of audio computer-assisted self-interview for youth reproductive health research in Vietnam. J Adolesc Health. 2006;38:740–7. [PubMed]
  • Mavhu W, Langhaug LF, Manyonga B, Power R, Cowan FM. What is ‘sex’ exactly? Using cognitive interviewing to improve validity of sexual behaviour reporting among young people in rural Zimbabwe. Culture, Health, and Sexuality. 2008;10:563–72. [PubMed]
  • Mensch BS, Hewett PC, Erulkar AS. The reporting of sensitive behavior by adolescents: a methodological experiment in Kenya. Demography. 2003;40:247–68. [PubMed]
  • Mensch BS, Hewett PC, Gregory R, Helleringer S. Sexual behavior and STI/HIV status among adolescents in rural Malawi: an evaluation of the effect of interview mode on reporting. Stud Fam Plann. 2008;39:321–34. [PMC free article] [PubMed]
  • Metzger DS, Koblin B, Turner CF, et al. Randomized controlled trial of audio computer-assisted self-interviewing: Utility and acceptability in longitudinal studies. Am J Epidemiol. 2000;152:99–106. [PubMed]
  • Millstein SG, Irwin CE. Acceptability of computer-acquired sexual histories in adolescent girls. J Pediatr. 1983;103:815–9. [PubMed]
  • Minnis AM, Muchini A, Shiboski S, et al. Audio computer-assisted self-interviewing in reproductive health research: reliability assessment among women in Harare, Zimbabwe. Contraception. 2007;75:59–65. [PubMed]
  • Phillips AE, Lowndes CM, Boily MC, et al. Comparison of informal confidential voting and face-to-face interviewing in collecting sexual research data from men who have sex with men (MSM) and transgenders (HIJRA) in Bangalore. 17th ISSTDR. Seattle: 2007. [PubMed]
  • Plummer ML, Ross DA, Wight D, et al. “A bit more truthful”: the validity of adolescent sexual behaviour data collected in rural northern Tanzania using five methods. Sex Transm Infect. 2004a;80(Suppl. 2):ii49–ii56. [PMC free article] [PubMed]
  • Plummer ML, Wight D, Ross DA, et al. Asking semi-literate adolescents about sexual behaviour: the validity of assisted self-completion questionnaire (ASCQ) data in rural Tanzania. Tropical Medicine & International Health. 2004b;9:737–54. [PubMed]
  • Wilson D, Halperin DT. “Know your epidemic, know your response”: a useful approach, if we get it right. Lancet. 2008;372:423–6. [PubMed]
  • Potdar R, Koenig MA. Does Audio-CASI improve reports of risky behavior? Evidence from a randomized field trial among young urban men in India. Stud Fam Plann. 2005;36:107–16. [PubMed]
  • Ramjee G, Weber AE, Morar NS. Recording sexual behavior: Comparison of recall questionnaires with a coital diary. Sex Transm Dis. 1999;26:374–80. [PubMed]
  • Rumakom P, Guest P, Chinvarasopak W, Utarmat W, Sontanakanit J. Obtaining accurate re-sponss to sensitive questions among Thai students: a comparison of two data collection techniques. In: Jejeebhoy S, Shah I, Thapa S, editors. Sex without consent. Zed Books; London: 2005. p?-p?
  • Sedyaningsih-Mamahit E, Gortmaker S. Reproducibility and validity of self-reported condom use in Jakarta. Southeast Asian Journal of Tropical Medicine & Public Health. 2003;34:136–46. [PubMed]
  • Seebregts CJ, Zwarenstein M, Mathews C, et al. Handheld computers for survey and trial data collection in resource-poor settings: Development and evaluation of PDACT, a Palm Pilot in-terviewing system. International Journal of Medical Informatics. 2008 Vol? p?-p? [PubMed]
  • Simoes AA, Bastos FI, Moreira RI, Lynch KG, Metzger DS. A randomized trial of audio computer and in-person interview to assess HIV risk among drug and alcohol users in Rio De Janeiro, Brazil. J Subst Abuse Treat. 2006;30:237–43. [PubMed]
  • Tourangeau R, Smith TW. The impact of data collection mode, question format, and question context. Public Opin Q. 1996;60:275–304.
  • Tourangeau R, Yan T. Sensitive questions in surveys. Psychol Bull. 2007;133:859–83. [PubMed]
  • Turner CF, Ku L, Rogers SM, Lindberg LD, Pleck JH, Sonenstein FL. Adolescent sexual behavior, drug use, and violence: Increased reporting with computer survey technology. Science. 1998;280:867–73. [PubMed]
  • van de Wijgert J, Padian NS, Shiboski S, Turner CF. Is audio computer assited interviewing a feasible method of surveying in Zimbabwe? Int J Epidemiol. 2000;29:885–90. [PubMed]
  • van Griensven F, Naorat S, Kilmarx PH, et al. Palmtop-assisted self-interviewing for the collection of sensitive behavioral data: randomized trial with drug use urine testing. Am J Epidemiol. 2006;163:271–8. [PubMed]
  • Weinhardt LS, Forsyth AD, Carey MP, Jaworski BA, Durant LE. Reliability and validity of self report measures of HIV-related sexual behavior: progress since 1990 and recommendations for research and practice. Arch Sex Behav. 1998;27:155–80. [PMC free article] [PubMed]
  • Wellings K, Nanchahal K, Macdowall W, et al. Sexual behaviour in Britain: early heterosexual experience. Lancet. 2001;358:1843–50. [PubMed]