We conducted a survey of family medicine and internal medicine residency programs in the United States to explore their use of educational games. Support and use of educational games were high at 92% and 80%, with Jeopardy like games being the most frequently used (78%). While 62% and 47% of programs used games as teaching and review tools respectively, only 4% used them as evaluation tools.
This study has a number of strengths. To our knowledge, this is the first national survey exploring the use of educational games. The survey questionnaire was rigorously designed based on input from current and former program directors, and chief residents. The response rate (52%) was comparable to the mean response rate to surveys of physicians published in medical journals (54%) [16
One limitation of this study is a possible selection bias if program directors not supporting or using educational games were less likely to respond. In that case, our findings would have overestimated the use of games. In the worst case scenario (i.e., assuming that all those who did not respond were not supporters or users), the percentage of program directors supporting or using educational games would be 48% and 41% respectively. Such percentages would still reflect relatively high support and use. Information bias is less likely here because of the factual nature of the questions, particularly the one about using educational games. The question about preference of using educational games is obviously more subjective and we did not test this question for reliability.
Educational games appear to be equally popular as educational tools among both family medicine and internal medicine residency programs. The reasons for the inverse association between using educational games and the percentage of international medical graduates in the program are not clear. These analyses were not defined a priori and should be interpreted with caution given the potential study limitations discussed above.
The strong support and high prevalence of use of educational games are intriguing in the face of the limited evidence supporting their effectiveness [4
]. This remains concerning even when considering how the potential response bias could have overestimated the percentage of supporters and users of educational games. A possible explanation of these findings is that medical educators do not follow an evidence-based approach in selecting their educational strategies. For example, they might not search for the evidence relating to the educational interventions they employ. Also, they might give more value to perceived effectiveness - based on personal educational experience - compared to systematically evaluated effectiveness. This may particularly apply when the quality of studies providing the evidence is poor.
The lack of evidence for the effectiveness of educational games is not necessarily an evidence of the lack of their effectiveness. This lack of evidence might be related to an inadequate power of studies to show an effect and/or to the poor quality of those studies. Indeed, each of the three systematic reviews conducted in this area identified a limited number of studies that were of moderate methodological quality at best, and of poor reporting in general [4