In looking for associations between program quality indicators, our results confirm the association between IM-ITE scores, ABIM-CE pass rates, and years of RRC certification, and extends knowledge of associations between other program quality indicators. When looking for associations between ABIM-CE pass rates and other indicators of program quality, there were no significant differences in ABIM-CE pass rates relative to hospital type, proportion of IMG graduates, program director workload, program size, faculty size, or acceptance of pharmaceutical funding. Since the range of ABIM-CE pass rates in our surveyed programs was so narrow, our study may have lacked the power to detect any meaningful associations between ABIM-CE pass rates and other indicators. The range of IM-ITE scores (both raw scores and percentile rank scores) in surveyed programs was wider, and we were able to demonstrate associations between IM-ITE scores and other quality indicators. We found that programs with larger percentages of IMG graduates had higher PGY3 IM-ITE rank scores (69th
percentile vs. 49th
percentile), consistent with the findings of others [6
]. IM-ITE scores were higher in programs that accepted pharmaceutical funding. PGY3 IM-ITE scores were also higher in programs at which program directors had lower clinical workloads.
In our univariate model, we found significant associations between length of RRC certification and markers of program size (i.e. large faculty or number of residents). Programs with higher numbers of residents and programs with higher number of faculty had longer RRC certification than did smaller programs and in those with fewer faculty. We did not, however, find an association between hospital type and length of RRC certification, which contrasts with the findings of others [16
]. Larger programs of either type (i.e. university vs. non-university) perhaps had more resources to comply with ACGME requirements. However, these resources are not being used to decrease the clinical workload of program directors; program directors' clinical workload is no less at these larger programs than it is at smaller programs. While some of the demonstrated associations between program characteristics and length of RRC certification are consistent with the most basic goals of residency training (i.e. to train physicians to deliver quality care, as represented by ABIM-CE pass rates), it is unclear why programs with higher numbers of residents and faculty have longer RRC certification.
Among the strongest associations demonstrated was the negative association between a high proportion of IMG graduates and longer RRC certification. This negative association was also present in our multivariable model, in which we corrected for number of residents, number of faculty, and IM-ITE scores. We found that programs with higher proportions of IMGs tend to be non-university hospitals with fewer full-time faculty, and that accept pharmaceutical funding to support training. This may suggest fewer resources to comply with ACGME certification requirements, even though these programs demonstrate successful medical knowledge outcomes among trainees (i.e. IM-ITE scores; ABIM-CE pass rates).
Strengths of our study include survey results from a group of internal medicine training programs whose distribution mirrors that of all internal medicine residency training programs, and with a satisfactory response rate on the survey. Nevertheless, several limitations deserve mention. First, the small variation of ABIM-CE pass rates among surveyed programs limited our ability to detect associations between program quality indicators and ABIM-CE pass rates. In addition, that the average ABIM-CE pass rates and IM-ITE scores in responding programs was greater than 50% suggests that although the distribution of responding programs mirrored that of all programs, responding programs differed in some way from all programs, and results may not be generalizable to all residency programs. Our survey was limited to those potential indicators suggested by the medical literature, and may have missed other program characteristics that are associated with quality, yet were not identified. Our survey also used self-reported ABIM-CE pass rates and IM-ITE performance, which may have been less accurate. Some of the quality indicators studied were aggregate results of individual data (e.g., IM-ITE scores; ABIM pass rates), which is a limitation of this and related studies. Finally, we studied associations, not causality. We do not know whether addressing a program characteristic such as program director workload or faculty size or faculty-to-resident ratios will improve educational outcomes such as IM-ITE scores or ABIM-CE pass rates or length of RRC certification, but our results may form the basis for future study.