We have described the successful implementation of an online resource used by residents in a one-month ambulatory experience that simulates the clinical decisions that needed to be made when seeing patients in a longitudinal clinic. Our use of simulations allowed the residents to practice patient care in a safe environment and provided immediate feedback on their patient-care decisions. The content and format was rated highly with the majority recommending its ongoing use not only in our rotation but also in others during their residency. Residents accessed COCOS on their own time without the need for direct involvement of faculty. It was not perceived to be labor or time intensive for the residents; each case required approximately 20 minutes to complete, consistent with other reports [9
]. Completing four to six cases in a four-week period appears to be an acceptable expectation of residents. Even though we applied the model to topics in endocrinology, its generic template facilitates writing cases in other medical subspecialties with a strong ambulatory component.
COCOS is unique in its emphasis on the competencies required to provide follow-up or continuing care to patients. To our knowledge this is the first study that has investigated the effect of web-based simulations of continuity of care as an adjunct to a clinical rotation at the postgraduate level. "Stand-alone" internet curricula in ambulatory have been rated highly by internal medicine residents and resulted in improved test scores [9
] but lack of control groups in these studies did not allow comparisons to 'traditional' teaching formats. Our use of a control group allowed us to make a comparison with our 'traditional' clinical rotation to help us determine whether it is useful as an adjunct rather than a replacement. Other interventions used at the postgraduate level in internal medicine ambulatory settings did not provide as much emphasis on continuity of care [12
We did not measure the number of resident encounters with patients for each category (initial consult or follow-up appointment), but traditionally residents are asked to see new patients (i.e. initial consult) rather than patients returning to clinic. Students' learning in the clinical setting is more often opportunistic and thus more learning is likely to occur around topics that are common compared those that are less commonly seen. Even though the use of COCOS resulted in improved test scores for topics relevant to the "initial consult", a more pronounced improvement was for topics relevant to seeing patients during the "follow-up visit". Thus COCOS may be most beneficial to improve learning around types of patient encounters (i.e. follow-up appointments) that are less frequent during a short clinical rotation. Similarly, COCOS was more beneficial for diseases or conditions less commonly seen during the rotation. The use of COCOS cases for Graves disease and hyperprolactinemia resulted in a significant improvement in test scores, in contrast to a non-statistically significant improvement for questions pertaining to type 1 diabetes. Comments left by the residents included suggestions to create cases for other endocrine disorders uncommonly seen such as adrenal insufficiency and primary hyperparathyroidism. Thus the use of COCOS as an adjunct to a clinical rotation should emphasize more "follow-up" or "continuing care" for conditions like diabetes that are commonly seen, but include content for both "initial consult" and "follow-up" issues for uncommon conditions.
It is noteworthy that the use of COCOS resulted in greater test score improvements in topics for which there were no COCOS cases (thyroid nodules, PCOS). There are two possible explanations for this observation. Residents in the intervention group may have had better exam-taking abilities compared to those in the control group. Alternatively the use of COCOS results in qualitative changes in the learning process that may result in more efficient or improved learning, as has been noted in some studies outside of medical education [13
]. 91% of residents reported COCOS enhanced self-directed learning and our group of 6 full-time faculty members anecdotally reported residents asked more questions about follow-up care, lending support to this hypothesis. However for this to occur COCOS would have to be used early in the rotation and we did not determine when in the rotation it was used by each resident.
Our residents' confidence in managing different types of endocrine conditions is similar to what is reported in the literature [15
]. In contrast to the difference in knowledge gains with the use of COCOS, there were no significant differences between groups in the changes in confidence. Residents may overrate themselves at the start of the rotation or underrate themselves at the end of the rotation when compared with ratings of their supervisors [16
]. The latter may be especially true for residents who used COCOS if they perceived they had done poorly during the case simulations. A retrospective self-analysis of how residents thought their confidence changed over the course of the rotation would be insightful.
The use of COCOS during the rotation proved to be advantageous compared to the rotation alone that included the provision of printed material. One of the key reasons for conducting this study relates to how there is a lack of reading material that adequately covers many of the "continuity of care" learning objectives. The reading material and printed guidelines provided to the control group were primarily review articles that only briefly address the COCOS learning objectives. We did not provide the control group with a paper-based version of the COCOS content. We feel that COCOS has many features that provide advantages over a stand-alone, paper-based source of information. Simulated cases provide residents with opportunities to build on knowledge gained from their clinical experience thus promoting a deeper learning. While we did not include multimedia content for the purpose of this pilot study, images, sounds and video clips can be easily incorporated into any case to add realism to the simulations. The option to complete online pre- and post-case self-assessment quizzes captures their attention and capitalizes on the learner's motivation [18
]. The interactive format including the ability to record resident answers to questions, was adopted not only to better engage the learners [19
], but also to assist course administrators identify gaps in knowledge and in the curriculum as part of a needs analysis.
Manually authoring unique case scenarios with potentially different sequences, content and variables, is an extremely labor-intensive undertaking that limits their availability or reproducibility [20
]. By using a generic template design for COCOS, disease-specific information was easily added to allow the creation of cases. In COCOS residents are asked to answer questions or make clinical decisions, and regardless of the type of response, the storyline continues in its preset course. A switch to a more probabilistic type of simulation where an action could result in a number of theoretic outcomes would improve the level of realism.
The text-based format of the cases allows for easy modification in light of changes in clinical evidence or to adjust the case sequence for disease in which certain topics may not be applicable. For example, the page devoted to "Medical management during pregnancy" could be omitted or interchanged with other special situations such as "Preparation of surgery" with relative ease. Opportunity exists for a wide application of COCOS beyond the postgraduate level, including both undergraduate and continuing education.
This study has several limitations. It was conducted at a single training program and the structure and administration of our rotation may not be representative. We only focused on "continuity of care" from the perspective of the CanMEDS role as a Medical Expert. There are elements of longitudinal clinical practice that are difficult to simulate (e.g. reviewing the chart, refreshing oneself on the patient's problems, retrieving missing information, understanding patient-doctor relationships, family dynamics and community resources) [3
] that require competencies in other CanMEDS roles such as Manager, Communicator and Professional. Only hands-on experience may be adequate to teach our trainees about these elements. We did not randomize residents within each rotation to avoid contamination. We are somewhat reassured that the structure and administration of the rotation did not differ between the two groups, and the self-reports of the numbers of types of patients seen by the two groups were similar. Even though residents may have been enrolled at different times, there was no difference in baseline test or confidence scores. 50% of the control group and 31% of the intervention group were in their third-year of residency. If clinical experience influenced outcomes, one would anticipate that the group with more experience would perform better on the knowledge assessment. Alternatively, more senior residents may have better-established learning habits and thus may be less likely to benefit from an online intervention compared to more impressionable junior residents. Regardless, despite being at a relatively earlier stage of training, the intervention group had greater improvements in test scores; if randomization resulted in an equal distribution by postgraduate year, the differences between groups may have been more pronounced. Similarly, if participants were randomized and contamination had occurred, the control group's test scores could be artificially raised, and thus the results would still be significant if the intervention group demonstrated greater improvements in knowledge. Alternating the control with the intervention group from month to month is a potential way to improve validity and avoid contamination.
When interpreting the results, it is not known whether the improvements in test scores seen in the intervention group can be attributed to a computer-dependent feature in COCOS. Future studies comparing paper-based to a web-based presentation of COCOS content, or comparing two different versions of COCOS (to measure the impact of change in a single feature while holding others constant) would be useful. The net worth of COCOS on the learning gain, relative to the resource costs to develop and maintain it, remains unknown. We did not measure the amount of time residents spent completing the online modules nor did we survey residents on whether they felt the use of COCOS resulted in a more efficient use of time to learn about its specific topics, thereby allowing them to allocate the saved time for learning in other curriculum areas. Thus it is theoretically possible that the time allocated to the use of COCOS could have taken away from other learning opportunities. We also did not measure whether the implementation of COCOS resulted in staff physicians being able to save time and/or observe improvements in their other academic responsibilities. While resident opinions may not be the best measure of worth, the overwhelming response of residents who used COCOS was very positive and did allow them to better learn about topics that they would not have otherwise been able to do.
Forty-four percent of our total participants did not complete the post-test, and their pre-test scores were not included in analysis. We did not systematically measure reasons for non-completion. It was explicit that participation in the study was voluntary and we suspect that these residents gave less priority to completing the post-test in favor of other aspects of the rotation. This could have biased the results, in two ways. First, if these residents had a lower baseline level of knowledge, our actual results would have ultimately underestimated the changes in pre-post test scores. In contrast, if these residents had higher pre-test scores, our results would have overestimated the changes. However, when analyzing the pre-test scores of who did not complete the study, there was no difference between groups and we feel the dropout rate did not influence our interpretation of the data. Second, not completing the post-test could be an indicator of relatively weaker skills or level of interest in self-directed learning and thus, our final results would not be applicable to them. It is unclear as to why one resident assigned to the intervention group did not end up using the COCOS program, but there were no reports however of technical difficulties with accessing the website.
The objective was to determine the effect of COCOS as an adjunct learning tool within an existing clinical rotation, and we can not decipher whether the technology itself or the content (using technology as its vehicle for presentation) contributed most to the benefits. Specific studies are needed to determine whether different types of such presentation elements that we included (e.g. popup windows, interactivity) provide enough learning benefit to justify their higher costs. A comparison of the effects of case-based content and non-case-based content would be an informative comparison of two different instructional methods to present the topics in our curriculum. It would also be interesting to compare the effects of the use of COCOS in a short, block rotation to the longitudinal clinical experience. We are currently studying whether the knowledge gained from using COCOS leads to long term retention and improvements in clinical performance.