We systematically reviewed the literature on skin cancer education for PCPs. Our results demonstrate that a multitude of interventions have been implemented, evaluated, and published, many of which have shown significant improvements in provider knowledge, competence, confidence, diagnostic performance, or systems outcomes. Curricular components were fairly consistent across educational interventions. All interventions provided instruction on skin cancer diagnosis. The majority included instruction on epidemiology and management. Of note, two-thirds (62%) included training on patient counseling. Curricular components were more or less emphasized depending on the perceived role of targeted PCPs in skin cancer detection and care, which can vary by specialty and geographic location. For example, instruction on management options ranged from triage and specialist referral to biopsies and definitive surgical management. The Basic Skin Cancer Triage course, Skin Watch program, and the Melanoma Education for Primary Care program were notable for including five of six curricular components, presenting very thorough descriptions of the educational components, and undergoing repeated evaluation33–39
. The aforementioned interventions each received funding on a national level or from large independent funds.
Dermoscopy is a recent and novel addition to a PCP skin cancer curriculum, being a component of the curriculum for only two interventions and first appearing as part of an intervention in 200634,40
. The decision to introduce dermoscopy to non-dermatologists and the best method to teach it have been debated, given that it is difficult to learn without formal training41
. Use of dermoscopy as a triage tool for suspicious lesions has been advocated42
. The Three-Point Checklist, a dermoscopic algorithm (described in the Online Appendix), has been shown to improve the ability of PCPs to triage suspicious lesions40,43–47
. Several studies have demonstrated improved diagnostic accuracy when teaching dermoscopy to non-dermatologists, supporting that dermoscopy may be a valuable addition to a skin cancer education curriculum44,47,48
The delivery format has proven to be very important in medical education, and various techniques have evolved over the years. Techniques range from the use of didactic programs, opinion leaders, and information distribution to interactive education, audit and feedback, and outreach49
. Traditional passive learning based on didactic presentations is generally not effective in changing professional behavior49,50
. Better results have been obtained with multifaceted and interactive interventions incorporating multiple methods, such as interactive workshops or didactic presentations combined with application workshops49–52
. This is consistent with adult learning approaches, which suggest that physicians learn best in response to perceived relevant problems53
. As a specific type of interactive technique, personalized feedback has been increasingly incorporated into successful education programs in many fields and has been shown to enhance learning54,55
. Internet-based educational interventions provide the opportunity for interactivity and have grown rapidly in number across all health professions56
. Some studies suggest interactive Internet-based continuing medical education (CME) can achieve comparable or superior results compared with traditional methods; however, data from a recent meta-analysis are inconclusive as to whether this approach is more efficacious than traditional methods32,56
We expected to see a time-related trend towards the development of interventions with delivery formats utilizing feedback, interactivity, and based in the Internet. However, we did not identify any delivery formats that trended with time. Interestingly, in our review we found that since 2001, there have been no published evaluations on web-based interventions. It should be noted that web-based programs are being developed for dermatology, but were not captured in our review because of lack of published evaluation according to our search methods. Lack of standardized evaluation has been problematic among online education programs57,58
. This may also be the case with recently developed CME courses or other educational programs, which are incorporating more novel delivery formats, but have not yet been formally evaluated. To assess which delivery formats are most effective in skin cancer education programs, interventions that incorporate novel delivery formats, especially those using the Internet, should be formally studied.
Although curriculum components were similar across studies, the specific outcomes (knowledge, competence, confidence, diagnostic performance, systems) studied were less consistent (Table ). The Accreditation Council for Continuing Medical Education recommends that educational activities be linked to changes in competence, performance, or patient outcomes59
. In order to determine the general efficacy of an intervention for the purposes of our review, we determined the positive or negative result for each outcome studied. In our descriptive analysis, the majority of outcomes studied showed a significant positive effect for at least one measure, and therefore most interventions were determined to be positive. Notably, systems outcomes, which are important to assess but are often considered difficult to change, produced positive measurements60,61
. While studies evaluating knowledge were most likely to have a significant positive outcome, studies looking at diagnostic performance were least likely to have a significant effect, suggesting that improvements in competence, the most commonly studied outcome, may not translate into improvements in practice50,62,63
. No study looked directly at patient outcomes; only intermediaries for patient health outcomes were evaluated. The ultimate determination of an educational program’s success is measured by patient outcomes (such as melanoma-associated morbidity and mortality); however these studies are difficult to design and execute, and require large sample sizes and long-term follow-up.
It is difficult to compare the effects of different curricular and delivery variables on outcomes across these studies because of their variable study designs. Studies of medical education programs are often subject to biases and confounding factors resulting in great heterogeneity across studies because of variation in learners, instructional method, outcome measures, and other aspects of the educational context64
. As shown in our review, a wide and inconsistent variety of outcome measures and methods of evaluation were employed when studying the interventions. Therefore, we are unable to draw conclusions about specific aspects of interventions that were more or less likely to result in a positive assessment.
Our search was limited to peer-reviewed health care-related journals, which generally publish studies with significant results, thus generating a biased sample of studies. We recognize the existence of many programs that were not captured in our review because of lack of evaluation and reported elements such as program designs, web-based tutorials, CME programs, and physician workshops. Due to the variability among the reviewed studied, meta-analysis was not possible64