|Home | About | Journals | Submit | Contact Us | Français|
Early detection of melanoma may provide an opportunity to positively impact melanoma mortality. Numerous skin cancer educational interventions have been developed for primary care physicians (PCPs) to improve diagnostic accuracy. Standardized training is also a prerequisite for formal testing of melanoma screening in the primary care setting.
We conducted a systematic review to determine the extent of evaluated interventions designed to educate PCPs about skin cancer, including melanoma.
Relevant studies in the English language were identified through systemic searches performed in MEDLINE, EMBASE, BIOSIS, and Cochrane through December 2010. Supplementary information was obtained from corresponding authors of the included studies when necessary.
Studies eligible for inclusion formally evaluated skin cancer education interventions and were designed primarily for PCPs. Excluded studies lacked a specified training intervention, used decision-making software, focused solely on risk factor identification, or did not directly educate or assess participants. Twenty studies met the selection criteria. Data were extracted according to intervention content and delivery format, and study outcomes.
All interventions included instructions about skin cancer diagnosis, but otherwise varied in content. Curricula utilized six distinct educational techniques, usually incorporating more than one. Intervention duration varied from 12 min to over 6 h. Eight of the 20 studies were randomized trials. Most studies (18/20, 90%) found a significant improvement in at least one of the following five outcome categories: knowledge, competence, confidence, diagnostic performance, or systems outcomes. Competence was most commonly measured; no study evaluated all categories. Variability in study design, interventions, and outcome measures prevented correlation of outcomes with intervention characteristics.
Despite the development of many isolated educational interventions, few have been tested rigorously or evaluated under sufficient standardized conditions to allow for quantitative comparison. Improved and rigorously tested skin cancer educational interventions for PCPs with outcome measures focusing on changes in performance are needed.
The online version of this article (doi:10.1007/s11606-011-1692-y) contains supplementary material, which is available to authorized users.
The incidence of melanoma, in all thickness categories, has increased over the past several decades and has become a major public health concern1,2. Survival from late stage disease remains poor despite significant research efforts on a variety of treatment options3. Primary prevention strategies, focusing on reducing unnecessary ultraviolet (UV) exposure and other UV protection behaviors, can be difficult to implement4–7. Early detection, when melanoma is thin and confined to the skin, may be the best chance to positively impact melanoma mortality and is feasible through simple visual inspection, which is paramount to effective secondary prevention strategies4.
Increased public education and awareness efforts are, in part, responsible for the escalating desire for skin cancer screening in the general population. Both dermatologists and primary care physicians (PCPs) must work together to meet these demands since the dermatology workforce shortage precludes dermatologists from adequately achieving this on their own8. PCPs often see patients with skin complaints, and they can serve as an important point of skin cancer diagnosis and triage for Americans, who make an average of 1.7 visits to PCPs each year9,10. Of patients with melanoma, 87% had a regular physician, and 63% had seen their PCP in the year prior to diagnosis, but only 20% had a dermatologist11,12. PCPs are thus well positioned to detect early melanoma and, not surprisingly, the initial presentation of melanoma is often to PCPs, who biopsy 1.4–13% of all melanomas13.
While the skin examination is the most frequently occurring diagnostic or screening service provided in office-based physician visits14, melanoma screening for the general population by PCPs is not formally recommended despite limited but important evidence that it is feasible and efficacious as a means of secondary prevention15–17. Currently, the United States Preventive Services Task Force (USPSTF) concludes that for adults in the general population who are not high risk, “the evidence is insufficient to recommend for or against routine screening for skin cancer using a total-body skin examination”17. The USPSTF made this recommendation based on (1) the lack of quality evidence that links screening to improved health outcomes and (2) limited information about the ability of PCPs to perform adequate examinations in the context of usual care18. With regard to the latter, it has been suggested that PCPs may not be prepared or sufficiently trained to identify early skin cancer18–21. An effective training program is essential prior to conducting a rigorous screening trial designed to ultimately determine the efficacy of clinician skin examination and its impact on melanoma mortality reduction.
Unfortunately, most physicians have limited exposure to skin cancer and dermatology during medical school and residency22–26. Lack of confidence and poor diagnostic skills are barriers to effectively performing skin cancer examinations, and many PCPs remain eager for education that can improve their diagnostic accuracy for skin cancer27. Recent surveys of PCPs reveal increasing interest in dermatology courses and educational activities related to skin cancer28,29. Interest in diagnostic aids for melanoma detection is also increasing among PCPs. In 2009, the American Academy of Family Physicians held the first dermoscopy course at their Annual Scientific Assembly Meeting (www.aafp.org accessed March 2, 2011). (See Online Appendix for a definition of dermoscopy.) The demand for this dermoscopy course has doubled, resulting in multiple dermoscopy sessions being offered at the annual meeting in 2010 (www.aafp.org accessed March 2, 2011 and personal communication, A.A. Marghoob, June 2010).
In light of high demand and concerns about the adequacy of existing education, we undertook a systematic review of the literature to compare components and outcomes of interventions that have been developed and tested for skin cancer education of PCPs. Awareness of previously developed educational interventions has the potential to impact the design and evaluation of future training efforts.
We followed guidelines described by the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) statement in selecting and assembling publications for this review30. The goal of the PRISMA statement is to help ensure clarity and transparency when reporting systematic reviews30.
Studies eligible for inclusion in this review evaluated skin cancer educational interventions designed primarily for PCPs, including family practice physicians, internal medicine physicians, and general practitioners. Studies were excluded if they aimed to instruct exclusively residents, medical students, dermatologists, dermatology residents, nurses, or lay people about skin cancer. Skin cancer was defined as melanoma, basal cell carcinoma, and squamous cell carcinoma; interventions focusing exclusively on nonmelanoma skin cancers were excluded because melanoma is responsible for most skin cancer deaths31. Studies that involved surveys of prior dermatologic education, lacked a specified training intervention, addressed other types of cancer or dermatologic conditions other than skin cancer, used decision-making software, focused solely on identification of risk factors, or had no direct participant education or assessment with reported results were excluded. Studies whose main text was in a language other than English were excluded. Table 1 provides a summary of study inclusion criteria.
We performed a systematic search of articles published in peer-reviewed health care-related journals between 1966 and December 2010 using MEDLINE, EMBASE, BIOSIS, and Cochrane. Three categories of terms were searched: (1) melanoma, skin cancer, pigmented skin lesions, skin malignancy, or melanocytic; (2) primary care doctor, primary health care, primary care provider, family physicians, family practice, general practitioner, internal medicine, or non-specialist; and (3) continuing medical education, training, instruction, teach, tutorial, or curriculum. In MEDLINE, Medical Subject Headings (MeSH) were used (skin neoplasms, melanoma, family practice, primary health care, family physicians, and continuing medical education). In EMBASE, Emtree terms were exploded (melanoma, skin cancer, family physician, family doctor, general practitioner, general practice, internal medicine, primary care, primary medical care, training, and curriculum). Cited reference searching, using Scopus and the Science Citation Index, on selected articles was also performed. Studies were selected for inclusion by three authors (JG, EQ, and SD), with SD providing the final decision in the event of disagreement (four studies required evaluation by SD and all were excluded).
Two authors (JG and EQ) reviewed all studies and independently collected data. In the event of discrepancy, a collaborative review and agreement occurred. Data were extracted according to criteria most useful for intervention comparison (Table 2), such as components of curriculum (diagnosis, epidemiology, counseling, management, dermoscopy, and detection algorithm) and delivery format (live, literature, multimedia, feedback, interactive, web-based), as well as funding source(s). Data were also extracted according to the evaluation of the intervention, which included study design, number of participants, and outcomes (knowledge, competence, confidence/attitudes, diagnostic performance, and systems outcomes). Complete definitions for the study variables pertaining to curriculum, delivery format, and outcomes can be found in Table 2 and were derived from study variables used in a recent meta-analysis of Internet-based learning in health professions32.
All included studies provided results for the outcomes investigated for the educational intervention, but the details of the actual educational component were not always specified. Information often absent included length of the instructional component and the topics covered in the curriculum. A study assessment query form was developed to obtain the characteristics of each training program, based on the study variables described above. Corresponding authors were identified, and e-mail information was obtained from a number of studies. If the corresponding author did not have updated contact information, was deceased, or could not be reached, a co-author was queried. The query form consisted of 14 multiple-choice or short answer questions and three open-ended questions. The query form concluded with a request to share any educational materials used for the study. Authors were also e-mailed a preliminary compilation of relevant studies and educational interventions in tabular format. This enabled authors to verify the details listed for their study.
We retrieved 1,980 citations, of which 66 (51 from literature review and 15 from reference searching) were reviewed in full, and 20 met inclusion criteria. Information about the studies identified and excluded is summarized in Figure 1. We included 20 studies in our review. The 20 studies evaluated 13 educational interventions; 7 interventions were evaluated once, 5 were evaluated twice, and 1 was evaluated 3 times. Characteristics of each intervention are summarized in Table 3 and are considered according to the study variables defined in Table 2. The study assessment survey instruments were completed by corresponding authors from 12 of the 20 studies, including 8 of the 13 interventions; each completed survey provided information or clarification beyond what appeared in the publication.
Each of the 13 interventions included instruction in diagnosis of melanoma and possibly other skin cancers, and 12 (92%) presented additional relevant information. This additional instruction included epidemiology in ten (77%) interventions, management in eight (62%), and counseling in eight (62%). A detection algorithm was used in six (46%) interventions, with ABCD(E) (defined in the Online Appendix) being most commonly taught (n=4). Instruction on dermoscopy was included for two (15%) interventions. Only one (8%) intervention utilized a needs assessment during curriculum development33,34.
Live format was used in nine (69%) interventions, literature in eight (62%), interactive format in six (46%), multimedia in three (23%), feedback in three (23%), and web-based in two (15%). Single-format delivery strategies were used in three (23%) of the interventions, two-format strategies were used in six (46%), and multifaceted educational strategies (three or more formats) were used in four (30%). Excluding one intervention that delivered instruction over several weeks to months, interventions for which length was specified averaged 2.3 h in duration; however, there was a wide range (12 min to over 6 h).
We categorized the study outcomes regarding the education interventions as pertaining to knowledge, competence, confidence, diagnostic performance, or systems outcomes. Table 4 summarizes the outcomes of the studies regarding each intervention according to the measure and method of evaluation. Outcome measures found in Table 4 are simplified to facilitate comparison and may not reflect the full extent of outcome measures assessed in the study (a more detailed version of Table 4 can be found online). If the study reported a statistically significant improvement for a measure in an outcome category, it was given a “plus” for that measure. Results denoted by a “minus” did not have a statistically significant positive impact. An intervention was considered positive overall if at least one measure in one study of the intervention earned a “plus.” Of the 13 educational interventions, change in knowledge was assessed after five (38%) interventions, of which five were positive. Competence was assessed after nine (69%) interventions, of which seven were positive. Confidence was assessed after seven (54%) interventions, of which five were positive. Diagnostic performance was assessed after five (38%) interventions, of which three were positive. Systems outcomes were assessed after seven (54%) interventions, of which six were positive.
We systematically reviewed the literature on skin cancer education for PCPs. Our results demonstrate that a multitude of interventions have been implemented, evaluated, and published, many of which have shown significant improvements in provider knowledge, competence, confidence, diagnostic performance, or systems outcomes. Curricular components were fairly consistent across educational interventions. All interventions provided instruction on skin cancer diagnosis. The majority included instruction on epidemiology and management. Of note, two-thirds (62%) included training on patient counseling. Curricular components were more or less emphasized depending on the perceived role of targeted PCPs in skin cancer detection and care, which can vary by specialty and geographic location. For example, instruction on management options ranged from triage and specialist referral to biopsies and definitive surgical management. The Basic Skin Cancer Triage course, Skin Watch program, and the Melanoma Education for Primary Care program were notable for including five of six curricular components, presenting very thorough descriptions of the educational components, and undergoing repeated evaluation33–39. The aforementioned interventions each received funding on a national level or from large independent funds.
Dermoscopy is a recent and novel addition to a PCP skin cancer curriculum, being a component of the curriculum for only two interventions and first appearing as part of an intervention in 200634,40. The decision to introduce dermoscopy to non-dermatologists and the best method to teach it have been debated, given that it is difficult to learn without formal training41. Use of dermoscopy as a triage tool for suspicious lesions has been advocated42. The Three-Point Checklist, a dermoscopic algorithm (described in the Online Appendix), has been shown to improve the ability of PCPs to triage suspicious lesions40,43–47. Several studies have demonstrated improved diagnostic accuracy when teaching dermoscopy to non-dermatologists, supporting that dermoscopy may be a valuable addition to a skin cancer education curriculum44,47,48.
The delivery format has proven to be very important in medical education, and various techniques have evolved over the years. Techniques range from the use of didactic programs, opinion leaders, and information distribution to interactive education, audit and feedback, and outreach49. Traditional passive learning based on didactic presentations is generally not effective in changing professional behavior49,50. Better results have been obtained with multifaceted and interactive interventions incorporating multiple methods, such as interactive workshops or didactic presentations combined with application workshops49–52. This is consistent with adult learning approaches, which suggest that physicians learn best in response to perceived relevant problems53. As a specific type of interactive technique, personalized feedback has been increasingly incorporated into successful education programs in many fields and has been shown to enhance learning54,55. Internet-based educational interventions provide the opportunity for interactivity and have grown rapidly in number across all health professions56. Some studies suggest interactive Internet-based continuing medical education (CME) can achieve comparable or superior results compared with traditional methods; however, data from a recent meta-analysis are inconclusive as to whether this approach is more efficacious than traditional methods32,56.
We expected to see a time-related trend towards the development of interventions with delivery formats utilizing feedback, interactivity, and based in the Internet. However, we did not identify any delivery formats that trended with time. Interestingly, in our review we found that since 2001, there have been no published evaluations on web-based interventions. It should be noted that web-based programs are being developed for dermatology, but were not captured in our review because of lack of published evaluation according to our search methods. Lack of standardized evaluation has been problematic among online education programs57,58. This may also be the case with recently developed CME courses or other educational programs, which are incorporating more novel delivery formats, but have not yet been formally evaluated. To assess which delivery formats are most effective in skin cancer education programs, interventions that incorporate novel delivery formats, especially those using the Internet, should be formally studied.
Although curriculum components were similar across studies, the specific outcomes (knowledge, competence, confidence, diagnostic performance, systems) studied were less consistent (Table 4). The Accreditation Council for Continuing Medical Education recommends that educational activities be linked to changes in competence, performance, or patient outcomes59. In order to determine the general efficacy of an intervention for the purposes of our review, we determined the positive or negative result for each outcome studied. In our descriptive analysis, the majority of outcomes studied showed a significant positive effect for at least one measure, and therefore most interventions were determined to be positive. Notably, systems outcomes, which are important to assess but are often considered difficult to change, produced positive measurements60,61. While studies evaluating knowledge were most likely to have a significant positive outcome, studies looking at diagnostic performance were least likely to have a significant effect, suggesting that improvements in competence, the most commonly studied outcome, may not translate into improvements in practice50,62,63. No study looked directly at patient outcomes; only intermediaries for patient health outcomes were evaluated. The ultimate determination of an educational program’s success is measured by patient outcomes (such as melanoma-associated morbidity and mortality); however these studies are difficult to design and execute, and require large sample sizes and long-term follow-up.
It is difficult to compare the effects of different curricular and delivery variables on outcomes across these studies because of their variable study designs. Studies of medical education programs are often subject to biases and confounding factors resulting in great heterogeneity across studies because of variation in learners, instructional method, outcome measures, and other aspects of the educational context64. As shown in our review, a wide and inconsistent variety of outcome measures and methods of evaluation were employed when studying the interventions. Therefore, we are unable to draw conclusions about specific aspects of interventions that were more or less likely to result in a positive assessment.
Our search was limited to peer-reviewed health care-related journals, which generally publish studies with significant results, thus generating a biased sample of studies. We recognize the existence of many programs that were not captured in our review because of lack of evaluation and reported elements such as program designs, web-based tutorials, CME programs, and physician workshops. Due to the variability among the reviewed studied, meta-analysis was not possible64.
While it is commendable that many interventions instructing PCPs on skin cancer detection have been created, implemented, and published, we found that lack of uniformity across interventions and outcome assessments precluded direct comparison of efficacy. The absence of similarity among interventions may impede dissemination of optimal interventions. This is particularly problematic when dealing with the detection of potentially fatal cancers. Improved and standardized methods of assessment are important for ultimately studying the effects of educational interventions conducted on a larger scale. Understudied areas include dermoscopy as a curricular component and the use of feedback, interactive components, and web-based strategies. We should move towards effectively and efficiently training large numbers of clinicians using an easily reproducible, generalizable, and accessible format. We suggest that future interventions be designed to measure change in participant diagnostic performance and patient outcomes, so that the efficacy of clinician skin examination and its impact on melanoma mortality reduction can be determined. We believe effective educational interventions are part of the solution to the challenge of early detection; while much has been done in an effort to improve early detection of melanoma and other skin cancers, still much remains to be done.
Funders This study was supported in part by a research grant from the Melanoma Research Alliance. The sponsors had no role in the design and conduct of the study; in the collection, analysis, and interpretation of data; or in the preparation, review, or approval of the manuscript.
Prior presentations None.
Conflict of Interest Jacqueline M. Goulart, BA, Stephen Dusza, DrPH, Gwen Alexander, PhD, Melody J. Eide, MD, Ashfaq A. Marghoob, MD, and Allan C. Halpern, MD, have nothing to disclose. Elizabeth A. Quigley, MD, reports honoraria received from Novartis Pharmaceuticals in 2009 and 2010 for skin screening and employee-health lectures. Maryam M. Asgari, MD, reports a grant received from Novartis Pharmaceuticals (2009). Suzanne W. Fletcher, MD, MSc, reports stock ownership of Berkshire Hathaway and honoraria received from UpToDate (editor, primary care) and Wolters-Kluwer (author, Clinical Epidemiology-The Essentials). Alan C. Geller, MPH, reports honoraria from UpToDate (author, skin cancer). Martin A. Weinstock, MD, PhD, reports employment by University Dermatology Inc.; consultancies for Abbott, Astellas, Conrail, Johnson&Johnson, Nordic Biotech, Roche, Schering; and expert testimony for Playtex.
Below is the link to the electronic supplementary material.
ESM 1(15K, docx) (DOC 15 kb)
(DOC 15 kb)
Members of the INFORMED Group include: Gwen Alexander, PhD; Maryam M. Asgari, MD; Melody J. Eide, MD; Suzanne W. Fletcher, MD, MSc; Alan C. Geller, MPH; Allan C. Halpern, MD; and Martin A. Weinstock, MD, PhD