Search tips
Search criteria 


Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
Am J Prev Med. Author manuscript; available in PMC 2011 November 1.
Published in final edited form as:
PMCID: PMC2994103

Durable Improvements in Prostate Cancer Screening from Online Spaced Education

A Randomized Controlled Trial



Prostate cancer screening with prostate-specific antigen (PSA) is frequently performed, counter to clinical practice guidelines.


It was hypothesized that an e-mail–based intervention termed “spaced education” could reduce clinicians’ inappropriate screening for prostate cancer.


The study was conducted as an RCT.


The study involved 95 primary care clinicians in eight Veterans Affairs medical centers from January 2007 to February 2009.


Participants were randomized into two cohorts: spaced education clinicians received four isomorphic cycles of nine e-mails over 36 weeks (zero to two e-mails per week), whereas control clinicians received no intervention. Each e-mail presented a clinical scenario and asked whether it was appropriate to obtain a PSA test. Participants received immediate feedback after submitting their answers.

Main outcome measures

The primary outcome was the number and percentage of inappropriate PSA screening tests ordered. Inappropriate testing was defined as use of PSA for prostate cancer screening in patients aged >76 or <40 years. Appropriateness of screening was dichotomized based on patient age at time of screening. Patients with PSA testing for non-screening reasons were excluded using a validated protocol. Logistic regression with adjustment for patient clustering by clinician was performed. Analyses were conducted in 2009.


During the intervention period (Weeks 1–36), clinicians receiving spaced education e-mails ordered significantly fewer inappropriate PSA screening tests than control clinicians (10.5% vs 14.2%, p=0.041). Over the 72-week period following the intervention (Weeks 37–108), spaced education clinicians continued to order fewer inappropriate tests compared to controls (7.8% vs 13.1%, respectively, p=0.011), representing a 40% relative reduction in inappropriate screening.


Spaced education durably improves the prostate cancer screening behaviors of clinicians and represents a promising new methodology to improve patient care across healthcare systems.


None of the major clinical practice guidelines (CPGs) recommend that prostate cancer screening with prostate-specific antigen (PSA) be routinely performed in asymptomatic men aged <40 years, >75 years, or with a less than 10-year life expectancy.14 Even so, PSA screening is frequently performed counter to CPGs. A nationwide study of Medicare and Veterans Affairs (VA) records found that more than 50% of men aged ≥75 years underwent PSA screening in 2003.5 In addition, a provider-level analysis demonstrated that 19% of all PSA screening tests ordered by clinicians from 1997 to 2004 were performed counter to guidelines.6

In August 2008, the U.S. Preventive Services Task Force (USPSTF)7 revised its guidelines to recommend specifically against performing PSA screening in men aged ≥75 years. Although this guideline is unambiguous as to when PSA screening should be discontinued in elderly men, it is not clear whether this change in guidelines will have a demonstrable impact on clinicians’ practice patterns.8 Barriers to guideline adherence by providers are manifold and include lack of CPG familiarity, inertia of previous practice, patient preferences, and systems-level obstacles.8,9 Although much recent effort has aimed to reduce barriers at the systems level, effective CPG implementation requires that clinicians have sufficient knowledge to execute the guideline recommendations effectively.10 In many cases, providers’ knowledge of pertinent CPGs is quite limited.8 Current methods of continuing medical education (CME) tend to focus on short-term learning gains rather than on the development of long-term knowledge that can be effectively translated to clinical practice.11,12 Not surprisingly, CME often has little to no impact on clinicians’ practice patterns.12,13

A novel system of online education termed “spaced education” holds promise as a method to improve the long-term efficacy of CME by harnessing the educational merits of both the spacing and testing effects. The “spacing effect” refers to the psychological research finding1416 that information that is presented and repeated over spaced intervals is learned and retained more efficiently than information presented at a single time-point. The spacing effect appears to have a distinct neurophysiologic basis. A recent study17 demonstrated that spaced learning by rats improves neuronal longevity in the hippocampus and that the strength of the rats’ memories correlates with the number of new cells in this region of their brains. Advances have also been made in elucidating the molecular basis of the spacing effect, including the recent identification18 of a protein tyrosine phosphatase that regulates the spacing effect in Drosophila. The “testing effect” refers to the psychological finding that the process of testing does not merely assess the knowledge levels of individuals. Rather, it alters the learning process itself so that new knowledge is retained more effectively.19,20

Spaced education is currently delivered via periodic e-mails that contain clinical case scenarios and multiple-choice questions. After submitting answers to each question online, clinicians receive immediate feedback and educational material. The questions are then repeated over spaced intervals of time to harness the pedagogic benefits of the spacing effect. In RCTs, it has been shown that the spaced education methodology improves knowledge acquisition,21 boosts learners’ abilities to self-assess their performance,22 and improves knowledge retention.2224 Spaced education is also well accepted by learners.21,25

This RCT investigated whether spaced education could durably reduce clinicians’ inappropriate screening for prostate cancer.


Study Participants

Primary care providers (physicians, nurse practitioners, and physician assistants) from the eight VA hospitals in the New England region (Veterans Integrated Service Network 1 [VISN-1]) were eligible to enroll. Participants were recruited via e-mail. Primary care providers (PCPs) who ordered PSA as a screen for prostate cancer fewer than 10 times each year were excluded. IRB approval was obtained from the Veterans Affairs Boston Healthcare System and Harvard Medical School to perform the present study.

Development and Validation of the Spaced Education Content

Each spaced education item consisted of an evaluative component (a multiple-choice question based on a clinical scenario) and an educational component (the answer and explanation). The content of the items was developed from published CPGs.14 Nine spaced education items were constructed by a clinical urologist to address whether PSA screening should or should not be performed in a variety of clinical contexts. The items focused primarily on circumstances in which PSA screening was not recommended. Three additional sets of nine items (for Cycles 2–4) were constructed to be identical to the original set, except for minor alterations to the clinical contexts to reduce the sense of repetition for study participants. The 36 spaced education items (four cycles of nine items) were independently content-validated by three physicians (two internists and a urologist).

Development and Validation of the Test Materials

Nineteen multiple-choice questions were developed based on a consensus of CPG recommendations for use of PSA in prostate cancer screening.14 The questions presented a clinical scenario and asked whether or not PSA screening should be performed. The questions were independently content-validated by three physicians and then pilot-tested by 33 physicians. Psychometric analysis of the questions was performed using the Integrity test analysis software (; Edmonton, Canada). Fourteen questions were selected for inclusion in the test based on item difficulty, point-biserial correlation, and Kuder–Richardson 20 score.

Spaced Education Online Delivery System

The spaced education items were delivered to PCPs at designated time intervals via an automated e-mail delivery system. The e-mail presented the clinical scenario and question (evaluative component). Clicking a hyperlink in the e-mail opened a web page that allowed the PCP to submit an answer to the question. The answer was downloaded to a central server, and PCPs were immediately presented with a web page displaying the correct answer to the question and an explanation of the curricular learning point (the educational component).

Study Design and Organization

This multi-institutional RCT was conducted from January 2007 to February 2009. PCPs were stratified by hospital and block-randomized into two cohorts: the spaced education cohort received four cycles of nine e-mails over a 36-week period (zero to two e-mails per week), whereas those in the control cohort received no intervention (and represented the standard-of-education in the VA system; Figure 1).

Figure 1
Structure of the RCT

The timing of four spaced education cycles was established to take advantage of the educational benefits of the spacing effect. All of the material presented in Cycle 1 was presented again (with minor variations in clinical context) as 3-week, 6-week, and 18–24 week cycled reviews (Figure 1). The time intervals between spaced education cycles were established based on psychology research findings to optimize long-term retention of learning.16,26

The 14-item test was administered to participants at enrollment (Test 1) and at Weeks 18 (Test 2) and 36 (Test 3). At Week 36, participants also completed a short survey that asked the amount of time required to answer each spaced education item. Participants received a $75 gift certificate to an online bookstore on submission of each test. Spaced education clinicians were also eligible to receive continuing medical education (CME) credit.

Clinical Data Sources

Data on PSA testing and patient characteristics were extracted from VISN-1 VISTA (Veterans Health Information Systems and Technology Architecture) databases for the male patients whose PSA levels were tested by participating clinicians during the trial period. Medication data were collected from VISN-1 pharmacy databases. Additional data were obtained from the VA National Patient Care Database Outpatient Clinic Files and the Patient Treatment Files, Austin Automation Center, Texas.

Patient age for determining appropriateness of PSA screening was defined at the time of each individual PSA test. Patients with PSA testing for reasons other than screening were excluded from the database following a previously validated protocol.6 Exclusion criteria included ICD-9 diagnosis codes for prostate cancer, prostate carcinoma in situ, nodular prostate and prostatitis; ICD-9 and CPT procedure codes for radical prostatectomy, external beam radiotherapy, brachytherapy, cryosurgery, and simple orchiectomy; prostate cancer–specific medications including leuprolide, goserelin acetate, abarelix, flutamide, nilutamide, and bicalutamide; and medications such as finasteride and dutasteride for which obtaining a baseline PSA level may be appropriate. If a single exclusion criterion was met, all of that patient’s PSA data were eliminated from the data set.

Outcome Measures

The primary outcome measure was the difference in the percentage of inappropriate PSA screening performed by PCPs in the spaced education and control cohorts. Based on published clinical guidelines and reports,14 inappropriate PSA utilization was defined as the use of PSA for prostate cancer screening in patients aged >76 or <40 years, or with an estimated life expectancy of less than 10 years. For the average U.S. man, an estimated life expectancy of 10 years is reached at age 76 years.27 No adjustments were made for patient comorbidities that might alter life expectancy. Appropriateness of screening was dichotomized based on patient age at time of screening.

Secondary outcome measures included (1) the change in test scores between cohorts measured over time and (2) the change in spaced education performance measured over time. An exploratory analysis was performed comparing the percentage of inappropriate screening in the 26 weeks prior to and after publication of the USPSTF statement in Week 82 of the study.

Statistical Analysis

Power calculations for the primary outcome measure were extrapolated from prior analyses6 of provider-specific PSA screening patterns in the VA VISN-1 hospitals. To detect a 30% relative decrease in inappropriate PSA screening, a total of 1093 PSA screening tests were required in each arm to provide a power of 0.8 (α=0.05).

An intention-to-treat analysis of PSA screening outcomes was performed. The percentages of inappropriate PSA screening tests were calculated by dividing the number of inappropriate PSA screening tests ordered by a cohort by the total number of PSA screening tests ordered by that cohort. For regression modeling of factors associated with inappropriate testing, data were organized as one record per test. Logistic regression with adjustment for clustering by provider was performed using a generalized estimating equation approach, where provider was a repeated factor and a compound symmetry correlation structure within provider was assumed.28 PSA data from baseline (2006) and the intervention period (Weeks 1–36) were analyzed in relation to cohort, time period, and clinicians’ demographic information (age, provider type, gender) to test for differences resulting from the intervention. The 2006 PSA data were included to adjust for any baseline differences in PSA screening behaviors between the cohorts at the start of the trial. In a separate model, PSA data from the follow-up period (Weeks 37–108) were used to test for the persistence of these differences. Comparisons of differences between cohorts over the intervention and follow-up periods were performed using a continuous measure of time. Analyses were conducted in 2009, and statistical calculations were performed using PROC GENMOD in SAS, version 9.1.3.

An intention-to-treat analysis was also performed on the results of the multiple-choice tests administered at baseline and during the trial. Baseline scores were carried forward to impute missing data for clinicians lost to follow-up. Test reliability was estimated with Cronbach’s α, which assesses the systematic variance of a measure administered to a sample.29 Scores for each test and each spaced education cycle were calculated as the number of items answered correctly normalized to a percentage scale. Scores were analyzed with a two-tailed t-test with SPSS for Windows, version 15.0. Intervention effect sizes for learning were measured by means of Cohen’s d.30 Each clinician’s estimate of time required to complete one spaced education item was multiplied by 36 to estimate the time required to complete the entire 36–e-mail spaced education program.


Ninety-five of 260 PCPs enrolled in the trial. Participants’ baseline demographic characteristics and attrition over the course of the trial were similar between randomized cohorts (Table 1 and Figure 2).

Figure 2
CONSORT flow chart of the RCT. All 95 participants were included in the intention-to-treat analysis.
Table 1
Baseline demographic characteristics of randomized cohorts

Improvement in Prostate-Specific Antigen Screening Knowledge

In the spaced education cohort, the four cycles of nine e-mails were completed by 84%, 88%, 94%, and 92% of clinicians, respectively. Spaced education clinicians reported spending a median 2.5 minutes (interquartile range [IQR] = 1.5–5.0) to complete each spaced education item. The estimated duration to complete the entire program of 36 e-mails was a median 90 minutes (estimated IQR = 54–180). The percentage of spaced education items answered correctly rose from 72.0% (SD = 20.9) in Cycle 1, to 85.3% (SD = 16.6) in Cycle 2, to 88.3% (SD = 17.2) in Cycle 3 and to 90.1% (SD = 15.5) in Cycle 4 (Figure 3).

Figure 3
Percentage scores on tests and spaced education cycles. Bars represent SE.

All three multiple-choice tests were completed by 92% and 85% of providers in the spaced education and control cohorts, respectively (p=0.23). Clinician loss to test follow-up was not significantly different between cohorts, genders, or provider types. Cronbach’s α reliability (internal consistency) of the test instrument was 0.71. Test-1 scores (Week 0) were similar between cohorts (M = 72%, p=0.89). Mean test scores in the spaced education cohort were 96% (SD=8) and 95% (SD=8) for Tests 2 (Week 18) and 3 (Week 36), respectively. Corresponding test scores in the control cohort were 73% (SD=20) and 76% (SD=17) (p<0.001 for both cross-cohort comparisons; Figure 3). The cross-cohort differences represent Cohen effect sizes of 1.2 and 1.1 for Test 2 and 3, respectively.30

Improvement in Prostate-Specific Antigen Screening Behavior

During the 36-week intervention period, clinicians in the spaced education cohort ordered fewer inappropriate PSA screening tests compared to control clinicians (762 vs 1145, respectively). Spaced education clinicians also ordered fewer PSA screening tests overall compared to clinicians in the control cohort (7244 vs 8173, respectively). In logistic regression models, the percentage of inappropriate PSA screening was significantly reduced among spaced education clinicians compared to controls (10.5% vs 14.2%, respectively, p=0.041). This between-cohort difference in the percentage of inappropriate PSA screening increased significantly during the intervention period (p=0.019 for interaction between cohort and time; Figure 4). The impact of the intervention was unaffected by clinicians’ age, gender, or provider type.

Figure 4
Impact of spaced education on inappropriate PSA screening. The percentages of inappropriate PSA screening during and after the spaced education intervention are represented by the gray and black triangles, respectively. The USPSTF released its recommendation ...

In the 72-week follow-up period (weeks 37–108), clinicians in the spaced education cohort continued to order fewer inappropriate PSA screening tests (1028 vs 1906 by control clinicians) and fewer PSA screening tests overall (13089 vs 14488 by controls). During this time period, the percentage of inappropriate PSA screening continued to be significantly lower among spaced education clinicians compared to controls (7.8% vs 13.1%, respectively, p=0.011), representing a 40% relative reduction in inappropriate screening. The screening differences between cohorts did not erode significantly during the 72-week follow-up period (p=0.63 for interaction between cohort and time). In an exploratory analysis, no significant change in the percentage of inappropriate screening was identified in the 26 weeks following publication of the USPSTF CPG statement (p=0.38).


In the present RCT, spaced education significantly reduced clinicians’ inappropriate screening for prostate cancer and aligned their clinical practice patterns more closely with CPG standards. In addition, the current study is the first to show that online education can produce demonstrable improvements in clinical practice that persist for more than 1 year after the intervention.31 The fact that this modest intervention (36 interactive e-mails over a 36-week period) generated such substantial results suggests that spaced education is a potent methodology for CME. With content tailored to meet specific needs, spaced education is the type of intervention that can be deployed across healthcare systems to improve the quality of patient care.

An intriguing finding from the present study is that all PSA screening declined among clinicians in the intervention cohort. Given the limited data supporting a mortality benefit for PSA screening in men of any age,32,33 it is not clear whether this reduction in overall PSA screening is a benefit or detriment to patient care. Because there is no CPG consensus as to what constitutes “appropriate” PSA screening, one cannot conclude that a reduction in overall screening represents a decline in appropriate screening. Rather, this finding suggests that spaced education clinicians learned to utilize PSA screening more judiciously among all of their male patients regardless of patient age.

The current study raises some important but unanswered questions. First, it is not clear how spaced education produced substantial improvements in PSA screening behaviors in spite of clinicians’ high level of baseline knowledge of CPGs (mean baseline test score 72%). The fact that spaced education was effective suggests that spaced education does more than just improve knowledge. We hypothesized that spaced education may also improve the translation of knowledge into clinical practice and/or reduce providers’ clinical inertia in adhering to screening CPGs. Second, the educational benefit of Cycles 3 and 4 is not clear, given that clinicians’ spaced education performance appeared to plateau after Cycle 2 (Figure 3). However, overlearning (the repeated presentation of content after mastery has been achieved) significantly improves the long-term retention of learning.34 It is also possible that once saturation of learning is reached, the spaced education intervention becomes more of a reminder system than an educational tool. Third, it is not clear how long the improvements in practice patterns will persist after this 72-week follow-up period. Still untested is whether a maintenance program of spaced education with substantially increased spacing intervals may prevent regression back to baseline practices. Finally, the optimal frequency, spacing intervals, content length, and duration for spaced education programs have yet to be established. This expanding pattern of spacing intervals has been demonstrated35 to improve retention of learning compared to fixed intervals, although recent data36 have challenged these findings. Fatigue from spaced education e-mails is a potential concern, but spaced education programs that deliver more content more frequently (e.g., daily e-mails containing two questions) are well accepted by physicians.25

The present study also adds to growing evidence8 that CPG publication in and of itself may have little to no impact on clinical practice. In the exploratory analysis,7 the much-heralded publication of the USPSTF guidelines in Week 82 of the current study did not demonstrably change clinicians’ screening patterns. By facilitating the translation of CPGs into improved clinical care, spaced education offers a potential solution to this problem.

There are several strengths to the study, including the novelty of the educational intervention, the randomized controlled design, the inclusion of multiple provider types, and the study’s focus on long-term educational and behavioral outcomes. Limitations of the present study include the narrow focus of the spaced education intervention on inappropriate PSA screening and the binary outcomes measure (screened inappropriately or not). Further work is needed to establish spaced education’s efficacy in other clinical domains, for clinical decisions of greater complexity, and in healthcare systems other than VA hospitals.

In summary, this RCT demonstrated that spaced education can durably improve clinicians’ knowledge of and adherence to PSA screening CPGs. Spaced education represents a promising new methodology to deliver online education across healthcare systems to improve the quality of patient care.

Supplementary Material


This study was supported by the Research Career Development Award Program and research grants TEL-02-100 and IIR-04-045 from the Veterans Affairs Health Services Research & Development Service, the U.S. Agency for Healthcare Research and Quality, the American Urological Association Foundation (Linthicum MD), Astellas Pharma U.S., Inc., Wyeth, Inc., and the NIH (K24 DK63214 and R01 HL77234).

Appendix Supplementary data

Supplementary data associated with this article can be found, in the online version, at doi:10.1016/j.amepre.2010.07.016.


We recognize the invaluable work of Ronald Rouse, Jason Alvarez, and David Bozzi of the Harvard Medical School Center for Educational Technology for their development of the spaced education online delivery platform utilized in this trial. number NCT01168323 “Spaced Education to Optimize Prostate Cancer Screening.”

The views expressed in this article are those of the authors and do not necessarily reflect the position and policy of the U.S. Federal Government or the Department of Veterans Affairs. No official endorsement should be inferred.

Dr. Kerfoot is an equity owner and director of Spaced Education Inc, a start-up company launched by Harvard University to develop the spaced education methodology outside of its firewalls. Although Dr. Kerfoot is the inventor listed on the patent application submitted by Harvard, the University will own the patent if granted.

No other financial disclosures were reported by the authors of this paper.


1. National Comprehensive Cancer Network. Prostate Cancer Early Detection (Version 1. 2005)
2. U.S. Preventive Services Task Force. Screening for prostate cancer: recommendations and rationale. Am Fam Physician. 2003;67(4):787–92. [PubMed]
3. Smith RA, Cokkinides V, Eyre HJ. American Cancer Society guidelines for the early detection of cancer, 2005. CA Cancer J Clin. 2005;55(1):31–44. quiz 55–6. [PubMed]
4. American Urological Association. Prostate-specific antigen (PSA) best practice policy. Oncology (Williston Park) 2000;14(2):267–72. 277–8, 280. passim. [PubMed]
5. Walter LC, Bertenthal D, Lindquist K, Konety BR. PSA screening among elderly men with limited life expectancies. JAMA. 2006;196(19):2336–42. [PubMed]
6. Kerfoot BP, Holmberg EF, Lawler EV, Krupat E, Conlin PR. Practitioner-level determinants of inappropriate prostate-specific antigen screening. Arch Intern Med. 2007;167(13):1367–72. [PubMed]
7. Screening for prostate cancer: U.S. Preventive Services Task Force recommendation statement. Ann Intern Med. 2008;149(3):185–91. [PubMed]
8. Cabana MD, Rand CS, Powe NR, et al. Why don’t physicians follow clinical practice guidelines? A framework for improvement. JAMA. 1999;282(15):1458–65. [PubMed]
9. IOM. Redesigning continuing education in the health professions. Washington DC: National Academies Press; 2010.
10. Holmboe ES, Lipner R, Greiner A. Assessing quality of care: knowledge matters. JAMA. 2008;299(3):338–40. [PubMed]
11. Marinopoulos SS, Dorman T, Ratanawongsa N, et al. Effectiveness of continuing medical education. Evid Rep Technol Assess (Full Rep) 2007;(149):1–69. [PubMed]
12. Davis D, O’Brien MA, Freemantle N, Wolf FM, Mazmanian P, Taylor-Vaisey A. Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? JAMA. 1999;282(9):867–74. [PubMed]
13. Curran VR, Fleet L. A review of evaluation outcomes of web-based continuing medical education. Med Educ. 2005;39(6):561–7. [PubMed]
14. Bjork RA. Retrieval practice and the maintenance of knowledge. In: Gruneberg MM, Morris PE, Sykes RN, editors. Practical aspects of memory: current research and issues. New York: Wiley; 1988. pp. 396–401.
15. Glenberg AM, Lehmann TS. Spacing repetitions over 1 week. Mem Cognit. 1980;8(6):528–38. [PubMed]
16. Pashler H, Rohrer D, Cepeda NJ, Carpenter SK. Enhancing learning and retarding forgetting: choices and consequences. Psychon Bull Rev. 2007;14(2):187–93. [PubMed]
17. Sisti HM, Glass AL, Shors TJ. Neurogenesis and the spacing effect: learning over time enhances memory and the survival of new neurons. Learn Mem. 2007;14(5):368–75. [PubMed]
18. Pagani MR, Oishi K, Gelb BD, Zhong Y. The phosphatase SHP2 regulates the spacing effect for long-term memory induction. Cell. 2009;139(1):186–98. [PMC free article] [PubMed]
19. Roediger HL, Karpicke JD. Test-enhanced learning: taking memory tests improves long-term retention. Psychol Sci. 2006;17(3):249–55. [PubMed]
20. Karpicke JD, Roediger HL., 3rd The critical importance of retrieval for learning. Science. 2008;319(5865):966–8. [PubMed]
21. Kerfoot BP, Armstrong EG, O’Sullivan PN. Interactive spaced-education to teach the physical examination: a randomized controlled trial. J Gen Intern Med. 2008;23(7):973–8. [PMC free article] [PubMed]
22. Kerfoot BP, Brotschi E. Online spaced education to teach urology to medical students: a multi-institutional randomized trial. Am J Surg. 2009;197(1):89–95. [PubMed]
23. Kerfoot BP. Learning benefits of on-line spaced education persist for 2 years. J Urol. 2009;181(6):2671–3. [PubMed]
24. Kerfoot BP, DeWolf WC, Masser BA, Church PA, Federman DD. Spaced education improves the retention of clinical knowledge by medical students: a randomised controlled trial. Med Educ. 2007;41(1):23–31. [PubMed]
25. Kerfoot BP, Kearney MC, Connelly D, Ritchey ML. Interactive spaced education to assess and improve knowledge of clinical practice guidelines: a randomized controlled trial. Ann Surg. 2009;249(5):744–9. [PubMed]
26. Cepeda NJ, Vul E, Rohrer D, Wixted JT, Pashler H. Spacing effects in learning: a temporal ridgeline of optimal retention. Psychol Sci. 2008;19(11):1095–102. [PubMed]
27. Arias E. U.S. life tables, 2003. Natl Vital Stat Rep. 2006;54(14):1–40. [PubMed]
28. Diggle PJ, Heagerty P, Liang K, Zeger S. Analysis of longitudinal data. New York: Oxford University Press; 2002.
29. Cronbach LJ. Coefficient alpha and the internal structure of tests. Psychometrika. 1951;16:297–334.
30. Cohen J. Statistical power analysis for the behavioral sciences. 2. Hillsdale, NJ: Lawrence Erlbaum; 1988.
31. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: a meta-analysis. JAMA. 2008;300(10):1181–96. [PubMed]
32. Andriole GL, Crawford ED, Grubb RL, III, et al. Mortality results from a randomized prostate-cancer screening trial. N Engl J Med. 2009;360(13):1310–9. [PMC free article] [PubMed]
33. Schroder FH, Hugosson J, Roobol MJ, et al. Screening and prostate-cancer mortality in a randomized European study. N Engl J Med. 2009;360(13):1320–8. [PubMed]
34. Driskell JE, Willis RP, Copper C. Effect of overlearning on retention. J Appl Psychol. 1992;77(5):615–22.
35. Landauer TK, Bjork RA. Optimum rehearsal patterns and name learning. In: Gruneberg MM, Morris PE, Sykes RN, editors. Practical aspects of memory. New York: Academic Press; 1978. pp. 625–32.
36. Logan JM, Balota DA. Expanded vs. equal interval spaced retrieval practice: exploring different schedules of spacing and retention interval in younger and older adults. Neuropsychol Dev Cogn B Aging Neuropsychol Cogn. 2008;15(3):257–80. [PubMed]