PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of jgimedspringer.comThis journalToc AlertsSubmit OnlineOpen Choice
 
J Gen Intern Med. Oct 2012; 27(10): 1300–1307.
Published online May 8, 2012. doi:  10.1007/s11606-012-2079-4
PMCID: PMC3445686
Literacy, Cognitive Function, and Health: Results of the LitCog Study
Michael S. Wolf, PhD MPH,corresponding author1,2 Laura M. Curtis, MS,1 Elizabeth A. H. Wilson, PhD,1 William Revelle, PhD,3 Katherine R. Waite, BA,1,2 Samuel G. Smith, MSc,4 Sandra Weintraub, PhD,5 Beth Borosh, PhD,5 David N. Rapp, PhD,2,3 Denise C. Park, PhD,6 Ian C. Deary, PhD,7 and David W. Baker, MD MPH1
1Health Literacy and Learning Program, Division of General Internal Medicine, Feinberg School of Medicine at Northwestern University, 750 N. Lake Shore Drive, 10th Floor, Chicago, IL 60611 USA
2Department of Learning Sciences, School of Education and Social Policy, Northwestern University, Evanston, IL USA
3Department of Psychology, Northwestern University, Evanston, IL USA
4Health Behaviour Research Centre, Department of Psychology, University College London, London, United Kingdom
5Department of Psychiatry & Behavioral Sciences, Feinberg School of Medicine at Northwestern University, Chicago, IL USA
6Center for Vital Longevity, University of Texas at Dallas, Dallas, USA
7Centre for Cognitive Ageing and Cognitive Epidemiology, Department of Psychology, University of Edinburgh, Edinburgh, United Kingdom
Michael S. Wolf, Phone: +1-312-5035592, Fax: +1-312-5032777, mswolf/at/northwestern.edu.
corresponding authorCorresponding author.
Received January 27, 2012; Revised March 16, 2012; Accepted April 6, 2012.
BACKGROUND
Emerging evidence suggests the relationship between health literacy and health outcomes could be explained by cognitive abilities.
OBJECTIVE
To investigate to what degree cognitive skills explain associations between health literacy, performance on common health tasks, and functional health status.
DESIGN
Two face-to-face, structured interviews spaced a week apart with three health literacy assessments and a comprehensive cognitive battery measuring ‘fluid’ abilities necessary to learn and apply new information, and ‘crystallized’ abilities such as background knowledge.
SETTING
An academic general internal medicine practice and three federally qualified health centers in Chicago, Illinois.
PATIENTS
Eight hundred and eighty-two English-speaking adults ages 55 to 74.
MEASUREMENTS
Health literacy was measured using the Rapid Estimate of Adult Literacy in Medicine (REALM), Test of Functional Health Literacy in Adults (TOFHLA), and Newest Vital Sign (NVS). Performance on common health tasks were globally assessed and categorized as 1) comprehending print information, 2) recalling spoken information, 3) recalling multimedia information, 4) dosing and organizing medication, and 5) healthcare problem-solving.
RESULTS
Health literacy measures were strongly correlated with fluid and crystallized cognitive abilities (range: r = 0.57 to 0.77, all p < 0.001). Lower health literacy and weaker fluid and crystallized abilities were associated with poorer performance on healthcare tasks. In multivariable analyses, the association between health literacy and task performance was substantially reduced once fluid and crystallized cognitive abilities were entered into models (without cognitive abilities: β = −28.9, 95 % Confidence Interval (CI) -31.4 to −26.4, p; with cognitive abilities: β = −8.5, 95 % CI −10.9 to −6.0).
LIMITATIONS
Cross-sectional analyses, English-speaking, older adults only.
CONCLUSIONS
The most common measures used in health literacy studies are detecting individual differences in cognitive abilities, which may predict one’s capacity to engage in self-care and achieve desirable health outcomes. Future interventions should respond to all of the cognitive demands patients face in managing health, beyond reading and numeracy.
KEY WORDS: health literacy, cognitive abilities, health tasks, patient-reported outcomes, physical health, mental health
The relationship between adult literacy skills, health knowledge, behaviors, and clinical outcomes has been repeatedly investigated.13 More than 500 research publications have demonstrated associations between crude measures of reading and numeracy skills with various health-related outcomes, including risk of hospitalization and mortality.46 This has been the foundation for the field now known as ‘health literacy’ research.
Despite more expansive and accepted definitions, the problem of low health literacy has often been characterized as difficulties in reading and math skills. Early studies therefore responded by re-writing health materials at a simpler level or following other design principles to enhance reading comprehension; an approach found to have limited success.7,8 Still lacking a deeper understanding of the problem, recent investigations have tested comprehensive strategies with more promising results.911 However, as these were multi-faceted interventions targeting system complexity, it is difficult to isolate the true reason for improvement.
The role of patient requires more than the ability to read and manipulate numbers when managing health and navigating the healthcare system.12 A global set of cognitive skills are necessary to access health services, process and comprehend text and numerical information, orally express oneself, understand and recall spoken instructions, make inferences, utilize technology, critically weigh options and make decisions, and sustain often complex behaviors. Several studies over the past two decades allude to this, as they have reported strong associations between a broader set of cognitive abilities and the aforementioned health outcomes associated with literacy skills.1317 In general, poorer cognitive performance has been linked to less health knowledge, poorer medication adherence, worse physical and mental health, and greater mortality risk.1822
A significant relationship is already known to exist between literacy and cognitive skills.23 In fact, there is a small but growing body of literature documenting associations between measures of literacy, health literacy, and limited sets of cognitive abilities.15,2427 These studies support the premise that the impact of health literacy on outcomes might be explained by a wide array of cognitive domains. It is imperative to explore these links, as this will expand our thinking on the true nature of the problem, and improve our ability to develop more effective strategies for identifying and responding to individuals who will struggle to learn and apply health information.28
Our research team launched a National Institute of Aging study (“Health Literacy and Cognitive Function among Older Adults”; R01 AG030611) herein referred to as ‘LitCog’. We recruited a cohort of older patients in order to investigate the association between health literacy and specific domains of cognitive function. Another objective was to examine to what degree cognitive abilities explained associations with health literacy and performance on health tasks common for self-care. We targeted widely-used assessments of literacy in healthcare: the Rapid Estimate of Adult Literacy in Medicine (REALM),29 Test of Functional Health Literacy in Adults (TOFHLA), 30 and the Newest Vital Sign (NVS).31 The examination of these assessments, coupled with a detailed battery of cognitive tests, afforded the unique opportunity to determine crucial, understudied relationships between literacy skills and general cognitive functioning.
Sample
English-speaking adults ages 55 to 74 who received care at an academic general internal medicine ambulatory care clinic or one of four federally qualified health centers in Chicago were recruited starting August 2008 through October 2010. A total of 3176 patients were identified through electronic health records as initially eligible by age, notified of the study by mail, and contacted via phone. A total of 1904 eligible patients were invited to participate. Initial screening deemed 244 subjects as ineligible due to severe cognitive or hearing impairment, limited English proficiency, or not being connected to a clinic physician (defined as < 2 visits in two years). In addition, 794 refused, 14 were deceased, and 20 were eligible but had scheduling conflicts. The final sample included 832 participants, for a determined cooperation rate of 51 percent following American Association for Public Opinion Research guidelines.32
Procedure
Subjects completed two structured interviews, 7–10 days apart, each lasting 2.5 hours. A trained research assistant guided patients through a series of assessments that, on Day 1, included basic demographic information, socioeconomic status, comorbidity, the three health literacy measures, and an assessment of performance on everyday health tasks. On Day 2, patients were administered a cognitive battery to measure processing speed, working memory, inductive reasoning, long-term memory, prospective memory, and verbal ability (see Table 1).3342 Multiple tests were used for each cognitive domain, allowing a latent trait to be extracted. Northwestern University’s Institutional Review Board approved the study.
Table 1
Table 1
Description of Health Literacy and Cognitive Measures
Measures
Health Literacy
Health literacy was assessed by the TOFHLA, REALM, and NVS.2931 The TOFHLA uses actual materials patients might encounter in healthcare to test reading fluency.29 A reading comprehension section includes 50 items that use the Cloze procedure; every fifth to seventh word in a passage is omitted and four multiple choice options are provided. The numeracy section includes 17 items to assess comprehension of labeled prescription vials, an appointment slip, a chart describing eligibility for financial aid, and an example of results from a medical test. Scores are classified as inadequate (0–59), marginal (60–74), or adequate (75–100).
The REALM is a word-recognition test comprised of 66 health-related words arranged in order of increasing difficulty.30 Patients are asked to read aloud as many words as they can. Scores are based on the total number of words pronounced correctly, with dictionary pronunciation being the scoring standard and interpreted as low (0–44), marginal (45–60), or adequate (61–66). The TOFHLA and REALM are the most common measures of literacy used in healthcare research.43 Finally, The NVS is a screening tool used to determine risk for limited health literacy.31 Patients are given a copy of a nutrition label and asked six questions about how they would interpret and act on the information. Scores are classified as high likelihood (0–1) or possibility (2–3) of limited literacy, and adequate literacy (4–6).
Cognitive Abilities
A set of 16 cognitive tests were used to assess six cognitive domains and to derive latent traits for them (Table 1). Cognitive abilities were broadly classified as fluid (processing speed, working memory, inductive reasoning, long-term memory, prospective memory) or crystallized (verbal ability). Fluid abilities refer to cognitive traits associated with active information processing in which prior knowledge is of relatively little help, whereas crystallized abilities embody stored information in long-term memory, or general background knowledge. With the exception of verbal ability, we included tests that were independent of reading skills.
Performance on Everyday Health Tasks
The Comprehensive Health Activities Scale (CHAS) was developed for the LitCog study and its psychometric properties described elsewhere.44 Ten different health scenarios that involve print, video, and spoken health communications as well as common ‘artifacts’ (pill bottles and labels) requiring navigation are presented to patients, followed by a series of questions asking them to demonstrate comprehension and/or use the material and artifacts. This methodology was adapted from prior studies by the research team, cognitive psychology approaches, and national literacy and health literacy assessments.4547 In brief, an initial pool of 150 items was developed across the scenarios and 80 items kept in the final assessment; all loaded on one common latent variable resulting in 94 % reliable variance (Ω total).48,49 Scores were standardized (0–100), and item subscales created, including: 1) comprehending print information (9 items), 2) recalling spoken information (11 items), 3) recalling multimedia information (20 items), 4) organizing and dosing medication (18 items), and 5) healthcare problem-solving (19 items). Higher scores translate to greater performance on specified health tasks. Internal consistency was high for all categories (Cronbach’s α = 0.73, 0.63, 0.78, 0.76, 0.76, respectively).
Analysis Plan
Descriptive statistics were calculated for each variable. ANOVA was used to compare mean performance on health tasks and functional health status by health literacy categories. Pearson product–moment (TOFHLA, REALM) and Spearman (NVS) correlations were used to examine associations between health literacy measures and cognitive tests.
Domain-specific and general cognitive ability (crystallized, fluid) scores were created to reduce cognitive skills to one measure per category and to avoid multicollinearity in subsequent regression models. Univariate imputation sampling methods were used to estimate any missing values (n = 98) on cognitive measures by regressing each variable on age and variables from the same cognitive category in a bootstrapped sample of non-missing observations. The category-specific and domain summary scores were then calculated by estimating a single factor score with maximum likelihood estimation.
Multivariable linear regression models were conducted to examine the independent associations between health literacy, fluid or crystallized cognitive abilities with overall performance on health tasks. Age, gender, race, and number of comorbid chronic conditions were included in all models as covariates. A final model included all three variables, and the extent to which the effect of health literacy was attenuated by cognitive abilities was then evaluated. The Vuong test, a likelihood-ratio based approach for non-nested models, was used to determine whether the variance explained by models (R2) significantly changed when health literacy, fluid or crystallized abilities were included or omitted.50 Analyses were performed using STATA 11.2 (College Station, TX).
The sample is described in Table 2. Participants were socially and economically diverse by years of schooling, household income, employment, marital status, and living situation. Individuals on average had two chronic conditions (M = 1.9, SD = 1.4), and were taking a mean of 3.6 prescription medications (SD = 3.1). According to the TOFHLA, estimates of marginal and inadequate health literacy were 16.8 % and 12.5 %, respectively. This compared with 15.4 % and 8.9 % as determined by the REALM, and 22.9 % and 28.9 % according to the NVS. Across all measures, lower health literacy was associated with older age, African-American race, less education, and lower household income.
Table 2
Table 2
Characteristics of Sample (N = 784)
Correlations among the three health literacy measures were 0.76 (TOFHLA- REALM), 0.62 (TOFHLA-NVS), and 0.47 (NVS-REALM; all p < 0.001). Health literacy measures were strongly correlated with all cognitive abilities (Table 3). Fluid abilities were more strongly correlated with the TOFHLA and NVS than with the REALM (0.76 and 0.73 vs. 0.57, respectively), and crystallized abilities correlated similarly with all health literacy measures (TOFHLA - 0.77, REALM - 0.74, NVS - 0.71). Fluid and crystallized abilities were also strongly correlated with one another (r = 0.78).
Table 3
Table 3
Correlations with Cognitive & Health Literacy Tests
In bivariate analyses, inadequate health literacy was significantly correlated with worse performance on healthcare tasks, with a gradient decline across decreasing levels of literacy skills (TOFHLA – r = 0.81, REALM – r = 0.68, NVS – r = .74; all p < 0.001, Table 4). Associations between fluid and crystallized cognitive abilities with overall task performance were equally strong (r = 0.84 for both sets of abilities; p < 0.001).
Table 4
Table 4
Performance on Health-Related Tasks and Functional Health Status by Health Literacy Measures
In multivariable models, inadequate health literacy as measured by the TOFHLA, REALM, and NVS were significant independent predictors of worse overall task performance (Table 5). Similarly, both fluid and crystallized cognitive abilities were significantly associated with the outcome. When both were entered into models that included health literacy, the relationship between health literacy and task performance was attenuated by 70.6 % for the TOFHLA (without cognitive abilities: β = −28.9, 95 % Confidence Interval (CI) -31.4 to −26.4; with cognitive abilities: β = −8.5, 95 % CI −10.9 to −6.0). This reduction was similar for the REALM (77.7 % attenuation; without cognitive abilities: β = −27.8, 95 % CI −30.8 to −24.7; with cognitive abilities: β = −6.2, 95 % CI −9.0 to −3.4) and NVS (73.4 % attenuation; without cognitive abilities: β = −22.8, 95 % CI −24.9 to −20.7; with cognitive abilities: β = −6.0, 95 % CI −7.9 to −4.1).
Table 5
Table 5
Multivariable Models of Health Literacy, Cognitive Abilities, and Health Task Performance
Independent associations between fluid and crystallized abilities and task performance were reduced by approximately half when they were both included together in the models, and health literacy by any measure provided little to no further attenuation of these relationships when added to models. The variance explained by multivariable models was significantly greater with fluid and/or crystallized cognitive abilities compared to health literacy as measured by the TOFHLA (R2 = 0.73 and 0.74 vs. R2 = 0.66, both p < 0.001; combined R2 = 0.80, p < 0.001). However, the explanatory power of the model including health literacy, fluid and crystallized abilities (R2 = 0.82) was greatest compared to models including only health literacy (p < 0.001) or both fluid and crystallized abilities (p = 0.003). This was also true for the REALM and NVS, with models explaining 60 % and 62 % of the variance, respectively. Including fluid and/or crystallized abilities significantly increased the variance explained by models (all p < 0.001, Table 5). When health literacy (measured by REALM or NVS), fluid and crystallized abilities were all included in models (REALM: R2 = 0.81; NVS: R2 = 0.81), the variance explained was greater than models only including health literacy (both p < 0.001) or fluid and crystallized abilities (p = 0.01).
Health literacy has been the subject of multiple reports, and the U.S. Department of Health and Human Services, Institute of Medicine, and World Health Organization promote and exhort improving health literacy as a public health goal.1,51,52 Proposals have even been made recently to recognize low health literacy as a risk factor warranting clinical screening.5355 Despite this, controversy remains as to the definition of health literacy - whether it is an individual risk factor or asset, a reflection on healthcare providers’ skills and health systems’ accessibility, or all of these.56 Clearly, the term has sparked unprecedented interest around simplifying healthcare and helping individuals manage their health.
The intent of the LitCog study was to revisit the measures previously used, almost exclusively, in health literacy research and better understand the latent psychological traits being evaluated. Our findings strongly suggest that the problem of limited health literacy mostly reflects individual differences across a broad set of cognitive skills that include but are not limited to reading and numeracy. Associations between health literacy and performance on common health tasks were substantially explained by 1) fluid abilities necessary to actively learn and apply new information, and 2) crystallized abilities such as background knowledge.
The TOFHLA and NVS were more strongly correlated with one another and aligned with fluid abilities. REALM associations with the various health tasks were explained more by crystallized abilities. In multivariable models, assessments of fluid and crystallized abilities together with a health literacy test best explained performance on everyday health tasks. There is likely not a limited set of skills that can be isolated as most important in managing one’s health. The roles individuals assume in healthcare require reading and numeracy, but also health-related knowledge, speed and efficiency of thought, critical thinking, multi-tasking, and memory among other abilities. It is therefore not surprising cognitive traits explained such a large degree of the relationship between health literacy and task performance, or that health literacy measures also provided an independent contribution.
Our findings are limited in that our sample was English-speaking only and predominantly female. We also included more assessments of fluid abilities than crystallized tests. Since fluid and crystallized abilities were comparable in explaining health literacy associations, our findings might under-estimate the importance of background knowledge. In addition, performance on everyday health tasks was measured using hypothetical scenarios. Participants might have applied greater effort to tasks if they had been more salient to their current personal health. However, when comparing scores between those with and without experience with the task or condition in each scenario, differences were not found. LitCog participants are now being followed as a cohort, allowing for opportunities to prospectively study relationships between health literacy, cognitive abilities, and outcomes including risk of hospitalization and mortality.
A general critique of our findings might be that the assessment of task performance is similar to health literacy measures. However, we required individuals to demonstrate functional skills across a wide array of health scenarios beyond solely reviewing print materials - the basis of existing health literacy tests. This criticism can also be directed at many seminal health literacy studies that have examined associations with the ability to perform common self-management tasks.4,5759 In fact, assessments of health literacy closely resemble cognitive tests, supporting the primary assertion of the LitCog study. The most notable similarity can be seen between the REALM and the American-National Adult Reading Test (AM-NART); both require individuals to correctly pronounce lists of words (r = 0.73). The strength of correlations among health literacy, cognitive tests, and performance on health tasks should be understandable and expected.
These crude literacy assessments have proven to be useful research tools, and it is possible they may eventually demonstrate equivalent clinical utility. All are highly predictive of an individual’s ability to perform routine healthcare tasks; the choice to use one versus another should depend more on test attributes (i.e. availability in Spanish, time to administer) and less so out of concerns for misclassification. In this case, the NVS might be a logical choice as it is as brief as the REALM, but available in Spanish where the REALM is not.
Brief measures that assess global cognitive function, such as the mini-mental status exam (MMSE), might also serve as proxies for health literacy. Strong associations between health literacy measures, such as the TOFHLA, and the MMSE have previously been established.25,26,60 The advantages to this approach are that many of these cognitive screeners are already administered in clinical settings. In addition, their face validity makes it less likely a researcher or clinician would fall victim to superficial interpretations of the problem and its solution. Yet these tools would likely need revised scoring thresholds, as individuals could be free of a clinically defined cognitive impairment yet still have limited health literacy.
The proposition that the most common measures of health literacy are actually crude assessments of general cognitive abilities should not distract attention from broader efforts to redesign health materials, improve clinician communication skills, enhance the navigability of health systems, or engage communities to assume public health roles.27 Our findings affirm the need to help patients build appropriate background knowledge and skills, but to also reduce the cognitive demands of health systems through unnecessarily complex health tasks. As a start, health literacy interventions should move beyond plain language approaches and deconstruct the tasks required of patients within a particular healthcare context. Depending on the task, steps could be taken that follow cognitive and human factors principles to improve performance. This might include giving individuals more time to accurately process information, limiting and layering new information to reduce cognitive burden, use of increasingly available technologies or ‘external aids’ (i.e. pill box organizers) to enhance recall and prompt health behaviors, or eliminate tasks all together if the health system could assume responsibility instead.
Future evaluations of interventions should always collect sufficient data to determine if a strategy mitigates the impact of ‘low health literacy’ on outcomes.61 A modified perspective of health literacy that includes an expansive view of cognitive skills necessary to manage health could also inform the development of more precise clinical assessments to identify those at risk.12 Despite calls for clinicians to follow universal precautions and assume all patients may have health literacy concerns, remediating inadequate cognitive skills for self-care might require clinical screening. This would then allow a greater allocation of resources, in terms of education and follow-up, to those struggling to learn and apply health information and instructions.
Acknowledgements
We would like to thank Elizabeth Bojarski, Rachel O’Conor, Emily Ross, and Rina Sobel for their determination and effort in recruiting and collecting data for the LitCog Study. This project was supported by the National Institute on Aging (R01 AG030611; PI: Wolf)
Conflict of Interest
The authors declare that they do not have a conflict of interest.
Funding
This project was supported by the National Institute on Aging (R01 AG030611; PI: Wolf)
1. Institute of Medicine. Health literacy: A prescription to end confusion: National Academy Press; 2004.
2. DeWalt DA, Berkman ND, Sheridan S, Lohr KN, Pignone MP. Literacy and health outcomes. J Gen Intern Med. 2004;19:1228–1239. doi: 10.1111/j.1525-1497.2004.40153.x. [PMC free article] [PubMed] [Cross Ref]
3. Wolf MS, Gazmararian JA, Baker DW. Health literacy and functional health status among older adults. Arch Intern Med. 2005;165:1946–1952. doi: 10.1001/archinte.165.17.1946. [PubMed] [Cross Ref]
4. Baker DW, Parker RM, Williams MV, Clark WS. Health literacy and the risk of hospital admission. J Gen Intern Med. 1998;13:791–798. doi: 10.1046/j.1525-1497.1998.00242.x. [PMC free article] [PubMed] [Cross Ref]
5. Baker DW, Wolf MS, Feinglass J, Thompson JA, Gazmararian JA, Huang J. Health literacy and mortality among elderly persons. Arch Intern Med. 2007;167:1503. doi: 10.1001/archinte.167.14.1503. [PubMed] [Cross Ref]
6. Sudore RL, Yaffe K, Satterfield S, et al. Limited literacy and mortality in the elderly: the health, aging, and body composition study. J Gen Intern Med. 2006;21:806–812. doi: 10.1111/j.1525-1497.2006.00539.x. [PMC free article] [PubMed] [Cross Ref]
7. Gerber BS, Brodsky IG, Lawless KA, Smolin LI, Arozullah AM, Smith EV, Berbaum ML, Heckerling PS, Eiser AR. Implementation and evaluation of a low-literacy diabetes education computer multimedia application. Diabetes Care. 2005;28:1574–1580. doi: 10.2337/diacare.28.7.1574. [PubMed] [Cross Ref]
8. Davis TC, Fredrickson DD, Arnold C, Murphy PW, Herbst M, Bocchini JA. A polio immunization pamphlet with increased appeal and simplified language does not improve comprehension to an acceptable level. Pat Ed Couns. 1998;35:25–23. doi: 10.1016/S0738-3991(97)00053-0. [PubMed] [Cross Ref]
9. Rothman RL, DeWalt DA, Malone R, Bryant B, Shintani A, Crigler B, Weinberger M, Pignone M. Influence of patient literacy on the effectiveness of a primary care-based diabetes disease management program. JAMA. 2004;292:1711–1716. doi: 10.1001/jama.292.14.1711. [PubMed] [Cross Ref]
10. Pignone M, DeWalt DA, Sheridan S, Berkman N, Lohr KN. Interventions to improve health outcomes for patients with low literacy. a systematic review. J Gen Intern Med. 2005;20:185–192. doi: 10.1111/j.1525-1497.2005.40208.x. [PMC free article] [PubMed] [Cross Ref]
11. Clement S, Ibrahim S, Wolf MS, Rowlands G. Health literacy and complex interventions: A systematic literature review. Pat Educ Counsel. 2009;75:340–351. doi: 10.1016/j.pec.2009.01.008. [PubMed] [Cross Ref]
12. Wolf MS, Wilson EA, Rapp DN, Waite KR, Bocchini MV, Davis TC, Rudd RE. Literacy and Learning in Healthcare. Pediatrics. 2009;124:275–281. doi: 10.1542/peds.2009-1162C. [PubMed] [Cross Ref]
13. Deary IJ, Batty D, Gottfredson LS. Human hierarchies, health, and IQ. Science. 2005;309(5735):703. doi: 10.1126/science.309.5735.703. [PubMed] [Cross Ref]
14. Whalley LJ, Deary IJ. Longitudinal cohort study of childhood IQ and survival up to age 76. BMJ. 2001;322(7290):819. doi: 10.1136/bmj.322.7290.819. [PMC free article] [PubMed] [Cross Ref]
15. Deary IJ, Whiteman MC, Starr JM, Whalley LJ, Fox HC. The impact of childhood intelligence on later life: following up the Scottish mental surveys of 1932 and 1947. J Pers Soc Psychol. 2004;86:130–147. doi: 10.1037/0022-3514.86.1.130. [PubMed] [Cross Ref]
16. Murray C, Johnson W, Wolf MS, Deary IC. The association between cognitive ability across the lifespan and health literacy in old age: the Lothian Birth Cohort 1936. Intelligence 2011;39:178–187.
17. Singh-Manoux A, Ferrie JE, Lynch JW, Marmot M. The role of cognitive ability (intelligence) in explaining the association between socioeconomic position and health: evidence from the Whitehall II prospective cohort study. Am J Epidemiol. 2005;161:831–839. doi: 10.1093/aje/kwi109. [PubMed] [Cross Ref]
18. Insel K, Morrow D, Brewer B, Figueredo A. Executive function, working memory, and medication adherence among older adults. J Gerontol B Psychol Sci Soc Sci. 2006;61:102–107. doi: 10.1093/geronb/61.2.P102. [PubMed] [Cross Ref]
19. Stilley CS, Sereika S, Muldoon MF, Ryan CM, Dunbar-Jacob J. Psychological and cognitive function: predictors of adherence with cholesterol lowering treatment. Ann Behav Med. 2004;27:117–124. doi: 10.1207/s15324796abm2702_6. [PubMed] [Cross Ref]
20. Shen BJ, McCreary CP, Myers HF. Independent and mediated contributions of personality, coping, social support, and depressive symptoms to physical functioning outcome among patients in cardiac rehabilitation. J Behav Med. 2004;27:39–62. doi: 10.1023/B:JOBM.0000013643.36767.22. [PubMed] [Cross Ref]
21. Batty GD, Mortensen EL. Nybo Andersen AM, Osler M. Childhood intelligence in relation to adult coronary heart disease and stroke risk: evidence from a Danish birth cohort study. Paediatr Perinat Epidemiol. 2005;19:452–459. doi: 10.1111/j.1365-3016.2005.00671.x. [PubMed] [Cross Ref]
22. Shipley BA, Der G, Taylor MD, Deary IJ. Cognition and all-cause mortality across the entire adult age range: health and lifestyle survey. Psychosom Med. 2006;68:17–24. doi: 10.1097/01.psy.0000195867.66643.0f. [PubMed] [Cross Ref]
23. Shaywitz SE, Shaywitz BA. Dyslexia (specific reading disability) Biol Psychiatry. 2005;57:1301–1309. doi: 10.1016/j.biopsych.2005.01.043. [PubMed] [Cross Ref]
24. Levinthal BR, Morrow DG, Tu W, Wu J, Murray MD. Cognition and health literacy in patients with hypertension. J Gen Intern Med. 2008;24:1172–1176. doi: 10.1007/s11606-008-0612-2. [PMC free article] [PubMed] [Cross Ref]
25. Baker DW, Wolf MS, Feinglass J, Thompson JA. Health literacy, Cognitive Abilities, and Mortality among Elderly Persons. J Gen Intern Med. 2008;23:723–726. doi: 10.1007/s11606-008-0566-4. [PMC free article] [PubMed] [Cross Ref]
26. Federman AD, Sano M, Safran DG, Siu AL, Wolf MS, Halm EA. Health literacy and cognitive performance among older adults. J Amer Ger Soc. 2009;57:1475–1480. doi: 10.1111/j.1532-5415.2009.02347.x. [PMC free article] [PubMed] [Cross Ref]
27. Wolf MS, Wilson EA, Rapp DN, Waite KR, Bocchini MV, Davis TC, Rudd RE. Literacy and Learning in Healthcare. Pediatrics. 2009;124:275–281. doi: 10.1542/peds.2009-1162C. [PubMed] [Cross Ref]
28. Wilson EA, Wolf MS. Working memory capacity and the design of patient education materials. Pat Educ Counsel. 2009;74:318–322. doi: 10.1016/j.pec.2008.11.005. [PubMed] [Cross Ref]
29. Parker RM, Baker DW, Williams MV, Nurss JR. The test of functional health literacy in adults: a new instrument for measuring patients' literacy skills. J Gen Intern Med. 1995;10:537–541. doi: 10.1007/BF02640361. [PubMed] [Cross Ref]
30. Davis TC, Long SW, Jackson RH, et al. Rapid estimate of adult literacy in medicine: a shortened screening instrument. Fam Med. 1993;25:391. [PubMed]
31. Weiss BD, Mays MZ, Martz W, et al. Quick assessment of literacy in primary care: the newest vital sign. Ann Fam Med. 2005;3:514. doi: 10.1370/afm.405. [PubMed] [Cross Ref]
32. Standard definitions: Final dispositions of case codes and outcome rates for surveys. 3. Lenexa, Kansas: AAPOR; 2004.
33. Salthouse TA. What do adult age differences in the Digit Symbol Substitution Test reflect? J Gerontol: Psychol Sci. 1992;47:121–128. [PubMed]
34. Salthouse TA, Babcock RL. Decomposing adult age differences in working memory. Dev Psychol. 1991;27:763–776. doi: 10.1037/0012-1649.27.5.763. [Cross Ref]
35. Smith A. Symbol digit modalities test. Los Angeles: Western Psychological Services; 1991.
36. Robbins TW, James M, Owen AM, Sahakian BJ, McInnes L, Rabbitt PMA. Cambridge Neuropsychological Test Automated Battery (CANTAB): A Factor Analytic Study of a Large Sample of Normal Elderly Volunteers. Dementia. 1994;5:266–281. [PubMed]
37. Cherry KE, Park DC. Individual difference and contextual variables influence spatial memory in younger and older adults. Psychol Aging. 1993;8:517–517. doi: 10.1037/0882-7974.8.4.517. [PubMed] [Cross Ref]
38. Ekstrom RB, French JW, Harman HH. ETS Kit of Factor-Referenced Cognitive Tests. Princeton, NJ: Educational Testing Service; 1976.
39. Raven JC. Standard progressive matrices: sets A, B, C, D and E. San Antonio, TX: Harcourt Assessment; 1976.
40. Grober E, Sliwinsk M, Korey SR. Development and validation of a model for estimating premorbid verbal intelligence in the elderly. J Clin Exp Neuropsychol. 1991;13(6):933–949. doi: 10.1080/01688639108405109. [PubMed] [Cross Ref]
41. Zachary RA.Shipley Institute of Living Scale, Revised Manual. Los Angeles, CA: Western Psychological Services, 1986; 1986.
42. Kluger A, Ferris SH, Golomb J, Mittelman MS, Reisberg B. Neuropsychological prediction of decline to dementia in nondemented elderly. J Geriatr Psychiatry Neurol. 1999;12:168. doi: 10.1177/089198879901200402. [PubMed] [Cross Ref]
43. Measures of Health Literacy. Workshop Summary. Washington D.C: National Academies Press; 2009.
44. Curtis LM, Revelle W, Wilson EAH, Waite KR, Park DC, Baker DW, Wolf MS. Assessment of everyday health tasks for older adults. J Health Comm: Under revision; 2010.
45. Willis SL, Jay GM, Diehl M, Marsiske M. Longitudinal change and prediction of everyday task competence in the elderly. Res aging. Vol 141992:68. [PMC free article] [PubMed]
46. Kutner M, Greenberg E, Jin Y, Paulsen C. The Health Literacy of America’s Adults: Results from the 2003 National Assessment of Adult Literacy (NCES 2006–483). U.S. Department of Education. Washington, DC: National Center for Education Statistics; 2006.
47. Kirsch I, Jungeblut A, Jenkins L, Kolstad A. Adult Literacy in America: a first look at the findings of the National Adult Literacy Survey. U.S. Department of Education; 1993.
48. McDonald RP. Test theory: A unified treatment. Mahwah, N.J.: L. Erlbaum Associates; 1999.
49. Revelle W, Zinbarg RE. Coefficients alpha, beta, omega and the glb: comments on Sijtsma. Psychometrika. 2009;74:145–154. doi: 10.1007/s11336-008-9102-z. [Cross Ref]
50. Vuong QH. Likelihood ratio tests for model selection and non-nested hypotheses. Econometrica. 1989;57:307–333. doi: 10.2307/1912557. [Cross Ref]
51. U.S. Department of Health and Human Services. Healthy People 2020. http//www.healthypeople.gov. Accessed April 6, 2012.
52. Kickbusch I, Maag D. Health Literacy. In: Heggenhougen K, Quah S, editors. International Encyclopedia of Public Health. San Diego: Academic Press; 2008. pp. 204–211.
53. Paasche-Orlow MK, Wolf MS. Evidence does not support clinical screening of literacy. J Gen Intern Med. 2008;23:100–102. doi: 10.1007/s11606-007-0447-2. [PMC free article] [PubMed] [Cross Ref]
54. Johnson K, Weiss BD. How long does it take to assess literacy skills in clinical practice? JAFBM. 2008;21:211–214. [PubMed]
55. Ryan JG, Leguen F, Weiss BD, Albury A, Jennings T, Velez F, Salibi N. Will patients agree to have their literacy skills assessed in clinical practice? Health Ed Res. 2008;23:603–611. doi: 10.1093/her/cym051. [PubMed] [Cross Ref]
56. Nutbeam D. The evolving concept of health literacy. Soc Sci Med. 2008;67:2072–2078. doi: 10.1016/j.socscimed.2008.09.050. [PubMed] [Cross Ref]
57. Davis TC, Wolf MS, Davis TC, Bass PF, Tilson H, Neuberger M, Parker RM. Literacy and misunderstanding of prescription drug labels. Ann Intern Med. 2006;145:887–894. [PubMed]
58. Rothman R, Housman T, Weiss H, et al. Patient understanding of food labels: the role of literacy and numeracy. Am J Prev Med. 2006;31:391–398. doi: 10.1016/j.amepre.2006.07.025. [PubMed] [Cross Ref]
59. Sj C. Sharit J, Narit SN. Usability of the Medicare health web site. JAMA. 2008;300:790–792. [PMC free article] [PubMed]
60. Baker DW, Gazmararian JA, Sudano J, Patterson M, Parker RM, Williams MV. Health literacy and performance on the Mini-Mental State Examination. Aging Ment Health. 2002;6:22–29. doi: 10.1080/13607860120101121. [PubMed] [Cross Ref]
61. Isham G. Opportunity at the Intersection of Quality Improvement, Disparities Reduction, and Health Literacy, Toward Health Care Equity and Patient-Centeredness. Institute of Medicine Workshop Summary; 2009.
Articles from Journal of General Internal Medicine are provided here courtesy of
Society of General Internal Medicine