Racial/ethnic differences with regard to complementary and alternative medicine (CAM) use have been reported in the US. However, specific details of CAM use by African Americans with rheumatoid arthritis (RA) are lacking.
Data were collected from African Americans with RA enrolled in a multicenter registry regarding the use of CAM, including food supplements, topical applications, activities, and alternative care providers. Factors associated with CAM use by sex and disease duration were assessed using t-test, Wilcoxon’s rank sum test, chi-square test, and logistic regression analyses.
Of the 855 participants, 85% were women and mean age at enrollment was 54 years. Overall, ever using any of the CAM treatments, activities, and providers was 95%, 98%, and 51%, respectively (median of 3 for number of treatments, median of 5 for activities, and median of 1 for providers). Those with longer disease duration (>2 years) were significantly more likely (odds ratio >2.0, P < 0.05) to use raisins soaked in vodka/gin, to take fish oils, or to drink alcoholic beverages for RA treatment than those with early disease. As compared to men, women were significantly (P < 0.05) more likely to pray/attend church, write in a journal, and use biofeedback, but were less likely to smoke tobacco or topically apply household oils for treatment of RA.
CAM use was highly prevalent in this cohort, even in individuals with early disease. Health care providers need to be aware of CAM use as some treatments may potentially have interactions with conventional medicines. This could be important within this cohort of African Americans, where racial disparities are known to affect access to conventional care.
Lipofuscin contained in the retinal pigment epithelium (RPE) is the main source of fundus auto-fluorescence (FAF), the target of an imaging method useful for estimating the progression of geographic atrophy (GA) in clinical trials. To establish a cellular basis for hyperfluorescent GA border zones, histologic autofluorescence (HAF) was measured at defined stages of RPE pathologic progression.
Participants and Controls
Ten GA donor eyes (mean age ± standard deviation, 87.1±4.0 years) and 3 age-matched control eyes (mean age ± standard deviation, 84.0±7.2 years) without GA.
Ten–micrometer-thick sections were divided into zones of RPE morphologic features according to an 8-point scale. Any HAF excited by 488 nm light was imaged by laser confocal microscopy. The HAF intensity summed along vertical lines perpendicular to Bruch’s membrane at 0.2-μm intervals served as a surrogate for FAF. Intensity profiles in 151 zones were normalized to grade 0 at a standard reference location in each eye. Cross-sectional area, mean, and sum autofluorescence for individual RPE cells were measured (cellular autofluorescence [CAF]).
Main Outcome Measures
Statistically significant differences in intensity and localization of HAF and CAF at defined stages of RPE morphologic progression for GA and control eyes.
The RPE morphologic features were most abnormal (cell rounding, sloughing, and layering; grade 2) and HAF intensity profiles were highest and most variable immediately adjacent to atrophic areas. Peaks in HAF intensity frequently were associated with vertically superimposed cells. The HAF value that optimally separated reactive RPE was 0.66 standard deviations more than the mean for uninvolved RPE and was associated with a sensitivity of 75.8% and a specificity of 76.3%. When variable cell area was accounted for, neither mean nor sum CAF differed significantly among the RPE pathologic grades.
Areas with advanced RPE alterations are most likely to exhibit clinically recognizable patterns of elevated FAF around GA, but may not predict cells about to die, because of vertically superimposed cells and cellular fragments. These data do not support a role for lipofuscin-related cell death and call into question the rationale of treatments targeting lipofuscin.
We examined the impact of an eye health education program for older African Americans on attitudes about eye care and utilization, using a randomized trial design in a community setting. Participants were older African Americans attending activities at senior centers. Ten centers were randomized to an eye health education (InCHARGE©) or social-contact control presentation. InCHARGE© addressed the importance of annual dilated comprehensive examination and strategies reducing barriers to care. The control presentation was on the importance of physical activity. Outcomes were attitudes about eye care through questionnaire 6 months post-event, and eye care utilization during 12 months post-event through medical record abstraction. At baseline > 80% participants in both arms said transportation and finding, communicating, and trusting a doctor were not problematic and agreed that yearly care was important. One-fourth said eye examination cost was problematic; one-half said spectacle cost was problematic. There were no group differences 6 months post-event. During 12 months pre-event, dilated exam rate was similar in the groups (38.3% InCHARGE©, 40.8% control), and unchanged during 12 months post-event. Results suggest less than half of older African Americans received annual dilated eye care. Group-administered eye health education did not increase this rate. Even before the program, they had positive attitudes about care, yet many cited examination and spectacle cost as problematic, which was not mitigated by health education. Evidence-based strategies in a community setting for increasing eye care utilization rate in older African Americans have yet to be identified. Policy changes may be more appropriate avenues for addressing cost.
eye care utilization; eye health education; barriers to care
To examine rates of visual impairment of older adults in assisted living facilities (ALFs).
Vision screening events were held at 12 ALFs in Jefferson County, Alabama for residents ≥60 years of age. Visual acuity, cognitive status, and presence of eye conditions were assessed.
144 residents were screened. 67.8% failed distance screening, 70.9% failed near screening, and 89.3% failed contrast sensitivity screening. 40.4% of residents had cognitive impairment and 89% had a least one diagnosed eye condition. Visual acuities did not differ significantly between cognitive status groups or with greater numbers of eye conditions.
This study is the first to provide information about vision impairment in the assisted living population. Of those screened, 70% had visual acuity worse than 20/40 for distance or near vision, and 90% had impaired contrast sensitivity. Cognitive impairment accounted for a small percentage of the variance in near vision and contrast sensitivity.
Visual impairment; assisted living facilities; older adults
To compare the diagnostic accuracy of the Moorfields Regression Analysis (MRA), parameters, and Glaucoma Probability Score (GPS) from Heidelberg Retinal Tomograph HRT-3 with MRA and parameters from HRT-II in discriminating glaucomatous and healthy eyes in subjects of African (AA) and European ancestry (EA).
case-control, institutional setting.
78 glaucoma patients (AA=44, EA=34) and 89 age-matched controls (AA=46, EA=33), defined by visual fields and self-reported race were included. Imaging was obtained with HRT-II and data were exported to a computer with the HRT-3 software using the same contour line. Area under Receiver-operating Characteristic [ROC] curves [AUC], sensitivity and specificity were evaluated for whole group, AA and EA separately. Mean disc area was compared between correctly and incorrectly diagnosed eyes by each technique.
Disc, cup and rim areas from HRT3 were lower than HRT-II (P<0.0001). AUC (sensitivity at 95%-specificity) was 0.85 (54%) for vertical cup-to-disc ratio (VCDR) HRT3, 0.84 (45%) for VCDR HRT-II, and 0.81 (44%) for GPS score at temporal sector. MRA-HRT3 showed greater sensitivity but lower specificity than HRT-II for whole group, AA and EA. GPS classification had lowest specificity. Glaucomatous eyes incorrectly classified by GPS had smaller mean disc area (P=0.0002); controls eyes incorrectly classified had greater mean disc area (P=0.015).
VCDR from HRT-3 showed higher sensitivity than HRT-II and GPS for the whole group, and AA and EA separately. Sensitivity of MRA improved in HRT-3 with some tradeoff in specificity compared to MRA of HRT-II. GPS yielded erroneous classification associated to optic disc size.
Background. This study aimed to determine whether it is possible to predict driving safety of individuals with homonymous hemianopia or quadrantanopia based upon a clinical review of neuroimages that are routinely available in clinical practice. Methods. Two experienced neuroophthalmologists viewed a summary report of the CT/MRI scans of 16 participants with homonymous hemianopic or quadrantanopic field defects which indicated the site and extent of the lesion and they made predictions regarding whether participants would be safe/unsafe to drive. Driving safety was independently defined at the time of the study using state-recorded motor vehicle crashes (all crashes and at-fault) for the previous 5 years and ratings of driving safety determined through a standardized on-road driving assessment by a certified driving rehabilitation specialist. Results. The ability to predict driving safety was highly variable regardless of the driving safety measure, ranging from 31% to 63% (kappa levels ranged from −0.29 to 0.04). The level of agreement between the neuroophthalmologists was only fair (kappa = 0.28). Conclusions. Clinical evaluation of summary reports of currently available neuroimages by neuroophthalmologists is not predictive of driving safety. Future research should be directed at identifying and/or developing alternative tests or strategies to better enable clinicians to make these predictions.
To examine the association between visual and hearing impairment and motor vehicle collision (MVC) involvement in older drivers.
Retrospective cohort study.
North central Alabama
Population-based sample of 2,000 licensed-drivers, age 70 and older.
Visual acuity was measured using the Electronic Visual Acuity test. Contrast sensitivity was measured using the Pelli-Robson chart. Presence of subjective hearing loss and other health conditions were determined using a general health questionnaire. Information regarding MVCs for all participants spanning the five years prior to study enrollment was obtained from the Alabama Department of Public Safety.
Following adjustment for age, race, gender, number of miles driven, number of medical conditions, general cognitive status, and visual processing speed, older drivers having both visual acuity and hearing impairment (rate ratio RR 1.52, 95% confidence interval CI 1.01–2.30), contrast sensitivity impairment alone (RR 1.42, 95% CI 1.00–2.02), and both contrast sensitivity and hearing impairment (RR 2.41, 95% CI 1.62–3.57) had elevated MVC rates, compared to drivers with no visual or hearing impairments. Drivers with visual acuity loss alone or hearing loss alone did not have significantly different MVC rates when compared to the no impairment group after adjustment for multiple variables.
Older drivers with dual sensory impairment are at greater MVC risk than those with only a visual acuity or a hearing deficit alone. A combined screening approach of screening for both hearing impairment and visual impairment may be a useful tool to identify older drivers at risk for MVC involvement.
driver safety; dual sensory impairment; vision impairment; hearing impairment
To characterize the morphology, prevalence, and topography of subretinal drusenoid deposits (SDD), a candidate histological correlate of reticular pseudodrusen, with reference to basal linear deposit (BlinD), a specific lesion of age-related macular degeneration (AMD); to propose a biogenesis model for both lesions.
Donor eyes with median death-to-preservation of 2:40 hr were post-fixed in osmium tannic acid paraphenylenediamine and prepared for macula-wide high-resolution digital sections. Annotated thicknesses of 21 chorioretinal layers were determined at standard locations in sections through the fovea and the superior perifovea.
In 22 eyes of 20 Caucasian donors (83.1 ± 7.7 years), SDD appeared as isolated or confluent drusenoid dollops punctuated by tufts of RPE apical processes and associated with photoreceptor perturbation. SDD and BlinD were detected in 85.0% and 90.0% of non-neovascular AMD donors, respectively. SDD was thick (median, 9.4 µm) and more abundant in perifovea than fovea (p<0.0001). BlinD was thin (median, 2.1 µm) and more abundant in fovea than perifovea (p<0.0001).
SDD and BlinD prevalence in AMD eyes are both high. SDD's organized morphology, topography, and impact on surrounding photoreceptors imply specific processes of biogenesis. Contrasting topographies of SDD and BlinD suggest relationships with differentiable aspects of rod and cone physiology, respectively. A 2-lesion, 2-compartment biogenesis model incorporating outer retinal lipid homeostasis is presented.
age-related macular degeneration; basal linear deposit; cholesterol; fovea; histopathology; lipoproteins; macula; photoreceptors; reticular drusen; subretinal drusenoid deposit
To examine associations between retinal thickness and rod-mediated dark adaptation in older adults with non-exudative age-related maculopathy (ARM) or normal macular health.
A cross-sectional study was conducted with 74 adults ≥ 50 years old from the comprehensive ophthalmology and retina services of an academic eye center. ARM presence and disease severity in the enrollment eye was defined by the masked grading of stereofundus photos using the Clinical Age-Related Maculopathy (CARMS) grading system. High-definition, spectral-domain optical coherence tomography was used to estimate retinal thickness in a grid of regions in the macula. Rod-mediated dark adaptation, recovery of light sensitivity after a photo-bleach, was measured over a 20-minute period for a 500 nm target presented at 5° on the inferior vertical meridian. Main outcomes of interest were retinal thickness in the macula (μm) and parameters of rod-mediated dark adaptation (second slope, third slope, average sensitivity, final sensitivity).
In non-exudative disease retinal thickness was decreased in greater disease severity; thinner retina was associated with reductions in average and final rod-mediated sensitivity even after adjustment for age and visual acuity.
Impairment in rod-mediated dark adaptation in non-exudative ARM is associated with macular thinning.
age-related maculopathy; dark adaptation; rod photoreceptors; optical coherence tomography
To determine and compare the effect of the severity of glaucomatous damage on the repeatability of retinal nerve fiber layer (RNFL) thickness with GDx-VCC (variable corneal compensation) and StratusOCT (optical coherence tomography; both produced by Carl Zeiss Meditec, Inc., Dublin, CA), and optic nerve head (ONH) topography with HRT-II (retinal tomograph; Heidelberg Engineering GmbH, Heidelberg, Germany) and StratusOCT.
With each of these techniques, two measurements were obtained from 41 eyes of 41 control subjects and 98 glaucomatous eyes (37 patients with early, 29 with moderate, and 32 with severe field loss). To evaluate test–retest variability at each stage, limits of agreement (Bland-Altman plots) and repeatability coefficients (RCs) were obtained from pairs of measurements. Comparisons of within-subject variances were used to compare repeatability of GDx-VCC versus StratusOCT for global RNFL and HRT-II versus StratusOCT for global ONH topography. Effects from age, visual acuity, and lens status were also included in the analysis as covariates.
Test–retest variability of RNFL using GDx-VCC and StratusOCT were consistent through all stages of disease severity. Repeatability results of GDx-VCC were better than those of StratusOCT, except in severe cases. Test–retest variability of ONH topography using HRT-II and StratusOCT increased with increasing disease severity for rim area, cup area, and cup-to-disc (C/D) area ratio. In contrast, vertical C/D ratio from HRT-II, and horizontal C/D ratio from StratusOCT showed stable test–retest variability through all stages. Regardless of disease severity, repeatability results of HRT-II were better than those of StratusOCT.
GDx-VCC and HRT-II showed better repeatability than StratusOCT. Although test–retest variability increased with disease severity for rim area, the variability for vertical C/D ratio (HRTII) and global RNFL (GDx-VCC) was stable across disease severity. These parameters, rather than rim area, may be more useful in detection of progression in patients with glaucoma who have more advanced field loss.
To compare the diagnostic ability of the confocal scanning laser ophthalmoscope (HRT-II; Heidelberg Engineering, Heidelberg, Germany), scanning laser polarimeter (GDx-VCC; Carl Zeiss Meditec, Inc., Dublin, CA), and optical coherence tomographer (StratusOCT, Carl Zeiss Meditec, Inc.) with subjective assessment of optic nerve head (ONH) stereophotographs in discriminating glaucomatous from nonglaucomatous eyes.
Data from 79 glaucomatous and 149 normal eyes of 228 subjects were included in the analysis. Three independent graders evaluated ONH stereophotographs. Receiver operating characteristic curves were constructed for each technique and sensitivity was estimated at 80% of specificity. Comparisons of areas under these curves (aROC) and agreement (κ) were determined between stereophoto grading and best parameter from each technique.
Stereophotograph grading had the largest aROC and sensitivity (0.903, 77.22%) in comparison with the best parameter from each technique: HRT-II global cup-to-disc area ratio (0.861, 75.95%); GDx-VCC Nerve Fiber Indicator (NFI; 0.836, 68.35%); and StratusOCT retinal nerve fiber layer (RNFL) thickness (0.844, 69.62%), ONH vertical integrated rim area (VIRA; 0.854, 73.42%), and macular thickness (0.815, 67.09%). The κ between photograph grading and imaging parameters was 0.71 for StratusOCT-VIRA, 0.57 for HRT-II cup-to-disc area ratio, 0.51 for GDX-VCC NFI, 0.33 for StratusOCT RNFL, and 0.28 for StratusOCT macular thickness.
Similar diagnostic ability was found for all imaging techniques, but none demonstrated superiority to subjective assessment of the ONH. Agreement between disease classification with subjective assessment of ONH and imaging techniques was greater for techniques that evaluate ONH topography than with techniques that evaluate RNFL parameters. A combination of subjective ONH evaluation with RNFL parameters provides additive information, may have clinical impact, and deserves to be considered in the design of future studies comparing objective techniques with subjective evaluation by general eye care providers.
The increased risk of thrombosis in systemic lupus erythematosus (SLE) may be partially explained by interrelated genetic pathways for thrombosis and SLE. In a case-control analysis, we investigated whether 33 established and novel single nucleotide polymorphisms (SNP) in 20 genes involved in hemostasis pathways that have been associated with deep venous thrombosis in the general population were risk factors for SLE development among Asians.
Patients in the discovery cohort were enrolled in one of two North American SLE cohorts. Patients in the replication cohort were enrolled in one of four Asian or two North American cohorts. SLE cases met American College of Rheumatology classification criteria. We first genotyped 263 Asian SLE and 357 healthy Asian control individuals for 33 SNPs using Luminex multiplex technology in the discovery phase, and then used Taqman and Immunochip assays to examine 5 SNPs in up to an additional 1496 cases and 993 controls in the Replication phase. SLE patients were compared to healthy controls for association with minor alleles in allelic models. Principal components analysis was used to control for intra-Asian ancestry in an analysis of the replication cohort.
Two genetic variants in the gene VKORC1, rs9934438 and rs9923231, were highly significant in both the discovery and replication cohorts: OR(disc) = 2.45 (p=2×10−9), OR(rep) = 1.53 (p=5×10−6) and OR(disc) = 2.40 (p=6×10−9), OR(rep) = 1.53 (p=5×10−6), respectively. These associations were significant in the replication cohort after adjustment for intra-Asian ancestry: rs9934438 OR(adj) = 1.34 (p=0.0029) and rs9923231 OR(adj) = 1.34 (p=0.0032).
Genetic variants in VKORC1, involved in vitamin K reduction and associated with DVT, are associated with SLE development in Asians. These results suggest intersecting genetic pathways for the development of SLE and thrombosis.
systemic lupus erythematosus; single nucleotide polymorphisms; genetic risk factors
Vision impairment is an important public health concern. Accurate information regarding visual health and eye care utilization is essential to monitor trends and inform health policy interventions aimed at addressing at-need populations. National surveys provide annual prevalence estimates but rely on self-report. The validity of self-reported information regarding eye disease has not been adequately explored.
This cross-sectional study compared self-report of eye care utilization and eye disease with information obtained from medical records. The study population was 2,001 adults aged 70 years and older who completed the Behavioral Risk Factor Surveillance System’s Visual Impairment and Access to Eye Care Module. Cohen’s kappa (κ) was used to assess agreement.
Agreement between self-report and medical records was substantial for eye care utilization (κ=0.64) and glaucoma (κ=0.73), moderate for macular degeneration (κ=0.40) and diabetic retinopathy (κ=0.47), and slight for cataracts (κ=0.18). Self-report tended to overestimate the number of subjects who visited an eye care provider in the previous year, and underestimated the prevalence in all but one (glaucoma) of the four eye diseases evaluated.
Though agreement was substantial for self-report of eye care utilization, results of the current study suggest that national estimates based on self-report overestimate eye care utilization.
vision; self-report; epidemiology
Bacterial vaginosis (BV) is a common vaginal disorder in women of reproductive age, especially among women with HIV-1 infection. Several bacterial products including lipopolysaccharides (LPS), lipoteichoic acids (LTA), and peptidoglycans (PGN) are stimulatory ligands for Toll-like receptors (TLRs), and recent evidence indicates the important role of variation in TLR genes for permitting overgrowth of gram negative and BV-type flora. We assessed whether genetic polymorphisms in five TLR genes (TLR1, TLR2, TLR4, TLR6, and TLR9) could be determinants of differential host immune responses to BV in 159 HIV-1-positive African American adolescents enrolled in the Reaching for Excellence in Adolescent Care and Health (REACH) study. BV was assessed biannually and diagnosed either by a Nugent Score of at least 7 of 10, or using the Amsel Criteria. Cox-proportional hazards regression models, adjusted for concurrent Chlamydia and Gonorrhea infections, douching, and absolute CD4 cell count, were used to identify host genetic factors associated with BV. Two SNPs were associated with BV as diagnosed by the Nugent Score and the combined criteria: a minor allele G of rs4986790 (frequency=0.07), which encodes a His to Tyr substitution in TLR4 (HR=1.47, 95% CI 1.15–1.87) and rs187084 (frequency=0.24) on TLR9. The minor allele of rs1898830 (frequency=0.13) was associated with an increased hazard of BV defined by the Amsel criteria (HR=1.86, 95%CI 1.17–2.95). Further studies are warranted to confirm the associations of TLR gene variants and also to understand the underlying pathways and immunogenetic correlates in the context of HIV-1 infection.
HIV-1; Bacterial Vaginosis; Toll Like Receptors
To compare the on-road driving performance of visually impaired drivers using bioptic telescopes with age-matched controls.
Participants included 23 persons (mean age = 33 ± 12 years) with visual acuity of 20/63 to 20/200 who were legally licensed to drive through a state bioptic driving program, and 23 visually normal age-matched controls (mean age = 33 ± 12 years). On-road driving was assessed in an instrumented dual-brake vehicle along 14.6 miles of city, suburban, and controlled-access highways. Two backseat evaluators independently rated driving performance using a standardized scoring system. Vehicle control was assessed through vehicle instrumentation and video recordings used to evaluate head movements, lane-keeping, pedestrian detection, and frequency of bioptic telescope use.
Ninety-six percent (22/23) of bioptic drivers and 100% (23/23) of controls were rated as safe to drive by the evaluators. There were no group differences for pedestrian detection, or ratings for scanning, speed, gap judgments, braking, indicator use, or obeying signs/signals. Bioptic drivers received worse ratings than controls for lane position and steering steadiness and had lower rates of correct sign and traffic signal recognition. Bioptic drivers made significantly more right head movements, drove more often over the right-hand lane marking, and exhibited more sudden braking than controls.
Drivers with central vision loss who are licensed to drive through a bioptic driving program can display proficient on-road driving skills. This raises questions regarding the validity of denying such drivers a license without the opportunity to train with a bioptic telescope and undergo on-road evaluation.
On-road driving ability of persons with central visual loss, licensed to drive with bioptic telescopes, was compared with a control group. There were some group differences in driving but most aspects of driving were similar between groups; most (96%) bioptic drivers were rated safe to drive.
bioptic telescopes; driving performance; on-road driving assessment; spotting
Background. Pediatric chronic pain is considered to be a multidimensional construct that includes biological, psychological, and social components. Methods. The 99 enrolled study patients (mean age 13.2 years, 71% female, 81% Caucasian) and an accompanying parent completed a series of health-related questionnaires at the time of their initial appointment in a pediatric chronic pain medicine clinic. Results. Significant correlations (r ≥ 0.30, P < 0.05) were observed between pediatric chronic pain intensity and patient anxiety, patient depression, patient pain coping, parent chronic pain intensity, and parent functional disability. Pediatric chronic pain intensity was significantly associated with patient anxiety (P = 0.002). Significant correlations (r ≥ 0.30, P < 0.05) were observed between pediatric functional disability and patient chronic pain intensity, patient anxiety, patient depression, patient pain coping, parent chronic pain intensity, parent functional disability, parent anxiety, parent depression, and parent stress. Pediatric functional disability was significantly associated with patient chronic pain intensity (P = 0.025), patient anxiety (P = 0.021), patient pain coping (P = 0.009), and parent functional disability (P = 0.027). Conclusions. These findings provide empirical support of a multidimensional Biobehavioral Model of Pediatric Pain. However, the practical clinical application of the present findings and much of the similar previously published data may be tenuous.
To evaluate prescribed optical device use in terms of frequency and perceived usefulness among people with age-related macular degeneration (AMD). We also sought to determine the tasks for which they were using their prescribed low vision device.
199 patients with AMD presenting for the first time to the low vision service were recruited from a university-based clinic. Prior to the low vision evaluation and device prescription, they completed the NEI-VFQ 25, Center for Epidemiological Studies Depression Scale, Short Portable Mental Status Questionnaire and a general health questionnaire. The low vision evaluation included best-corrected ETDRS visual acuity, MNRead testing, microperimetry, prescription and dispensing of optical low vision devices. Telephone follow-up interviews were conducted about device usage 1-week, 1-month and 3-months post-intervention.
181 participants were prescribed low vision devices. 93% completed all 3 follow-up interviews. Intensive users (≥ 1hour/day) of devices were similar in demographic and visual characteristics to non-intensive users (<1 hour/day) except for habitual reading acuity and speed as well as contrast sensitivity. Overall, device use increased slightly over 3 months of follow-up. Magnifiers were reported to be moderately to extremely useful by greater than 80% of participants at all time points except the 1 month follow-up for hand magnifiers (75%). High plus spectacles were the least frequently prescribed device and rated as moderately to extremely useful by 70%, 74% and 59% at 1 week, 1 month and 3 months, respectively. Most participants used their devices for leisure reading, followed by managing bills. Very few devices (n=3, <1%)were not used at any time point.
Patients with AMD who are provided with prescribed optical low vision devices do use them and perceive them as useful, especially for leisure reading activities. High rates of usage were maintained over 3 months.
low vision; age-related macular degeneration; magnification; reading; low vision devices
To identify through focus groups of visually impaired children and their parents, relevant content for a vision-targeted health-related quality of life questionnaire designed for children ages 6-12.
Six focus groups of children with vision impairment ages 6 -12 and six focus groups of their parents were conducted by trained facilitators using a guided script. Sessions were recorded, transcribed and coded per a standardized protocol for content analysis. Comments were placed in thematic categories and each coded as positive, negative or neutral.
Twenty-four children (mean age 9.4 years) with vision impairment from a variety of causes and 23 parents participated. The child focus groups generated 1,163 comments, of which 52% (n=599) were negative, 12% (n=138) were neutral and 37% (n=426) were positive. The three most common topical areas among children were: glasses and adaptive equipment (18%), psychosocial (14%) and school (14%). The parent focus groups generated 1,952 comments of which 46% (n=895) were negative, 16% (n=323) were neutral and 38% (n=734) were positive. The three most common topical areas among parents were: school (21%), expectations or frustrations (14%) and psychosocial (13%).
Pediatric vision impairment has significant effects on health related quality of life, as reported by children with vision impairment and their parents in their own words. These findings will provide the content to guide construction of a survey instrument to assess vision-specific, health-related quality of life in children with vision impairment.
low vision; quality of life; congenital nystagmus; albinism; pediatrics
To examine the prevalence and biopsychosocial predictors of sub-optimal virologic response to highly active antiretroviral therapy (HAART) among human immunodeficiency virus (HIV)-infected adolescents.
Population-based cohort study.
Sixteen academic medical centers across thirteen cities in the United States.
One hundred and fifty four HIV-infected adolescents who presented for at least two consecutive visits after initiation of HAART.
Main Outcome Measures
Viral load (plasma concentration of HIV RNA), CD4+ T-lymphocyte count.
Of the 154 adolescents enrolled in the study, 50 (32.5%) demonstrated early and sustained virologic suppression while receiving HAART. The remaining 104 adolescents (67.5%) had a poor virologic response. Adequate adherence (>50%) to HAART—reported by 70.8% of respondents—was associated with a 60% reduced odds of suboptimal virologic suppression in a multivariable logistic regression model (adjusted odds ratio = 0.4; 95% confidence interval : 0.2 – 1.0). Exposure to sub-optimal antiretroviral therapy (ART) prior to HAART, on the other hand, was associated with more than a two-fold increased odds of sub-optimal virologic response (adjusted odds ratio = 2.6; 95% confidence interval: 1.1 – 5.7).
Fully two-thirds of HIV-infected adolescents in the current study demonstrated a sub-optimal virologic response to HAART. Non-adherence and prior single or dual ART were associated with subsequent poor virologic responses to HAART. These predictors of HAART failure echo findings in pediatric and adult populations. Given the unique developmental stage of adolescence, age-specific interventions are indicated to address high rates of non-adherence and therapeutic failure.
HIV; Adolescent; Antiretroviral Therapy; Highly Active; Adherence; Viral Load; CD4 Lymphocyte Count
To determine the relationship between refractive error as measured by autorefraction and that measured by trial frame refraction among a sample of adults with vision impairment seen in a university-based low-vision clinic and to determine if autorefraction might be a suitable replacement for trial frame refraction.
A retrospective chart review of all new patients 19 years or older seen over an 18-month period was conducted and the following data collected: age, sex, primary ocular diagnosis, entering distance visual acuity, habitual correction, trial frame refraction, autorefraction, and distance visual acuity measured after trial frame refraction. Trial frame refraction and autorefraction were compared using paired t-tests, intraclass correlations, and Bland-Altman plots.
Final analyses included 440 patients for whom both trial frame refraction and autorefraction data were available for the better eye. Participants were mostly female (59%) with a mean age of 68 years (SD = 20). Age-related macular degeneration was the most common etiology for vision impairment (44%). Values for autorefraction and trial frame refraction were statistically different, but highly correlated for the spherical equivalent power (r = 0.92), the cylinder power (r = 0.80) and overall blurring strength (0.89). Although the values of the cross-cylinders J0 and J45 were similar, they were poorly correlated (0.08 and 0.15, respectively). The range of differences in spherical equivalent power was large (−8.6 to 4.9).
Autorefraction is highly correlated with trial frame refraction. Differences are sometimes substantial, making autorefraction an unsuitable substitute for trial frame refraction.
Trial frame refraction and autorefraction are highly correlated for patients with vision impairment attending a low-vision clinic; however, approximately 5% of values differ by 2 diopters or more. Therefore, autorefraction should not be substituted for trial frame refraction in this population.
To examine in a population-based sample of 2000 drivers aged 70 years and older, the independent association between higher order visual processing impairment and motor vehicle collision (MVC) rate during the prior 5 years.
Three higher order visual processing screening tests were administered since previous research found associations between impaired performance on these screens and MVC involvement. They included an estimate of visual processing speed under divided attention conditions (useful field of view [UFOV] subset 2); Trails B, a paper and pencil test of visual processing speed also involving problem solving, executive function, and working memory; and the visual closure subtest of the Motor Free Visual Perception Test (MVPT) examining the ability to recognize objects only partially visible. Potentially confounding variables were also assessed including demographics, general cognitive status, visual acuity, and contrast sensitivity. MVC involvement was determined by accident reports from the Alabama Department of Public Safety, and driving exposure was estimated from the Driving Habits Questionnaire.
MVC rates (for at fault and all MVCs) were significantly higher for those older drivers with impairments in any of the three visual processing screening tests. After adjustment for potentially confounding influences, the association between MVC rate and Trails B remained significant, whereas the association with MVPT and UFOV did not.
This population-based study of drivers aged 70 years and older suggests that a paper and pencil test assessing higher order visual processing skills is independently associated with a recent history of MVC involvement.
A population-based study of drivers aged 70 years or older suggests that a paper and pencil test assessing higher order visual processing skills is a marker for increased motor vehicle collision risk.
Ghrelin and glucagon-like peptide-1 (GLP-1) are gut hormones known to induce hunger and satiety, respectively. Current knowledge about the effects of different macronutrients on circulating ghrelin and GLP-1 comes mainly from acute test meals, whereas little is known about the effects of chronic dietary intake on gut hormone secretion. This study was designed to examine whether 8-week habituation to diets with different percentages of carbohydrate and fat would affect serum ghrelin, GLP-1, and subjective hunger in a postabsorptive state and in response to a standard liquid mixed meal.
Sixty-one overweight men and women were provided all food for 8 weeks of either a higher-carbohydrate/lower-fat diet (High-CHO/Low-FAT; 55% CHO, 18% PRO, 27% FAT) or a lower-carbohydrate/higher-fat diet (Low-CHO/High-FAT; 43% CHO, 18% PRO, 39% FAT). After overnight fasts at baseline and week 8, participants consumed a standard liquid meal (7 kcals/kg, 58.6% CHO, 17.4% PRO, 24% FAT). Blood was sampled before the meal and at 15, 60, 90, 120, 180, and 240 minutes to determine total serum ghrelin and active GLP-1. Hunger was assessed by a visual analog scale. Mixed models were used to evaluate whether the temporal patterns of total serum ghrelin and active GLP-1 differed with diet.
Although both diet groups reported greater hunger after 8 weeks (p=0.03), circulating ghrelin and GLP-1 were not affected by acclimation to different macronutrients.
Habituation to different diets does not appear to influence fasting ghrelin, fasting GLP-1, or responses of these gut hormones to a standard meal.
Macronutrients; ghrelin; glucagon-like peptide-1 (GLP-1)
The marked improvement in outcome following induction of hypothermia after cardiac arrest has spurred the search for better methods to induce cooling. A regulated decrease in core temperature mediated by a drug-induced reduction in the set point for thermoregulation may be an ideal means of inducing hypothermia. To this end, the exploratory drug HBN-1 was assessed as a means to induce mild and prolonged hypothermia.
Free moving rats were infused i.v. for 12 hours with: a vehicle at room temperature (normothermia), a vehicle chilled to 4°C (forced hypothermia), or HBN-1 (mixture of ethanol, lidocaine, and vasopressin) at room temperature. Core (intra-abdominal) temperature (Tc) was measured telemetrically, tail skin temperature (Ttail) by infrared thermography, metabolic rate (MR) was estimated with indirect calorimetery, and shivering was scored visually.
HBN-1 elicited a reduction in Tc from 37.5°C to 34°C within 80 minutes after initiation of the infusion; Tc was maintained between 33°C and 34°C for more than 13 hours. HBN-1 infusion was associated with a reduction in MR (p=0.0006), a slight reduction in Ttail, and no evidence of shivering (p<0.001). The forced hypothermia group displayed shivering (p<0.001), a significant increase in MR, and a decrease in Ttail, indicative of peripheral vasoconstriction to reduce heat loss.
HBN-1 infusion induced a mild and prolonged hypothermia in free moving, unanesthetized rats characterized by modulation of thermoeffectors to reduce heat gain and increase heat loss. HBN-1 thus appears to elicit regulated hypothermia and may provide a new method for achieving a prolonged state of therapeutic hypothermia.
To determine the clinical manifestations and disease damage associated with discoid rash in a large multiethnic systemic lupus erythematosus (SLE) cohort.
SLE patients (per ACR criteria), age ≥ 16 years, disease duration ≤ 10 years at enrollment, and defined ethnicity (African American, Hispanic or Caucasian), from a longitudinal cohort were studied. Socioeconomic-demographic features, clinical manifestations and disease damage [as per the Systemic Lupus International Collaborating Clinics Damage Index (SDI)] were determined. The association of DLE with clinical manifestations and disease damage was examined using multivariable logistic regression.
A total of 2,228 SLE patients were studied. The mean (standard deviation, SD) age at diagnosis was 34.3 (12.8) years and the mean (SD) disease duration was 7.9 (6.0) years; 91.8% were women. Discoid lupus was observed in 393 (17.6%) of patients with SLE. In the multivariable analysis, patients with discoid lupus were more likely to be smokers and of African-American ethnicity, and to have malar rash, photosensitivity, oral ulcers, leukopenia and vasculitis. DLE patients were less likely to be of Hispanic (from Texas) ethnicity, and to have arthritis, end-stage renal disease (ESRD), and antinuclear, anti-dsDNA and anti-phospholipid antibodies. Patients with DLE had more damage accrual, particularly chronic seizures, scarring alopecia, scarring of the skin, and skin ulcers.
In this cohort of SLE patients, discoid lupus was associated with several clinical features including serious manifestations such as vasculitis and chronic seizures.
discoid rash; systemic lupus erythematosus; disease damage