PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (551877)

Clipboard (0)
None

Related Articles

1.  Lentiviral Expression of Retinal Guanylate Cyclase-1 (RetGC1) Restores Vision in an Avian Model of Childhood Blindness 
PLoS Medicine  2006;3(6):e201.
Background
Leber congenital amaurosis (LCA) is a genetically heterogeneous group of retinal diseases that cause congenital blindness in infants and children. Mutations in the GUCY2D gene that encodes retinal guanylate cyclase–1 (retGC1) were the first to be linked to this disease group (LCA type 1 [LCA1]) and account for 10%–20% of LCA cases. These mutations disrupt synthesis of cGMP in photoreceptor cells, a key second messenger required for function of these cells. The GUCY1*B chicken, which carries a null mutation in the retGC1 gene, is blind at hatching and serves as an animal model for the study of LCA1 pathology and potential treatments in humans.
Methods and Findings
A lentivirus-based gene transfer vector carrying the GUCY2D gene was developed and injected into early-stage GUCY1*B embryos to determine if photoreceptor function and sight could be restored to these animals. Like human LCA1, the avian disease shows early-onset blindness, but there is a window of opportunity for intervention. In both diseases there is a period of photoreceptor cell dysfunction that precedes retinal degeneration. Of seven treated animals, six exhibited sight as evidenced by robust optokinetic and volitional visual behaviors. Electroretinographic responses, absent in untreated animals, were partially restored in treated animals. Morphological analyses indicated there was slowing of the retinal degeneration.
Conclusions
Blindness associated with loss of function of retGC1 in the GUCY1*B avian model of LCA1 can be reversed using viral vector-mediated gene transfer. Furthermore, this reversal can be achieved by restoring function to a relatively low percentage of retinal photoreceptors. These results represent a first step toward development of gene therapies for one of the more common forms of childhood blindness.
Blindness associated with loss of function of retinal guanylate cyclase-1 in an avian model of Leber congenital amaurosis-1 can be reversed using viral vector-mediated gene transfer.
Editors' Summary
Background.
Leber congenital amaurosis (LCA) is the name of a group of hereditary diseases that cause blindness in infants and children. Changes in any one of a number of different genes can cause the blindness, which affects vision starting at birth or soon after. The condition was first described by a German doctor, Theodore Leber, in the 19th century, hence the first part of the name, and “amaurosis” is another word for blindness. About 20% of children with LCA have the most common type, called LCA1, which is caused by defects in a gene called retinal guanylate cyclase (GUCY2D) that is found on Chromosome 17. It is one of a group of genes that produce proteins that are important in determining how rods and cones—specialized light receptor cells at the back of the eye—respond to light, in particular, how they can return to the resting state after being stimulated by light. Defects in the gene leave the eye unable to respond to light, and so cause blindness. There is also damage to the receptor cells, which in turn causes a cellular breakdown of the retina—the light-sensitive tissue at the back of the eye. Gene therapy works by replacing a defective gene with a normal functional one, usually by packaging the normal gene into a harmless virus and injecting it into the affected tissue, in this case the eye.
Why Was This Study Done?
Gene therapy is a promising therapy for diseases such as LCA, because the gene defect is known and the damaged cells are still alive but just not functioning properly. In addition, there is an animal model for LCA1, a strain of chickens with a mutation in the chicken GUCY2D gene: these chickens develop a disease very similar to that of humans and can be used to test treatments before they are tried out in humans.
What Did the Researchers Do and Find?
They took a virus and put the normal GUCY2D gene into it and linked it to a reporter gene (that is a gene that can be used to check whether the first gene is present). They then injected this virus into eggs containing chick embryos that had the abnormal gene, and allowed the chicks to hatch normally. Of seven chicks injected, six exhibited sight, in contrast to untreated animals, which were blind. The improvement in the chicks' sight occurred despite the fact that only a relatively small amount of normal protein was made and that normal protein was present in only some of the retinal cells. The breakdown of the retinas in the treated chicks also appeared to be slower than in untreated chicks.
What Do These Findings Mean?
This study shows that in theory it is possible to treat a form of childhood blindness caused by a photoreceptor gene defect by gene therapy. Because this study was done in chickens, many other steps need to be taken before it will be clear whether the treatment could work in humans. These steps include a demonstration that the virus is safe in humans, and experiments that determine what dose of virus would be needed and how long the effects of the treatment would last (the chicks were only studied for 6–7 weeks). Another question is whether it would be necessary (or even possible) to treat affected children before birth, or whether therapy in infants could restore their sight. In addition, the treatment would obviously only work for children who had this specific type of blindness, but many of the principles learned from studying LCA1 should be applicable to other types of LCA and possibly other types of inherited blindness.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0030201.
• The Foundation for Retinal Research has detailed information on LCA
• The Foundation Fighting Blindness funds research into, and provides information on, many types of blindness, including LCA
• Contact a Family, a United Kingdom organization that aims to put families of children with illnesses in touch with each other, has a page of information on LCA
• RetNet—Retinal Information Network—is a comprehensive resource describing current knowledge of all inherited retinal diseases
• Fight for Sight funds student and post-doctoral fellowships as well as grant-in-aid to persons whose research focuses on curing and preventing blindness
• WebVision is a Web site that provides a comprehensive resource for understanding the organization and function of the retina and visual system
doi:10.1371/journal.pmed.0030201
PMCID: PMC1463903  PMID: 16700630
2.  Medical interventions for traumatic hyphema 
Background
Traumatic hyphema is the entry of blood into the anterior chamber (the space between the cornea and iris) subsequent to a blow or a projectile striking the eye. Hyphema uncommonly causes permanent loss of vision. Associated trauma (e.g., corneal staining, traumatic cataract, angle recession glaucoma, optic atrophy, etc.) may seriously affect vision. Such complications may lead to permanent impairment of vision. Patients with sickle cell trait/disease may be particularly susceptible to increases of elevated intraocular pressure. If rebleeding occurs, the rates and severity of complications increase.
Objectives
The objective of this review was to assess the effectiveness of various medical interventions in the management of traumatic hyphema.
Search methods
We searched the Cochrane Central Register of Controlled Trials (CENTRAL) (which contains the Cochrane Eyes and Vision Group Trials Register) (The Cochrane Library 2010, Issue 6), MEDLINE (January 1950 to June 2010), EMBASE (January 1980 to June 2010), the metaRegister of Controlled Trials (mRCT) (www.controlled-trials.com) and ClinicalTrials.gov (http://clinicaltrials.gov). We searched the reference lists of identified trial reports to find additional trials. We also searched the ISI Web of Science Social Sciences Citation Index (SSCI) to find studies that cited the identified trials. There were no language or date restrictions in the search for trials. The electronic databases were last searched on 25 June 2010.
Selection criteria
Two authors independently assessed the titles and abstracts of all reports identified by the electronic and manual searches. In this review, we included randomized and quasi-randomized trials that compared various medical interventions to other medical interventions or control groups for the treatment of traumatic hyphema following closed globe trauma. There were no restrictions regarding age, gender, severity of the closed globe trauma or level of visual acuity at the time of enrollment.
Data collection and analysis
Two authors independently extracted the data for the primary and secondary outcomes. We entered and analyzed data using Review Manager (RevMan) 5. We performed meta-analyses using a fixed-effect model and reported dichotomous outcomes as odds ratios and continuous outcomes as mean differences.
Main results
Nineteen randomized and seven quasi-randomized studies with 2,560 participants were included in this review. Interventions included antifibrinolytic agents (oral and systemic aminocaproic acid, tranexamic acid, and aminomethylbenzoic acid), corticosteroids (systemic and topical), cycloplegics, miotics, aspirin, conjugated estrogens, monocular versus bilateral patching, elevation of the head, and bed rest. No intervention had a significant effect on visual acuity whether measured at two weeks or less after the trauma or at longer time periods. The number of days for the primary hyphema to resolve appeared to be longer with the use of aminocaproic acid compared to no use, but was not altered by any other intervention.
Systemic aminocaproic acid reduced the rate of recurrent hemorrhage (odds ratio (OR) 0.25, 95% confidence interval (CI) 0.11 to 0.5), but a sensitivity analysis omitting studies not using an intention-to-treat (ITT) analysis reduced the strength of the evidence (OR 0.41, 95% CI 0.16 to 1.09). We obtained similar results for topical aminocaproic acid (OR 0.42, 95% CI 0.16 to 1.10). We found tranexamic acid had a significant effect in reducing the rate of secondary hemorrhage (OR 0.25, 95% CI 0.13 to 0.49), as did aminomethylbenzoic acid as reported in a single study (OR 0.07, 95% CI 0.01 to 0.32). The evidence to support an associated reduction in the risk of complications from secondary hemorrhage (i.e., corneal blood staining, peripheral anterior synechiae, elevated intraocular pressure, and development of optic atrophy) by antifibrinolytics was limited by the small number of these events. Use of aminocaproic acid was associated with increased nausea, vomiting, and other adverse events compares with placebo. We found no difference in the number of adverse events with the use of systemic versus topical aminocaproic acid or with standard versus lower drug dose.
The available evidence on usage of corticosteroids, cycloplegics or aspirin in traumatic hyphema was limited due to the small numbers of participants and events in the trials.
We found no difference in effect between a single versus binocular patch nor ambulation versus complete bed rest on the risk of secondary hemorrhage or time to rebleed.
Authors’ conclusions
Traumatic hyphema in the absence of other intraocular injuries, uncommonly leads to permanent loss of vision. Complications resulting from secondary hemorrhage could lead to permanent impairment of vision, especially in patients with sickle cell trait/disease. We found no evidence to show an effect on visual acuity by any of the interventions evaluated in this review. Although evidence is limited, it appears that patients with traumatic hyphema who receive aminocaproic acid or tranexamic acid are less likely to experience secondary hemorrhaging. However, hyphema in patients on aminocaproic acid take longer to clear.
Other than the possible benefits of antifibrinolytic usage to reduce the rate of secondary hemorrhage, the decision to use corticosteroids, cycloplegics, or non-drug interventions (such as binocular patching, bed rest, or head elevation) should remain individualized because no solid scientific evidence supports a benefit. As these multiple interventions are rarely used in isolation, further research to assess the additive effect of these interventions might be of value.
doi:10.1002/14651858.CD005431.pub2
PMCID: PMC3437611  PMID: 21249670
3.  Intrastromal Corneal Ring Implants for Corneal Thinning Disorders 
Executive Summary
Objective
The purpose of this project was to determine the role of corneal implants in the management of corneal thinning disease conditions. An evidence-based review was conducted to determine the safety, effectiveness and durability of corneal implants for the management of corneal thinning disorders. The evolving directions of research in this area were also reviewed.
Subject of the Evidence-Based Analysis
The primary treatment objectives for corneal implants are to normalize corneal surface topography, improve contact lens tolerability, and restore visual acuity in order to delay or defer the need for corneal transplant. Implant placement is a minimally invasive procedure that is purported to be safe and effective. The procedure is also claimed to be adjustable, reversible, and both eyes can be treated at the same time. Further, implants do not limit the performance of subsequent surgical approaches or interfere with corneal transplant. The evidence for these claims is the focus of this review.
The specific research questions for the evidence review were as follows:
Safety
Corneal Surface Topographic Effects:
Effects on corneal surface remodelling
Impact of these changes on subsequent interventions, particularly corneal transplantation (penetrating keratoplasty [PKP])
Visual Acuity
Refractive Outcomes
Visual Quality (Symptoms): such as contrast vision or decreased visual symptoms (halos, fluctuating vision)
Contact lens tolerance
Functional visual rehabilitation and quality of life
Patient satisfaction:
Disease Process:
Impact on corneal thinning process
Effect on delaying or deferring the need for corneal transplantation
Clinical Need: Target Population and Condition
Corneal ectasia (thinning) comprises a range of disorders involving either primary disease conditions such as keratoconus and pellucid marginal corneal degeneration or secondary iatrogenic conditions such as corneal thinning occurring after LASIK refractive surgery. The condition occurs when the normally round dome-shaped cornea progressively thins causing a cone-like bulge or forward protrusion in response to the normal pressure of the eye. Thinning occurs primarily in the stoma layers and is believed to be a breakdown in the collagen network. This bulging can lead to an irregular shape or astigmatism of the cornea and, because the anterior part of the cornea is largely responsible for the focusing of light on the retina, results in loss of visual acuity. This can make even simple daily tasks, such as driving, watching television or reading, difficult to perform.
Keratoconus (KC) is the most common form of corneal thinning disorder and is a noninflammatory chronic disease process. Although the specific causes of the biomechanical alterations that occur in KC are unknown, there is a growing body of evidence to suggest that genetic factors may play an important role. KC is a rare condition (<0.05% of the population) and is unique among chronic eye diseases as it has an early age of onset (median age of 25 years). Disease management for this condition follows a step-wise approach depending on disease severity. Contact lenses are the primary treatment of choice when there is irregular astigmatism associated with the disease. When patients can no longer tolerate contact lenses or when lenses no longer provide adequate vision, patients are referred for corneal transplant.
Keratoconus is one of the leading indications for corneal transplants and has been so for the last three decades. Yet, despite high graft survival rates of up to 20 years, there are reasons to defer receiving transplants for as long as possible. Patients with keratoconus are generally young and life-long term graft survival would be an important consideration. The surgery itself involves lengthy time off work and there are potential complications from long term steroid use following surgery, as well as the risk of developing secondary cataracts, glaucoma etc. After transplant, recurrent KC is possible with need for subsequent intervention. Residual refractive errors and astigmatism can remain challenging after transplantation and high refractive surgery rates and re-graft rates in KC patients have been reported. Visual rehabilitation or recovery of visual acuity after transplant may be slow and/or unsatisfactory to patients.
Description of Technology/Therapy
INTACS® (Addition Technology Inc. Sunnyvale, CA, formerly KeraVision, Inc.) are the only currently licensed corneal implants in Canada. The implants are micro-thin poly methyl methacrylate crescent shaped ring segments with a circumference arc length of 150 degrees, an external diameter of 8.10 mm, an inner diameter of 6.77 mm, and a range of different thicknesses. Implants act as passive spacers and, when placed in the cornea, cause local separation of the corneal lamellae resulting in a shortening of the arc length of the anterior corneal curvature and flattening the central cornea. Increasing segment thickness results in greater lamellar separation with increased flattening of the cornea correcting for myopia by decreasing the optical power of the eye. Corneal implants also improve corneal astigmatism but the mechanism of action for this is less well understood.
Treatment with corneal implants is considered for patients who are contact lens intolerant, having adequate corneal thickness particularly around the area of the implant incision site and without central corneal scarring. Those with central corneal scarring would not benefit from implants and those without an adequate corneal thickness, particularly in the region that the implants are being inserted, would be at increased risk for corneal perforation. Patients desiring to have visual rehabilitation that does not include glasses or contact lenses would not be candidates for corneal ring implants.
Placement of the implants is an outpatient procedure with topical anesthesia generally performed by either corneal specialists or refractive surgeons. It involves creating tunnels in the corneal stroma to secure the implants either by a diamond knife or laser calibrated to an approximate depth of 70% of the cornea. Variable approaches have been employed by surgeons in selecting ring segment size, number and position. Generally, two segments of equal thickness are placed superiorly and inferiorly to manage symmetrical patterns of corneal thinning whereas one segment may be placed to manage asymmetric thinning patterns.
Following implantation, the major safety concerns are for potential adverse events including corneal perforation, infection, corneal infiltrates, corneal neovascularization, ring migration and extrusion and corneal thinning. Technical results can be unsatisfactory for several reasons. Treatment may result in an over or under-correction of refraction and may induce astigmatism or asymmetry of the cornea.
Progression of the corneal cone with corneal opacities is also invariably an indication for progression to corneal transplant. Other reasons for treatment failure or patient dissatisfaction include foreign body sensation, unsatisfactory visual quality with symptoms such as double vision, fluctuating vision, poor night vision or visual side effects related to ring edge or induced or unresolved astigmatism.
Evidence-Based Analysis Methods
The literature search strategy employed keywords and subject headings to capture the concepts of 1) intrastromal corneal rings and 2) corneal diseases, with a focus on keratoconus, astigmatism, and corneal ectasia. The initial search was run on April 17, 2008, and a final search was run on March 6, 2009 in the following databases: Ovid MEDLINE (1996 to February Week 4 2009), OVID MEDLINE In-Process and Other Non-Indexed Citations, EMBASE (1980 to 2009 Week 10), OVID Cochrane Library, and the Centre for Reviews and Dissemination/International Agency for Health Technology Assessment. Parallel search strategies were developed for the remaining databases. Search results were limited to human and English-language published between January 2000 and April 17, 2008. The resulting citations were downloaded into Reference Manager, v.11 (ISI Researchsoft, Thomson Scientific, U.S.A), and duplicates were removed. The Web sites of several other health technology agencies were also reviewed including the Canadian Agency for Drugs and Technologies in Health (CADTH), ECRI, and the United Kingdom National Institute for Clinical Excellence (NICE). The bibliographies of relevant articles were scanned.
Inclusion Criteria
English language reports and human studies
Any corneal thinning disorder
Reports with corneal implants used alone or in conjunction with other interventions
Original reports with defined study methodology
Reports including standardized measurements on outcome events such as technical success, safety, effectiveness, durability, vision quality of life or patient satisfaction
Case reports or case series for complications and adverse events
Exclusion Criteria
Non-systematic reviews, letters, comments and editorials
Reports not involving outcome events such as safety, effectiveness, durability, vision quality or patient satisfaction following an intervention with corneal implants
Reports not involving corneal thinning disorders and an intervention with corneal implants
Summary of Findings
In the MAS evidence review on intrastromal corneal ring implants, 66 reports were identified on the use of implants for management of corneal thinning disorders. Reports varied according to their primary clinical indication, type of corneal implant, and whether or not secondary procedures were used in conjunction with the implants. Implants were reported to manage post LASIK thinning and/or uncorrected refractive error and were also reported as an adjunctive intervention both during and after corneal transplant to manage recurrent thinning and/or uncorrected refractive error.
Ten pre-post cohort longitudinal follow-up studies were identified examining the safety and effectiveness of INTAC corneal implants in patients with keratoconus. Five additional cohort studies were identified using the Ferrara implant for keratoconus management but because this corneal implant is not licensed in Canada these studies were not reviewed.
The cohorts implanted with INTACS involved 608 keratoconus patients (754 eyes) followed for 1, 2 or 3 years. Three of the reports involved ≥ 2 years of follow-up with the longest having 5-year follow-up data for a small number of patients. Four of the INTAC cohort studies involved 50 or more patients; the largest involved 255 patients. Inclusion criteria for the studies were consistent and included patients who were contact lens intolerant, had adequate corneal thickness, particularly around the area of the implant incision site, and without central corneal scarring. Disease severity, thinning pattern, and corneal cone protrusions all varied and generally required different treatment approaches involving defined segment sizes and locations.
A wide range of outcome measures were reported in the cohort studies. High levels of technical success or ability to place INTAC segments were reported. Technically related complications were often delayed and generally reported as segment migration attributable to early experience. Overall, complications were infrequently reported and largely involved minor reversible events without clinical sequelae.
The outcomes reported across studies involved statistically significant and clinically relevant improvements in corneal topography, refraction and visual acuity, for both uncorrected and best-corrected visual acuity. Patients’ vision was usually restored to within normal functioning levels and for those not achieving satisfactory correction, insertion of intraocular lenses was reported in case studies to result in additional gains in visual acuity. Vision loss (infrequently reported) was usually reversed by implant exchange or removal. The primary effects of INTACS on corneal surface remodelling were consistent with secondary improvements in refractive error and visual acuity. The improvements in visual acuity and refractive error noted at 6 months were maintained at 1 and 2-year follow-up
Improvements in visual acuity and refractive error following insertion of INTACS, however, were not noted for all patients. Although improvements were not found to vary across age groups there were differences across stages of disease. Several reports suggested that improvements in visual acuity and refractive outcomes may not be as large or predictable in more advanced stages of KC. Some studies have suggested that the effects of INTACs were much greater in flattening the corneal surface than in correcting astigmatism. However, these studies involved small numbers of high risk patients in advanced stages of KC and conclusions made from this group are limited.
INTACS were used for other indications other than primary KC. The results of implant insertion on corneal topography, refraction, and visual acuity in post-LASIK thinning cases were similar to those reported for KC. The evidence for this indication, however, only involved case reports and small case series. INTACS were also successfully used to treat recurrent KC after corneal transplant but this was based on only a single case report. Corneal implants were compared to corneal transplantation but these studies were not randomized and based on small numbers of selected patients.
The foremost limitation of the evidence base is the basic study design in the reports that involved longitudinal follow-up only for the treated group; there were no randomized trials. Follow-up in the trials (although at prescribed intervals) often had incomplete accounts of losses at follow-up and estimates of change were often not reported or based on group differences. Second, although standardized outcome measures were reported, contact lens tolerance (a key treatment objective) was infrequently specified. A third general limitation was the lack of reporting of patients’ satisfaction with their vision quality or functional vision. Outcome measures for vision quality and impact on patient quality of life were available but rarely reported and have been noted to be a limitation in ophthalmological literature in general. Fourth, the longitudinal cohort studies have not followed patients long enough to evaluate the impact of implants on the underlying disease process (follow-up beyond 3 years is limited). Additionally, only a few of these studies directly examined corneal thinning in follow-up. The overall quality of evidence determined using the GRADE hierarchy of evidence was moderate.
There is some evidence in these studies to support the claim that corneal implants do not interfere with, or increase the difficultly of, subsequent corneal transplant, at least for those performed shortly after INTAC placement. Although it’s uncertain for how long implants can delay the need for a corneal transplant, given that patients with KC are often young (in their twenties and thirties), delaying transplant for any number of years may still be a valuable consideration.
Conclusion
The clinical indications for corneal implants have evolved from management of myopia in normal eyes to the management of corneal thinning disorders such as KC and thinning occurring after refractive surgery. Despite the limited evidence base for corneal implants, which consists solely of longitudinal follow-up studies, they appear to be a valuable clinical tool for improving vision in patients with corneal thinning. For patients unable to achieve functional vision, corneal implants achieved statistically significant and clinically relevant improvements in corneal topography, refraction, and visual acuity, providing a useful alternative to corneal transplant. Implants may also have a rescue function, treating corneal thinning occurring after refractive surgery in normal eyes, or managing refractive errors following corneal transplant. The treatment offers several advantages in that it’s an outpatient based procedure, is associated with minimal risk, and has high technical success rates. Both eyes can be treated at once and the treatment is adjustable and reversible. The implants can be removed or exchanged to improve vision without limiting subsequent interventions, particularly corneal transplant.
Better reporting on vision quality, functional vision and patient satisfaction, however, would improve evaluation of the impact of these devices. Information on the durability of the implants’ treatment effects and their affects on underlying disease processes is limited. This information is becoming more important as alternative treatment strategies, such as collagen cross-linking aimed at strengthening the underlying corneal tissue, are emerging and which might prove to be more effective or increase the effectiveness of the implants, particularly in advances stages of corneal thinning.
Ontario Health System Considerations
At present there are approximately 70 ophthalmologists in Canada who’ve had training with corneal implants; 30 of these practice in Ontario. Industry currently sponsors the training, proctoring and support for the procedure. The cost of the implant device ranges from $950 to $1200 (CAD) and costs for instrumentation range from $20,000 to $30,000 (CAD) (a one time capital expenditure). There is no physician services fee code for corneal implants in Ontario but assuming that they are no higher than those for a corneal transplant, the estimated surgical costs would be $914.32(CAD) An estimated average cost per patient, based on device costs and surgical fees, for treatment is $1,964 (CAD) (range $1,814 to $2,114) per eye. There have also been no out of province treatment requests. In Ontario the treatment is currently being offered in private clinics and an increasing number of ophthalmologists are being certified in the technique by the manufacturer.
KC is a rare disease and not all of these patients would be eligible candidates for treatment with corneal implants. Based on published population rates of KC occurrence, it can be expected that there is a prevalent population of approximately 6,545 patients and an incident population of 240 newly diagnosed cases per year. Given this small number of potential cases, the use of corneal implants would not be expected to have much impact on the Ontario healthcare system. The potential impact on the provincial budget for managing the incident population, assuming the most conservative scenario (i.e., all are eligible and all receive bilateral implants) ranges from $923 thousand to $1.1 million (CAD). This estimate would vary based on a variety of criteria including eligibility, unilateral or bilateral interventions, re-interventions, capacity and uptake
Keywords
Keratoconus, corneal implants, corneal topography, corneal transplant, visual acuity, refractive error
PMCID: PMC3385416  PMID: 23074513
4.  Age related macular degeneration 
Clinical Evidence  2007;2007:0701.
Introduction
Sight-threatening (late) age-related macular degeneration (AMD) occurs in 2% of people aged over 50 years in industrialised countries, with prevalence increasing with age. Early-stage disease is marked by normal vision, but retinal changes (drusen and pigment changes). Disease progression leads to worsening central vision, but peripheral vision is preserved.
Methods and outcomes
We conducted a systematic review and aimed to answer the following clinical questions: What are the effects of interventions to prevent progression of early- or late-stage age-related macular degeneration; and exudative age-related macular degeneration? We searched: Medline, Embase, The Cochrane Library and other important databases up to March 2006 (BMJ Clinical Evidence reviews are updated periodically, please check our website for the most up-to-date version of this review). We included harms alerts from relevant organisations such as the US Food and Drug Administration (FDA) and the UK Medicines and Healthcare products Regulatory Agency (MHRA).
Results
We found 45 systematic reviews, RCTs, or observational studies that met our inclusion criteria. We performed a GRADE evaluation of the quality of evidence for interventions.
Conclusions
In this systematic review we present information relating to the effectiveness and safety of the following interventions: antiangiogenesis (using pegaptanib, ranibizumab, interferon alfa-2a, or anecortave acetate), antioxidant vitamins plus zinc, external beam radiation, laser treatment to drusen, photodynamic therapy with verteporfin, submacular surgery, thermal laser photocoagulation, transpupillary thermotherapy.
Key Points
Sight-threatening (late) age related macular degeneration (AMD) occurs in 2% of people aged over 50 years in industrialised countries, with prevalence increasing with age. Early stage disease is marked by normal vision, but retinal changes (drusen and pigment changes). Disease progression leads to worsening central vision, but peripheral vision is preserved.85% of cases are atrophic (dry) AMD, but exudative (wet) AMD, marked by choroidal neovascularisation, leads to a more rapid loss of sight.The main risk factor is age. Hypertension, smoking, and a family history of AMD are also risk factors.
High dose antioxidant vitamin and zinc supplementation may reduce progression of moderate AMD, but there is no evidence of benefit in people with no, or mild AMD, or those with established late AMD in both eyes.
CAUTION: Beta-carotene, an antioxidant vitamin used in AMD, has been linked to an increased risk of lung cancer in people at high risk of this disease.
Photodynamic treatment with verteporfin reduces the risk of developing moderate or severe loss of visual acuity and legal blindness in people with vision initially better than 20/100 or 20/200, compared with placebo. Photodynamic treatment is associated with an initial loss of vision and photosensitive reactions in a small proportion of people.
Thermal laser photocoagulation can reduce severe visual loss in people with exudative AMD. It is frequently associated with an immediate and permanent reduction in visual acuity if the lesion involves the central macula, but it remains a proven effective treatment for extrafoveal choroidal neovascularisation. About half of people treated with thermal lasers show recurrent choroidal neovascularisation within 3 years.We don't know whether laser treatment of drusen prevents progression of disease, and it may increase short term rates of choroidal neovascularisation.
Antiangiogenesis treatment using vascular endothelial growth factor (VEGF) inhibitors such as ranibizumab or pegaptanib reduces the risk of moderate vision loss, and may improve vision at 12 and 24 months. Antiangiogenesis treatment using anecortave acetate may be as effective as photodynamic therapy in reducing vision loss.
Studies investigating external beam radiotherapy have given contradictory results, and have failed to show an overall benefit in AMD.
Subcutaneous interferon alfa-2a and submacular surgery have not been shown to improve vision, and are associated with potentially severe adverse effects.
We found no RCT evidence on the effects of transpupillary thermotherapy.
PMCID: PMC2943806  PMID: 19454069
5.  A Case-Control Study to Assess the Relationship between Poverty and Visual Impairment from Cataract in Kenya, the Philippines, and Bangladesh 
PLoS Medicine  2008;5(12):e244.
Background
The link between poverty and health is central to the Millennium Development Goals (MDGs). Poverty can be both a cause and consequence of poor health, but there are few epidemiological studies exploring this complex relationship. The aim of this study was to examine the association between visual impairment from cataract and poverty in adults in Kenya, Bangladesh, and the Philippines.
Methods and Findings
A population-based case–control study was conducted in three countries during 2005–2006. Cases were persons aged 50 y or older and visually impaired due to cataract (visual acuity < 6/24 in the better eye). Controls were persons age- and sex-matched to the case participants with normal vision selected from the same cluster. Household expenditure was assessed through the collection of detailed consumption data, and asset ownership and self-rated wealth were also measured. In total, 596 cases and 535 controls were included in these analyses (Kenya 142 cases, 75 controls; Bangladesh 216 cases, 279 controls; Philippines 238 cases, 180 controls). Case participants were more likely to be in the lowest quartile of per capita expenditure (PCE) compared to controls in Kenya (odds ratio = 2.3, 95% confidence interval 0.9–5.5), Bangladesh (1.9, 1.1–3.2), and the Philippines (3.1, 1.7–5.7), and there was significant dose–response relationship across quartiles of PCE. These associations persisted after adjustment for self-rated health and social support indicators. A similar pattern was observed for the relationship between cataract visual impairment with asset ownership and self-rated wealth. There was no consistent pattern of association between PCE and level of visual impairment due to cataract, sex, or age among the three countries.
Conclusions
Our data show that people with visual impairment due to cataract were poorer than those with normal sight in all three low-income countries studied. The MDGs are committed to the eradication of extreme poverty and provision of health care to poor people, and this study highlights the need for increased provision of cataract surgery to poor people, as they are particularly vulnerable to visual impairment from cataract.
Hannah Kuper and colleagues report a population-based case-control study conducted in three countries that found an association between poverty and visual impairment from cataract.
Editors' Summary
Background.
Globally, about 45 million people are blind. As with many other conditions, avoidable blindness (preventable or curable blindness) is a particular problem for people in developing countries—90% of blind people live in poor regions of the world. Although various infections and disorders can cause blindness, cataract is the most common cause. In cataract, which is responsible for half of all cases of blindness in the world, the lens of the eye gradually becomes cloudy. Because the lens focuses light to produce clear, sharp images, as cataract develops, vision becomes increasingly foggy or fuzzy, colors become less intense, and the ability to see shapes against a background declines. Eventually, vision may be lost completely. Cataract can be treated with an inexpensive, simple operation in which the cloudy lens is surgically removed and an artificial lens is inserted into the eye to restore vision. In developed countries, this operation is common and easily accessible but many poor countries lack the resources to provide the operation to everyone who needs it. In addition, blind people often cannot afford to travel to the hospitals where the operation, which also may come with a fee, is done.
Why Was This Study Done?
Because blindness may reduce earning potential, many experts believe that poverty and blindness (and, more generally, poor health) are inextricably linked. People become ill more often in poor countries than in wealthy countries because they have insufficient food, live in substandard housing, and have limited access to health care, education, water, and sanitation. Once they are ill, their ability to earn money may be reduced, which increases their personal poverty and slows the economic development of the whole country. Because of this potential link between health and poverty, improvements in health are at the heart of the United Nations Millennium Development Goals, a set of eight goals established in 2000 with the primary aim of reducing world poverty. However, few studies have actually investigated the complex relationship between poverty and health. Here, the researchers investigate the association between visual impairment from cataract and poverty among adults living in three low-income countries.
What Did the Researchers Do and Find?
The researchers identified nearly 600 people aged 50 y or more with severe cataract-induced visual impairment (“cases”) primarily through a survey of the population in Kenya, Bangladesh, and the Philippines. They matched each case to a normally sighted (“control”) person of similar age and sex living nearby. They then assessed a proxy for the income level, measured as “per capita expenditure” (PCE), of all the study participants (people with cataracts and controls) by collecting information about what their households consumed. The participants' housing conditions and other assets and their self-rated wealth were also measured. In all three countries, cases were more likely to be in the lowest quarter (quartile) of the range of PCEs for that country than controls. In the Philippines, for example, people with cataract-affected vision were three times more likely than normally sighted controls to have a PCE in the lowest quartile than in the highest quartile. The risk of cataract-related visual impairment increased as PCE decreased in all three countries. Similarly, severe cataract-induced visual impairment was more common in those who owned fewer assets and those with lower self-rated wealth. However, there was no consistent association between PCE and the level of cataract-induced visual impairment.
What Do These Findings Mean?
These findings show that there is an association between visual impairment caused by cataract and poverty in Kenya, Bangladesh, and the Philippines. However, because the financial circumstances of the people in this study were assessed after cataracts had impaired their sight, this study does not prove that poverty is a cause of visual impairment. A causal connection between poverty and cataract can only be shown by determining the PCEs of normally sighted people and following them for several years to see who develops cataract. Nevertheless, by confirming an association between poverty and blindness, these findings highlight the need for increased provision of cataract surgery to poor people, particularly since cataract surgery has the potential to improve the quality of life for many people in developing countries at a relatively low cost.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0050244.
This study is further discussed in a PLoS Medicine Perspective by Susan Lewallen
The MedlinePlus encyclopedia contains a page on cataract, and MedlinePlus also provides a list of links to further information about cataract (in English and Spanish)
VISION 2020, a global initiative for the elimination of avoidable blindness launched by the World Health Organization and the International Agency for the Prevention of Blindness, provides information in several languages about many causes of blindness, including cataract. It also has an article available for download on blindness, poverty, and development
Information is available from the World Health Organization on health and the Millennium Development Goals (in English, French, and Spanish)
The International Centre for Eye Health carries out research and education activities to improve eye health and eliminate avoidable blindness with a focus on populations with low incomes
doi:10.1371/journal.pmed.0050244
PMCID: PMC2602716  PMID: 19090614
6.  Canine and Human Visual Cortex Intact and Responsive Despite Early Retinal Blindness from RPE65 Mutation 
PLoS Medicine  2007;4(6):e230.
Background
RPE65 is an essential molecule in the retinoid-visual cycle, and RPE65 gene mutations cause the congenital human blindness known as Leber congenital amaurosis (LCA). Somatic gene therapy delivered to the retina of blind dogs with an RPE65 mutation dramatically restores retinal physiology and has sparked international interest in human treatment trials for this incurable disease. An unanswered question is how the visual cortex responds after prolonged sensory deprivation from retinal dysfunction. We therefore studied the cortex of RPE65-mutant dogs before and after retinal gene therapy. Then, we inquired whether there is visual pathway integrity and responsivity in adult humans with LCA due to RPE65 mutations (RPE65-LCA).
Methods and Findings
RPE65-mutant dogs were studied with fMRI. Prior to therapy, retinal and subcortical responses to light were markedly diminished, and there were minimal cortical responses within the primary visual areas of the lateral gyrus (activation amplitude mean ± standard deviation [SD] = 0.07% ± 0.06% and volume = 1.3 ± 0.6 cm3). Following therapy, retinal and subcortical response restoration was accompanied by increased amplitude (0.18% ± 0.06%) and volume (8.2 ± 0.8 cm3) of activation within the lateral gyrus (p < 0.005 for both). Cortical recovery occurred rapidly (within a month of treatment) and was persistent (as long as 2.5 y after treatment). Recovery was present even when treatment was provided as late as 1–4 y of age. Human RPE65-LCA patients (ages 18–23 y) were studied with structural magnetic resonance imaging. Optic nerve diameter (3.2 ± 0.5 mm) was within the normal range (3.2 ± 0.3 mm), and occipital cortical white matter density as judged by voxel-based morphometry was slightly but significantly altered (1.3 SD below control average, p = 0.005). Functional magnetic resonance imaging in human RPE65-LCA patients revealed cortical responses with a markedly diminished activation volume (8.8 ± 1.2 cm3) compared to controls (29.7 ± 8.3 cm3, p < 0.001) when stimulated with lower intensity light. Unexpectedly, cortical response volume (41.2 ± 11.1 cm3) was comparable to normal (48.8 ± 3.1 cm3, p = 0.2) with higher intensity light stimulation.
Conclusions
Visual cortical responses dramatically improve after retinal gene therapy in the canine model of RPE65-LCA. Human RPE65-LCA patients have preserved visual pathway anatomy and detectable cortical activation despite limited visual experience. Taken together, the results support the potential for human visual benefit from retinal therapies currently being aimed at restoring vision to the congenitally blind with genetic retinal disease.
The study by Samuel Jacobson and colleagues suggests that retinal gene therapy can improve retinal, visual pathway, and visual cortex responses to light stimulation, even after prolonged periods of blindness and in congenitally blind patients.
Editors' Summary
Background.
The eye captures light but the brain is where vision is experienced. Treatments for childhood blindness at the eye level are ready, but it is unknown whether the brain will be receptive to an improved neural message. Normal vision begins as photoreceptor cells in the retina (the light-sensitive tissue lining the inside of the eye) convert visual images into electrical impulses. These impulses are sent along the optic nerve to the visual cortex, the brain region where they are interpreted. The conversion of light into electrical impulses requires the activation of a molecule called retinal, which is subsequently recycled by retinal pigment epithelium (RPE) cells neighboring the retina. One of the key enzymes of the recycling reactions is encoded by a gene called RPE65. Genetic changes (mutations) in RPE65 cause an inherited form of blindness called Leber congenital amaurosis (LCA). In this disease, retinal is not recycled and as a result, the photoreceptor cells cannot work properly and affected individuals have poor or nonexistent vision from birth. Previous studies in dog and mouse models of the human disease have demonstrated that the introduction of a functional copy of RPE65 into the RPE cells using a harmless virus (gene therapy) dramatically restores retinal activity. Very recently, a pioneering gene therapy operation took place in London (UK) where surgeons injected a functional copy of RPE65 into the retina of a man with LCA. Whether this operation results in improved vision is not known at this time.
Why Was This Study Done?
Gene therapy corrects the retinal defects in animal models of LCA but whether the visual pathway from the retina to the visual cortex of the brain can respond normally to the signals sent by the restored retina is not known. Early visual experience is thought to be necessary for the development of a functional visual cortex, so replacing the defective RPE65 gene might not improve the vision of people with LCA. In this study, the researchers have studied the visual cortex of RPE65-deficient dogs before and after gene therapy to see whether the therapy affects the activity of the visual cortex. They have also investigated visual pathway integrity and responsiveness in adults with LCA caused by RPE65 mutations. If the visual pathway is disrupted in these patients, they reasoned, gene therapy might not restore their vision.
What Did the Researchers Do and Find?
The researchers used a technique called functional magnetic resonance imaging (fMRI) to measure light-induced brain activity in RPE65-deficient dogs before and after gene therapy. They also examined the reactions of the dogs' pupils to light (in LCA, the pupils do not contract normally in response to light because there is reduced signal transmission along the visual pathway). Finally, they measured the electrical activity of the dogs' retinas in response to light flashes—the retinas of patients with LCA do not react to light. Gene therapy corrected the defective retinal and visual pathway responses to light in the RPE65-deficient dogs and, whereas before treatment there was no response in the visual cortex to light stimulation in these dogs, after treatment, its activity approached that seen in normal dogs. The recovery of cortical responses was permanent and occurred soon after treatment, even in animals that were 4 years old when treated. Next, using structural MRI, the researchers studied human patients with LCA and found that the optic nerve diameter in young adults was within the normal range and that the structure of the visual cortex was very similar to that of normal individuals. Finally, using fMRI, they found that, although the visual cortex of patients with LCA did not respond to dim light, its reaction to bright light was comparable to that of normal individuals.
What Do These Findings Mean?
The findings from the dog study indicate that retinal gene therapy rapidly improves retinal, visual pathway, and visual cortex responses to light stimulation, even in animals that have been blind for years. In other words, in the dog model of LCA at least, all the components of the visual system remain receptive to visual inputs even after long periods of visual deprivation. The findings from the human study also indicate that the visual pathway remains anatomically intact despite years of disuse and that the visual cortex can be activated in patients with LCA even though these people have very limited visual experience. Taken together, these findings suggest that successful gene therapy of the retina might restore some functional vision to people with LCA but proof will have to await the outcomes of several clinical trials ongoing or being planned in Europe and the USA.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0040230.
General information on gene therapy is available from the Oak Ridge National Laboratory
Information is provided by the BBC about gene therapy for Leber congenital amaurosis (includes an audio clip from a doctor about the operation)
The National Institutes of Health/National Eye Institute (US) provides information about an ongoing gene therapy trial of RPE65-Leber congenital amaurosis
ClinicalTrials.gov gives details on treatment trials for Leber congenital amaurosis
The Foundation Fighting Blindness has a fact sheet on Leber congenital amaurosis (site includes Microsoft Webspeak links that read some content aloud)
The Foundation for Retinal Research has a fact sheet on Leber congenital amaurosis
Find more detailed information on Leber congenital amaurosis and the gene mutations that cause it from GeneReviews
WonderBaby, information for parents of babies with Leber congenital amaurosis
doi:10.1371/journal.pmed.0040230
PMCID: PMC1896221  PMID: 17594175
7.  Fresnel prisms and their effects on visual acuity and binocularity. 
1. The visual acuity with the Fresnel membrane prism is significantly less than that with the conventional prism of the same power for all prism powers from 12 delta through 30 delata at distance and from 15 delta through 30 delta at near. 2. The difference in the visual acuity between base up and base down, and between base in and base out, is not significantly different for either the Fresnel membrane prism or for the conventional prism. 3. For both Fresnel membrane prism and the conventional prism, the visual acuity when looking straight ahead. 4. Using Fresnel membrane prisms of the same power from different lots, the visual acuity varied significantly. The 30 delta prism caused the widest range in visual acuity. 5. When normal subjects are fitted with the higher powers of the Fresnel membrane prism, fusion and stereopsis are disrupted to such an extent that the use of this device to restore or to improve binocular vision in cases with large-angle deviations is seriously questioned. 6. Moreover, the disruption of fusion and stereopsis is abrupt and severe and does not parallel the decrease in visual acuity. The severely reduced ability to maintain fusion may be related to the optical aberrations, which, in turn, may be due to the molding process and the polyvinyl chloride molding material. 7. Through the flexibility of the membrane prism is a definite advantage, because of its proclivity to reduce visual acuity and increase aberrations its prescription for adults often must be limited to only one eye. 8. For the same reasons in the young child with binocular vision problems, the membrane prism presently available should be prescribed over both eyes only in powers less than 20 delta. When the membrane prism is to be used as a partial occluder (over one eye only), any power can be used. 9. The new Fresnel "hard" prism reduces visual acuity minimally and rarely disrupts binocularity, thus increasing the potential for prismotherapy to establish binocularity. This prism is currently available only for use as a trial set. Since the cosmetic appearance of the Fresnel "hard" prism is similar to that of the Fresnel membrane prism and it is easier to maintain, it would be the prism of choice (over all other types) for bilateral prescriptions in the young patient with emmetropia. The manufacturer is urged to make these prisms available to fit a special round adjustable frame, such as that developed in Europe for use with the wafer prism.
Images
PMCID: PMC1311638  PMID: 754384
8.  Routine Eye Examinations for Persons 20-64 Years of Age 
Executive Summary
Objective
The objective of this analysis was to determine the strength of association between age, gender, ethnicity, family history of disease and refractive error and the risk of developing glaucoma or ARM?
Clinical Need
A routine eye exam serves a primary, secondary, and tertiary care role. In a primary care role, it allows contact with a doctor who can provide advice about eye care, which may reduce the incidence of eye disease and injury. In a secondary care role, it can via a case finding approach, diagnose persons with degenerative eye diseases such as glaucoma and or AMD, and lead to earlier treatment to slow the progression of the disease. Finally in a tertiary care role, it provides ongoing monitoring and treatment to those with diseases associated with vision loss.
Glaucoma is a progressive degenerative disease of the optic nerve, which causes gradual loss of peripheral (side) vision, and in advanced disease states loss of central vision. Blindness may results if glaucoma is not diagnosed and managed. The prevalence of primary open angle glaucoma (POAG) ranges from 1.1% to 3.0% in Western populations, and from 4.2% to 8.8% in populations of African descent. It is estimated up to 50% of people with glaucoma are aware that they have the disease. In Canada, glaucoma disease is the second leading cause of blindness in people aged 50 years and older. Tonometry, inspection of the optic disc and perimetry are used concurrently by physicians and optometrists to make the diagnosis of glaucoma. In general, the evidence shows that treating people with increased IOP only, increased IOP and clinical signs of early glaucoma or with normal-tension glaucoma can reduce the progression of disease.
Age-related maculopathy (ARM) is a degenerative disease of the macula, which is a part of the retina. Damage to the macula causes loss of central vision affecting the ability to read, recognize faces and to move about freely. ARM can be divided into an early- stage (early ARM) and a late-stage (AMD). AMD is the leading cause of blindness in developed countries. The prevalence of AMD increases with increasing age. It is estimated that 1% of people 55 years of age, 5% aged 75 to 84 years and 15% 80 years of age and older have AMD. ARM can be diagnosed during fundoscopy (ophthalmoscopy) which is a visual inspection of the retina by a physician or optometrist, or from a photograph of the retina. There is no cure or prevention for ARM. Likewise, there is currently no treatment to restore vision lost due to AMD. However, there are treatments to delay the progression of the disease and further loss of vision.
The Technology
A periodic oculo-visual assessment is defined “as an examination of the eye and vision system rendered primarily to determine if a patient has a simple refractive error (visual acuity assessment) including myopia, hypermetropia, presbyopia, anisometropia or astigmatism.” This service includes a history of the presenting complaint, past medical history, visual acuity examination, ocular mobility examination, slit lamp examination of the anterior segment, ophthalmoscopy, and tonometry (measurement of IOP) and is completed by either a physician or an optometrist.
Review Strategy
The Medical Advisory Secretariat conducted a computerized search of the literature in the following databases: OVID MEDLINE, MEDLINE, In-Process & Other Non-Indexed Citations, EMBASE, INAHTA and the Cochrane Library. The search was limited to English-language articles with human subjects, published from January 2000 to March 2006. In addition, a search was conducted for published guidelines, health technology assessments, and policy decisions. Bibliographies of references of relevant papers were searched for additional references that may have been missed in the computerized database search. Studies including participants 20 years and older, population-based prospective cohort studies, population-based cross-sectional studies when prospective cohort studies were unavailable or insufficient and studies determining and reporting the strength of association or risk- specific prevalence or incidence rates of either age, gender, ethnicity, refractive error or family history of disease and the risk of developing glaucoma or AMD were included in the review. The Grading of Recommendations Assessment, Development and Evaluation (GRADE) system was used to summarize the overall quality of the body of evidence.
Summary of Findings
A total of 498 citations for the period January 2000 through February 2006 were retrieved and an additional 313 were identified when the search was expanded to include articles published between 1990 and 1999. An additional 6 articles were obtained from bibliographies of relevant articles. Of these, 36 articles were retrieved for further evaluation. Upon review, 1 meta-analysis and 15 population-based epidemiological studies were accepted for this review
Primary Open Angle Glaucoma
Age
Six cross-sectional studies and 1 prospective cohort study contributed data on the association between age and PAOG. From the data it can be concluded that the prevalence and 4-year incidence of POAG increases with increasing age. The odds of having POAG are statistically significantly greater for people 50 years of age and older relative to those 40 to 49 years of age. There is an estimated 7% per year incremental odds of having POAG in persons 40 years of age and older, and 10% per year in persons 49 years of age and older. POAG is undiagnosed in up to 50% of the population. The quality of the evidence is moderate.
Gender
Five cross-sectional studies evaluated the association between gender and POAG. Consistency in estimates is lacking among studies and because of this the association between gender and prevalent POAG is inconclusive. The quality of the evidence is very low.
Ethnicity
Only 1 cross-sectional study compared the prevalence rates of POAG between black and white participants. These data suggest that prevalent glaucoma is statistically significantly greater in a black population 50 years of age and older compared with a white population of similar age. There is an overall 4-fold increase in prevalent POAG in a black population compared with a white population. This increase may be due to a confounding variable not accounted for in the analysis. The quality of the evidence is low.
Refractive Error
Four cross-sectional studies assessed the association of myopia and POAG. These data suggest an association between myopia defined as a spherical equivalent of -1.00D or worse and prevalent POAG. However, there is inconsistency in results regarding the statistical significance of the association between myopia when defined as a spherical equivalent of -0.5D. The quality of the evidence is very low.
Family History of POAG
Three cross-sectional studies investigated the association between family history of glaucoma and prevalent POAG. These data suggest a 2.5 to 3.0 fold increase in the odds having POAG in persons with a family history (any first-degree relative) of POAG. The quality of the evidence is moderate.
Age-Related Maculopathy
Age
Four cohort studies evaluated the association between age and early ARM and AMD. After 55 years of age, the incidence of both early ARM and AMD increases with increasing age. Progression to AMD occurs in up to 12% of persons with early ARM. The quality of the evidence is low
Gender
Four cohort studies evaluated the association between gender and early ARM and AMD. Gender differences in incident early ARM and incident AMD are not supported from these data. The quality of the evidence is lows.
Ethnicity
One meta-analysis and 2 cross-sectional studies reported the ethnic-specific prevalence rates of ARM. The data suggests that the prevalence of early ARM is higher in a white population compared with a black population. The data suggest that the ethnic-specific differences in the prevalence of AMD remain inconclusive.
Refractive Error
Two cohort studies investigated the association between refractive error and the development of incident early ARM and AMD. The quality of the evidence is very low.
Family History
Two cross-sectional studies evaluated the association of family history and early ARM and AMD. Data from one study supports an association between a positive family history of AMD and having AMD. The results of the study indicate an almost 4-fold increase in the odds of any AMD in a person with a family history of AMD. The quality of the evidence, as based on the GRADE criteria is moderate.
Economic Analysis
The prevalence of glaucoma is estimated at 1 to 3% for a Caucasian population and 4.2 to 8.8% for a black population. The incidence of glaucoma is estimated at 0.5 to 2.5% per year in the literature. The percentage of people who go blind per year as a result of glaucoma is approximately 0.55%.
The total population of Ontarians aged 50 to 64 years is estimated at 2.6 million based on the April 2006 Ontario Ministry of Finance population estimates. The range of utilization for a major eye examination in 2006/07 for this age group is estimated at 567,690 to 669,125, were coverage for major eye exams extended to this age group. This would represent a net increase in utilization of approximately 440,116 to 541,551.
The percentage of Ontario population categorized as black and/or those with a family history of glaucoma was approximately 20%. Therefore, the estimated range of utilization for a major eye examination in 2006/07 for this sub-population is estimated at 113,538 - 138,727 (20% of the estimated range of utilization in total population of 50-64 year olds in Ontario), were coverage for major eye exams extended to this sub-group. This would represent a net increase in utilization of approximately 88,023 to 108,310 within this sub-group.
Costs
The total cost of a major eye examination by a physician is $42.15, as per the 2006 Schedule of Benefits for Physician Services.(1) The total difference between the treatments of early-stage versus late-stage glaucoma was estimated at $167. The total cost per recipient was estimated at $891/person.
Current Ontario Policy
As of November 1, 2004 persons between 20 years and 64 years of age are eligible for an insured eye examination once every year if they have any of the following medical conditions: diabetes mellitus type 1 or 2, glaucoma, cataract(s), retinal disease, amblyopia, visual field defects, corneal disease, or strabismus. Persons between 20 to 64 years of age who do not have diabetes mellitus, glaucoma, cataract(s), retinal disease, amblyopia, visual field defects, corneal disease, or strabismus may be eligible for an annual eye examination if they have a valid “request for major eye examination” form completed by a physician (other than that who completed the eye exam) or a nurse practitioner working in a collaborative practice. Persons 20-64 years of age who are in receipt of social assistance and who do not have one of the 8 medical conditions listed above are eligible to receive an eye exam once every 2 years as a non-OHIP government funded service. Persons 19 years of age or younger and 65 years of age or older may receive an insured eye exam once every year.
Considerations for Policy Development
As of July 17, 2006 there were 1,402 practicing optometrists in Ontario. As of December 31, 2005 there were 404 practicing ophthalmologists in Ontario. It is unknown how many third party payers now cover routine eye exams for person between the ages of 20 and 64 years of age in Ontario.
PMCID: PMC3379534  PMID: 23074485
9.  Digital Enhancement of Television Signals for People with Visual Impairments: Evaluation of a Consumer Product 
Technology to improve the clarity of video for home theater viewers is available utilizing a low cost enhancement chip (DigiVision DV1000). The impact of such a device on the preference for enhanced video was tested for people with impaired vision and normally sighted viewers. Viewers with impaired vision preferred the enhancement effects more than normally sighted viewers. Preference for enhancement was correlated with loss in contrast sensitivity and visual acuity. Preference increased with increased enhancement settings (designed for those with normal vision) in the group with vision impairments. This suggests that higher enhancement levels may be of even greater benefit, and a similar product could be designed to meet the needs of the large, growing population of elderly television viewers with impaired vision.
doi:10.1889/1.2896328
PMCID: PMC2410034  PMID: 19255610
10.  Visual acuity in unilateral cataract. 
BACKGROUND: Patching the fellow eye in infancy is a well recognised therapy to encourage visual development in the lensectomised eye in cases of unilateral congenital cataract. The possibility of iatrogenic deficits of the fellow eye was investigated by comparing the vision of these patients with untreated unilateral patients and binocularly normal controls. METHODS: Sweep visual evoked potentials (VEPs) offer a rapid and objective method for estimating grating acuity. Sweep VEPs were used to estimate acuity in 12 children aged between 4 and 16 years who had had a congenital cataract removed in the first 13 weeks of life. The acuities of aphakic and fellow phakic eye were compared with the monocular acuities of similarly aged children who have good binocular vision, and with children with severe untreated uniocular visual impairment. Recognition linear acuities were measured with a linear Bailey-Lovie logMAR chart and compared with the sweep VEP estimates. RESULTS: A significant difference was found between Bailey-Lovie acuity of the fellow eye of the patient group and the right eye of binocular controls, and the good eye of uniocular impaired patients (one way ANOVA, p < 0.01). However, this was not evident for a similar comparison with sweep VEP estimates. There was no significant difference between the right and left eye acuities in binocular controls measured by the two techniques (paired t test). CONCLUSION: A loss of recognition acuity in the fellow phakic eye of patients treated for unilateral congenital cataract has been demonstrated with a logMAR chart. This loss was not apparent in children who have severe untreated uniocular visual impairment and may therefore be an iatrogenic effect of occlusion. An acuity loss was not apparent in the patient group using the sweep VEP method. Sweep VEP techniques have a place for objectively studying acuity in infants and in those whose communication difficulties preclude other forms of behavioural test. The mean sweep VEP acuity for the control groups is 20 cpd--that is, about 6/9. When acuities higher than this are under investigation--for example, in older children, slower transient VEP recording may be more appropriate, because higher spatial frequency patterns are not as visible at higher temporal rates (for example, 8 Hz used in sweep VEP recordings).
PMCID: PMC505614  PMID: 8942375
11.  Brief Daily Periods of Unrestricted Vision Preserve Stereopsis in Strabismus 
In this study, stereopsis was preserved in optically strabismic infant monkeys by brief daily periods of normal binocular vision. This result indicates that the temporal integration properties of mechanisms responsible for vision development ensure that binocular vision development proceeds normally despite episodes of abnormal vision, but also implies that stereopsis may be preserved in human strabismic infants by providing brief daily periods of fusion prior to alignment surgery.
Purpose.
This study examines whether brief periods of binocular vision could preserve stereopsis in monkeys reared with optical strabismus.
Methods.
Starting at 4 weeks of age, six infant monkeys were reared with a total of 30 prism diopters base-in split between the eyes. Two of the six monkeys wore prisms continuously, one for 4 weeks and one for 6 weeks. Four of the six monkeys wore prisms but had 2 hours of binocular vision daily, one for 4, one for 6, and two for 16 weeks. Five normally reared monkeys provided control data. Behavioral methods were used to measure spatial contrast sensitivity, eye alignment, and stereopsis with Gabor and random dot targets.
Results.
The same pattern of results was evident for both local and global stereopsis. For monkeys treated for 4 weeks, daily periods of binocular vision rescued stereopsis from the 10-fold reduction observed with continuous optical strabismus. Six weeks of continuous strabismus resulted in stereo blindness, whereas daily periods of binocular vision limited the reduction to a twofold loss from normal. Daily periods of binocular vision preserved stereopsis over 16 weeks of optical strabismus for one of the two monkeys.
Conclusions.
Two hours of daily binocular vision largely preserves local and global stereopsis in monkeys reared with optical strabismus. During early development, the effects of normal vision are weighed more heavily than those of abnormal vision. The manner in which the effects of visual experience are integrated over time reduces the likelihood that brief episodes of abnormal vision will cause abnormal binocular vision development.
doi:10.1167/iovs.10-6891
PMCID: PMC3175955  PMID: 21398285
12.  Barriers to sight impairment certification in the UK: the example of a population with diabetes in East London 
BMC Ophthalmology  2014;14:99.
Background
This study assessed the barriers to sight impairment certification in the East London Borough of Tower Hamlets amongst patients attending the Diabetic Retinopathy Screening Service (DRSS).
Methods
All patients who attended DRSS between 1stApril 2009 and 31st of March 2010 and whose recorded best corrected visual acuity (BCVA) at DRSS fulfilled the requirements for sight impairment in the UK were included. An additional 24 patients whose general practitioners (GPs) reported them to be certified blind due to no perception of light (NPL) vision were re-examined to ascertain the reason for certification, and their potential social and visual aids needs.
Results
78 patients were identified with certifiable vision and were reviewed: 10 deceased in the preceding 12 months; 60 were not known to be certified. Of these, 57 attended further assessment, 27 were found to have non-certifiable vision, 9 were referred for further interventions, 9 were certified and 9 were found to be eligible, but declined certification. Five patients were registered due to diabetic eye disease.
Of those 24 reported by the GP of NPL vision, only 4 had true NPL, the rest had usable vision. Only two of them were certified blind due to diabetes.
Conclusions
Our data shows that sight certification in patients with diabetes might be underestimated and these patients often have non-diabetes related visual loss. We propose that data on certifiable visual impairment could serve, along with existing certification databases, as a resource for quality of care standards assessment and service provision for patients with diabetes.
doi:10.1186/1471-2415-14-99
PMCID: PMC4148678  PMID: 25128412
Diabetic retinopathy; Visual acuity; Sight impairment certification
13.  Intraocular Lenses for the Treatment of Age-Related Cataracts 
Executive Summary
Objective
The objective of the report is to examine the comparative effectiveness and cost-effectiveness of various intraocular lenses (IOLs) for the treatment of age-related cataracts.
Clinical Need: Target Population and Condition
A cataract is a hardening and clouding of the normally transparent crystalline lens that may result in a progressive loss of vision depending on its size, location and density. The condition is typically bilateral, seriously compromises visual acuity and contrast sensitivity and increases glare. Cataracts can also affect people at any age, however, they usually occur as a part of the natural aging process. The occurrence of cataracts increases with age from about 12% at age 50 years, to 60% at age 70. In general, approximately 50% of people 65 year of age or older have cataracts. Mild cataracts can be treated with a change in prescription glasses, while more serious symptoms are treated by surgical removal of the cataract and implantation of an IOL.
In Ontario, the estimated prevalence of cataracts increased from 697,000 in 1992 to 947,000 in 2004 (35.9% increase, 2.4% annual increase). The number of cataract surgeries per 1,000 individuals at risk of cataract increased from 64.6 in 1992 to 140.4 in 1997 (61.9% increase, 10.1% annual increase) and continued to steadily increase to 115.7 in 2004 (10.7% increase, 5.2% increase per year).
Description of Technology/Therapy
IOLs are classified either as monofocal, multifocal, or accommodative. Traditionally, monofocal (i.e.. fixed focusing power) IOLs are available as replacement lenses but their implantation can cause a loss of the eye’s accommodative capability (which allows variable focusing). Patients thus usually require eyeglasses after surgery for reading and near vision tasks. Multifocal IOLs aim to improve near and distant vision and obviate the need for glasses. Potential disadvantages include reduced contrast sensitivity, halos around lights and glare. Accommodating IOLs are designed to move with ciliary body contraction during accommodation and, therefore, offer a continuous range of vision (i.e. near, intermediate and distant vision) without the need for glasses. Purported advantages over multifocal IOLs include the avoidance of haloes and no reduction in contrast sensitivity.
Polymethyl methacrylate (PMMA) was the first material used in the fabrication of IOLs and has inherent ultraviolet blocking abilities. PMMA IOLs are inflexible, however, and require a larger incision for implantation compared with newer foldable silicone (hydrophobic) and acrylic (hydrophobic or hydrophilic) lenses. IOLs can be further sub-classified as being either aspheric or spheric, blue/violet filtered or non-filtered or 1- or 3-piece.
Methods of Evidence-Based Analysis
A literature search was conducted from January 2003 to January 2009 that included OVID MEDLINE, MEDLINE In-Process and Other Non-Indexed Citations, EMBASE, the Cumulative Index to Nursing & Allied Health Literature (CINAHL), The Cochrane Library, and the International Agency for Health Technology Assessment/Centre for Review and Dissemination.
adult patients with age-related cataracts
systematic reviews, randomized controlled trials (RCTs)
primary outcomes: distance visual acuity (best corrected distance visual acuity), near visual acuity (best distance corrected near visual acuity)
secondary outcomes: contrast sensitivity, depth of field, glare, quality of life, visual function, spectacle dependence, posterior capsule opacification.
studies with fewer than 20 eyes
IOLs for non-age related cataracts
IOLs for presbyopia
studies with a mean follow-up <6months
studies reporting insufficient data for analysis
Comparisons of Interest
The primary comparison of interest was accommodative vs. multifocal vs. monofocal lenses.
Secondary comparisons of interest included:
tinted vs. non-tinted lenses
aspheric vs. spheric lenses
multipiece vs. single piece lenses
biomaterial A (e.g. acrylic) vs. biomaterial B (e.g. silicone) lenses
sharp vs. round edged lenses
The quality of the studies was examined according to the GRADE Working Group criteria for grading quality of evidence for interventional procedures.
Summary of Findings
The conclusions of the systematic review of IOLs for age-related cataracts are summarized in Executive Summary Table 1.
Considerations for the Ontario Health System
Procedures for crystalline lens removal and IOL insertion are insured and listed in the Ontario Schedule of Benefits.
If a particular lens is determined to be medically necessary for a patient, the cost of the lens is covered by the hospital budget. If the patient chooses a lens that has enhanced features, then the hospital may choose to charge an additional amount above the cost of the usual lens offered.
An IOL manufacturer stated that monofocal lenses comprise approximately 95% of IOL sales in Ontario and premium lenses (e.g., multifocal/accomodative) consist of about 5% of IOL sales.
A medical consultant stated that all types of lenses are currently being used in Ontario (e.g., multifocal, monofocal, accommodative, tinted, nontinted, spheric, and aspheric). Nonfoldable lenses, rarely used in routine cases, are primarily used for complicated cataract implantation situations.
Conclusions for the Systematic Review of IOLs for Age-Related Cataracts
BCDVA refers to best corrected distance visual acuity; BDCUNVA, best distance corrected unaided near visual acuity; HRQL, health related quality of life; PCO, posterior capsule opacification; VA, visual acuity.
PMCID: PMC3377510  PMID: 23074519
14.  Dominant optic atrophy 
Definition of the disease
Dominant Optic Atrophy (DOA) is a neuro-ophthalmic condition characterized by a bilateral degeneration of the optic nerves, causing insidious visual loss, typically starting during the first decade of life. The disease affects primary the retinal ganglion cells (RGC) and their axons forming the optic nerve, which transfer the visual information from the photoreceptors to the lateral geniculus in the brain.
Epidemiology
The prevalence of the disease varies from 1/10000 in Denmark due to a founder effect, to 1/30000 in the rest of the world.
Clinical description
DOA patients usually suffer of moderate visual loss, associated with central or paracentral visual field deficits and color vision defects. The severity of the disease is highly variable, the visual acuity ranging from normal to legal blindness. The ophthalmic examination discloses on fundoscopy isolated optic disc pallor or atrophy, related to the RGC death. About 20% of DOA patients harbour extraocular multi-systemic features, including neurosensory hearing loss, or less commonly chronic progressive external ophthalmoplegia, myopathy, peripheral neuropathy, multiple sclerosis-like illness, spastic paraplegia or cataracts.
Aetiology
Two genes (OPA1, OPA3) encoding inner mitochondrial membrane proteins and three loci (OPA4, OPA5, OPA8) are currently known for DOA. Additional loci and genes (OPA2, OPA6 and OPA7) are responsible for X-linked or recessive optic atrophy. All OPA genes yet identified encode mitochondrial proteins embedded in the inner membrane and ubiquitously expressed, as are the proteins mutated in the Leber Hereditary Optic Neuropathy. OPA1 mutations affect mitochondrial fusion, energy metabolism, control of apoptosis, calcium clearance and maintenance of mitochondrial genome integrity. OPA3 mutations only affect the energy metabolism and the control of apoptosis.
Diagnosis
Patients are usually diagnosed during their early childhood, because of bilateral, mild, otherwise unexplained visual loss related to optic discs pallor or atrophy, and typically occurring in the context of a family history of DOA. Optical Coherence Tomography further discloses non-specific thinning of retinal nerve fiber layer, but a normal morphology of the photoreceptors layers. Abnormal visual evoked potentials and pattern ERG may also reflect the dysfunction of the RGCs and their axons. Molecular diagnosis is provided by the identification of a mutation in the OPA1 gene (75% of DOA patients) or in the OPA3 gene (1% of patients).
Prognosis
Visual loss in DOA may progress during puberty until adulthood, with very slow subsequent chronic progression in most of the cases. On the opposite, in DOA patients with associated extra-ocular features, the visual loss may be more severe over time.
Management
To date, there is no preventative or curative treatment in DOA; severely visually impaired patients may benefit from low vision aids. Genetic counseling is commonly offered and patients are advised to avoid alcohol and tobacco consumption, as well as the use of medications that may interfere with mitochondrial metabolism. Gene and pharmacological therapies for DOA are currently under investigation.
doi:10.1186/1750-1172-7-46
PMCID: PMC3526509  PMID: 22776096
15.  Collagen Cross-Linking Using Riboflavin and Ultraviolet-A for Corneal Thinning Disorders 
Executive Summary
Objective
The main objectives for this evidence-based analysis were to determine the safety and effectiveness of photochemical corneal collagen cross-linking with riboflavin (vitamin B2) and ultraviolet-A radiation, referred to as CXL, for the management of corneal thinning disease conditions. The comparative safety and effectiveness of corneal cross-linking with other minimally invasive treatments such as intrastromal corneal rings was also reviewed. The Medical Advisory Secretariat (MAS) evidence-based analysis was performed to support public financing decisions.
Subject of the Evidence-Based Analysis
The primary treatment objective for corneal cross-linking is to increase the strength of the corneal stroma, thereby stabilizing the underlying disease process. At the present time, it is the only procedure that treats the underlying disease condition. The proposed advantages for corneal cross-linking are that the procedure is minimally invasive, safe and effective, and it can potentially delay or defer the need for a corneal transplant. In addition, corneal cross-linking does not adversely affect subsequent surgical approaches, if they are necessary, or interfere with corneal transplants. The evidence for these claims for corneal cross-linking in the management of corneal thinning disorders such as keratoconus will be the focus of this review.
The specific research questions for the evidence review were as follows:
Technical: How technically demanding is corneal cross-linking and what are the operative risks?
Safety: What is known about the broader safety profile of corneal cross-linking?
Effectiveness - Corneal Surface Topographic Affects:
What are the corneal surface remodeling effects of corneal cross-linking?
Do these changes interfere with subsequent interventions, particularly corneal transplant known as penetrating keratoplasty (PKP)?
Effectiveness -Visual Acuity:
What impacts does the remodeling have on visual acuity?
Are these impacts predictable, stable, adjustable and durable?
Effectiveness - Refractive Outcomes: What impact does remodeling have on refractive outcomes?
Effectiveness - Visual Quality (Symptoms): What impact does corneal cross-linking have on vision quality such as contrast vision, and decreased visual symptoms (halos, fluctuating vision)?
Effectiveness - Contact lens tolerance: To what extent does contact lens intolerance improve after corneal cross-linking?
Vision-Related QOL: What is the impact of corneal cross-linking on functional visual rehabilitation and quality of life?
Patient satisfaction: Are patients satisfied with their vision following the procedure?
Disease Process:
What impact does corneal cross-linking have on the underling corneal thinning disease process?
Does corneal cross-linking delay or defer the need for a corneal transplant?
What is the comparative safety and effectiveness of corneal cross-linking compared with other minimally invasive treatments for corneal ectasia such as intrastromal corneal rings?
Clinical Need: Target Population and Condition
Corneal ectasia (thinning) disorders represent a range of disorders involving either primary disease conditions, such as keratoconus (KC) and pellucid marginal corneal degeneration, or secondary iatrogenic conditions, such as corneal thinning occurring after laser in situ keratomileusis (LASIK) refractive surgery.
Corneal thinning is a disease that occurs when the normally round dome-shaped cornea progressively thins causing a cone-like bulge or forward protrusion in response to the normal pressure of the eye. The thinning occurs primarily in the stroma layers and is believed to be a breakdown in the collagen process. This bulging can lead to irregular astigmatism or shape of the cornea. Because the anterior part of the cornea is responsible for most of the focusing of the light on the retina, this can then result in loss of visual acuity. The reduced visual acuity can make even simple daily tasks, such as driving, watching television or reading, difficult to perform.
Keratoconus is the most common form of corneal thinning disorder and involves a noninflammatory chronic disease process of progressive corneal thinning. Although the specific cause for the biomechanical alterations in the corneal stroma is unknown, there is a growing body of evidence suggesting that genetic factors may play an important role. Keratoconus is a rare disease (< 0.05% of the population) and is unique among chronic eye diseases because it has an early onset, with a median age of 25 years. Disease management for this condition follows a step-wise approach depending on disease severity. Contact lenses are the primary treatment of choice when there is irregular astigmatism associated with the disease. Patients are referred for corneal transplants as a last option when they can no longer tolerate contact lenses or when lenses no longer provide adequate vision.
Keratoconus is one of the leading indications for corneal transplants and has been so for the last 3 decades. Despite the high success rate of corneal transplants (up to 20 years) there are reasons to defer it as long as possible. Patients with keratoconus are generally young and a longer-term graft survival of at least 30 or 40 years may be necessary. The surgery itself involves lengthy time off work and postsurgery, while potential complications include long-term steroid use, secondary cataracts, and glaucoma. After a corneal transplant, keratoconus may recur resulting in a need for subsequent interventions. Residual refractive errors and astigmatism can remain challenges after transplantation, and high refractive surgery and regraft rates in KC patients have been reported. Visual rehabilitation or recovery of visual acuity after transplant may be slow and/or unsatisfactory to patients.
Description of Technology/Therapy
Corneal cross-linking involves the use of riboflavin (vitamin B2) and ultraviolet-A (UVA) radiation. A UVA irradiation device known as the CXL® device (license number 77989) by ACCUTECH Medical Technologies Inc. has been licensed by Health Canada as a Class II device since September 19, 2008. An illumination device that emits homogeneous UVA, in combination with any generic form of riboflavin, is licensed by Health Canada for the indication to slow or stop the progression of corneal thinning caused by progressive keratectasia, iatrogenic keratectasia after laser-assisted in situ keratomileusis (LASIK) and pellucid marginal degeneration. The same device is named the UV-X® device by IROCMedical, with approvals in Argentina, the European Union and Australia.
UVA devices all use light emitting diodes to generate UVA at a wavelength of 360-380 microns but vary in the number of diodes (5 to 25), focusing systems, working distance, beam diameter, beam uniformity and extent to which the operator can vary the parameters. In Ontario, CXL is currently offered at over 15 private eye clinics by refractive surgeons and ophthalmologists.
The treatment is an outpatient procedure generally performed with topical anesthesia. The treatment consists of several well defined procedures. The epithelial cell layer is first removed, often using a blunt spatula in a 9.0 mm diameter under sterile conditions. This step is followed by the application of topical 0.1% riboflavin (vitamin B2) solution every 3 to 5 minutes for 25 minutes to ensure that the corneal stroma is fully penetrated. A solid-state UVA light source with a wavelength of 370 nm (maximum absorption of riboflavin) and an irradiance of 3 mW/cm2 is used to irradiate the central cornea. Following treatment, a soft bandage lens is applied and prescriptions are given for oral pain medications, preservative-free tears, anti-inflammatory drops (preferably not nonsteroidal anti-inflammatory drugs, or NSAIDs) and antibiotic eye drops. Patients are recalled 1 week following the procedure to evaluate re-epithelialization and they are followed-up subsequently.
Evidence-Based Analysis Methods
A literature search was conducted on photochemical corneal collagen cross-linking with riboflavin (vitamin B2) and ultraviolet-A for the management of corneal thinning disorders using a search strategy with appropriate keywords and subject headings for CXL for literature published up until April 17, 2011. The literature search for this Health Technology Assessment (HTA) review was performed using the Cochrane Library, the Emergency Care Research Institute (ECRI) and the Centre for Reviews and Dissemination. The websites of several other health technology agencies were also reviewed, including the Canadian Agency for Drugs and Technologies in Health (CADTH) and the United Kingdom’s National Institute for Clinical Excellence (NICE). The databases searched included OVID MEDLINE, MEDLINE IN-Process and other Non-Indexed Citations such as EMBASE.
As the evidence review included an intervention for a rare condition, case series and case reports, particularly for complications and adverse events, were reviewed. A total of 316 citations were identified and all abstracts were reviewed by a single reviewer for eligibility. For those studies meeting the eligibility criteria, full-text articles were obtained. Reference lists were also examined for any additional relevant studies not identified through the search.
Inclusion Criteria
English-language reports and human studies
patients with any corneal thinning disorder
reports with CXL procedures used alone or in conjunction with other interventions
original reports with defined study methodology
reports including standardized measurements on outcome events such as technical success, safety effectiveness, durability, vision quality of life or patient satisfaction
systematic reviews, meta-analyses, randomized controlled trials, observational studies, retrospective analyses, case series, or case reports for complications and adverse events
Exclusion Criteria
nonsystematic reviews, letters, comments and editorials
reports not involving outcome events such as safety, effectiveness, durability, vision quality or patient satisfaction following an intervention with corneal implants
reports not involving corneal thinning disorders and an intervention involving CXL
Summary of Evidence Findings
In the Medical Advisory Secretariat evidence review on corneal cross-linking, 65 reports (16 case reports) involving 1403 patients were identified on the use of CXL for managing corneal thinning disorders. The reports were summarized according to their primary clinical indication, whether or not secondary interventions were used in conjunction with CXL (referred to as CXL-Plus) and whether or not it was a safety-related report.
The safety review was based on information from the cohort studies evaluating effectiveness, clinical studies evaluating safety, treatment response or recovery, and published case reports of complications. Complications, such as infection and noninfectious keratitis (inflammatory response), reported in case reports, generally occurred in the first week and were successfully treated with topical antibiotics and steroids. Other complications, such as the cytotoxic effects on the targeted corneal stroma, occurred as side effects of the photo-oxidative process generated by riboflavin and ultraviolet-A and were usually reversible.
The reports on treatment effectiveness involved 15 pre-post longitudinal cohort follow-up studies ranging from follow-up of patients’ treated eye only, follow-up in both the treated and untreated fellow-eye; and follow-up in the treated eye only and a control group not receiving treatment. One study was a 3-arm randomized control study (RCT) involving 2 comparators: one comparator was a sham treatment in which one eye was treated with riboflavin only; and the other comparator was the untreated fellow-eye. The outcomes reported across the studies involved statistically significant and clinically relevant improvements in corneal topography and refraction after CXL. In addition, improvements in treated eyes were accompanied by worsening outcomes in the untreated fellow-eyes. Improvements in corneal topography reported at 6 months were maintained at 1- and 2-year follow-up. Visual acuity, although not always improved, was infrequently reported as vision loss. Additional procedures such as the use of intrastromal corneal ring segments, intraocular lenses and refractive surgical practices were reported to result in additional improvements in topography and visual acuity after CXL.
Considerations for Ontario Health System
The total costs of providing CXL therapy to keratoconus patients in Ontario was calculated based on estimated physician, clinic, and medication costs. The total cost per patient was approximately $1,036 for the treatment of one eye, and $1,751 for the treatment of both eyes. The prevalence of keratoconus was estimated at 4,047 patients in FY2011, with an anticipated annual incidence (new cases) of about 148 cases. After distributing the costs of CXL therapy for the FY2011 prevalent keratoconus population over the next 3 years, the estimated average annual cost was approximately $2.1 million, of which about $1.3 million would be physician costs specifically.
Conclusion
Corneal cross-linking effectively stabilizes the underlying disease, and in some cases reverses disease progression as measured by key corneal topographic measures. The affects of CXL on visual acuity are less predictable and the use of adjunct interventions with CXL, such as intrastromal corneal ring segments, refractive surgery, and intraocular lens implants are increasingly employed to both stabilize disease and restore visual acuity. Although the use of adjunct interventions have been shown to result in additional clinical benefit, the order, timing, and risks of performing adjunctive interventions have not been well established.
Although there is potential for serious adverse events with corneal UVA irradiation and photochemical reactions, there have been few reported complications. Those that have occurred tended to be related to side effects of the induced photochemical reactions and were generally reversible. However, to ensure that there are minimal complications with the use of CXL and irradiation, strict adherence to defined CXL procedural protocols is essential.
Keywords
Keratoconus, corneal cross-linking, corneal topography, corneal transplant, visual acuity, refractive error.
PMCID: PMC3377552  PMID: 23074417
16.  Health Related Quality of Life after Surgical Removal of An Eye 
Purpose
This study compared the general health related quality of life (HRQOL) and the vision specific HRQOL in patients following the surgical removal of one eye who had good vision in the remaining eye to a group of binocular patients with good vision in both eyes.
Methods
The Medical Outcomes Study Short Form 12 (SF-12) and the National Eye Institute Visual Function Questionnaire (NEI VFQ) health related quality of life (HRQOL) surveys were administered to 29 patients who had surgical removal of an eye who attended an ocular prosthetics clinic and to 25 binocular persons who accompanied a patient. All subjects in each group had best corrected visual acuity of 20/40 or better. Overall statistical significance was tested using Cramer's V followed by individual t-tests for independent groups for each of the scales on the two questionnaires to determine if the means between the two groups differed statistically.
Results
The patient group had a mean age of 50.98 years (range 19 to 76). The control group had a mean age of 49.46 years (range 18 to 76). The mean time after loss of vision was 28.03 years (range 1-71 years) and the mean time from surgical removal of the eye was 23.6 years (range 0.5 to 59.5). There was an overall significant difference between the two groups on the 15 derived subscales of the two forms (Cramer's V, p = 0.0025). Three general HRQOL subscales (SF-12- mental component summary (MCS), SF-12 physical component summary (PCS), NEI VFQ-General Health) showed no differences between the two groups (p = 0.48, p = 0.81, p = 0.78 respectively). Three of the twelve vision specific NEI VFQ subscales demonstrated statistically significant differences between the patient and control groups: peripheral vision (p = 0.0006), role difficulties (p = 0.015) and the composite score (p = 0.014). Additionally, two monocular patients had given up driving compared to no binocular subjects (p = 0.056).
Conclusions
This population of monocular patients had general physical and mental HRQOL equivalent to the normal binocular group despite the surgical removal of one eye. However, the reduced vision specific HRQOL of monocular patients on the NEI VFQ indicates that there are substantial residual visual deficits even after prolonged monocular status.
doi:10.1097/IOP.0b013e318275b754
PMCID: PMC3541504  PMID: 23299809
Quality of Life; monocular blindness; enucleation; evisceration; evisceration; exenteration; disability; visual field loss
17.  Visual problems in the elderly population and implications for services. 
BMJ : British Medical Journal  1992;304(6836):1226-1229.
OBJECTIVE--To determine the prevalence of visual disability and common eye disease among elderly people in inner London. DESIGN--Cross sectional random sample survey. SETTING--Inner London health centre. SUBJECTS--Random sample of people aged 65 and over taken from practice's computerised age-sex register. MAIN OUTCOME MEASURES--Presenting binocular Snellen 6 m distance acuity and best monocular 3 m Sonksen-Silver acuity to classify prevalence of blindness by World Health Organisation criteria (less than 3/60 in better eye) and American criteria for legal blindness (better eye equal to 6/60 or less) and of low vision by WHO criteria (best acuity 6/18) and visual impairment by American criteria (less than 6/12 or 20/40 but greater than 6/60 or 20/200 in better eye). Principal cause of visual loss by diagnosis, referral indication by cause to hospital eye service, and proportion of cases known to primary care. RESULTS--207 of 288 (72%) eligible people were examined. 17 (8%) housebound subjects were examined at home. The prevalence of blindness was 1% by WHO criteria and 3.9% by American criteria. The prevalence of low vision (WHO criteria) was 7.7%. The prevalence of visual impairment (American criteria) was 10.6%. Cataract accounted for 75% of cases of low vision. Only eight out of 16 patients with low vision were known by their general practitioner to have an eye problem. 56 subjects (27%) would probably have benefited from refraction. Comparisons with studies in the United States and Finland suggested higher rates in this sample, mainly due to the prevalence of disabling cataract. CONCLUSION--There seems to be a considerable amount of undetected ocular disease in elderly people in the community.
PMCID: PMC1881785  PMID: 1515797
18.  Fear of falling in age-related macular degeneration 
BMC Ophthalmology  2014;14:10.
Background
Prior studies have shown age-related macular degeneration (AMD) to be associated with falls. The purpose of this study is to determine if (AMD) and AMD-related vision loss are associated with fear of falling, an important and distinct outcome.
Methods
Sixty-five persons with AMD with evidence of vision loss in one or both eyes and 60 glaucoma suspects with normal vision completed the University of Illinois at Chicago Fear of Falling questionnaire. Responses were Rasch analyzed. Scores were expressed in logit units, with lower scores demonstrating lesser ability and greater fear of falling.
Results
Compared to glaucoma suspect controls, AMD subjects had worse visual acuity (VA) (median better-eye VA = 20/48 vs. 20/24, p < 0.001) and worse contrast sensitivity (CS) (binocular CS = 1.9 vs. 1.5 log units, p < 0.001). AMD subjects were also older, more likely to be Caucasian, and less likely to be employed (p < 0.05 for all), but were similar with regards to other demographic and health measures. In multivariable models controlling for age, gender, body habitus, strength, and comorbid illnesses, AMD subjects reported greater fear of falling as compared to controls (β = -0.77 logits, 95% CI = -1.5 to -0.002, p = 0.045). In separate multivariable models, fear of falling increased with worse VA (β = -0.15 logits/1 line decrement, 95% CI = -0.28 to -0.03, p = 0.02) and CS (β = -0.20 logits/0.1 log unit decrement, 95% CI = -0.31 to -0.09, p = 0.001). Greater fear of falling was also associated with higher BMI, weaker grip, and more comorbid illnesses (p < 0.05 for all).
Conclusions
AMD and AMD-related vision loss are associated with greater fear of falling in the elderly. Development, validation, and implementation of methods to address falls and fear of falling for individuals with vision loss from AMD are important goals for future work.
doi:10.1186/1471-2415-14-10
PMCID: PMC3922687  PMID: 24472499
Fear of falling; Falls; Age-related macular degeneration; Disability; Older adults; Visual acuity; vision loss; Physical function; Safety
19.  Medical interventions for traumatic hyphema 
Background
Traumatic hyphema is the entry of blood into the anterior chamber (the space between the cornea and iris) subsequent to a blow or a projectile striking the eye. Hyphema uncommonly causes permanent loss of vision. Associated trauma (e.g. corneal staining, traumatic cataract, angle recession glaucoma, optic atrophy, etc.) may seriously affect vision. Such complications may lead to permanent impairment of vision. Patients with sickle cell trait/disease may be particularly susceptible to increases of elevated intraocular pressure. If rebleeding occurs, the rates and severity of complications increase.
Objectives
To assess the effectiveness of various medical interventions in the management of traumatic hyphema.
Search methods
We searched CENTRAL (which contains the Cochrane Eyes and Vision Group Trials Register) (The Cochrane Library 2013, Issue 8), Ovid MEDLINE, Ovid MEDLINE In-Process and Other Non-Indexed Citations, Ovid MEDLINE Daily, Ovid OLDMED-LINE (January 1946 to August 2013), EMBASE (January 1980 to August 2013), the metaRegister of Controlled Trials (mRCT) (www.controlled-trials.com), ClinicalTrials.gov (www.clinicaltrials.gov) and the WHO International Clinical Trials Registry Platform (ICTRP) (www.who.int/ictrp/search/en). We did not use any date or language restrictions in the electronic searches for trials. We last searched the electronic databases on 30 August 2013.
Selection criteria
Two authors independently assessed the titles and abstracts of all reports identified by the electronic and manual searches. In this review, we included randomized and quasi-randomized trials that compared various medical interventions versus other medical interventions or control groups for the treatment of traumatic hyphema following closed globe trauma. We applied no restrictions regarding age, gender, severity of the closed globe trauma, or level of visual acuity at the time of enrolment.
Data collection and analysis
Two authors independently extracted the data for the primary and secondary outcomes. We entered and analyzed data using Review Manager 5. We performed meta-analyses using a fixed-effect model and reported dichotomous outcomes as odds ratios and continuous outcomes as mean differences.
Main results
We included 20 randomized and seven quasi-randomized studies with 2643 participants in this review. Interventions included antifibrinolytic agents (oral and systemic aminocaproic acid, tranexamic acid, and aminomethylbenzoic acid), corticosteroids (systemic and topical), cycloplegics, miotics, aspirin, conjugated estrogens, traditional Chinese medicine, monocular versus bilateral patching, elevation of the head, and bed rest. No intervention had a significant effect on visual acuity whether measured at two weeks or less after the trauma or at longer time periods. The number of days for the primary hyphema to resolve appeared to be longer with the use of aminocaproic acid compared with no use, but was not altered by any other intervention.
Systemic aminocaproic acid reduced the rate of recurrent hemorrhage (odds ratio (OR) 0.25, 95% confidence interval (CI) 0.11 to 0.57), but a sensitivity analysis omitting studies not using an intention-to-treat (ITT) analysis reduced the strength of the evidence (OR 0.41, 95% CI 0.16 to 1.09). We obtained similar results for topical aminocaproic acid (OR 0.42, 95% CI 0.16 to 1.10). We found tranexamic acid had a significant effect in reducing the rate of secondary hemorrhage (OR 0.25, 95% CI 0.13 to 0.49), as did aminomethylbenzoic acid as reported in one study (OR 0.07, 95% CI 0.01 to 0.32). The evidence to support an associated reduction in the risk of complications from secondary hemorrhage (i.e. corneal blood staining, peripheral anterior synechiae, elevated intraocular pressure, and development of optic atrophy) by antifibrinolytics was limited by the small number of these events. Use of aminocaproic acid was associated with increased nausea, vomiting, and other adverse events compared with placebo. We found no difference in the number of adverse events with the use of systemic versus topical aminocaproic acid or with standard versus lower drug dose.
The available evidence on usage of corticosteroids, cycloplegics, or aspirin in traumatic hyphema was limited due to the small numbers of participants and events in the trials.
We found no difference in effect between a single versus binocular patch or ambulation versus complete bed rest on the risk of secondary hemorrhage or time to rebleed.
Authors’ conclusions
Traumatic hyphema in the absence of other intraocular injuries uncommonly leads to permanent loss of vision. Complications resulting from secondary hemorrhage could lead to permanent impairment of vision, especially in patients with sickle cell trait/disease. We found no evidence to show an effect on visual acuity by any of the interventions evaluated in this review. Although evidence was limited, it appears that patients with traumatic hyphema who receive aminocaproic acid or tranexamic acid are less likely to experience secondary hemorrhaging. However, hyphema in patients treated with aminocaproic acid take longer to clear.
Other than the possible benefits of antifibrinolytic usage to reduce the rate of secondary hemorrhage, the decision to use corticosteroids, cycloplegics, or nondrug interventions (such as binocular patching, bed rest, or head elevation) should remain individualized because no solid scientific evidence supports a benefit. As these multiple interventions are rarely used in isolation, further research to assess the additive effect of these interventions might be of value.
doi:10.1002/14651858.CD005431.pub3
PMCID: PMC4268787  PMID: 24302299
6-Aminocaproic Acid [therapeutic use]; Adrenal Cortex Hormones [therapeutic use]; Antifibrinolytic Agents [therapeutic use]; Aspirin [therapeutic use]; Bandages; Bed Rest; Estrogens, Conjugated (USP) [therapeutic use]; Hyphema [etiology, *therapy]; Mydriatics [therapeutic use]; Patient Positioning [methods]; Platelet Aggregation Inhibitors [therapeutic use]; Randomized Controlled Trials as Topic; Wounds, Nonpenetrating [*complications]; Humans
20.  International vision requirements for driver licensing and disability pensions: using a milestone approach in characterization of progressive eye disease 
Objective
Low vision that causes forfeiture of driver’s licenses and collection of disability pension benefits can lead to negative psychosocial and economic consequences. The purpose of this study was to review the requirements for holding a driver’s license and rules for obtaining a disability pension due to low vision. Results highlight the possibility of using a milestone approach to describe progressive eye disease.
Methods
Government and research reports, websites, and journal articles were evaluated to review rules and requirements in Germany, Spain, Italy, France, the UK, and the US.
Results
Visual acuity limits are present in all driver’s license regulations. In most countries, the visual acuity limit is 0.5. Visual field limits are included in some driver’s license regulations. In Europe, binocular visual field requirements typically follow the European Union standard of ≥120°. In the US, the visual field requirements are typically between 110° and 140°. Some countries distinguish between being partially sighted and blind in the definition of legal blindness, and in others there is only one limit.
Conclusions
Loss of driving privileges could be used as a milestone to monitor progressive eye disease. Forfeiture could be standardized as a best-corrected visual acuity of <0.5 or visual field of <120°, which is consistent in most countries. However, requirements to receive disability pensions were too variable to standardize as milestones in progressive eye disease. Implementation of the World Health Organization criteria for low vision and blindness would help to establish better comparability between countries.
doi:10.2147/OPTH.S15359
PMCID: PMC2999549  PMID: 21179219
driver’s license requirements; glaucoma; health outcomes; progressive eye disease
21.  Improvements in clinical and functional vision and perceived visual disability after first and second eye cataract surgery 
AIMS—To determine the improvements in clinical and functional vision and perceived visual disability after first and second eye cataract surgery.
METHODS—Clinical vision (monocular and binocular high and low contrast visual acuity, contrast sensitivity, and disability glare), functional vision (face identity and expression recognition, reading speed, word acuity, and mobility orientation), and perceived visual disability (Activities of Daily Vision Scale) were measured in 25 subjects before and after uncomplicated cataract surgery (10 first eye surgery and 15 second eye surgery) and in 10 age matched controls.
RESULTS—Significant improvements were found after surgery in clinical and functional vision and perceived visual disability. Greater improvements were found after first eye surgery than after second eye surgery. However, first eye surgery did not return all scores to age matched normal levels. There were significant improvements in several of the tests measured after second eye surgery, and all postoperative values were similar to those from age matched normals.
CONCLUSIONS—Significant improvements in clinical, functional, and perceived vision are obtained by cataract surgery. The improvements in objective measures of functional vision found in this study support previous findings of improvements in patients' perceived functional vision. In addition, these data provide support to the necessity of second eye surgery in some patients to improve certain aspects of visual function to age matched normal levels.


PMCID: PMC1722018  PMID: 9486032
22.  Reliability of Snellen charts for testing visual acuity for driving: prospective study and postal questionnaire 
BMJ : British Medical Journal  2000;321(7267):990-992.
Objectives
To assess the ability of patients with binocular 6/9 or 6/12 vision on the Snellen chart (Snellen acuity) to read a number plate at 20.5 m (the required standard for driving) and to determine how health professionals advise such patients about driving.
Design
Prospective study of patients and postal questionnaire to healthcare professionals.
Subjects
50 patients with 6/9 vision and 50 with 6/12 vision and 100 general practitioners, 100 optometrists or opticians, and 100 ophthalmologists.
Setting
Ophthalmology outpatient clinics in Sheffield.
Main outcome measures
Ability to read a number plate at 20.5 m and health professionals' advice about driving on the basis of visual acuity.
Results
26% of patients with 6/9 vision failed the number plate test, and 34% with 6/12 vision passed it. Of the general practitioners advising patients with 6/9 vision, 76% said the patients could drive, 13% said they should not drive, and 11% were unsure. Of the general practitioners advising patients with 6/12 vision, 21% said the patients could drive, 54% said they should not drive, and 25% were unsure. The level of acuity at which optometrists, opticians, and ophthalmologists would advise drivers against driving ranged from 6/9−2 (ability to read all except two letters on the 6/9 line of the Snellen chart) to less than 6/18.
Conclusions
Snellen acuity is a poor predictor of an individual's ability to meet the required visual standard for driving. Patients with 6/9 vision or less should be warned that they may fail to meet this standard, but those with 6/12 vision should not be assumed to be below the standard.
PMCID: PMC27506  PMID: 11039964
23.  Visually Impaired Drivers Who Use Bioptic Telescopes: Self-Assessed Driving Skills and Agreement With On-Road Driving Evaluation 
Purpose.
To compare self-assessed driving habits and skills of licensed drivers with central visual loss who use bioptic telescopes to those of age-matched normally sighted drivers, and to examine the association between bioptic drivers' impressions of the quality of their driving and ratings by a “backseat” evaluator.
Methods.
Participants were licensed bioptic drivers (n = 23) and age-matched normally sighted drivers (n = 23). A questionnaire was administered addressing driving difficulty, space, quality, exposure, and, for bioptic drivers, whether the telescope was helpful in on-road situations. Visual acuity and contrast sensitivity were assessed. Information on ocular diagnosis, telescope characteristics, and bioptic driving experience was collected from the medical record or in interview. On-road driving performance in regular traffic conditions was rated independently by two evaluators.
Results.
Like normally sighted drivers, bioptic drivers reported no or little difficulty in many driving situations (e.g., left turns, rush hour), but reported more difficulty under poor visibility conditions and in unfamiliar areas (P < 0.05). Driving exposure was reduced in bioptic drivers (driving 250 miles per week on average vs. 410 miles per week for normally sighted drivers, P = 0.02), but driving space was similar to that of normally sighted drivers (P = 0.29). All but one bioptic driver used the telescope in at least one driving task, and 56% used the telescope in three or more tasks. Bioptic drivers' judgments about the quality of their driving were very similar to backseat evaluators' ratings.
Conclusions.
Bioptic drivers show insight into the overall quality of their driving and areas in which they experience driving difficulty. They report using the bioptic telescope while driving, contrary to previous claims that it is primarily used to pass the vision screening test at licensure.
Drivers with central vision impairment who use bioptic telescopes have insight into the overall quality of their driving and driving skills with which they have difficulty.
doi:10.1167/iovs.13-13520
PMCID: PMC3894796  PMID: 24370830
bioptic telescope; driving; low vision
24.  Prevalence and Causes of Blindness and Low Vision in Southern Sudan  
PLoS Medicine  2006;3(12):e477.
Background
Blindness and low vision are thought to be common in southern Sudan. However, the magnitude and geographical distribution are largely unknown. We aimed to estimate the prevalence of blindness and low vision, identify the main causes of blindness and low vision, and estimate targets for blindness prevention programs in Mankien payam (district), southern Sudan.
Methods and Findings
A cross-sectional survey of the population aged 5 y and above was conducted in May 2005 using a two-stage cluster random sampling with probability proportional to size. The Snellen E chart was used to test visual acuity, and participants also underwent basic eye examination. Vision status was defined using World Health Organization categories of visual impairment based on presenting visual acuity (VA). A total of 2,954 persons were enumerated and 2,499 (84.6%) examined. Prevalence of blindness (presenting VA of less than 3/60 in the better eye) was 4.1% (95% confidence interval [CI], 3.4–4.8); prevalence of low vision (presenting VA of at least 3/60 but less than 18/60 in the better eye) was 7.7% (95% CI, 6.7–8.7); whereas prevalence of monocular visual impairment (presenting VA of at least 18/60 in better eye and VA of less than 18/60 in other eye) was 4.4% (95% CI, 3.6–5.3). The main causes of blindness were considered to be cataract (41.2%) and trachoma (35.3%), whereas low vision was mainly caused by trachoma (58.1%) and cataract (29.3%). It is estimated that in Mankien payam 1,154 persons aged 5 y and above (lower and upper bounds = 782–1,799) are blind, and 2,291 persons (lower and upper bounds = 1,820–2,898) have low vision.
Conclusions
Blindness is a serious public health problem in Mankien, and there is urgent need to implement comprehensive blindness prevention programs. Further surveys are essential to confirm these tragic findings and estimate prevalence of blindness and low vision in the entire region of southern Sudan in order to facilitate planning of VISION 2020 objectives.
A cross-sectional survey using two-stage cluster random sampling was conducted in Mankien district, southern Sudan. The prevalence of blindness (4.1%) and of low vision (7.7%) were much higher than expected.
Editors' Summary
Background.
Blindness is very common. The World Health Organization says that around 161 million people have at least some degree of “visual impairment,” of whom 37 million are blind. There are many causes of blindness, including infections, malnutrition, injury, and aging. Around 90% of blind people live in developing countries. It is estimated that 75% of the cases of blindness in these countries could have been prevented but, in situations where people are poor and live in remote locations, both prevention and treatment efforts are extremely difficult. In times of war and civil conflict, the problems become even more severe. In these situations, it is very hard even to get an idea of the number of people who are blind. Surveys to find this out are important as a first step toward providing prevention and treatment services. Surveys play an essential part in international efforts to fight blindness.
Why Was This Study Done?
Sudan is the largest country in Africa and one of the poorest in the world. Southern Sudan has spent most of the last five decades in a state of civil war and is a very remote region. The last information collected on the scale of the blindness problem was in the early 1980s. The researchers decided to conduct a survey in Mankien—a district of Sudan with a total population that is estimated to be around 50,000. Their aim was to estimate how many people were blind or had “low vision” and to find out the main causes of blindness. This would be useful in planning a blindness prevention programme for the district. It would also give some idea of the situation in the southern Sudan as a whole.
What Did the Researchers Do and Find?
Working under very difficult conditions, the researchers selected villages to be visited at random. A house in each village visited was selected by spinning a pen in the middle of the village. The people in this house were examined and then other houses were chosen, also at random. In total, 2499 people were examined. Children under five years were not included in survey.
A very high rate of blindness was found—4%. This is more than twice the level that would be expected, given what is known about the prevalence of blindness in other parts of rural Africa. The two most common causes of blindness and low vision were cataract and trachoma, each accounting for over one-third of cases. Cataract is mainly a disease of older people; the lens of the eye becomes opaque. Trachoma is caused by an infection; it is the subject of another article by the same researchers in this issue of PLoS Medicine. Trachoma was responsible for a greater proportion of cases of blindness than has been found in studies in other parts of rural Africa.
What Do These Findings Mean?
Based on the researchers' use of the random walk survey technique, the prevalence of blindness in this district and possibly the rest of southern Sudan appears to be extremely serious. The number of cases caused by trachoma is especially worrying. This information will help efforts to improve the situation. The implications of the study—and a discussion of the methods the researchers used—will be found in two “Perspective” articles in this issue of PLoS Medicine (by Buchan and by Kuper and Gilbert).
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0030477.
General information about blindness is available on Wikipedia, an internet encyclopedia that anyone can edit
Vision 2020 is a major international initiative to reduce blindness, in which many organizations collaborate
The World Health Organization has a Web page on blindness
Many charities provide help to blind people in developing countries, for example: Sight Savers, Lions Clubs International Foundation, Dark and Light Blind Care
A profile of Sudan will be found on the website of the BBC
doi:10.1371/journal.pmed.0030477
PMCID: PMC1702554  PMID: 17177596
25.  Association Between Depression and Functional Vision Loss in Persons 20 Years of Age or Older in the United States, NHANES 2005–2008 
JAMA ophthalmology  2013;131(5):573-581.
Importance
This study provides further evidence from a national sample to generalize the relationship between depression and vision loss to adults across the age spectrum. Better recognition of depression among people reporting reduced ability to perform routine activities of daily living due to vision loss is warranted.
Objectives
To estimate, in a national survey of US adults 20 years of age or older, the prevalence of depression among adults reporting visual function loss and among those with visual acuity impairment. The relationship between depression and vision loss has not been reported in a nationally representative sample of US adults. Previous studies have been limited to specific cohorts and predominantly focused on the older population.
Design
The National Health and Nutrition Examination Survey (NHANES) 2005–2008.
Setting
A cross-sectional, nationally representative sample of adults, with prevalence estimates weighted to represent the civilian, noninstitutionalized US population.
Participants
A total of 10 480 US adults 20 years of age or older.
Main Outcome Measures
Depression, as measured by the 9-item Patient Health Questionnaire depression scale, and vision loss, as measured by visual function using a questionnaire and by visual acuity at examination.
Results
In 2005–2008, the estimated crude prevalence of depression (9-item Patient Health Questionnaire score of ≥10) was 11.3% (95% CI, 9.7%–13.2%) among adults with self-reported visual function loss and 4.8% (95% CI, 4.0%–5.7%) among adults without. The estimated prevalence of depression was 10.7% (95% CI, 8.0%–14.3%) among adults with presenting visual acuity impairment (visual acuity worse than 20/40 in the better-seeing eye) compared with 6.8% (95% CI, 5.8%–7.8%) among adults with normal visual acuity. After controlling for age, sex, race/ethnicity, marital status, living alone or not, education, income, employment status, health insurance, body mass index, smoking, binge drinking, general health status, eyesight worry, and major chronic conditions, self-reported visual function loss remained significantly associated with depression (overall odds ratio, 1.9 [95% CI, 1.6–2.3]), whereas the association between presenting visual acuity impairment and depression was no longer statistically significant.
Conclusions and Relevance
Self-reported visual function loss, rather than loss of visual acuity, is significantly associated with depression. Health professionals should be aware of the risk of depression among persons reporting visual function loss.
doi:10.1001/jamaophthalmol.2013.2597
PMCID: PMC3772677  PMID: 23471505

Results 1-25 (551877)