PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (860348)

Clipboard (0)
None

Related Articles

1.  First-in-Human Trial of a Novel Suprachoroidal Retinal Prosthesis 
PLoS ONE  2014;9(12):e115239.
Retinal visual prostheses (“bionic eyes”) have the potential to restore vision to blind or profoundly vision-impaired patients. The medical bionic technology used to design, manufacture and implant such prostheses is still in its relative infancy, with various technologies and surgical approaches being evaluated. We hypothesised that a suprachoroidal implant location (between the sclera and choroid of the eye) would provide significant surgical and safety benefits for patients, allowing them to maintain preoperative residual vision as well as gaining prosthetic vision input from the device. This report details the first-in-human Phase 1 trial to investigate the use of retinal implants in the suprachoroidal space in three human subjects with end-stage retinitis pigmentosa. The success of the suprachoroidal surgical approach and its associated safety benefits, coupled with twelve-month post-operative efficacy data, holds promise for the field of vision restoration.
Trial Registration
Clinicaltrials.gov NCT01603576
doi:10.1371/journal.pone.0115239
PMCID: PMC4270734  PMID: 25521292
2.  Collagen Cross-Linking Using Riboflavin and Ultraviolet-A for Corneal Thinning Disorders 
Executive Summary
Objective
The main objectives for this evidence-based analysis were to determine the safety and effectiveness of photochemical corneal collagen cross-linking with riboflavin (vitamin B2) and ultraviolet-A radiation, referred to as CXL, for the management of corneal thinning disease conditions. The comparative safety and effectiveness of corneal cross-linking with other minimally invasive treatments such as intrastromal corneal rings was also reviewed. The Medical Advisory Secretariat (MAS) evidence-based analysis was performed to support public financing decisions.
Subject of the Evidence-Based Analysis
The primary treatment objective for corneal cross-linking is to increase the strength of the corneal stroma, thereby stabilizing the underlying disease process. At the present time, it is the only procedure that treats the underlying disease condition. The proposed advantages for corneal cross-linking are that the procedure is minimally invasive, safe and effective, and it can potentially delay or defer the need for a corneal transplant. In addition, corneal cross-linking does not adversely affect subsequent surgical approaches, if they are necessary, or interfere with corneal transplants. The evidence for these claims for corneal cross-linking in the management of corneal thinning disorders such as keratoconus will be the focus of this review.
The specific research questions for the evidence review were as follows:
Technical: How technically demanding is corneal cross-linking and what are the operative risks?
Safety: What is known about the broader safety profile of corneal cross-linking?
Effectiveness - Corneal Surface Topographic Affects:
What are the corneal surface remodeling effects of corneal cross-linking?
Do these changes interfere with subsequent interventions, particularly corneal transplant known as penetrating keratoplasty (PKP)?
Effectiveness -Visual Acuity:
What impacts does the remodeling have on visual acuity?
Are these impacts predictable, stable, adjustable and durable?
Effectiveness - Refractive Outcomes: What impact does remodeling have on refractive outcomes?
Effectiveness - Visual Quality (Symptoms): What impact does corneal cross-linking have on vision quality such as contrast vision, and decreased visual symptoms (halos, fluctuating vision)?
Effectiveness - Contact lens tolerance: To what extent does contact lens intolerance improve after corneal cross-linking?
Vision-Related QOL: What is the impact of corneal cross-linking on functional visual rehabilitation and quality of life?
Patient satisfaction: Are patients satisfied with their vision following the procedure?
Disease Process:
What impact does corneal cross-linking have on the underling corneal thinning disease process?
Does corneal cross-linking delay or defer the need for a corneal transplant?
What is the comparative safety and effectiveness of corneal cross-linking compared with other minimally invasive treatments for corneal ectasia such as intrastromal corneal rings?
Clinical Need: Target Population and Condition
Corneal ectasia (thinning) disorders represent a range of disorders involving either primary disease conditions, such as keratoconus (KC) and pellucid marginal corneal degeneration, or secondary iatrogenic conditions, such as corneal thinning occurring after laser in situ keratomileusis (LASIK) refractive surgery.
Corneal thinning is a disease that occurs when the normally round dome-shaped cornea progressively thins causing a cone-like bulge or forward protrusion in response to the normal pressure of the eye. The thinning occurs primarily in the stroma layers and is believed to be a breakdown in the collagen process. This bulging can lead to irregular astigmatism or shape of the cornea. Because the anterior part of the cornea is responsible for most of the focusing of the light on the retina, this can then result in loss of visual acuity. The reduced visual acuity can make even simple daily tasks, such as driving, watching television or reading, difficult to perform.
Keratoconus is the most common form of corneal thinning disorder and involves a noninflammatory chronic disease process of progressive corneal thinning. Although the specific cause for the biomechanical alterations in the corneal stroma is unknown, there is a growing body of evidence suggesting that genetic factors may play an important role. Keratoconus is a rare disease (< 0.05% of the population) and is unique among chronic eye diseases because it has an early onset, with a median age of 25 years. Disease management for this condition follows a step-wise approach depending on disease severity. Contact lenses are the primary treatment of choice when there is irregular astigmatism associated with the disease. Patients are referred for corneal transplants as a last option when they can no longer tolerate contact lenses or when lenses no longer provide adequate vision.
Keratoconus is one of the leading indications for corneal transplants and has been so for the last 3 decades. Despite the high success rate of corneal transplants (up to 20 years) there are reasons to defer it as long as possible. Patients with keratoconus are generally young and a longer-term graft survival of at least 30 or 40 years may be necessary. The surgery itself involves lengthy time off work and postsurgery, while potential complications include long-term steroid use, secondary cataracts, and glaucoma. After a corneal transplant, keratoconus may recur resulting in a need for subsequent interventions. Residual refractive errors and astigmatism can remain challenges after transplantation, and high refractive surgery and regraft rates in KC patients have been reported. Visual rehabilitation or recovery of visual acuity after transplant may be slow and/or unsatisfactory to patients.
Description of Technology/Therapy
Corneal cross-linking involves the use of riboflavin (vitamin B2) and ultraviolet-A (UVA) radiation. A UVA irradiation device known as the CXL® device (license number 77989) by ACCUTECH Medical Technologies Inc. has been licensed by Health Canada as a Class II device since September 19, 2008. An illumination device that emits homogeneous UVA, in combination with any generic form of riboflavin, is licensed by Health Canada for the indication to slow or stop the progression of corneal thinning caused by progressive keratectasia, iatrogenic keratectasia after laser-assisted in situ keratomileusis (LASIK) and pellucid marginal degeneration. The same device is named the UV-X® device by IROCMedical, with approvals in Argentina, the European Union and Australia.
UVA devices all use light emitting diodes to generate UVA at a wavelength of 360-380 microns but vary in the number of diodes (5 to 25), focusing systems, working distance, beam diameter, beam uniformity and extent to which the operator can vary the parameters. In Ontario, CXL is currently offered at over 15 private eye clinics by refractive surgeons and ophthalmologists.
The treatment is an outpatient procedure generally performed with topical anesthesia. The treatment consists of several well defined procedures. The epithelial cell layer is first removed, often using a blunt spatula in a 9.0 mm diameter under sterile conditions. This step is followed by the application of topical 0.1% riboflavin (vitamin B2) solution every 3 to 5 minutes for 25 minutes to ensure that the corneal stroma is fully penetrated. A solid-state UVA light source with a wavelength of 370 nm (maximum absorption of riboflavin) and an irradiance of 3 mW/cm2 is used to irradiate the central cornea. Following treatment, a soft bandage lens is applied and prescriptions are given for oral pain medications, preservative-free tears, anti-inflammatory drops (preferably not nonsteroidal anti-inflammatory drugs, or NSAIDs) and antibiotic eye drops. Patients are recalled 1 week following the procedure to evaluate re-epithelialization and they are followed-up subsequently.
Evidence-Based Analysis Methods
A literature search was conducted on photochemical corneal collagen cross-linking with riboflavin (vitamin B2) and ultraviolet-A for the management of corneal thinning disorders using a search strategy with appropriate keywords and subject headings for CXL for literature published up until April 17, 2011. The literature search for this Health Technology Assessment (HTA) review was performed using the Cochrane Library, the Emergency Care Research Institute (ECRI) and the Centre for Reviews and Dissemination. The websites of several other health technology agencies were also reviewed, including the Canadian Agency for Drugs and Technologies in Health (CADTH) and the United Kingdom’s National Institute for Clinical Excellence (NICE). The databases searched included OVID MEDLINE, MEDLINE IN-Process and other Non-Indexed Citations such as EMBASE.
As the evidence review included an intervention for a rare condition, case series and case reports, particularly for complications and adverse events, were reviewed. A total of 316 citations were identified and all abstracts were reviewed by a single reviewer for eligibility. For those studies meeting the eligibility criteria, full-text articles were obtained. Reference lists were also examined for any additional relevant studies not identified through the search.
Inclusion Criteria
English-language reports and human studies
patients with any corneal thinning disorder
reports with CXL procedures used alone or in conjunction with other interventions
original reports with defined study methodology
reports including standardized measurements on outcome events such as technical success, safety effectiveness, durability, vision quality of life or patient satisfaction
systematic reviews, meta-analyses, randomized controlled trials, observational studies, retrospective analyses, case series, or case reports for complications and adverse events
Exclusion Criteria
nonsystematic reviews, letters, comments and editorials
reports not involving outcome events such as safety, effectiveness, durability, vision quality or patient satisfaction following an intervention with corneal implants
reports not involving corneal thinning disorders and an intervention involving CXL
Summary of Evidence Findings
In the Medical Advisory Secretariat evidence review on corneal cross-linking, 65 reports (16 case reports) involving 1403 patients were identified on the use of CXL for managing corneal thinning disorders. The reports were summarized according to their primary clinical indication, whether or not secondary interventions were used in conjunction with CXL (referred to as CXL-Plus) and whether or not it was a safety-related report.
The safety review was based on information from the cohort studies evaluating effectiveness, clinical studies evaluating safety, treatment response or recovery, and published case reports of complications. Complications, such as infection and noninfectious keratitis (inflammatory response), reported in case reports, generally occurred in the first week and were successfully treated with topical antibiotics and steroids. Other complications, such as the cytotoxic effects on the targeted corneal stroma, occurred as side effects of the photo-oxidative process generated by riboflavin and ultraviolet-A and were usually reversible.
The reports on treatment effectiveness involved 15 pre-post longitudinal cohort follow-up studies ranging from follow-up of patients’ treated eye only, follow-up in both the treated and untreated fellow-eye; and follow-up in the treated eye only and a control group not receiving treatment. One study was a 3-arm randomized control study (RCT) involving 2 comparators: one comparator was a sham treatment in which one eye was treated with riboflavin only; and the other comparator was the untreated fellow-eye. The outcomes reported across the studies involved statistically significant and clinically relevant improvements in corneal topography and refraction after CXL. In addition, improvements in treated eyes were accompanied by worsening outcomes in the untreated fellow-eyes. Improvements in corneal topography reported at 6 months were maintained at 1- and 2-year follow-up. Visual acuity, although not always improved, was infrequently reported as vision loss. Additional procedures such as the use of intrastromal corneal ring segments, intraocular lenses and refractive surgical practices were reported to result in additional improvements in topography and visual acuity after CXL.
Considerations for Ontario Health System
The total costs of providing CXL therapy to keratoconus patients in Ontario was calculated based on estimated physician, clinic, and medication costs. The total cost per patient was approximately $1,036 for the treatment of one eye, and $1,751 for the treatment of both eyes. The prevalence of keratoconus was estimated at 4,047 patients in FY2011, with an anticipated annual incidence (new cases) of about 148 cases. After distributing the costs of CXL therapy for the FY2011 prevalent keratoconus population over the next 3 years, the estimated average annual cost was approximately $2.1 million, of which about $1.3 million would be physician costs specifically.
Conclusion
Corneal cross-linking effectively stabilizes the underlying disease, and in some cases reverses disease progression as measured by key corneal topographic measures. The affects of CXL on visual acuity are less predictable and the use of adjunct interventions with CXL, such as intrastromal corneal ring segments, refractive surgery, and intraocular lens implants are increasingly employed to both stabilize disease and restore visual acuity. Although the use of adjunct interventions have been shown to result in additional clinical benefit, the order, timing, and risks of performing adjunctive interventions have not been well established.
Although there is potential for serious adverse events with corneal UVA irradiation and photochemical reactions, there have been few reported complications. Those that have occurred tended to be related to side effects of the induced photochemical reactions and were generally reversible. However, to ensure that there are minimal complications with the use of CXL and irradiation, strict adherence to defined CXL procedural protocols is essential.
Keywords
Keratoconus, corneal cross-linking, corneal topography, corneal transplant, visual acuity, refractive error.
PMCID: PMC3377552  PMID: 23074417
3.  Intrastromal Corneal Ring Implants for Corneal Thinning Disorders 
Executive Summary
Objective
The purpose of this project was to determine the role of corneal implants in the management of corneal thinning disease conditions. An evidence-based review was conducted to determine the safety, effectiveness and durability of corneal implants for the management of corneal thinning disorders. The evolving directions of research in this area were also reviewed.
Subject of the Evidence-Based Analysis
The primary treatment objectives for corneal implants are to normalize corneal surface topography, improve contact lens tolerability, and restore visual acuity in order to delay or defer the need for corneal transplant. Implant placement is a minimally invasive procedure that is purported to be safe and effective. The procedure is also claimed to be adjustable, reversible, and both eyes can be treated at the same time. Further, implants do not limit the performance of subsequent surgical approaches or interfere with corneal transplant. The evidence for these claims is the focus of this review.
The specific research questions for the evidence review were as follows:
Safety
Corneal Surface Topographic Effects:
Effects on corneal surface remodelling
Impact of these changes on subsequent interventions, particularly corneal transplantation (penetrating keratoplasty [PKP])
Visual Acuity
Refractive Outcomes
Visual Quality (Symptoms): such as contrast vision or decreased visual symptoms (halos, fluctuating vision)
Contact lens tolerance
Functional visual rehabilitation and quality of life
Patient satisfaction:
Disease Process:
Impact on corneal thinning process
Effect on delaying or deferring the need for corneal transplantation
Clinical Need: Target Population and Condition
Corneal ectasia (thinning) comprises a range of disorders involving either primary disease conditions such as keratoconus and pellucid marginal corneal degeneration or secondary iatrogenic conditions such as corneal thinning occurring after LASIK refractive surgery. The condition occurs when the normally round dome-shaped cornea progressively thins causing a cone-like bulge or forward protrusion in response to the normal pressure of the eye. Thinning occurs primarily in the stoma layers and is believed to be a breakdown in the collagen network. This bulging can lead to an irregular shape or astigmatism of the cornea and, because the anterior part of the cornea is largely responsible for the focusing of light on the retina, results in loss of visual acuity. This can make even simple daily tasks, such as driving, watching television or reading, difficult to perform.
Keratoconus (KC) is the most common form of corneal thinning disorder and is a noninflammatory chronic disease process. Although the specific causes of the biomechanical alterations that occur in KC are unknown, there is a growing body of evidence to suggest that genetic factors may play an important role. KC is a rare condition (<0.05% of the population) and is unique among chronic eye diseases as it has an early age of onset (median age of 25 years). Disease management for this condition follows a step-wise approach depending on disease severity. Contact lenses are the primary treatment of choice when there is irregular astigmatism associated with the disease. When patients can no longer tolerate contact lenses or when lenses no longer provide adequate vision, patients are referred for corneal transplant.
Keratoconus is one of the leading indications for corneal transplants and has been so for the last three decades. Yet, despite high graft survival rates of up to 20 years, there are reasons to defer receiving transplants for as long as possible. Patients with keratoconus are generally young and life-long term graft survival would be an important consideration. The surgery itself involves lengthy time off work and there are potential complications from long term steroid use following surgery, as well as the risk of developing secondary cataracts, glaucoma etc. After transplant, recurrent KC is possible with need for subsequent intervention. Residual refractive errors and astigmatism can remain challenging after transplantation and high refractive surgery rates and re-graft rates in KC patients have been reported. Visual rehabilitation or recovery of visual acuity after transplant may be slow and/or unsatisfactory to patients.
Description of Technology/Therapy
INTACS® (Addition Technology Inc. Sunnyvale, CA, formerly KeraVision, Inc.) are the only currently licensed corneal implants in Canada. The implants are micro-thin poly methyl methacrylate crescent shaped ring segments with a circumference arc length of 150 degrees, an external diameter of 8.10 mm, an inner diameter of 6.77 mm, and a range of different thicknesses. Implants act as passive spacers and, when placed in the cornea, cause local separation of the corneal lamellae resulting in a shortening of the arc length of the anterior corneal curvature and flattening the central cornea. Increasing segment thickness results in greater lamellar separation with increased flattening of the cornea correcting for myopia by decreasing the optical power of the eye. Corneal implants also improve corneal astigmatism but the mechanism of action for this is less well understood.
Treatment with corneal implants is considered for patients who are contact lens intolerant, having adequate corneal thickness particularly around the area of the implant incision site and without central corneal scarring. Those with central corneal scarring would not benefit from implants and those without an adequate corneal thickness, particularly in the region that the implants are being inserted, would be at increased risk for corneal perforation. Patients desiring to have visual rehabilitation that does not include glasses or contact lenses would not be candidates for corneal ring implants.
Placement of the implants is an outpatient procedure with topical anesthesia generally performed by either corneal specialists or refractive surgeons. It involves creating tunnels in the corneal stroma to secure the implants either by a diamond knife or laser calibrated to an approximate depth of 70% of the cornea. Variable approaches have been employed by surgeons in selecting ring segment size, number and position. Generally, two segments of equal thickness are placed superiorly and inferiorly to manage symmetrical patterns of corneal thinning whereas one segment may be placed to manage asymmetric thinning patterns.
Following implantation, the major safety concerns are for potential adverse events including corneal perforation, infection, corneal infiltrates, corneal neovascularization, ring migration and extrusion and corneal thinning. Technical results can be unsatisfactory for several reasons. Treatment may result in an over or under-correction of refraction and may induce astigmatism or asymmetry of the cornea.
Progression of the corneal cone with corneal opacities is also invariably an indication for progression to corneal transplant. Other reasons for treatment failure or patient dissatisfaction include foreign body sensation, unsatisfactory visual quality with symptoms such as double vision, fluctuating vision, poor night vision or visual side effects related to ring edge or induced or unresolved astigmatism.
Evidence-Based Analysis Methods
The literature search strategy employed keywords and subject headings to capture the concepts of 1) intrastromal corneal rings and 2) corneal diseases, with a focus on keratoconus, astigmatism, and corneal ectasia. The initial search was run on April 17, 2008, and a final search was run on March 6, 2009 in the following databases: Ovid MEDLINE (1996 to February Week 4 2009), OVID MEDLINE In-Process and Other Non-Indexed Citations, EMBASE (1980 to 2009 Week 10), OVID Cochrane Library, and the Centre for Reviews and Dissemination/International Agency for Health Technology Assessment. Parallel search strategies were developed for the remaining databases. Search results were limited to human and English-language published between January 2000 and April 17, 2008. The resulting citations were downloaded into Reference Manager, v.11 (ISI Researchsoft, Thomson Scientific, U.S.A), and duplicates were removed. The Web sites of several other health technology agencies were also reviewed including the Canadian Agency for Drugs and Technologies in Health (CADTH), ECRI, and the United Kingdom National Institute for Clinical Excellence (NICE). The bibliographies of relevant articles were scanned.
Inclusion Criteria
English language reports and human studies
Any corneal thinning disorder
Reports with corneal implants used alone or in conjunction with other interventions
Original reports with defined study methodology
Reports including standardized measurements on outcome events such as technical success, safety, effectiveness, durability, vision quality of life or patient satisfaction
Case reports or case series for complications and adverse events
Exclusion Criteria
Non-systematic reviews, letters, comments and editorials
Reports not involving outcome events such as safety, effectiveness, durability, vision quality or patient satisfaction following an intervention with corneal implants
Reports not involving corneal thinning disorders and an intervention with corneal implants
Summary of Findings
In the MAS evidence review on intrastromal corneal ring implants, 66 reports were identified on the use of implants for management of corneal thinning disorders. Reports varied according to their primary clinical indication, type of corneal implant, and whether or not secondary procedures were used in conjunction with the implants. Implants were reported to manage post LASIK thinning and/or uncorrected refractive error and were also reported as an adjunctive intervention both during and after corneal transplant to manage recurrent thinning and/or uncorrected refractive error.
Ten pre-post cohort longitudinal follow-up studies were identified examining the safety and effectiveness of INTAC corneal implants in patients with keratoconus. Five additional cohort studies were identified using the Ferrara implant for keratoconus management but because this corneal implant is not licensed in Canada these studies were not reviewed.
The cohorts implanted with INTACS involved 608 keratoconus patients (754 eyes) followed for 1, 2 or 3 years. Three of the reports involved ≥ 2 years of follow-up with the longest having 5-year follow-up data for a small number of patients. Four of the INTAC cohort studies involved 50 or more patients; the largest involved 255 patients. Inclusion criteria for the studies were consistent and included patients who were contact lens intolerant, had adequate corneal thickness, particularly around the area of the implant incision site, and without central corneal scarring. Disease severity, thinning pattern, and corneal cone protrusions all varied and generally required different treatment approaches involving defined segment sizes and locations.
A wide range of outcome measures were reported in the cohort studies. High levels of technical success or ability to place INTAC segments were reported. Technically related complications were often delayed and generally reported as segment migration attributable to early experience. Overall, complications were infrequently reported and largely involved minor reversible events without clinical sequelae.
The outcomes reported across studies involved statistically significant and clinically relevant improvements in corneal topography, refraction and visual acuity, for both uncorrected and best-corrected visual acuity. Patients’ vision was usually restored to within normal functioning levels and for those not achieving satisfactory correction, insertion of intraocular lenses was reported in case studies to result in additional gains in visual acuity. Vision loss (infrequently reported) was usually reversed by implant exchange or removal. The primary effects of INTACS on corneal surface remodelling were consistent with secondary improvements in refractive error and visual acuity. The improvements in visual acuity and refractive error noted at 6 months were maintained at 1 and 2-year follow-up
Improvements in visual acuity and refractive error following insertion of INTACS, however, were not noted for all patients. Although improvements were not found to vary across age groups there were differences across stages of disease. Several reports suggested that improvements in visual acuity and refractive outcomes may not be as large or predictable in more advanced stages of KC. Some studies have suggested that the effects of INTACs were much greater in flattening the corneal surface than in correcting astigmatism. However, these studies involved small numbers of high risk patients in advanced stages of KC and conclusions made from this group are limited.
INTACS were used for other indications other than primary KC. The results of implant insertion on corneal topography, refraction, and visual acuity in post-LASIK thinning cases were similar to those reported for KC. The evidence for this indication, however, only involved case reports and small case series. INTACS were also successfully used to treat recurrent KC after corneal transplant but this was based on only a single case report. Corneal implants were compared to corneal transplantation but these studies were not randomized and based on small numbers of selected patients.
The foremost limitation of the evidence base is the basic study design in the reports that involved longitudinal follow-up only for the treated group; there were no randomized trials. Follow-up in the trials (although at prescribed intervals) often had incomplete accounts of losses at follow-up and estimates of change were often not reported or based on group differences. Second, although standardized outcome measures were reported, contact lens tolerance (a key treatment objective) was infrequently specified. A third general limitation was the lack of reporting of patients’ satisfaction with their vision quality or functional vision. Outcome measures for vision quality and impact on patient quality of life were available but rarely reported and have been noted to be a limitation in ophthalmological literature in general. Fourth, the longitudinal cohort studies have not followed patients long enough to evaluate the impact of implants on the underlying disease process (follow-up beyond 3 years is limited). Additionally, only a few of these studies directly examined corneal thinning in follow-up. The overall quality of evidence determined using the GRADE hierarchy of evidence was moderate.
There is some evidence in these studies to support the claim that corneal implants do not interfere with, or increase the difficultly of, subsequent corneal transplant, at least for those performed shortly after INTAC placement. Although it’s uncertain for how long implants can delay the need for a corneal transplant, given that patients with KC are often young (in their twenties and thirties), delaying transplant for any number of years may still be a valuable consideration.
Conclusion
The clinical indications for corneal implants have evolved from management of myopia in normal eyes to the management of corneal thinning disorders such as KC and thinning occurring after refractive surgery. Despite the limited evidence base for corneal implants, which consists solely of longitudinal follow-up studies, they appear to be a valuable clinical tool for improving vision in patients with corneal thinning. For patients unable to achieve functional vision, corneal implants achieved statistically significant and clinically relevant improvements in corneal topography, refraction, and visual acuity, providing a useful alternative to corneal transplant. Implants may also have a rescue function, treating corneal thinning occurring after refractive surgery in normal eyes, or managing refractive errors following corneal transplant. The treatment offers several advantages in that it’s an outpatient based procedure, is associated with minimal risk, and has high technical success rates. Both eyes can be treated at once and the treatment is adjustable and reversible. The implants can be removed or exchanged to improve vision without limiting subsequent interventions, particularly corneal transplant.
Better reporting on vision quality, functional vision and patient satisfaction, however, would improve evaluation of the impact of these devices. Information on the durability of the implants’ treatment effects and their affects on underlying disease processes is limited. This information is becoming more important as alternative treatment strategies, such as collagen cross-linking aimed at strengthening the underlying corneal tissue, are emerging and which might prove to be more effective or increase the effectiveness of the implants, particularly in advances stages of corneal thinning.
Ontario Health System Considerations
At present there are approximately 70 ophthalmologists in Canada who’ve had training with corneal implants; 30 of these practice in Ontario. Industry currently sponsors the training, proctoring and support for the procedure. The cost of the implant device ranges from $950 to $1200 (CAD) and costs for instrumentation range from $20,000 to $30,000 (CAD) (a one time capital expenditure). There is no physician services fee code for corneal implants in Ontario but assuming that they are no higher than those for a corneal transplant, the estimated surgical costs would be $914.32(CAD) An estimated average cost per patient, based on device costs and surgical fees, for treatment is $1,964 (CAD) (range $1,814 to $2,114) per eye. There have also been no out of province treatment requests. In Ontario the treatment is currently being offered in private clinics and an increasing number of ophthalmologists are being certified in the technique by the manufacturer.
KC is a rare disease and not all of these patients would be eligible candidates for treatment with corneal implants. Based on published population rates of KC occurrence, it can be expected that there is a prevalent population of approximately 6,545 patients and an incident population of 240 newly diagnosed cases per year. Given this small number of potential cases, the use of corneal implants would not be expected to have much impact on the Ontario healthcare system. The potential impact on the provincial budget for managing the incident population, assuming the most conservative scenario (i.e., all are eligible and all receive bilateral implants) ranges from $923 thousand to $1.1 million (CAD). This estimate would vary based on a variety of criteria including eligibility, unilateral or bilateral interventions, re-interventions, capacity and uptake
Keywords
Keratoconus, corneal implants, corneal topography, corneal transplant, visual acuity, refractive error
PMCID: PMC3385416  PMID: 23074513
4.  Reading aids for adults with low vision 
Background
The purpose of low-vision rehabilitation is to allow people to resume or to continue to perform daily living tasks, with reading being one of the most important. This is achieved by providing appropriate optical devices and special training in the use of residual-vision and low-vision aids, which range from simple optical magnifiers to high-magnification video magnifiers.
Objectives
To assess the effects of reading aids for adults with low vision.
Search methods
We searched CENTRAL (which contains the Cochrane Eyes and Vision Group Trials Register) (The Cochrane Library 2013, Issue 1), Ovid MEDLINE, Ovid MEDLINE In-Process and Other Non-Indexed Citations, Ovid MEDLINE Daily, Ovid OLDMEDLINE, (January 1950 to January 2013), EMBASE (January 1980 to January 2013), Latin American and Caribbean Literature on Health Sciences (LILACS) (January 1982 to January 2013), OpenGrey (System for Information on Grey Literature in Europe) (www.opengrey.eu/), the metaRegister of Controlled Trials (mRCT) (www.controlled-trials.com), ClinicalTrials.gov (www.clinicaltrials.gov/) and the WHO International Clinical Trials Registry Platform (ICTRP) (www.who.int/ictrp/search/en). We did not use any date or language restrictions in the electronic searches for trials. We last searched the electronic databases on 31 January 2013. We searched the reference lists of relevant articles and used the Science Citation Index to find articles that cited the included studies and contacted investigators and manufacturers of low-vision aids. We handsearched the British Journal of Visual Impairment from 1983 to 1999 and the Journal of Visual Impairment and Blindness from 1976 to 1991.
Selection criteria
This review includes randomised and quasi-randomised trials in which any device or aid used for reading had been compared to another device or aid in people aged 16 or over with low vision as defined by the study investigators.
Data collection and analysis
At least two authors independently assessed trial quality and extracted data.
Main results
We included nine small studies with a cross-over-like design (181 people overall) and one study with three parallel arms (243 participants) in the review. All studies reported the primary outcome, results for reading speed.
Two studies including 92 participants found moderate- or low-quality evidence suggesting that reading speed is higher with stand-mounted electronic devices or electronic devices with the camera mounted in a ‘mouse’ than with optical magnifiers, which in these trials were generally stand-mounted or, less frequently, hand-held magnifiers or microscopic lenses. In another study of 20 participants there was moderate-quality evidence that optical devices are better than head-mounted electronic devices (four types).
There was low-quality evidence from three studies (93 participants) that reading using head-mounted electronic devices is slower than with stand-based electronic devices. The technology of electronic devices may have changed and improved since these studies were conducted.
One study suggested no difference between a diffractive spectacle-mounted magnifier and either refractive (15 participants) or aplanatic (15 participants) magnifiers.
One study of 10 people suggested that several overlay coloured filters were no better and possibly worse than a clear filter.
A parallel-arm study including 243 participants with age-related macular degeneration found that custom or standard prism spectacles were no different from conventional reading spectacles, although the data did not allow precise estimates of performance to be made.
Authors' conclusions
There is insufficient evidence on the effect of different types of low-vision aids on reading performance. It would be necessary to investigate which patient characteristics predict performance with different devices, including costly electronic devices. Better-quality research should also focus on assessing sustained long-term use of each device. Authors of studies testing several devices on the same person should consider design and reporting issues related to their sequential presentation and to the cross-over-like study design.
doi:10.1002/14651858.CD003303.pub3
PMCID: PMC4288929  PMID: 24154864
*Reading; *Sensory Aids; Eyeglasses; Lenses; Macular Degeneration [complications]; Optical Devices [*standards]; Randomized Controlled Trials as Topic; Vision; Low [*rehabilitation]; Visual Acuity; Visually Impaired Persons [*rehabilitation]; Adult; Humans
5.  Canine and Human Visual Cortex Intact and Responsive Despite Early Retinal Blindness from RPE65 Mutation 
PLoS Medicine  2007;4(6):e230.
Background
RPE65 is an essential molecule in the retinoid-visual cycle, and RPE65 gene mutations cause the congenital human blindness known as Leber congenital amaurosis (LCA). Somatic gene therapy delivered to the retina of blind dogs with an RPE65 mutation dramatically restores retinal physiology and has sparked international interest in human treatment trials for this incurable disease. An unanswered question is how the visual cortex responds after prolonged sensory deprivation from retinal dysfunction. We therefore studied the cortex of RPE65-mutant dogs before and after retinal gene therapy. Then, we inquired whether there is visual pathway integrity and responsivity in adult humans with LCA due to RPE65 mutations (RPE65-LCA).
Methods and Findings
RPE65-mutant dogs were studied with fMRI. Prior to therapy, retinal and subcortical responses to light were markedly diminished, and there were minimal cortical responses within the primary visual areas of the lateral gyrus (activation amplitude mean ± standard deviation [SD] = 0.07% ± 0.06% and volume = 1.3 ± 0.6 cm3). Following therapy, retinal and subcortical response restoration was accompanied by increased amplitude (0.18% ± 0.06%) and volume (8.2 ± 0.8 cm3) of activation within the lateral gyrus (p < 0.005 for both). Cortical recovery occurred rapidly (within a month of treatment) and was persistent (as long as 2.5 y after treatment). Recovery was present even when treatment was provided as late as 1–4 y of age. Human RPE65-LCA patients (ages 18–23 y) were studied with structural magnetic resonance imaging. Optic nerve diameter (3.2 ± 0.5 mm) was within the normal range (3.2 ± 0.3 mm), and occipital cortical white matter density as judged by voxel-based morphometry was slightly but significantly altered (1.3 SD below control average, p = 0.005). Functional magnetic resonance imaging in human RPE65-LCA patients revealed cortical responses with a markedly diminished activation volume (8.8 ± 1.2 cm3) compared to controls (29.7 ± 8.3 cm3, p < 0.001) when stimulated with lower intensity light. Unexpectedly, cortical response volume (41.2 ± 11.1 cm3) was comparable to normal (48.8 ± 3.1 cm3, p = 0.2) with higher intensity light stimulation.
Conclusions
Visual cortical responses dramatically improve after retinal gene therapy in the canine model of RPE65-LCA. Human RPE65-LCA patients have preserved visual pathway anatomy and detectable cortical activation despite limited visual experience. Taken together, the results support the potential for human visual benefit from retinal therapies currently being aimed at restoring vision to the congenitally blind with genetic retinal disease.
The study by Samuel Jacobson and colleagues suggests that retinal gene therapy can improve retinal, visual pathway, and visual cortex responses to light stimulation, even after prolonged periods of blindness and in congenitally blind patients.
Editors' Summary
Background.
The eye captures light but the brain is where vision is experienced. Treatments for childhood blindness at the eye level are ready, but it is unknown whether the brain will be receptive to an improved neural message. Normal vision begins as photoreceptor cells in the retina (the light-sensitive tissue lining the inside of the eye) convert visual images into electrical impulses. These impulses are sent along the optic nerve to the visual cortex, the brain region where they are interpreted. The conversion of light into electrical impulses requires the activation of a molecule called retinal, which is subsequently recycled by retinal pigment epithelium (RPE) cells neighboring the retina. One of the key enzymes of the recycling reactions is encoded by a gene called RPE65. Genetic changes (mutations) in RPE65 cause an inherited form of blindness called Leber congenital amaurosis (LCA). In this disease, retinal is not recycled and as a result, the photoreceptor cells cannot work properly and affected individuals have poor or nonexistent vision from birth. Previous studies in dog and mouse models of the human disease have demonstrated that the introduction of a functional copy of RPE65 into the RPE cells using a harmless virus (gene therapy) dramatically restores retinal activity. Very recently, a pioneering gene therapy operation took place in London (UK) where surgeons injected a functional copy of RPE65 into the retina of a man with LCA. Whether this operation results in improved vision is not known at this time.
Why Was This Study Done?
Gene therapy corrects the retinal defects in animal models of LCA but whether the visual pathway from the retina to the visual cortex of the brain can respond normally to the signals sent by the restored retina is not known. Early visual experience is thought to be necessary for the development of a functional visual cortex, so replacing the defective RPE65 gene might not improve the vision of people with LCA. In this study, the researchers have studied the visual cortex of RPE65-deficient dogs before and after gene therapy to see whether the therapy affects the activity of the visual cortex. They have also investigated visual pathway integrity and responsiveness in adults with LCA caused by RPE65 mutations. If the visual pathway is disrupted in these patients, they reasoned, gene therapy might not restore their vision.
What Did the Researchers Do and Find?
The researchers used a technique called functional magnetic resonance imaging (fMRI) to measure light-induced brain activity in RPE65-deficient dogs before and after gene therapy. They also examined the reactions of the dogs' pupils to light (in LCA, the pupils do not contract normally in response to light because there is reduced signal transmission along the visual pathway). Finally, they measured the electrical activity of the dogs' retinas in response to light flashes—the retinas of patients with LCA do not react to light. Gene therapy corrected the defective retinal and visual pathway responses to light in the RPE65-deficient dogs and, whereas before treatment there was no response in the visual cortex to light stimulation in these dogs, after treatment, its activity approached that seen in normal dogs. The recovery of cortical responses was permanent and occurred soon after treatment, even in animals that were 4 years old when treated. Next, using structural MRI, the researchers studied human patients with LCA and found that the optic nerve diameter in young adults was within the normal range and that the structure of the visual cortex was very similar to that of normal individuals. Finally, using fMRI, they found that, although the visual cortex of patients with LCA did not respond to dim light, its reaction to bright light was comparable to that of normal individuals.
What Do These Findings Mean?
The findings from the dog study indicate that retinal gene therapy rapidly improves retinal, visual pathway, and visual cortex responses to light stimulation, even in animals that have been blind for years. In other words, in the dog model of LCA at least, all the components of the visual system remain receptive to visual inputs even after long periods of visual deprivation. The findings from the human study also indicate that the visual pathway remains anatomically intact despite years of disuse and that the visual cortex can be activated in patients with LCA even though these people have very limited visual experience. Taken together, these findings suggest that successful gene therapy of the retina might restore some functional vision to people with LCA but proof will have to await the outcomes of several clinical trials ongoing or being planned in Europe and the USA.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0040230.
General information on gene therapy is available from the Oak Ridge National Laboratory
Information is provided by the BBC about gene therapy for Leber congenital amaurosis (includes an audio clip from a doctor about the operation)
The National Institutes of Health/National Eye Institute (US) provides information about an ongoing gene therapy trial of RPE65-Leber congenital amaurosis
ClinicalTrials.gov gives details on treatment trials for Leber congenital amaurosis
The Foundation Fighting Blindness has a fact sheet on Leber congenital amaurosis (site includes Microsoft Webspeak links that read some content aloud)
The Foundation for Retinal Research has a fact sheet on Leber congenital amaurosis
Find more detailed information on Leber congenital amaurosis and the gene mutations that cause it from GeneReviews
WonderBaby, information for parents of babies with Leber congenital amaurosis
doi:10.1371/journal.pmed.0040230
PMCID: PMC1896221  PMID: 17594175
6.  Toward the development of a cortically based visual neuroprosthesis 
Journal of neural engineering  2009;6(3):035001.
Motivated by the success of cochlear implants for deaf patients, we are now facing the goal of creating a visual neuroprosthesis designed to interface with the occipital cortex as a means through which a limited but useful sense of vision could be restored in profoundly blind patients. We review the most important challenges regarding this neuroprosthetic approach and emphasize the need for basic human psychophysical research on the best way of presenting complex stimulating patterns through multiple microelectrodes. Continued research will hopefully lead to the development of and design specifications for the first generation of a cortically based visual prosthesis system.
doi:10.1088/1741-2560/6/3/035001
PMCID: PMC2941645  PMID: 19458403
7.  Drug Delivery Interfaces in the 21st Century: From Science Fiction Ideas to Viable Technologies 
Molecular pharmaceutics  2013;10(10):10.1021/mp4003283.
Early science fiction envisioned the future of drug delivery as targeted micron-scale submarines and ‘Cyborg’ body parts. Here we describe the progression of the field toward technologies that are now beginning to capture aspects of this early vision. Specifically, we focus on the two most prominent types of systems in drug delivery – the intravascular micro/nano drug carriers for delivery to the site of pathology and drug-loaded implantable devices that facilitate release with the pre-defined kinetics or in response to a specific cue. We discuss the unmet clinical needs that inspire these designs, the physiological factors that pose difficult challenges for their realization, and viable technologies that promise robust solutions. We also offer a perspective on where drug delivery may be in the next 50 years based on expected advances in material engineering and in the context of future diagnostics.
doi:10.1021/mp4003283
PMCID: PMC3818793  PMID: 23915375
drug delivery; drug carriers; nanotechnology; controlled release implants; physiological barriers; pharmacokinetics; translational medicine
8.  Humans ignore motion and stereo cues in favour of a fictional stable world 
Current biology : CB  2006;16(4):428-432.
Summary
As a human observer moves through the world, their eyes acquire a changing sequence of images. The information from this sequence is sufficient to determine the structure of a 3-D scene, up to a scale factor determined by the distance that the eyes have moved [1, 2]. There is good evidence that the human visual system accounts for the distance the observer has walked [3, 4] and the separation of the eyes [5-8] when judging the scale, shape and distance of objects. However, using an immersive virtual reality environment we created a scene that provided consistent information about scale from both distance walked and binocular vision and yet observers failed to notice when this scene expanded or contracted. This failure led to large errors in judging the size of objects. The pattern of errors cannot be explained by assuming a visual reconstruction of the scene with an incorrect estimate of interocular separation or distance walked. Instead, it is consistent with a Bayesian model of cue integration in which the efficacy of motion and disparity cues is greater at near viewing distances. Our results imply that observers are more willing to adjust their estimate of interocular separation or distance walked than to accept that the scene has changed in size.
doi:10.1016/j.cub.2006.01.019
PMCID: PMC2833396  PMID: 16488879
9.  Phakic Intraocular Lenses for the Treatment of Refractive Errors 
Executive Summary
Objective
The objective of this analysis is to review the effectiveness, safety, and cost-effectiveness of phakic intraocular lenses (pIOLs) for the treatment of myopia, hyperopia, and astigmatism.
Clinical Need: Condition and Target Population
Refractive Errors
Refractive errors occur when the eye cannot focus light properly. In myopia (near- or short-sightedness), distant objects appear blurry because the axis of the eye is too long or the cornea is too steep, so light becomes focused in front of the retina. Hyperopia (far sightedness) occurs when light is focused behind the retina causing nearby objects to appear blurry. In astigmatism, blurred or distorted vision occurs when light is focused at two points rather than one due to an irregularly shaped cornea or lens.
Refractive errors are common worldwide, but high refractive errors are less common. In the United States, the prevalence of high myopia (≤ −5 D) in people aged 20 to 39, 40 to 59, and 60 years and older is 7.4% (95% confidence interval [CI], 6.5% – 8.3%), 7.8% (95% CI, 6.4% – 8.6%), and 3.1% (95% CI, 2.2% – 3.9%), respectively. The prevalence of high hyperopia (≥ 3 D) is 1.0% (95% CI, .6% – 1.4%), 2.4% (95% CI, 1.7% – 3.0%), and 10.0% (95% CI, 9.1% – 10.9%) for the same age groupings. Finally, the prevalence of astigmatism (≥ 1 D cylinder) is 23.1% (95% CI, 21.6% – 24.5%), 27.6% (95% CI, 25.8% – 29.3%) and 50.1% (48.2% – 52.0%).
Low Vision
According to the Ontario Schedule of Benefits, low visual acuity is defined by a best spectacle corrected visual acuity (BSCVA) of 20/50 (6/15) or less in the better eye and not amenable to further medical and/or surgical treatment. Similarly, the Ontario Assistive Devices Program defines low vision as BSCVA in the better eye in the range of 20/70 or less that cannot be corrected medically, surgically, or with ordinary eyeglasses or contact lenses.
Estimates of the prevalence of low vision vary. Using the criteria of BSCVA ranging from 20/70 to 20/160, one study estimated that 35.6 per 10,000 people in Canada have low vision. The 2001 Participation and Activity Limitation Survey (PALS) found that 594,350 (2.5%) Canadians had “difficulty seeing ordinary newsprint or clearly seeing the face of someone from 4 m,” and the Canadian National Institute for the Blind (CNIB) registry classified 105,000 (.35%) Canadians as visually disabled.
Phakic Intraocular Lenses (pIOL)
A phakic intraocular lens (pIOL) is a supplementary lens that is inserted into the anterior or posterior chamber of the eye to correct refractive errors (myopia, hyperopia, and astigmatism). Unlike in cataract surgery, the eye’s natural crystalline lens is not removed when the pIOL is inserted, so the eye retains its accommodative ability. In Canada and the United States, iris-fixated (anterior chamber lenses that are anchored to the iris with a claw) and posterior chamber lenses are the only types of pIOLs that are licensed by Health Canada and the Food and Drug Administration, respectively.
Evidence-Based Analysis Method
Research Questions & Methodology
What are the effectiveness, cost-effectiveness, and safety of pIOLs for the treatment of myopia, hyperopia, and astigmatism?
Do certain subgroups (e.g. high myopia and low vision) benefit more from pIOLs?
How do pIOLs compare with alternative surgical treatment options (LASIK, PRK, and CLE)?
Using appropriate keywords, a literature search was conducted up to January 2009. Systematic reviews, meta-analyses, randomized controlled trials, and observational studies with more than 20 eyes receiving pIOLs were eligible for inclusion. The primary outcomes of interest were uncorrected visual acuity (UCVA), predictability of manifest refraction spherical equivalent (MRSE), and adverse events. The GRADE approach was used to systematically and explicitly evaluate the quality of evidence.
Summary of Findings
The search identified 1,131 citations published between January 1, 2003, and January 16, 2009. Including a health technology assessment (HTA) identified in the bibliography review, 30 studies met the inclusion criteria: two HTAs; one systematic review; 20 pre-post observational studies; and seven comparative studies (five pIOL vs. LASIK, one pIOL vs. PRK, and one pIOL vs. CLE).
Both HTAs concluded that there was good evidence of the short-term efficacy and safety of pIOLs, however, their conclusions regarding long-term safety differed. The 2006 HTA found convincing evidence of long-term safety, while the 2009 HTA found no long-term evidence about the risks of complications including cataract development, corneal damage, and retinal detachment.
The systematic review of adverse events found that cataract development (incidence rate of 9.6% of eyes) is a substantial risk following posterior chamber pIOL implantation, while chronic endothelial cell loss is a safety concern after iris-fixated pIOL implantation. Adverse event rates varied by lens type, but they were more common in eyes that received posterior chamber pIOLs.
The evidence of pIOL effectiveness is based on pre-post case series. These studies reported a variety of outcomes and different follow-up time points. It was difficult to combine the data into meaningful summary measures as many time points are based on a single study with a very small sample size. Overall, the efficacy evidence is low to very low quality based on the GRADE Working Group Criteria.
For all refractive errors (low to high), most eyes experienced a substantial increase in uncorrected visual acuity (UCVA) with more than 75% of eyes achieving UCVA of 20/40 or better at all postoperative time points. The proportion of eyes that achieved postoperative UCVA 20/20 or better varied substantially according type of lens used and the type of refractive error being corrected, ranging from about 30% of eyes that received iris-fixated lenses for myopia to more than 78% of eyes that received posterior chamber toric lenses for myopic astigmatism.
Predictability of manifest refraction spherical equivalent (MRSE) within ± 2.0 D was very high (≥ 90%) for all types of lenses and refractive error. At most time points, more than 50% of eyes achieved a MRSE within ± 0.5 D of emmetropia and at least 85% within ± 1.0 D. Predictability was lower for eyes with more severe preoperative refractive errors. The mean postoperative MRSE was less than 1.0 D in all but two studies.
Safety, defined as a loss of two or more Snellen lines of best spectacle corrected visual acuity (BSCVA), was high for all refractive errors and lens types. Losses of two or more lines of BSCVA were uncommon, occurring in fewer than 2% of eyes that had received posterior chamber pIOLs for myopia, and less than 1% of eyes that received iris-fixated lens implantation for myopia. Most eyes did not experience a clinically significant change in BSCVA (i.e. loss of one line, no change, or gain of one line), but 10% to 20% of eyes gained two or more lines of BSCVA.
The pIOL outcomes for UCVA, predictability, BSCVA, and adverse events were compared with FDA targets and safety values for refractive surgery and found to meet or exceed these targets at most follow-up time points. The results were then stratified to examine the efficacy of pIOLs for high refractive errors. There was limited data for many outcomes and time points, but overall the results were similar to those for all levels of refractive error severity.
The studies that compared pIOLs with LASIK, PRK, and CLE for patients with moderate to high myopia and myopic astigmatism showed that pIOLs performed better than these alternative surgical options for the outcomes of:
UCVA,
predictability and stability of MRSE,
postoperative MRSE,
safety (measured as clinically significant loss of BSCVA), and
gains in BSCVA.
Correction of refractive cylinder (astigmatism) was the only outcome that favoured refractive surgery over pIOLs. This was observed for both toric and non-toric pIOLs (toric pIOLs correct for astigmatism, non-toric pIOLs do not).
Common adverse events in the LASIK groups were diffuse lamellar keratitis and striae in the corneal flap. In the pIOL groups, lens repositioning and lens opacities (both asymptomatic and visually significant cataracts) were the most commonly observed adverse events. These studies were determined to be of low to very low evidence quality based on the GRADE Working Group Criteria.
Keywords
Eye, myopia, hyperopia, astigmatism, phakic intraocular lens, LASIK, PRK, uncorrected visual acuity, best corrected visual acuity, refractive errors, clear lens extraction
PMCID: PMC3377525  PMID: 23074518
10.  Integration and binding in rehabilitative sensory substitution: Increasing resolution using a new Zooming-in approach 
Purpose:
To visually perceive our surroundings we constantly move our eyes and focus on particular details, and then integrate them into a combined whole. Current visual rehabilitation methods, both invasive, like bionic-eyes and non-invasive, like Sensory Substitution Devices (SSDs), down-sample visual stimuli into low-resolution images. Zooming-in to sub-parts of the scene could potentially improve detail perception. Can congenitally blind individuals integrate a ‘visual’ scene when offered this information via different sensory modalities, such as audition? Can they integrate visual information –perceived in parts - into larger percepts despite never having had any visual experience?
Methods:
We explored these questions using a zooming-in functionality embedded in the EyeMusic visual-to-auditory SSD. Eight blind participants were tasked with identifying cartoon faces by integrating their individual components recognized via the EyeMusic’s zooming mechanism.
Results:
After specialized training of just 6–10 hours, blind participants successfully and actively integrated facial features into cartooned identities in 79±18% of the trials in a highly significant manner, (chance level 10% ; rank-sum P <  1.55E-04).
Conclusions:
These findings show that even users who lacked any previous visual experience whatsoever can indeed integrate this visual information with increased resolution. This potentially has important practical visual rehabilitation implications for both invasive and non-invasive methods.
doi:10.3233/RNN-150592
PMCID: PMC4927841  PMID: 26518671
Sensory substitution; vision rehabilitation; action-perception; motor control; active sensing
11.  Gastric Electrical Stimulation 
Executive Summary
Objective
The objective of this analysis was to assess the effectiveness, safety and cost-effectiveness of gastric electrical stimulation (GES) for the treatment of chronic, symptomatic refractory gastroparesis and morbid obesity.
Background
Gastroparesis - Epidemiology
Gastroparesis (GP) broadly refers to impaired gastric emptying in the absence of obstruction. Clinically, this can range from the incidental detection of delayed gastric emptying in an asymptomatic person to patients with severe nausea, vomiting and malnutrition. Symptoms of GP are nonspecific and may mimic structural disorders such as ulcer disease, partial gastric or small bowel obstruction, gastric cancer, and pancreaticobiliary disorders.
Gastroparesis may occur in association with diabetes, gastric surgery (consequence of peptic ulcer surgery and vagotomy) or for unknown reasons (idiopathic gastroparesis). Symptoms include early satiety, nausea, vomiting, abdominal pain and weight loss. The majority of patients with GP are women.
The relationship between upper gastrointestinal symptoms and the rate of gastric emptying is considered to be weak. Some patients with markedly delayed gastric emptying are asymptomatic and sometimes, severe symptoms may remit spontaneously.
Idiopathic GP may represent the most common form of GP. In one tertiary referral retrospective series, the etiologies in 146 GP patients were 36% idiopathic, 29% diabetic, 13% postgastric surgery, 7.5% Parkinson’s disease, 4.8% collagen vascular disorders, 4.1% intestinal pseudoobstruction and 6% miscellaneous causes.
The true prevalence of digestive symptoms in patients with diabetes and the relationship of these symptoms to delayed gastric emptying are unknown. Delayed gastric emptying is present in 27% to 58% of patients with type 1 diabetes and 30% with type 2 diabetes. However, highly variable rates of gastric emptying have been reported in type 1 and 2 diabetes, suggesting that development of GP in patients with diabetes is neither universal nor inevitable. In a review of studies examining gastric emptying in patients with diabetes compared to control patients, investigators noted that in many cases the magnitude of the delay in gastric emptying is modest.
GP may occur as a complication of a number of different surgical procedures. For example, vagal nerve injury may occur in 4% to 40% of patients who undergo laparoscopic fundoplication1 for gastroesophageal reflux disease.
The prevalence of severe, refractory GP is scantily reported in the literature. Using data from a past study, it has been estimated that the prevalence of severe, symptomatic and refractory GP in the United States population is 0.017%. Assuming an Ontario population of 13 million, this would correspond to approximately 2,000 people in Ontario having severe, symptomatic, refractory GP.
The incidence of severe refractory GP estimated by the United States Food and Drug Administration (FDA) is approximately 4,000 per year in the United States. This corresponds to about 150 patients in Ontario. Using expert opinion and FDA data, the incidence of severe refractory GP in Ontario is estimated to be about 20 to 150 per year.
Treatment for Gastroparesis
To date, there have been no long-term studies confirming the beneficial effects of maintaining euglycemia on GP symptoms. However, it has been suggested that consistent findings of physiologic studies in healthy volunteers and diabetes patients provides an argument to strive for near-normal blood glucose levels in affected diabetes patients.
Dietary measures (e.g., low fibre, low fat food), prokinetic drugs (e.g., domperidone, metoclopramide and erythromycin) and antiemetic or antinausea drugs (e.g, phenothiazines, diphenhydramine) are generally effective for symptomatic relief in the majority of patients with GP.
For patients with chronic, symptomatic GP who are refractory to drug treatment, surgical options may include jejunostomy tube for feeding, gastrotomy tube for stomach decompression and pyloroplasty for gastric emptying.
Few small studies examined the use of botulinum toxin injections into the pyloric sphincter. However, the contribution of excessive pyloric contraction to GP has been insufficiently defined and there have been no controlled studies of this therapy.
Treatment with GES is reversible and may be a less invasive option compared to stomach surgery for the treatment of patients with chronic, drug-refractory nausea and vomiting secondary to GP. In theory, GES represents an intermediate step between treatment directed at the underlying pathophysiology, and the treatment of symptoms. It is based on studies of gastric electrical patterns in GP that have identified the presence of a variety of gastric arrhythmias. Similar to a cardiac pacemaker, it was hypothesized that GES could override the abnormal rhythms, stimulate gastric emptying and eliminate symptoms.
Morbid Obesity Epidemiology
Obesity is defined as a body mass index (BMI) of at last 30 kg/m2. Morbid obesity is defined as a BMI of at least 40 kg/m2 or at least 35 kg/m2 with comorbid conditions. Comorbid conditions associated with obesity include diabetes, hypertension, dyslipidemias, obstructive sleep apnea, weight-related arthropathies, and stress urinary incontinence.
In the United States, the age-adjusted prevalence of extreme obesity (BMI ≥ 40 kg/m2) for adults aged 20 years and older has increased significantly in the population, from 2.9% (1988–1994) to 4.7% (1999–2000). An expert estimated that about 160,000 to 180,000 people are morbidly obese in Ontario.
Treatment for Morbid Obesity
Diet, exercise, and behavioural therapy are used to help people lose weight.
Bariatric surgery for morbid obesity is considered an intervention of last resort for patients who have attempted first-line forms of medical management.
Gastric stimulation has been investigated for the treatment of morbid obesity; the intention being to reduce appetite and induce early satiety possibly due to inhibitory effects on gastric motility and effects on the central nervous system (CNS) and hormones related to satiety and/or appetite.
Possible advantages to GES for the treatment of morbid obesity include reversibility of the procedure, less invasiveness than some bariatric procedures, e.g., gastric bypass, and less side effects (e.g., dumping syndrome).
The Device
Electrical stimulation is delivered via an implanted system that consists of a neurostimulator and 2 leads. The surgical procedure can be performed via either an open or laparoscopic approach. An external programmer used by the physician can deliver instructions to the GES, i.e., adjust the rate and amplitude of stimulation (Figure 1). GES may be turned off by the physician at any time or may be removed. The battery life is approximately 4-5 years
For treatment of GP, the GES leads are secured in the muscle of the lower stomach, 10 cm proximal to the pylorus (the opening from the stomach to the intestine), 1 cm apart and connected to an implantable battery-powered neurostimulator which is placed in a small pocket in the abdominal wall
For treatment of morbid obesity, GES leads are implanted along the lesser curvature of the stomach where the vagal nerve branches spread, approximately 8 cm proximal to the pylorus. However, the implant positioning of the leads has been variably reported in the literature.
Regulatory Status
The Enterra Therapy System and the Transcend II Implantable Gastric Stimulation System (Medtronic Inc.) are both licensed as class 3 devices by Health Canada (license numbers 60264 and 66948 respectively). The Health Canada indications for use are:
Enterra Therapy System
“For use in the treatment of chronic intractable (drug-refractory) nausea and vomiting.”
Transcend II Implantable Gastric Stimulation System
“For use in weight reduction for obese adults with a body mass index greater than 35.”
The GES device that is licensed by Health Canada for treatment of GP, produces high-frequency GES. Most clinical studies examining GES for GP have used high-frequency (4 times the intrinsic slow wave frequency, i.e., 12 cycles per minute), low energy, short duration pulses. This type of stimulation does not alter gastric muscular contraction and has no effect on slow wave dysrhythmias. The mechanism of action is unclear but it is hypothesized that high-frequency GES may act on sensory fibers directed to the CNS.
The GES device licensed by Health Canada for treatment of morbid obesity produces low-frequency GES, which is close to or just above the normal/native gastric slow wave cycle (approximately 3 cycles/min.). This pacing uses low-frequency, high-energy, long-duration pulses to induce propagated slow waves that replace the spontaneous ones. Low-frequency pacing does not invoke muscular contractions.
Most studies examining the use of GES for the treatment of morbid obesity use low-frequency GES. Under normal circumstances, the gastric slow wave propagates distally and determines the frequency and propagation direction of gastric peristalsis. Low-frequency GES aims to produce abnormal gastric slow waves that can induce gastric dysrhythmia, disrupt regular propagation of slow waves, cause hypomotility of the stomach, delay gastric emptying, reduce food intake, prolong satiety, and produce weight loss.
In the United States, the Enterra Therapy System is a Humanitarian Use Device (HUD), meaning it is a medical device designated by the FDA for use in the treatment of medical conditions that affect fewer than 4,000 individuals per year.2 The Enterra Therapy System is indicated for “the treatment of chronic, drug- refractory nausea and vomiting secondary to GP of diabetes or idiopathic etiology” (not postsurgical etiologies).
GES for morbid obesity has not been approved by the FDA and is for investigational use only in the United States.
Review Strategy
The Medical Advisory Secretariat systematically reviewed the literature to assess the effectiveness, safety, and cost-effectiveness of GES to treat patients who have: a) chronic refractory symptomatic GP; or b) morbid obesity.
The Medical Advisory Secretariat used its standard search strategy to retrieve international health technology assessments and English-language journal articles from selected databases.
The GRADE approach was used to systematically and explicitly make judgments about the quality of evidence and strength of recommendations.
Findings
As stated by the GRADE Working Group, the following definitions were used in grading the quality of the evidence in Tables 1 and 2.
GRADE Quality of Studies – Gastroparesis
Confounders related to diabetes.
Possible Type 2 error for subgroup analyses.
Subjective self-reported end point.
Posthoc change in primary end point analysis.
No sample size justification.
Concomitant prokinetic/antiemetic therapy.
Only 1 RCT (with different results for FDA and publication).
GES originally hypothesized to correct gastric rhythms, stimulate gastric emptying and therefore eliminate symptoms.
Now hypothesized to directly act on neurons to the CNS to control symptoms.
Weak correlation between symptoms and gastric emptying.
Unclear whether gastric emptying is still considered an end point to investigate.
GRADE Quality of Studies – Morbid Obesity
No sample size calculation.
Small sample size.
No ITT analysis.
Lack of detail regarding dropouts.
Possible Type 2 error.
Sparse details about randomization/blinding.
Full, final results not published.
Only 1 RCT (technically grey literature).
Economic Analysis
No formal economic analysis was identified in the literature search.
The Alberta Heritage Foundation for Medical Research reported that the cost of implanting a GES in the United States for the treatment of GP is estimated to be $30,000 US. In Canada, the device costs approximately $10,700 Cdn; this does not include costs associated with the physician’s training, the implantation procedure, or device programming and maintenance.
Ontario Context
There is no Schedule of Benefits code for GES.
There is no Canadian Classification of Health Interventions Index (CCI) procedure code for GES.
Since the ICD-10 diagnosis code for gastroparesis falls under K31.8 “Other specified diseases of the stomach and duodenum”, it is impossible to determine how many patients in Ontario had discharge abstracts because of gastroparesis.
In 2005, there were less than 5 out-of-country requests for GES (for either consultation only or for surgery).
Gastroparesis
The prevalence of severe, refractory GP is variably reported in the literature.
The Alberta Heritage Foundation for Medical Research estimated that the prevalence of severe, symptomatic and medically refractory GP in the United States population was 0.017%. Assuming a total Ontario population of 13 million, this would correspond to a budget impact of approximately $23.6 M
Cdn ($10,700 Cdn x 2,210 patients) for the device cost alone.
The incidence of severe refractory GP estimated by the FDA is approximately 4,000 per year in the United States. This corresponds to about 150 patients in Ontario. Using expert opinion and FDA data, the incidence of severe refractory GP in Ontario is estimated to be about 20 to 150 per year. This corresponds to a budget impact of approximately $107,000 Cdn to $1.6M Cdn per year for the device cost alone.
Morbid Obesity
An expert in the field estimated that there are 160,000 to 180,000 people in Ontario who are morbidly obese. This would correspond to a budget impact of approximately $1.7B Cdn to $1.9B Cdn for the device cost alone (assuming 100% uptake). However, the true uptake of GES for morbid obesity is unknown in relation to other types of bariatric surgery (which are more effective).
Conclusion
As per the GRADE Working Group, overall recommendations consider 4 main factors.
The tradeoffs, taking into account the estimated size of the effect for the main outcome, the confidence limits around those estimates and the relative value placed on the outcome.
The quality of the evidence.
Translation of the evidence into practice in a specific setting, taking into consideration important factors that could be expected to modify the size of the expected effects such as proximity to a hospital or availability of necessary expertise.
Uncertainty about the baseline risk for the population of interest.
The GRADE Working Group also recommends that incremental costs of healthcare alternatives should be considered explicitly alongside the expected health benefits and harms. Recommendations rely on judgments about the value of the incremental health benefits in relation to the incremental costs. The last column in Table 3 shows the overall trade-off between benefits and harms and incorporates any risk/uncertainty.
For GP, the overall GRADE and strength of the recommendation is “weak” – the quality of the evidence is “low” (uncertainties due to methodological limitations in the study design in terms of study quality, consistency and directness), and the corresponding risk/uncertainty is increased due to a budget impact of approximately $107,000 Cdn to $1.6M Cdn for the device cost alone, while the cost-effectiveness of GES is unknown and difficult to estimate considering that there are no high-quality studies of effectiveness. Further evidence of effectiveness should be available in the future since there is a RCT underway that is examining the use of GES in patients with severe refractory GP associated with diabetes and idiopathic etiologies (ClinicalTrials.gov identifier NCT00157755).
For morbid obesity, the overall GRADE and strength of the recommendation is “weak” – the quality of the evidence is “low” (uncertainties due to methodological limitations in the study design in terms of study quality and consistency), and the corresponding risk/uncertainty is increased due to a budget impact of approximately $1.7B Cdn to $1.9B Cdn for the device cost alone (assuming 100% uptake) while the cost-effectiveness of GES is unknown and difficult to estimate considering that there are no high quality studies of effectiveness. However, the true uptake of GES for morbid obesity is unknown in relation to other types of bariatric surgery (which are more effective).
Overall GRADE and Strength of Recommendation (Including Uncertainty)
PMCID: PMC3413096  PMID: 23074486
12.  Stem cell therapy: facts and fiction 
Facts, Views & Vision in ObGyn  2012;4(3):195-197.
This opinion paper is a brief overview of the current state of the translation of stem cell therapy from the bench to the clinic. The hype generated by the great medical potential of stem cells has lead to hundreds of clinics worldwide claiming to have the cure for every imaginable condition. This fraudulent practice is far from the reality of scientists and bona fide companies. Much effort is put into addressing all the hurdles we have been encountering for the safe use of stem cells in therapy. By now, a significant number of clinical trials are booking very exciting progress, opening a realistic path to the use of these amazing cells in regenerative medicine.
PMCID: PMC3991398  PMID: 24753907
Embryonic stem cells; induced pluripotent stem cells; stem cell therapy; clinical trial; regenerative medicine; safety
13.  Effectiveness of conventional versus virtual reality based vestibular rehabilitation in the treatment of dizziness, gait and balance impairment in adults with unilateral peripheral vestibular loss: a randomised controlled trial 
Background
Unilateral peripheral vestibular loss results in gait and balance impairment, dizziness and oscillopsia. Vestibular rehabilitation benefits patients but optimal treatment remains unkown. Virtual reality is an emerging tool in rehabilitation and provides opportunities to improve both outcomes and patient satisfaction with treatment. The Nintendo Wii Fit Plus® (NWFP) is a low cost virtual reality system that challenges balance and provides visual and auditory feedback. It may augment the motor learning that is required to improve balance and gait, but no trials to date have investigated efficacy.
Methods/Design
In a single (assessor) blind, two centre randomised controlled superiority trial, 80 patients with unilateral peripheral vestibular loss will be randomised to either conventional or virtual reality based (NWFP) vestibular rehabilitation for 6 weeks. The primary outcome measure is gait speed (measured with three dimensional gait analysis). Secondary outcomes include computerised posturography, dynamic visual acuity, and validated questionnaires on dizziness, confidence and anxiety/depression. Outcome will be assessed post treatment (8 weeks) and at 6 months.
Discussion
Advances in the gaming industry have allowed mass production of highly sophisticated low cost virtual reality systems that incorporate technology previously not accessible to most therapists and patients. Importantly, they are not confined to rehabilitation departments, can be used at home and provide an accurate record of adherence to exercise. The benefits of providing augmented feedback, increasing intensity of exercise and accurately measuring adherence may improve conventional vestibular rehabilitation but efficacy must first be demonstrated.
Trial registration
Clinical trials.gov identifier: NCT01442623
doi:10.1186/1472-6815-12-3
PMCID: PMC3394213  PMID: 22449224
Rehabilitation; Vestibular diseases; Nintendo Wii Fit Plus®; Virtual reality; Postural balance; Dizziness; Vertigo; Gait; Visual acuity; Feedback sensory
14.  Telerehabilitation for people with low vision 
Background
Low vision affects over 300 million people worldwide and can compromise both activities of daily living and quality of life. Rehabilitative training and vision assistive equipment (VAE) may help, but some visually impaired people have limited resources to attend in-person visits at rehabilitation clinics. These people may be able to overcome barriers to care through remote, Internet-based consultation (i.e., telerehabilitation).
Objectives
To compare the effects of telerehabilitation with face-to-face (e.g., in-office or inpatient) vision rehabilitation services for improving vision-related quality of life and reading speed in people with visual function loss due to any ocular condition. Secondary objectives are to evaluate compliance with scheduled rehabilitation sessions, abandonment rates for visual assistive equipment devices, and patient satisfaction ratings.
Search methods
We searched CENTRAL (which contains the Cochrane Eyes and Vision Group Trials Register) (2015 Issue 5), Ovid MEDLINE, Ovid MEDLINE In-Process and Other Non-Indexed Citations, Ovid MEDLINE Daily, Ovid OLDMEDLINE (January 1980 to June 2015), EMBASE (January 1980 to June 2015), PubMed (1980 to June 2015), ClinicalTrials.gov (www.clinicaltrials.gov) and the World Health Organization (WHO) International Clinical Trials Registry Platform (ICTRP) (www.who.int/ictrp/search/en). We did not use any language restriction or study design filter in the electronic searches; however, we restricted the searches from 1980 onwards because the Internet was not introduced to the public until 1982. We last searched the electronic databases on 15 June 2015.
Selection criteria
We planned to include randomized controlled trials (RCTs) or controlled clinical trials (CCTs) in which participants were diagnosed with low vision and were undergoing low vision rehabilitation using an Internet, web-based technology compared with an approach based on in-person consultations.
Data collection and analysis
Two authors independently screened titles and abstracts, and then full-text articles against the eligibility criteria. We planned to have two authors independently abstract data from included studies. We resolved discrepancies by discussion.
Main results
We did not find any study that met the inclusion criteria for this review and, hence, we did not conduct a quantitative analysis. As a part of the background, we discussed review articles on telemedicine for facilitating communication with elderly individuals or for providing remote ophthalmological care.
Authors’ conclusions
We did not find any evidence on whether the use of telerehabilitation is feasible or a potentially viable means to remotely deliver rehabilitation services to individuals with low vision. Given the disease burden and the growing interest in telemedicine, there is a need for future pilot studies and subsequent clinical trials to explore the potential for telerehabilitation as a platform for providing services to people with low vision.
doi:10.1002/14651858.CD011019.pub2
PMCID: PMC4730549  PMID: 26329308
15.  Precision Medicine and Non-Colorectal Cancer Liver Metastases: Fiction or Reality? 
Viszeralmedizin  2015;31(6):434-439.
Summary
Background
Non-colorectal liver metastases (nCRLM) constitute a variety of heterogeneous diseases and a considerable therapeutic challenge. Management is based on the primary tumor and the clinical course. In the era of precision medicine (PM) we know that cancer is heterogeneous within the tumor and across different sites.
Methods
We give an overview of the path to PM through ‘omics’ beyond genomics. We refer to the experience gained to date from models such as colorectal cancer and we discuss the opportunity offered by PM for the management of nCRLM.
Results
In order to best characterize and track tumor biological behaviors as well as to understand mechanisms of response to therapy and survival we suggest the application of novel clinical trial designs, a dynamic approach with serial monitoring involving evaluation of primary and metastatic sites. Quality and standardization of tissue acquisition and biobanking is a precondition for the reliability of this approach.
Conclusion
The application of PM is increasingly becoming a reality. Elucidating the mysteries of tumors in complex settings can only be achieved with the approach PM offers. nCRLM may serve as a model for the application of PM principles and techniques in understanding individual diseases and also cancer as an entity and therapeutic challenge.
doi:10.1159/000442485
PMCID: PMC4748797  PMID: 26889147
Liver metastases; Non-colorectal cancer; Precision medicine; Immune oncology; Clinical trials
16.  Probing the functional impact of sub-retinal prosthesis 
eLife  null;5:e12687.
Retinal prostheses are promising tools for recovering visual functions in blind patients but, unfortunately, with still poor gains in visual acuity. Improving their resolution is thus a key challenge that warrants understanding its origin through appropriate animal models. Here, we provide a systematic comparison between visual and prosthetic activations of the rat primary visual cortex (V1). We established a precise V1 mapping as a functional benchmark to demonstrate that sub-retinal implants activate V1 at the appropriate position, scalable to a wide range of visual luminance, but with an aspect-ratio and an extent much larger than expected. Such distorted activation profile can be accounted for by the existence of two sources of diffusion, passive diffusion and activation of ganglion cells’ axons en passant. Reverse-engineered electrical pulses based on impedance spectroscopy is the only solution we tested that decreases the extent and aspect-ratio, providing a promising solution for clinical applications.
DOI: http://dx.doi.org/10.7554/eLife.12687.001
eLife digest
One of the most common causes of blindness is a disorder called retinitis pigmentosa. In a healthy eye, the surface at the back of the eye – called the retina – contains cells called photoreceptors that detect light and convert it into electrical signals for the brain to process. In people with retinitis pigmentosa, these photoreceptor cells die off gradually, which leads to loss of vision.
The only treatment available for retinitis pigmentosa is to have an artificial retina implanted into the eye. The artificial retina consists of an array of tiny electrodes, which take over from the damaged photoreceptors and generate electrical signals. The person with the implant perceives these electrical signals as bright flashes called “phosphenes”. However, the phosphenes are too large and imprecise to provide the person with vision that is good enough for tasks such as walking unaided or reading.
To find out why artificial retinas produce such poor resolution, Roux et al. compared how a rat’s brain responds to either natural visual stimuli or activation of implanted an array of micro-electrodes. Both the micro-electrodes and the natural stimuli activated the same areas of the brain. However, the micro-electrodes produced larger and more elongated patterns of activation. This is because the electrical currents generated by the micro-electrodes diffused throughout the retinal tissue and activated other neurons besides those intended. To overcome this problem, Roux et al. tested different ways of stimulating the micro-electrodes in order to identify those that induce the desired patterns of brain activity. This approach – known as reverse engineering – did indeed improve the performance of the micro-electrode array.
The next step is to extend these findings, which were obtained in healthy rats, to non-human primates or animal models of retinitis pigmentosa to better understand the condition in humans. In addition, combining the current approach with other existing techniques should further improve the vision that can be achieved with artificial retinas.
DOI: http://dx.doi.org/10.7554/eLife.12687.002
doi:10.7554/eLife.12687
PMCID: PMC4995098  PMID: 27549126
primary visual cortex; retinal implants; optical imaging; impedance spectroscopy; artificial visual acuity; Rat
17.  The FiCTION dental trial protocol – filling children’s teeth: indicated or not? 
BMC Oral Health  2013;13:25.
Background
There is a lack of evidence for effective management of dental caries (decay) in children’s primary (baby) teeth and an apparent failure of conventional dental restorations (fillings) to prevent dental pain and infection for UK children in Primary Care. UK dental schools’ teaching has been based on British Society of Paediatric Dentistry guidance which recommends that caries in primary teeth should be removed and a restoration placed. However, the evidence base for this is limited in volume and quality, and comes from studies conducted in either secondary care or specialist practices. Restorations provided in specialist environments can be effective but the generalisability of this evidence to Primary Care has been questioned.
The FiCTION trial addresses the Health Technology Assessment (HTA) Programme’s commissioning brief and research question “What is the clinical and cost effectiveness of restoration caries in primary teeth, compared to no treatment?” It compares conventional restorations with an intermediate treatment strategy based on the biological (sealing-in) management of caries and with no restorations.
Methods/Design
This is a Primary Care-based multi-centre, three-arm, parallel group, patient-randomised controlled trial. Practitioners are recruiting 1461 children, (3–7 years) with at least one primary molar tooth where caries extends into dentine. Children are randomized and treated according to one of three treatment approaches; conventional caries management with best practice prevention, biological management of caries with best practice prevention or best practice prevention alone.
Baseline measures and outcome data (at review/treatment during three year follow-up) are assessed through direct reporting, clinical examination including blinded radiograph assessment, and child/parent questionnaires.
The primary outcome measure is the incidence of either pain or infection related to dental caries.
Secondary outcomes are; incidence of caries in primary and permanent teeth, patient quality of life, cost-effectiveness, acceptability of treatment strategies to patients and parents and their experiences, and dentists’ preferences.
Discussion
FiCTION will provide evidence for the most clinically-effective and cost-effective approach to managing caries in children’s primary teeth in Primary Care. This will support general dental practitioners in treatment decision making for child patients to minimize pain and infection in primary teeth. The trial is currently recruiting patients.
Trial registration
Protocol ID: NCTU: ISRCTN77044005
doi:10.1186/1472-6831-13-25
PMCID: PMC3698078  PMID: 23725316
Dental caries; Caries prevention; Primary teeth; Prevention; Paediatric Dentistry; Restoration; Fillings; RCT; Primary care
18.  PD-L1 biomarker testing for non-small cell lung cancer: truth or fiction? 
Research in cancer immunology is currently accelerating following a series of cancer immunotherapy breakthroughs during the last 5 years. Various monoclonal antibodies which block the interaction between checkpoint molecules PD-1 on immune cells and PD-L1 on cancer cells have been used to successfully treat non-small cell lung cancer (NSCLC), including some durable responses lasting years. Two drugs, nivolumab and pembrolizumab, are now FDA approved for use in certain patients who have failed or progressed on platinum-based or targeted therapies while agents targeting PD-L1, atezolizumab and durvalumab, are approaching the final stages of clinical testing. Despite impressive treatment outcomes in a subset of patients who receive these immune therapies, many patients with NSCLC fail to respond to anti-PD-1/PD-L1 and the identification of a biomarker to select these patients remains highly sought after. In this review, we discuss the recent clinical trial results of pembrolizumab, nivolumab, and atezolizumab for NSCLC, and the significance of companion diagnostic testing for tumor PD-L1 expression.
doi:10.1186/s40425-016-0153-x
PMCID: PMC4986262  PMID: 27532023
NSCLC; PD-1; PD-L1; Immunotherapy; Nivolumab; Pembrolizumab; Immune checkpoint inhibitor; Biomarker; Lung cancer
19.  Improved content aware scene retargeting for retinitis pigmentosa patients 
Background
In this paper we present a novel scene retargeting technique to reduce the visual scene while maintaining the size of the key features. The algorithm is scalable to implementation onto portable devices, and thus, has potential for augmented reality systems to provide visual support for those with tunnel vision. We therefore test the efficacy of our algorithm on shrinking the visual scene into the remaining field of view for those patients.
Methods
Simple spatial compression of visual scenes makes objects appear further away. We have therefore developed an algorithm which removes low importance information, maintaining the size of the significant features. Previous approaches in this field have included seam carving, which removes low importance seams from the scene, and shrinkability which dynamically shrinks the scene according to a generated importance map. The former method causes significant artifacts and the latter is inefficient. In this work we have developed a new algorithm, combining the best aspects of both these two previous methods. In particular, our approach is to generate a shrinkability importance map using as seam based approach. We then use it to dynamically shrink the scene in similar fashion to the shrinkability method. Importantly, we have implemented it so that it can be used in real time without prior knowledge of future frames.
Results
We have evaluated and compared our algorithm to the seam carving and image shrinkability approaches from a content preservation perspective and a compression quality perspective. Also our technique has been evaluated and tested on a trial included 20 participants with simulated tunnel vision. Results show the robustness of our method at reducing scenes up to 50% with minimal distortion. We also demonstrate efficacy in its use for those with simulated tunnel vision of 22 degrees of field of view or less.
Conclusions
Our approach allows us to perform content aware video resizing in real time using only information from previous frames to avoid jitter. Also our method has a great benefit over the ordinary resizing method and even over other image retargeting methods. We show that the benefit derived from this algorithm is significant to patients with fields of view 20° or less.
doi:10.1186/1475-925X-9-52
PMCID: PMC2949883  PMID: 20846440
20.  Clinical and Laboratory Evaluation of Peripheral Prism Glasses for Hemianopia 
Purpose
Homonymous hemianopia (the loss of vision on the same side in each eye) impairs the ability to navigate and walk safely. We evaluated peripheral prism glasses as a low vision optical device for hemianopia in an extended wearing trial.
Methods
Twenty-three patients with complete hemianopia (13 right) with neither visual neglect nor cognitive deficit enrolled in the 5-visit study. To expand the horizontal visual field, patients’ spectacles were fitted with both upper and lower Press-On™ Fresnel prism segments (each 40 prism diopters) across the upper and lower portions of the lens on the hemianopic (“blind”) side. Patients were asked to wear these spectacles as much as possible for the duration of the study, which averaged 9 (range: 5 to 13) weeks. Clinical success (continued wear, indicating perceived overall benefit), visual field expansion, perceived direction and perceived quality of life were measured.
Results
Clinical Success: 14 of 21 (67%) patients chose to continue to wear the peripheral prism glasses at the end of the study (2 patients did not complete the study for non-vision reasons). At long-term follow-up (8 to 51 months), 5 of 12 (42%) patients reported still wearing the device. Visual Field Expansion: Expansion of about 22 degrees in both the upper and lower quadrants was demonstrated for all patients (binocular perimetry, Goldmann V4e). Perceived Direction: Two patients demonstrated a transient adaptation to the change in visual direction produced by the peripheral prism glasses. Quality of Life: At study end, reduced difficulty noticing obstacles on the hemianopic side was reported.
Conclusions
The peripheral prism glasses provided reported benefits (usually in obstacle avoidance) to 2/3 of the patients completing the study, a very good success rate for a vision rehabilitation device. Possible reasons for long-term discontinuation and limited adaptation of perceived direction are discussed.
doi:10.1097/OPX.0b013e31819f9e4d
PMCID: PMC2680467  PMID: 19357552
low vision; prism adaptation; rehabilitation; prism treatment; visual impairment; traumatic brain injury; stroke
21.  Internet-Based Device-Assisted Remote Monitoring of Cardiovascular Implantable Electronic Devices 
Executive Summary
Objective
The objective of this Medical Advisory Secretariat (MAS) report was to conduct a systematic review of the available published evidence on the safety, effectiveness, and cost-effectiveness of Internet-based device-assisted remote monitoring systems (RMSs) for therapeutic cardiac implantable electronic devices (CIEDs) such as pacemakers (PMs), implantable cardioverter-defibrillators (ICDs), and cardiac resynchronization therapy (CRT) devices. The MAS evidence-based review was performed to support public financing decisions.
Clinical Need: Condition and Target Population
Sudden cardiac death (SCD) is a major cause of fatalities in developed countries. In the United States almost half a million people die of SCD annually, resulting in more deaths than stroke, lung cancer, breast cancer, and AIDS combined. In Canada each year more than 40,000 people die from a cardiovascular related cause; approximately half of these deaths are attributable to SCD.
Most cases of SCD occur in the general population typically in those without a known history of heart disease. Most SCDs are caused by cardiac arrhythmia, an abnormal heart rhythm caused by malfunctions of the heart’s electrical system. Up to half of patients with significant heart failure (HF) also have advanced conduction abnormalities.
Cardiac arrhythmias are managed by a variety of drugs, ablative procedures, and therapeutic CIEDs. The range of CIEDs includes pacemakers (PMs), implantable cardioverter-defibrillators (ICDs), and cardiac resynchronization therapy (CRT) devices. Bradycardia is the main indication for PMs and individuals at high risk for SCD are often treated by ICDs.
Heart failure (HF) is also a significant health problem and is the most frequent cause of hospitalization in those over 65 years of age. Patients with moderate to severe HF may also have cardiac arrhythmias, although the cause may be related more to heart pump or haemodynamic failure. The presence of HF, however, increases the risk of SCD five-fold, regardless of aetiology. Patients with HF who remain highly symptomatic despite optimal drug therapy are sometimes also treated with CRT devices.
With an increasing prevalence of age-related conditions such as chronic HF and the expanding indications for ICD therapy, the rate of ICD placement has been dramatically increasing. The appropriate indications for ICD placement, as well as the rate of ICD placement, are increasingly an issue. In the United States, after the introduction of expanded coverage of ICDs, a national ICD registry was created in 2005 to track these devices. A recent survey based on this national ICD registry reported that 22.5% (25,145) of patients had received a non-evidence based ICD and that these patients experienced significantly higher in-hospital mortality and post-procedural complications.
In addition to the increased ICD device placement and the upfront device costs, there is the need for lifelong follow-up or surveillance, placing a significant burden on patients and device clinics. In 2007, over 1.6 million CIEDs were implanted in Europe and the United States, which translates to over 5.5 million patient encounters per year if the recommended follow-up practices are considered. A safe and effective RMS could potentially improve the efficiency of long-term follow-up of patients and their CIEDs.
Technology
In addition to being therapeutic devices, CIEDs have extensive diagnostic abilities. All CIEDs can be interrogated and reprogrammed during an in-clinic visit using an inductive programming wand. Remote monitoring would allow patients to transmit information recorded in their devices from the comfort of their own homes. Currently most ICD devices also have the potential to be remotely monitored. Remote monitoring (RM) can be used to check system integrity, to alert on arrhythmic episodes, and to potentially replace in-clinic follow-ups and manage disease remotely. They do not currently have the capability of being reprogrammed remotely, although this feature is being tested in pilot settings.
Every RMS is specifically designed by a manufacturer for their cardiac implant devices. For Internet-based device-assisted RMSs, this customization includes details such as web application, multiplatform sensors, custom algorithms, programming information, and types and methods of alerting patients and/or physicians. The addition of peripherals for monitoring weight and pressure or communicating with patients through the onsite communicators also varies by manufacturer. Internet-based device-assisted RMSs for CIEDs are intended to function as a surveillance system rather than an emergency system.
Health care providers therefore need to learn each application, and as more than one application may be used at one site, multiple applications may need to be reviewed for alarms. All RMSs deliver system integrity alerting; however, some systems seem to be better geared to fast arrhythmic alerting, whereas other systems appear to be more intended for remote follow-up or supplemental remote disease management. The different RMSs may therefore have different impacts on workflow organization because of their varying frequency of interrogation and methods of alerts. The integration of these proprietary RM web-based registry systems with hospital-based electronic health record systems has so far not been commonly implemented.
Currently there are 2 general types of RMSs: those that transmit device diagnostic information automatically and without patient assistance to secure Internet-based registry systems, and those that require patient assistance to transmit information. Both systems employ the use of preprogrammed alerts that are either transmitted automatically or at regular scheduled intervals to patients and/or physicians.
The current web applications, programming, and registry systems differ greatly between the manufacturers of transmitting cardiac devices. In Canada there are currently 4 manufacturers—Medtronic Inc., Biotronik, Boston Scientific Corp., and St Jude Medical Inc.—which have regulatory approval for remote transmitting CIEDs. Remote monitoring systems are proprietary to the manufacturer of the implant device. An RMS for one device will not work with another device, and the RMS may not work with all versions of the manufacturer’s devices.
All Internet-based device-assisted RMSs have common components. The implanted device is equipped with a micro-antenna that communicates with a small external device (at bedside or wearable) commonly known as the transmitter. Transmitters are able to interrogate programmed parameters and diagnostic data stored in the patients’ implant device. The information transfer to the communicator can occur at preset time intervals with the participation of the patient (waving a wand over the device) or it can be sent automatically (wirelessly) without their participation. The encrypted data are then uploaded to an Internet-based database on a secure central server. The data processing facilities at the central database, depending on the clinical urgency, can trigger an alert for the physician(s) that can be sent via email, fax, text message, or phone. The details are also posted on the secure website for viewing by the physician (or their delegate) at their convenience.
Research Questions
The research directions and specific research questions for this evidence review were as follows:
To identify the Internet-based device-assisted RMSs available for follow-up of patients with therapeutic CIEDs such as PMs, ICDs, and CRT devices.
To identify the potential risks, operational issues, or organizational issues related to Internet-based device-assisted RM for CIEDs.
To evaluate the safety, acceptability, and effectiveness of Internet-based device-assisted RMSs for CIEDs such as PMs, ICDs, and CRT devices.
To evaluate the safety, effectiveness, and cost-effectiveness of Internet-based device-assisted RMSs for CIEDs compared to usual outpatient in-office monitoring strategies.
To evaluate the resource implications or budget impact of RMSs for CIEDs in Ontario, Canada.
Research Methods
Literature Search
The review included a systematic review of published scientific literature and consultations with experts and manufacturers of all 4 approved RMSs for CIEDs in Canada. Information on CIED cardiac implant clinics was also obtained from Provincial Programs, a division within the Ministry of Health and Long-Term Care with a mandate for cardiac implant specialty care. Various administrative databases and registries were used to outline the current clinical follow-up burden of CIEDs in Ontario. The provincial population-based ICD database developed and maintained by the Institute for Clinical Evaluative Sciences (ICES) was used to review the current follow-up practices with Ontario patients implanted with ICD devices.
Search Strategy
A literature search was performed on September 21, 2010 using OVID MEDLINE, MEDLINE In-Process and Other Non-Indexed Citations, EMBASE, the Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Cochrane Library, and the International Agency for Health Technology Assessment (INAHTA) for studies published from 1950 to September 2010. Search alerts were generated and reviewed for additional relevant literature until December 31, 2010. Abstracts were reviewed by a single reviewer and, for those studies meeting the eligibility criteria full-text articles were obtained. Reference lists were also examined for any additional relevant studies not identified through the search.
Inclusion Criteria
published between 1950 and September 2010;
English language full-reports and human studies;
original reports including clinical evaluations of Internet-based device-assisted RMSs for CIEDs in clinical settings;
reports including standardized measurements on outcome events such as technical success, safety, effectiveness, cost, measures of health care utilization, morbidity, mortality, quality of life or patient satisfaction;
randomized controlled trials (RCTs), systematic reviews and meta-analyses, cohort and controlled clinical studies.
Exclusion Criteria
non-systematic reviews, letters, comments and editorials;
reports not involving standardized outcome events;
clinical reports not involving Internet-based device assisted RM systems for CIEDs in clinical settings;
reports involving studies testing or validating algorithms without RM;
studies with small samples (<10 subjects).
Outcomes of Interest
The outcomes of interest included: technical outcomes, emergency department visits, complications, major adverse events, symptoms, hospital admissions, clinic visits (scheduled and/or unscheduled), survival, morbidity (disease progression, stroke, etc.), patient satisfaction, and quality of life.
Summary of Findings
The MAS evidence review was performed to review available evidence on Internet-based device-assisted RMSs for CIEDs published until September 2010. The search identified 6 systematic reviews, 7 randomized controlled trials, and 19 reports for 16 cohort studies—3 of these being registry-based and 4 being multi-centered. The evidence is summarized in the 3 sections that follow.
1. Effectiveness of Remote Monitoring Systems of CIEDs for Cardiac Arrhythmia and Device Functioning
In total, 15 reports on 13 cohort studies involving investigations with 4 different RMSs for CIEDs in cardiology implant clinic groups were identified in the review. The 4 RMSs were: Care Link Network® (Medtronic Inc,, Minneapolis, MN, USA); Home Monitoring® (Biotronic, Berlin, Germany); House Call 11® (St Jude Medical Inc., St Pauls, MN, USA); and a manufacturer-independent RMS. Eight of these reports were with the Home Monitoring® RMS (12,949 patients), 3 were with the Care Link® RMS (167 patients), 1 was with the House Call 11® RMS (124 patients), and 1 was with a manufacturer-independent RMS (44 patients). All of the studies, except for 2 in the United States, (1 with Home Monitoring® and 1 with House Call 11®), were performed in European countries.
The RMSs in the studies were evaluated with different cardiac implant device populations: ICDs only (6 studies), ICD and CRT devices (3 studies), PM and ICD and CRT devices (4 studies), and PMs only (2 studies). The patient populations were predominately male (range, 52%–87%) in all studies, with mean ages ranging from 58 to 76 years. One study population was unique in that RMSs were evaluated for ICDs implanted solely for primary prevention in young patients (mean age, 44 years) with Brugada syndrome, which carries an inherited increased genetic risk for sudden heart attack in young adults.
Most of the cohort studies reported on the feasibility of RMSs in clinical settings with limited follow-up. In the short follow-up periods of the studies, the majority of the events were related to detection of medical events rather than system configuration or device abnormalities. The results of the studies are summarized below:
The interrogation of devices on the web platform, both for continuous and scheduled transmissions, was significantly quicker with remote follow-up, both for nurses and physicians.
In a case-control study focusing on a Brugada population–based registry with patients followed-up remotely, there were significantly fewer outpatient visits and greater detection of inappropriate shocks. One death occurred in the control group not followed remotely and post-mortem analysis indicated early signs of lead failure prior to the event.
Two studies examined the role of RMSs in following ICD leads under regulatory advisory in a European clinical setting and noted:
– Fewer inappropriate shocks were administered in the RM group.
– Urgent in-office interrogations and surgical revisions were performed within 12 days of remote alerts.
– No signs of lead fracture were detected at in-office follow-up; all were detected at remote follow-up.
Only 1 study reported evaluating quality of life in patients followed up remotely at 3 and 6 months; no values were reported.
Patient satisfaction was evaluated in 5 cohort studies, all in short term follow-up: 1 for the Home Monitoring® RMS, 3 for the Care Link® RMS, and 1 for the House Call 11® RMS.
– Patients reported receiving a sense of security from the transmitter, a good relationship with nurses and physicians, positive implications for their health, and satisfaction with RM and organization of services.
– Although patients reported that the system was easy to implement and required less than 10 minutes to transmit information, a variable proportion of patients (range, 9% 39%) reported that they needed the assistance of a caregiver for their transmission.
– The majority of patients would recommend RM to other ICD patients.
– Patients with hearing or other physical or mental conditions hindering the use of the system were excluded from studies, but the frequency of this was not reported.
Physician satisfaction was evaluated in 3 studies, all with the Care Link® RMS:
– Physicians reported an ease of use and high satisfaction with a generally short-term use of the RMS.
– Physicians reported being able to address the problems in unscheduled patient transmissions or physician initiated transmissions remotely, and were able to handle the majority of the troubleshooting calls remotely.
– Both nurses and physicians reported a high level of satisfaction with the web registry system.
2. Effectiveness of Remote Monitoring Systems in Heart Failure Patients for Cardiac Arrhythmia and Heart Failure Episodes
Remote follow-up of HF patients implanted with ICD or CRT devices, generally managed in specialized HF clinics, was evaluated in 3 cohort studies: 1 involved the Home Monitoring® RMS and 2 involved the Care Link® RMS. In these RMSs, in addition to the standard diagnostic features, the cardiac devices continuously assess other variables such as patient activity, mean heart rate, and heart rate variability. Intra-thoracic impedance, a proxy measure for lung fluid overload, was also measured in the Care Link® studies. The overall diagnostic performance of these measures cannot be evaluated, as the information was not reported for patients who did not experience intra-thoracic impedance threshold crossings or did not undergo interventions. The trial results involved descriptive information on transmissions and alerts in patients experiencing high morbidity and hospitalization in the short study periods.
3. Comparative Effectiveness of Remote Monitoring Systems for CIEDs
Seven RCTs were identified evaluating RMSs for CIEDs: 2 were for PMs (1276 patients) and 5 were for ICD/CRT devices (3733 patients). Studies performed in the clinical setting in the United States involved both the Care Link® RMS and the Home Monitoring® RMS, whereas all studies performed in European countries involved only the Home Monitoring® RMS.
3A. Randomized Controlled Trials of Remote Monitoring Systems for Pacemakers
Two trials, both multicenter RCTs, were conducted in different countries with different RMSs and study objectives. The PREFER trial was a large trial (897 patients) performed in the United States examining the ability of Care Link®, an Internet-based remote PM interrogation system, to detect clinically actionable events (CAEs) sooner than the current in-office follow-up supplemented with transtelephonic monitoring transmissions, a limited form of remote device interrogation. The trial results are summarized below:
In the 375-day mean follow-up, 382 patients were identified with at least 1 CAE—111 patients in the control arm and 271 in the remote arm.
The event rate detected per patient for every type of CAE, except for loss of atrial capture, was higher in the remote arm than the control arm.
The median time to first detection of CAEs (4.9 vs. 6.3 months) was significantly shorter in the RMS group compared to the control group (P < 0.0001).
Additionally, only 2% (3/190) of the CAEs in the control arm were detected during a transtelephonic monitoring transmission (the rest were detected at in-office follow-ups), whereas 66% (446/676) of the CAEs were detected during remote interrogation.
The second study, the OEDIPE trial, was a smaller trial (379 patients) performed in France evaluating the ability of the Home Monitoring® RMS to shorten PM post-operative hospitalization while preserving the safety of conventional management of longer hospital stays.
Implementation and operationalization of the RMS was reported to be successful in 91% (346/379) of the patients and represented 8144 transmissions.
In the RM group 6.5% of patients failed to send messages (10 due to improper use of the transmitter, 2 with unmanageable stress). Of the 172 patients transmitting, 108 patients sent a total of 167 warnings during the trial, with a greater proportion of warnings being attributed to medical rather than technical causes.
Forty percent had no warning message transmission and among these, 6 patients experienced a major adverse event and 1 patient experienced a non-major adverse event. Of the 6 patients having a major adverse event, 5 contacted their physician.
The mean medical reaction time was faster in the RM group (6.5 ± 7.6 days vs. 11.4 ± 11.6 days).
The mean duration of hospitalization was significantly shorter (P < 0.001) for the RM group than the control group (3.2 ± 3.2 days vs. 4.8 ± 3.7 days).
Quality of life estimates by the SF-36 questionnaire were similar for the 2 groups at 1-month follow-up.
3B. Randomized Controlled Trials Evaluating Remote Monitoring Systems for ICD or CRT Devices
The 5 studies evaluating the impact of RMSs with ICD/CRT devices were conducted in the United States and in European countries and involved 2 RMSs—Care Link® and Home Monitoring ®. The objectives of the trials varied and 3 of the trials were smaller pilot investigations.
The first of the smaller studies (151 patients) evaluated patient satisfaction, achievement of patient outcomes, and the cost-effectiveness of the Care Link® RMS compared to quarterly in-office device interrogations with 1-year follow-up.
Individual outcomes such as hospitalizations, emergency department visits, and unscheduled clinic visits were not significantly different between the study groups.
Except for a significantly higher detection of atrial fibrillation in the RM group, data on ICD detection and therapy were similar in the study groups.
Health-related quality of life evaluated by the EuroQoL at 6-month or 12-month follow-up was not different between study groups.
Patients were more satisfied with their ICD care in the clinic follow-up group than in the remote follow-up group at 6-month follow-up, but were equally satisfied at 12- month follow-up.
The second small pilot trial (20 patients) examined the impact of RM follow-up with the House Call 11® system on work schedules and cost savings in patients randomized to 2 study arms varying in the degree of remote follow-up.
The total time including device interrogation, transmission time, data analysis, and physician time required was significantly shorter for the RM follow-up group.
The in-clinic waiting time was eliminated for patients in the RM follow-up group.
The physician talk time was significantly reduced in the RM follow-up group (P < 0.05).
The time for the actual device interrogation did not differ in the study groups.
The third small trial (115 patients) examined the impact of RM with the Home Monitoring® system compared to scheduled trimonthly in-clinic visits on the number of unplanned visits, total costs, health-related quality of life (SF-36), and overall mortality.
There was a 63.2% reduction in in-office visits in the RM group.
Hospitalizations or overall mortality (values not stated) were not significantly different between the study groups.
Patient-induced visits were higher in the RM group than the in-clinic follow-up group.
The TRUST Trial
The TRUST trial was a large multicenter RCT conducted at 102 centers in the United States involving the Home Monitoring® RMS for ICD devices for 1450 patients. The primary objectives of the trial were to determine if remote follow-up could be safely substituted for in-office clinic follow-up (3 in-office visits replaced) and still enable earlier physician detection of clinically actionable events.
Adherence to the protocol follow-up schedule was significantly higher in the RM group than the in-office follow-up group (93.5% vs. 88.7%, P < 0.001).
Actionability of trimonthly scheduled checks was low (6.6%) in both study groups. Overall, actionable causes were reprogramming (76.2%), medication changes (24.8%), and lead/system revisions (4%), and these were not different between the 2 study groups.
The overall mean number of in-clinic and hospital visits was significantly lower in the RM group than the in-office follow-up group (2.1 per patient-year vs. 3.8 per patient-year, P < 0.001), representing a 45% visit reduction at 12 months.
The median time from onset of first arrhythmia to physician evaluation was significantly shorter (P < 0.001) in the RM group than in the in-office follow-up group for all arrhythmias (1 day vs. 35.5 days).
The median time to detect clinically asymptomatic arrhythmia events—atrial fibrillation (AF), ventricular fibrillation (VF), ventricular tachycardia (VT), and supra-ventricular tachycardia (SVT)—was also significantly shorter (P < 0.001) in the RM group compared to the in-office follow-up group (1 day vs. 41.5 days) and was significantly quicker for each of the clinical arrhythmia events—AF (5.5 days vs. 40 days), VT (1 day vs. 28 days), VF (1 day vs. 36 days), and SVT (2 days vs. 39 days).
System-related problems occurred infrequently in both groups—in 1.5% of patients (14/908) in the RM group and in 0.7% of patients (3/432) in the in-office follow-up group.
The overall adverse event rate over 12 months was not significantly different between the 2 groups and individual adverse events were also not significantly different between the RM group and the in-office follow-up group: death (3.4% vs. 4.9%), stroke (0.3% vs. 1.2%), and surgical intervention (6.6% vs. 4.9%), respectively.
The 12-month cumulative survival was 96.4% (95% confidence interval [CI], 95.5%–97.6%) in the RM group and 94.2% (95% confidence interval [CI], 91.8%–96.6%) in the in-office follow-up group, and was not significantly different between the 2 groups (P = 0.174).
The CONNECT Trial
The CONNECT trial, another major multicenter RCT, involved the Care Link® RMS for ICD/CRT devices in a15-month follow-up study of 1,997 patients at 133 sites in the United States. The primary objective of the trial was to determine whether automatically transmitted physician alerts decreased the time from the occurrence of clinically relevant events to medical decisions. The trial results are summarized below:
Of the 575 clinical alerts sent in the study, 246 did not trigger an automatic physician alert. Transmission failures were related to technical issues such as the alert not being programmed or not being reset, and/or a variety of patient factors such as not being at home and the monitor not being plugged in or set up.
The overall mean time from the clinically relevant event to the clinical decision was significantly shorter (P < 0.001) by 17.4 days in the remote follow-up group (4.6 days for 172 patients) than the in-office follow-up group (22 days for 145 patients).
– The median time to a clinical decision was shorter in the remote follow-up group than in the in-office follow-up group for an AT/AF burden greater than or equal to 12 hours (3 days vs. 24 days) and a fast VF rate greater than or equal to 120 beats per minute (4 days vs. 23 days).
Although infrequent, similar low numbers of events involving low battery and VF detection/therapy turned off were noted in both groups. More alerts, however, were noted for out-of-range lead impedance in the RM group (18 vs. 6 patients), and the time to detect these critical events was significantly shorter in the RM group (same day vs. 17 days).
Total in-office clinic visits were reduced by 38% from 6.27 visits per patient-year in the in-office follow-up group to 3.29 visits per patient-year in the remote follow-up group.
Health care utilization visits (N = 6,227) that included cardiovascular-related hospitalization, emergency department visits, and unscheduled clinic visits were not significantly higher in the remote follow-up group.
The overall mean length of hospitalization was significantly shorter (P = 0.002) for those in the remote follow-up group (3.3 days vs. 4.0 days) and was shorter both for patients with ICD (3.0 days vs. 3.6 days) and CRT (3.8 days vs. 4.7 days) implants.
The mortality rate between the study arms was not significantly different between the follow-up groups for the ICDs (P = 0.31) or the CRT devices with defribillator (P = 0.46).
Conclusions
There is limited clinical trial information on the effectiveness of RMSs for PMs. However, for RMSs for ICD devices, multiple cohort studies and 2 large multicenter RCTs demonstrated feasibility and significant reductions in in-office clinic follow-ups with RMSs in the first year post implantation. The detection rates of clinically significant events (and asymptomatic events) were higher, and the time to a clinical decision for these events was significantly shorter, in the remote follow-up groups than in the in-office follow-up groups. The earlier detection of clinical events in the remote follow-up groups, however, was not associated with lower morbidity or mortality rates in the 1-year follow-up. The substitution of almost all the first year in-office clinic follow-ups with RM was also not associated with an increased health care utilization such as emergency department visits or hospitalizations.
The follow-up in the trials was generally short-term, up to 1 year, and was a more limited assessment of potential longer term device/lead integrity complications or issues. None of the studies compared the different RMSs, particularly the different RMSs involving patient-scheduled transmissions or automatic transmissions. Patients’ acceptance of and satisfaction with RM were reported to be high, but the impact of RM on patients’ health-related quality of life, particularly the psychological aspects, was not evaluated thoroughly. Patients who are not technologically competent, having hearing or other physical/mental impairments, were identified as potentially disadvantaged with remote surveillance. Cohort studies consistently identified subgroups of patients who preferred in-office follow-up. The evaluation of costs and workflow impact to the health care system were evaluated in European or American clinical settings, and only in a limited way.
Internet-based device-assisted RMSs involve a new approach to monitoring patients, their disease progression, and their CIEDs. Remote monitoring also has the potential to improve the current postmarket surveillance systems of evolving CIEDs and their ongoing hardware and software modifications. At this point, however, there is insufficient information to evaluate the overall impact to the health care system, although the time saving and convenience to patients and physicians associated with a substitution of in-office follow-up by RM is more certain. The broader issues surrounding infrastructure, impacts on existing clinical care systems, and regulatory concerns need to be considered for the implementation of Internet-based RMSs in jurisdictions involving different clinical practices.
PMCID: PMC3377571  PMID: 23074419
22.  Revision of visual impairment definitions in the International Statistical Classification of Diseases 
BMC Medicine  2006;4:7.
Background
The existing definitions of visual impairment in the International Statistical Classification of Diseases are based on recommendations made over 30 years ago. New data and knowledge related to visual impairment that have accumulated over this period suggest that these definitions need to be revised.
Discussion
Three major issues need to be addressed in the revision of these definitions. First, the existing definitions are based on best-corrected visual acuity, which exclude uncorrected refractive error as a cause of visual impairment, leading to substantial underestimation of the total visual impairment burden by about 38%. Second, the cut-off level of visual impairment to define blindness in the International Statistical Classification of Diseases is visual acuity less than 3/60 in the better eye, but with increasing human development the visual acuity requirements are also increasing, suggesting that a level less than 6/60 be used to define blindness. Third, the International Statistical Classification of Diseases uses the term 'low vision' for visual impairment level less than blindness, which causes confusion with the common use of this term for uncorrectable vision requiring aids or rehabilitation, suggesting that alternative terms such as moderate and mild visual impairment would be more appropriate for visual impairment less severe than blindness. We propose a revision of the definitions of visual impairment in the International Statistical Classification of Diseases that addresses these three issues. According to these revised definitions, the number of blind persons in the world defined as presenting visual acuity less than 6/60 in the better eye would be about 57 million as compared with the World Health Organization estimate of 37 million using the existing International Statistical Classification of Diseases definition of best-corrected visual acuity less than 3/60 in the better eye, and the number of persons in the world with moderate visual impairment defined as presenting visual acuity less than 6/18 to 6/60 in the better eye would be about 202 million as compared with the World Health Organization estimate of 124 million persons with low vision defined as best-corrected visual acuity less than 6/18 to 3/60 in the better eye.
Conclusion
Our suggested revision of the visual impairment definitions in the International Statistical Classification of Diseases takes into account advances in the understanding of visual impairment. This revised classification seems more appropriate for estimating and tracking visual impairment in the countries and regions of the world than the existing classification in the International Statistical Classification of Diseases.
doi:10.1186/1741-7015-4-7
PMCID: PMC1435919  PMID: 16539739
23.  Other ways of seeing: From behavior to neural mechanisms in the online “visual” control of action with sensory substitution 
Vision is the dominant sense for perception-for-action in humans and other higher primates. Advances in sight restoration now utilize the other intact senses to provide information that is normally sensed visually through sensory substitution to replace missing visual information. Sensory substitution devices translate visual information from a sensor, such as a camera or ultrasound device, into a format that the auditory or tactile systems can detect and process, so the visually impaired can see through hearing or touch. Online control of action is essential for many daily tasks such as pointing, grasping and navigating, and adapting to a sensory substitution device successfully requires extensive learning. Here we review the research on sensory substitution for vision restoration in the context of providing the means of online control for action in the blind or blindfolded. It appears that the use of sensory substitution devices utilizes the neural visual system; this suggests the hypothesis that sensory substitution draws on the same underlying mechanisms as unimpaired visual control of action. Here we review the current state of the art for sensory substitution approaches to object recognition, localization, and navigation, and the potential these approaches have for revealing a metamodal behavioral and neural basis for the online control of action.
doi:10.3233/RNN-150541
PMCID: PMC4927905  PMID: 26599473
Sensory substitution; blindness; prosthetics; object recognition; perception for action; active sensing
24.  Are Supramodality and Cross-Modal Plasticity the Yin and Yang of Brain Development? From Blindness to Rehabilitation 
Research in blind individuals has primarily focused for a long time on the brain plastic reorganization that occurs in early visual areas. Only more recently, scientists have developed innovative strategies to understand to what extent vision is truly a mandatory prerequisite for the brain’s fine morphological architecture to develop and function. As a whole, the studies conducted to date in sighted and congenitally blind individuals have provided ample evidence that several “visual” cortical areas develop independently from visual experience and do process information content regardless of the sensory modality through which a particular stimulus is conveyed: a property named supramodality. At the same time, lack of vision leads to a structural and functional reorganization within “visual” brain areas, a phenomenon known as cross-modal plasticity. Cross-modal recruitment of the occipital cortex in visually deprived individuals represents an adaptative compensatory mechanism that mediates processing of non-visual inputs. Supramodality and cross-modal plasticity appears to be the “yin and yang” of brain development: supramodal is what takes place despite the lack of vision, whereas cross-modal is what happens because of lack of vision. Here we provide a critical overview of the research in this field and discuss the implications that these novel findings have for the development of educative/rehabilitation approaches and sensory substitution devices (SSDs) in sensory-impaired individuals.
doi:10.3389/fnsys.2016.00089
PMCID: PMC5099160  PMID: 27877116
rehabilitation; blindness; supramodal; crossmodal; sensory substitution; fMRI; MRI
25.  Pulmonary Rehabilitation for Patients With Chronic Pulmonary Disease (COPD) 
Executive Summary
In July 2010, the Medical Advisory Secretariat (MAS) began work on a Chronic Obstructive Pulmonary Disease (COPD) evidentiary framework, an evidence-based review of the literature surrounding treatment strategies for patients with COPD. This project emerged from a request by the Health System Strategy Division of the Ministry of Health and Long-Term Care that MAS provide them with an evidentiary platform on the effectiveness and cost-effectiveness of COPD interventions.
After an initial review of health technology assessments and systematic reviews of COPD literature, and consultation with experts, MAS identified the following topics for analysis: vaccinations (influenza and pneumococcal), smoking cessation, multidisciplinary care, pulmonary rehabilitation, long-term oxygen therapy, noninvasive positive pressure ventilation for acute and chronic respiratory failure, hospital-at-home for acute exacerbations of COPD, and telehealth (including telemonitoring and telephone support). Evidence-based analyses were prepared for each of these topics. For each technology, an economic analysis was also completed where appropriate. In addition, a review of the qualitative literature on patient, caregiver, and provider perspectives on living and dying with COPD was conducted, as were reviews of the qualitative literature on each of the technologies included in these analyses.
The Chronic Obstructive Pulmonary Disease Mega-Analysis series is made up of the following reports, which can be publicly accessed at the MAS website at: http://www.hqontario.ca/en/mas/mas_ohtas_mn.html.
Chronic Obstructive Pulmonary Disease (COPD) Evidentiary Framework
Influenza and Pneumococcal Vaccinations for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Smoking Cessation for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Community-Based Multidisciplinary Care for Patients With Stable Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Pulmonary Rehabilitation for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Long-term Oxygen Therapy for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Noninvasive Positive Pressure Ventilation for Acute Respiratory Failure Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Noninvasive Positive Pressure Ventilation for Chronic Respiratory Failure Patients With Stable Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Hospital-at-Home Programs for Patients With Acute Exacerbations of Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Home Telehealth for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis
Cost-Effectiveness of Interventions for Chronic Obstructive Pulmonary Disease Using an Ontario Policy Model
Experiences of Living and Dying With COPD: A Systematic Review and Synthesis of the Qualitative Empirical Literature
For more information on the qualitative review, please contact Mita Giacomini at: http://fhs.mcmaster.ca/ceb/faculty member_giacomini.htm.
For more information on the economic analysis, please visit the PATH website: http://www.path-hta.ca/About-Us/Contact-Us.aspx.
The Toronto Health Economics and Technology Assessment (THETA) collaborative has produced an associated report on patient preference for mechanical ventilation. For more information, please visit the THETA website: http://theta.utoronto.ca/static/contact.
Objective
The objective of this evidence-based review was to determine the effectiveness and cost-effectiveness of pulmonary rehabilitation in the management of chronic obstructive pulmonary disease (COPD).
Technology
Pulmonary rehabilitation refers to a multidisciplinary program of care for patients with chronic respiratory impairment that is individually tailored and designed to optimize physical and social performance and autonomy. Exercise training is the cornerstone of pulmonary rehabilitation programs, though they may also include components such as patient education and psychological support. Pulmonary rehabilitation is recommended as the standard of care in the treatment and rehabilitation of patients with COPD who remain symptomatic despite treatment with bronchodilators.
For the purpose of this review, the Medical Advisory Secretariat focused on pulmonary rehabilitation programs as defined by the Cochrane Collaboration—that is, any inpatient, outpatient, or home-based rehabilitation program lasting at least 4 weeks that includes exercise therapy with or without any form of education and/or psychological support delivered to patients with exercise limitations attributable to COPD.
Research Questions
What is the effectiveness and cost-effectiveness of pulmonary rehabilitation compared with usual care (UC) for patients with stable COPD?
Does early pulmonary rehabilitation (within 1 month of hospital discharge) in patients who had an acute exacerbation of COPD improve outcomes compared with UC (or no rehabilitation)?
Do maintenance or postrehabilitation programs for patients with COPD who have completed a pulmonary rehabilitation program improve outcomes compared with UC?
Research Methods
Literature Search
Search Strategy
For Research Questions 1and 2, a literature search was performed on August 10, 2010 for studies published from January 1, 2004 to July 31, 2010. For Research Question 3, a literature search was performed on February 3, 2011 for studies published from January 1, 2000 to February 3, 2011. Abstracts were reviewed by a single reviewer and, for those studies meeting the eligibility criteria, full-text articles were obtained. Reference lists and health technology assessment websites were also examined for any additional relevant studies not identified through the systematic search.
Inclusion Criteria
Research questions 1 and 2:
published between January 1, 2004 and July 31, 2010
randomized controlled trials, systematic reviews, and meta-analyses
COPD study population
studies comparing pulmonary rehabilitation with UC (no pulmonary rehabilitation)
duration of pulmonary rehabilitation program ≥ 6 weeks
pulmonary rehabilitation program had to include at minimum exercise training
Research question 3:
published between January 1, 2000 and February 3, 2011
randomized controlled trials, systematic reviews, and meta-analyses
COPD study population
studies comparing a maintenance or postrehabilitation program with UC (standard follow-up)
duration of pulmonary rehabilitation program ≥ 6 weeks
initial pulmonary rehabilitation program had to include at minimum exercise training
Exclusion Criteria
Research questions 1, 2, and 3:
grey literature
duplicate publications
non-English language publications
study population ≤ 18 years of age
studies conducted in a palliative population
studies that did not report primary outcome of interest
Additional exclusion criteria for research question 3:
studies with ≤ 2 sessions/visits per month
Outcomes of Interest
The primary outcomes of interest for the stable COPD population were exercise capacity and health-related quality of life (HRQOL). For the COPD population following an exacerbation, the primary outcomes of interest were hospital readmissions and HRQOL. The primary outcomes of interest for the COPD population undertaking maintenance programs were functional exercise capacity and HRQOL.
Quality of Evidence
The quality of each included study was assessed taking into consideration allocation concealment, randomization, blinding, power/sample size, withdrawals/dropouts, and intention-to-treat analyses.
The quality of the body of evidence was assessed as high, moderate, low, or very low according to the GRADE Working Group criteria. The following definitions of quality were used in grading the quality of the evidence:
Summary of Findings
Research Question 1: Effect of Pulmonary Rehabilitation on Outcomes in Stable COPD
Seventeen randomized controlled trials met the inclusion criteria and were included in this review.
The following conclusions are based on moderate quality of evidence.
Pulmonary rehabilitation including at least 4 weeks of exercise training leads to clinically and statistically significant improvements in HRQOL in patients with COPD.1
Pulmonary rehabilitation also leads to a clinically and statistically significant improvement in functional exercise capacity2 (weighted mean difference, 54.83 m; 95% confidence interval, 35.63–74.03; P < 0.001).
Research Question 2: Effect of Pulmonary Rehabilitation on Outcomes Following an Acute Exacerbation of COPD
Five randomized controlled trials met the inclusion criteria and are included in this review. The following conclusion is based on moderate quality of evidence.
Pulmonary rehabilitation (within 1 month of hospital discharge) after acute exacerbation significantly reduces hospital readmissions (relative risk, 0.50; 95% confidence interval, 0.33–0.77; P = 0.001) and leads to a statistically and clinically significant improvement in HRQOL.3
Research Question 3: Effect of Pulmonary Rehabilitation Maintenance Programs on COPD Outcomes
Three randomized controlled trials met the inclusion criteria and are included in this review. The conclusions are based on a low quality of evidence and must therefore be considered with caution.
Maintenance programs have a nonsignificant effect on HRQOL and hospitalizations.
Maintenance programs have a statistically but not clinically significant effect on exercise capacity (P = 0.01). When subgrouped by intensity and quality of study, maintenance programs have a statistically and marginally clinically significant effect on exercise capacity.
PMCID: PMC3384375  PMID: 23074434

Results 1-25 (860348)