Ocular trauma remains an important cause of visual morbidity worldwide. A previous population-based study in Scotland reported a 1-year cumulative incidence of 8.14 per 100 000 population. The purpose of this study was to identify any change in the incidence and pattern of serious ocular trauma in Scotland.
This study was a 1-year prospective observational study using the British Ophthalmological Surveillance Unit reporting scheme among Scottish ophthalmologists. Serious ocular trauma was defined as requiring hospital admission. Data were collected using two questionnaires for each patient 1 year apart.
The response rate from ophthalmologists was 77.1%. There were 102 patients reported with complete data giving an incidence of 1.96 per 100 000 population, four times less than in 1992. In patients younger than 65 years, the age-adjusted incidence ratio (males/females) indicated a ninefold higher risk of trauma in males. In 25 patients (27.2%), the injured eye was blind (final visual acuities (FVA) <6/60), 24 being attributable to the eye injury. Standardised morbidity ratios suggested a threefold decrease in risk of poor visual outcome in 2009 compared with 1992.
The incidence of serious ocular trauma has fallen; this study has shown hospital admission for serious eye injury in Scotland has decreased fourfold in 17 years. Young adult males continue to be at highest risk, which needs to be specifically addressed in future health-prevention strategies. This study also observed a reduction in visual loss from serious ocular injuries, although the reasons for this require further exploration.
ocular trauma; epidemiology; prevention; penetrating injury
Testicular cell culture of crab, Scylla serrata (Forskal) was used to study the effects of White spot syndrome virus (WSSV). We are showing the susceptibility of cell culture of crabs to WSSV. The proliferating cell culture of testes were maintained for more than 4 months in a medium prepared from L15 and crab saline supplemented with epidermal growth factor. The cell cultures inoculated with different concentrations of virus showed distinct cytopathic effects such as change in cell appearance, shrinkage and cell lysis. WSSV infection of cultured cells was confirmed by Nested PCR technique. The incorporation of viral DNA in cultured cells was shown by RAPD profile generated using 10-mer primers. The controls that were not exposed to WSSV did not show cytopathic effects. This work shows the usefulness of proliferating testicular cell culture for studying WSSV infection using molecular tools. Thus, this report gains significance as it opens new vistas for diagnostics and drugs for WSSV.
White spot syndrome virus; Scylla serrata (Forskal); Crab; Cell culture; Nested PCR; Pathogenesis
To analyze whether an association exists between keratometric and pachymetric changes in the cornea, and whether it can be used to create pachymetric cutoff criteria secondary to keratometric criteria.
In this cross-sectional study, 1000 candidates presenting to the refractive surgery services of a tertiary care hospital underwent bilateral Orbscan IIz (Bausch and Lomb) assessment along with other ophthalmic evaluation.
Stepwise regression analysis-based models showed that simulated keratometry (simK) astigmatism was significantly predicted by the minimum corneal thickness (MCT) and difference between central and MCT (δCT), mean SimK by the MCT and δCT, and maximum keratometry in the central 10-mm zone by the MCT and δCT (P<0.001). The mean MCT values were 542.5±39.6, 539.9±39.2, 524.2±49.5, and 449.3±73.7 μm for flatter normal (<44 D), steeper normal (≥44 D), keratoconus suspect and keratoconic eyes, respectively (P<0.001). The mean differences between central corneal thickness and MCT (δCT) were 12.2±7.1 μm, 12.4±7.4 μm, 14.4±8.9 μm and 23.2±10.1 μm for the flatter normal, steeper normal, keratoconus suspect, and keratoconic eyes, respectively (P<0.001). Mean and 2SD cutoff were used to suggest that a cornea having MCT<461 μm or δCT>27 μm has only a 2.5% chance of being normal and not a keratoconus suspect or worse.
Pachymetric diagnostic cutoffs can be used as adjuncts to the existing topographic criteria to screen keratoconus suspect and keratoconic eyes.
keratoconus; keratoconus suspect; cutoffs; pachymetry; keratometry
Cronobacter (previously known as Enterobacter sakazakii) is a diverse bacterial genus consisting of seven species: C. sakazakii, C. malonaticus, C. turicensis, C. universalis, C. muytjensii, C. dublinensis, and C. condimenti. In this study, we have used a multilocus sequence typing (MLST) approach employing the alleles of 7 genes (atpD, fusA, glnS, gltB, gyrB, infB, and ppsA; total length, 3,036 bp) to investigate the phylogenetic relationship of 325 Cronobacter species isolates. Strains were chosen on the basis of their species, geographic and temporal distribution, source, and clinical outcome. The earliest strain was isolated from milk powder in 1950, and the earliest clinical strain was isolated in 1953. The existence of seven species was supported by MLST. Intraspecific variation ranged from low diversity in C. sakazakii to extensive diversity within some species, such as C. muytjensii and C. dublinensis, including evidence of gene conversion between species. The predominant species from clinical sources was found to be C. sakazakii. C. sakazakii sequence type 4 (ST4) was the predominant sequence type of cerebral spinal fluid isolates from cases of meningitis.
Newborns are colonized with an intestinal microbiota shortly after birth but the factors governing the retention and abundance of specific microbial lineages are unknown. Nursing infants consume human milk oligosaccharides (HMOs) that pass undigested to the distal gut where they may be digested by microbes. We determined that the prominent neonate gut residents, Bacteroides thetaiotaomicron and Bacteroides fragilis, induce the same genes during HMO consumption that are used to harvest host mucus glycans, which are structurally similar to HMOs. Lacto-N-neotetraose, a specific HMO component, selects for HMO-adapted species such as Bifidobacterium infantis, which cannot use mucus, and provides a selective advantage to B. infantis in vivo when bi-associated with B. thetaiotaomicron in the gnotobiotic mouse gut. This indicates that the complex oligosaccharide mixture within HMOs attracts both mutualistic mucus-adapted species and HMO-adapted bifidobacteria to the infant intestine that likely facilitate both milk and future solid food digestion.
This is the first report on development of a finite cell line from testicular tissues of crab, Scylla serrata. Both the explant and segregated tissues of testes yielded cells that could proliferate and grow. These cells ranged in size from 10 to 38 μm with distinct nuclei of varying shapes. The testicular cells survived and proliferated best in L-15-crab saline medium supplemented with epidermal growth factor (20 ng/mL) and glucose (1 mg/mL). The cell proliferation rate was assessed by Methyl tetrazolium assay in terms of change in optical density which clearly indicated a prominent increase in cell density. The testicular cells were subcultured at an interval of 4–6 days. These subcultured cells remained healthy and proliferated for 5 months with a minimum of ten subsequent passages. The finite cell line was characterized in terms of morphology, growth rate, lactate dehydrogenase release (for detecting health status) and 18S rRNA sequencing. This cell line could be a very useful tool for testing infections and replications of crustacean viruses. The present work provides a technique that could be extended for developing other crustacean cell lines.
Cell culture; Cell line; Scylla serrata; Crab; 18S rRNA sequence; MTT
To evaluate the rate of seropositivity to hepatitis B and C and Human Immunodeficiency Virus (HIV) infections among children with β-thalassemia major receiving multiple transfusions in Ahmedabad, India, compared with healthy controls.
Materials and Methods:
The study was performed during January 2007 to January 2009 on multi-transfused children suffering with β-thalassemia major registered in the Prathama Blood Centre, Ahmedabad; Jeevandeep hospital, Ahmedabad; and Red Cross Blood Centre, Ahmedabad, and investigated for the prevalence and development of transfusion-transmitted infections. Hepatitis B surface Antigen (HBsAg), anti-Hepatitis C Virus (HCV) Antibodies (Ab), and HIV Ab were checked using a fourth-generation Enzyme-Linked Immunosorbent Assay (ELISA). Positive tests were confirmed by western blots. Healthy blood donors were used for the control group.
Hepatitis B surface antigen, anti-HCV Ab, and HIV Ab were positive in one of 96 (1.04%; 95% Confidence Interval (CI) = 0.17–1.3), 24 of 96 (25%; 95% CI = 11.4–14.2), and one of 96 (1.04%; 95% CI = 0.12–1.3), respectively. The rate of anti-HCV Ab was significantly higher in multi-transfused children suffering with β-thalassemia major. In thalassemia patients, the rate of positive anti-HCV Ab was significantly higher than that for positive HBsAg (P<0.001) and HIV Ab (P<0.001).
It is concluded that HCV is the current major problem in multi-transfused children with thalassemia major and more careful pretransfusion screening of blood for anti-HCV must be introduced in blood centers.
Hepatitis B; hepatitis C; Human immunodeficiency virus; β-thalassemia major; seroprevalence
This paper reports for the first time, the Primary cell culture of hepatopancreas from edible crab Scylla serrata using crab saline, L-15 (Leibovitz), 1 × L-15 + crab saline, 2 × L-15 + crab saline, 3 × L-15 and citrate buffer without any serum. We could isolate and maintain E (Embryonalzellen), F (Fibrenzellen), B (Blasenzellen), R (Restzellen) and G (Granular cells). Upon seeding the hepatopancreatic E, F, B, and R cells showed different survival pattern over time than granular cells. A modified L-15 (3×) medium supported the best survival of hepatopancreatic E, F B, and R cells in in-vitro culture. However granular cells could be maintained for 184 days with L-15 (1×) + crab saline. Fetal bovine serum was not effective additive and hampered cell viability in present study.
Hepatopancreas; Scylla serrata; Primary cultures; E (Embryonalzellen); F (Fibrenzellen); B (Blasenzellen); R (Restzellen) and G (Granular cells)
Although possession of the ε4 allele of the apolipoprotein E gene appears to be an important biological marker for Alzheimer's disease (AD) susceptibility, strong evidence indicates that at least one additional risk gene exists on chromosome 12.
Here, we describe an association of the 3'-UTR +1073 C/T polymorphism of the OLR1 (oxidised LDL receptor 1) on chromosome 12 with AD in French sporadic (589 cases and 663 controls) and American familial (230 affected sibs and 143 unaffected sibs) populations. The age and sex adjusted odds ratio between the CC+CT genotypes versus the TT genotypes was 1.56 (p=0.001) in the French sample and 1.92 (p=0.02) in the American sample. Furthermore, we have discovered a new T/A polymorphism two bases upstream of the +1073 C/T polymorphism. This +1071 T/A polymorphism was not associated with the disease, although it may weakly modulate the impact of the +1073 C/T polymorphism.
Using 3'-UTR sequence probes, we have observed specific DNA protein binding with nuclear proteins from lymphocyte, astrocytoma, and neuroblastoma cell lines, but not from the microglia cell line. This binding was modified by both the +1071 T/A and +1073 C/T polymorphisms. In addition, a trend was observed between the presence or absence of the +1073 C allele and the level of astrocytic activation in the brain of AD cases. However, Aß40, Aß42, Aß total, and Tau loads or the level of microglial cell activation were not modulated by the 3'-UTR OLR1 polymorphisms. Finally, we assessed the impact of these polymorphisms on the level of OLR1 expression in lymphocytes from AD cases compared with controls. The OLR1 expression was significantly lower in AD cases bearing the CC and CT genotypes compared with controls with the same genotypes. In conclusion, our data suggest that genetic variation in the OLR1 gene may modify the risk of AD.
Background/aims: In diabetics, cataract is associated with higher risk of death. In non-diabetics the data are conflicting, but some indicate an association between one type of cataract (nuclear) and increased mortality. The aim of this study was to estimate and compare age and sex specific mortality for elderly people with and without cataract in a population based cohort.
Methods: A random sample drawn from a defined population of elderly people (age 65 and older) registered with 17 general practice groups in north London formed the study cohort and were followed up for 4 years. The age and sex specific mortality from various causes was estimated and compared in those with and without cataract.
Results: In non-diabetics (n=1318), cataract (lens opacity at baseline) was significantly associated with higher mortality in women. The age standardised death rate per 1000 was 39.8 and 24.8 in women with and without cataract, respectively (age adjusted hazard ratio 1.7, confidence limits 1.1 to 2.7, p=0.032). This was not the case in non-diabetic men (hazard ratio 0.9, confidence limits 0.6 to 1.5, p=0.782). The excess mortality in women with cataract was consistent for cardiovascular, respiratory, and other non-cancer causes of death. There was no association between cataract and mortality from cancer.
Conclusions: This study has shown, for the first time, that cataract is associated with higher mortality in women but not in men, among the non-diabetic population. This sex effect suggests that women may be exposed to risk factors that increase both the risk of cataract and mortality, and that men may have little or no exposure to these “sex specific” factors. Possible risk factors that warrant further investigation may be those associated with some pregnancy and childbearing experience.
mortality; women; cataract; London
BACKGROUND—Cataract extraction constitutes the largest surgical workload in ophthalmic units throughout the world. Extracapsular cataract extraction (ECCE), through a large incision, with insertion of an intraocular lens has been the most widely used method from 1982 until recently. Technological advances have led to the increasing use of phacoemulsification (Phako) to emulsify and remove the lens The technique requires a smaller incision, but requires substantial capital investment in theatre equipment. In this randomised trial we assessed the clinical outcomes and carried out an economic evaluation of the two procedures.
METHODS—In this two centre randomised trial, 232 patients with age related cataract received ECCE, and 244 received small incision surgery by Phako. The main comparative outcomes were visual acuity, refraction, and complication rates. Resource use was monitored in the two trial centres and in an independent comparator centre. Costs calculated included average cost per procedure, at each stage of follow up.
RESULTS—Phako was found to be clinically superior. Surgical complications and capsule opacity within 1 year after surgery were significantly less frequent, and a higher proportion achieved an unaided visual acuity of 6/9 or better (<0.2 logMAR) in the Phako group. Postoperative astigmatism was more stable in Phako. The average cost of a cataract operation and postoperative care within the trial was similar for the two procedures. With the input of additional spectacles for corrected vision at 6 months after surgery, the average cost per procedure was £359.89 for Phako and £367.57 for ECCE.
CONCLUSION—Phako is clinically superior to ECCE and is cost effective.
Objectives—To determine the recent incidence of eye injury due to sport in Scotland, identify any trend, and establish which sports are responsible for most injury? The type of injury and final visual outcome is also evaluated.
Methods—A prospective observational study of ocular injuries sustained during sport was performed over a one year period. Only patients requiring hospital admission were included. Data were collected on a standardised proforma and entered into a central database. Patients were followed up for at least three months.
Results—Of 416 patients admitted because of ocular injury, 52 (12.5%) resulted from playing a sport. Although all racquet sports together accounted for 47.5% of these injuries, football was the single most common sport associated with ocular trauma, being responsible for 32.5% of cases. The most common clinical finding was macroscopic hyphaema occurring in 87.5% of patients. Overall the final visual acuity was 6/6 in 92.5% of patients.
Conclusions—The incidence of eye injury due to sport at 12.5% is lower than previously reported, suggesting a change in the pattern of ocular trauma. Football is the single most common cause of ocular injury from sport in Scotland, but the wearing of protective headgear would be difficult to instigate. The incidence of hyphaema in sport related ocular trauma (87.5%) is almost double that of all ocular injury (47.8%), so the potential for serious visual loss as the result of a sports injury should not be underrated. Ophthalmologists have a role in protecting this young population at risk by actively encouraging the design and use of protective eyewear.
Key Words: eye injury; ocular trauma; hyphaema; protective eyewear
BACKGROUND—The pool of old cases of cataract, the expected new cases, and the shortfall in cataract surgery and consequently the numbers dying with poor vision without the benefit of cataract surgery are regarded as escalating problems worldwide. Successive governments and the professional ophthalmic bodies have not had the wherewithal to estimate the magnitude or interaction of these elements in the population of the UK. This study has collected and applied the best available epidemiological data on cataract prevalence, incidence and service utilisation, and demography to address the problem of control of the cataract pool in the population of England and Wales.
METHODS—Data from recent surveys undertaken by the authors, both on prevalence of vision impairing cataract and on patterns of cataract surgery, were used together with demographic and service utilisation information obtained from government departments. These were integrated within a holistic model, which was run under varied assumed levels and patterns of service provision.
RESULTS—The study shows that there is a serious pool of unoperated vision impairing cataract in the population aged 65 and older, reflecting a shortfall in cataract surgery. Continuing with the present level and pattern of service provision, the pool will increase to over 2.5 million by the year 2001. In addition, more than 700 000 will die with unoperated impaired vision.
CONCLUSIONS—Targeting of existing or new additional operations to those below the visual acuity of 6/12 will have relatively little effect on numbers dying without surgery, but should have a substantial controlling effect on the pool of vision impairing cataract in the population.
AIMS—A national survey of over 100 hospitals in the UK was carried out to collect routine clinical information on the outcomes of cataract surgery. The clinical outcomes of interest were: visual acuity at time of discharge from postoperative hospital follow up, visual acuity at time of final refraction; complications related to surgery occurring during the operation, within 48 hours of surgery, and within 3 months of surgery. In addition, information on age and comorbidity was obtained. This article reports on the findings of the experience of approximately 18 000 patients who had cataract surgery in the hospital eye service of the NHS.
RESULTS—Of those with no ocular comorbidity, 85% achieved a visual acuity of 6/12 or better on discharge from postoperative hospital follow up, while 65% of patients with a serious co-existing eye disease achieved this level of acuity at this time. At final refraction, 92% of patients without ocular comorbidity and 77% of patients with ocular comorbidity achieved 6/12 or better visual acuity. The following main risk indicators were associated with visual outcomes and complications related to surgery: age, other eye diseases, diabetes and stroke, type of surgical procedure, and grade of surgeon.
CONCLUSIONS—The acceptability of these findings could fruitfully be the subject of discussion within the ophthalmic community and hopefully issues arising out of the study can lead to research, especially in-depth studies of the outcomes of cataract surgery in those patients with co-existing serious eye conditions.
AIMS—To investigate the current causes and outcomes of paediatric ocular trauma.
METHODS—A prospective observational study of all children admitted to hospital with ocular trauma in Scotland over a 1 year period.
RESULTS—The commonest mechanism of injury was blunt trauma, accounting for 65% of the total. 60% of the patients were admitted with a hyphaema. Injuries necessitating admission occurred most frequently at home (51%). Sporting activities were the commonest cause of injury in the 5-14 age group. There were no injuries caused by road traffic accidents or fireworks. Patients were admitted to hospital for a mean of 4.2 days (range 1-25 days). One (1%) child had an acuity in the "visually impaired" range (6/18-6/60) and one (1%) was "blind" (6/60) in the affected eye. No child was bilaterally blinded by injury and none required blind or partial sight registration.
CONCLUSION—This study has shown that the incidence of eye injuries affecting children has fallen. The outcome of ocular trauma has improved significantly, and for the first time paediatric injuries appear to have a better prognosis than injuries affecting adults.
AIMS/METHODS—A national data collection exercise was carried out in more than 100 hospital eye service units within the UK to provide clinical and administrative information on patients undergoing cataract surgery. This included patient clinical data such as visual acuity at the time of wait listing and at the time of admission for surgery, presence of other eye disorders, other serious medical disorders, and data on waiting time and type of admission.
RESULTS—The profiles of the 18 454 patients aged 50 years or older are reported. Findings of particular note were as follows. At the time of wait listing for cataract surgery 31% had visual acuity of 6/12 or better, 54% had visual acuity between 6/18 and 6/60, and 15% had less than 6/60 vision. Considering those who had visual acuity of 6/12 or better at the time of wait listing, by the time of admission for surgery, the vision deteriorated to 6/18-6/60 in 33% and in a further 3% the vision deteriorated to below 6/60. In patients with moderately poor visual acuity (<6/12-6/60) at the time of wait listing, 13% had less than 6/60 vision by the time of admission for surgery.
CONCLUSION—This type of data collection and reporting exercise provides new material that can be used in the planning and provision of cataract surgery services in the UK.
AIMS: To provide epidemiological data on the current burden of serious eye injuries utilising the hospital eye service, to inform the planning and provision of eye health care, and health and safety strategies for the prevention of ocular injuries. METHODS: A prospective observational study was carried out of all patients with ocular trauma admitted to hospitals in Scotland, under the care of a consultant ophthalmologist, during a 1 year period. The population of Scotland represented the population at risk of injury. Visual outcome (Snellen visual acuity in the injured eye) was measured at the time of final discharge from ophthalmic care and at follow up. RESULTS: All ophthalmic departments in Scotland participated and a total of 415 residents of Scotland were admitted. The 1 year cumulative incidence of ocular trauma necessitating admission to hospital is estimated to be 8.14 per 100 000 population (95% CI 7.38 to 8.97). Some 13.2% (n = 26/197) of patients discharged from follow up had a poor visual outcome with a visual acuity less than 6/12 in the injured eye. Some 10.7% (21/197) patients at this time had a blinding outcome in the injured eye (visual acuity less than 6/60). No patient was registered blind or partially sighted during the study period. The home was the single most frequent place for blinding injuries to occur (52%, n = 11/21), followed by the workplace 24% (n = 5/21). The 1 year cumulative incidence of blinding outcome from serious ocular trauma is estimated to be 0.41 per 100 000 population per year (95% CI 0.26 to 0.64). CONCLUSION: The current burden of serious ocular trauma presenting to the hospital eye service has been quantified from this population based study, and for the first time, a direct estimate of the incidence of the subsequent blinding outcome from these injuries has been provided. Ocular trauma remains an important cause of avoidable and, predominantly, monocular visual morbidity (visual impairment and blindness), with over half of the blinding injuries now occurring in the home. Health education and safety strategies should now consider targeting the home for the prevention of the serious eye injuries in addition to the traditional work, sports, and leisure environments and their related activities.