Search tips
Search criteria

Results 1-25 (714724)

Clipboard (0)

Related Articles

1.  Regional comparison of dietary intakes and health related behaviors among residents in Asan 
Nutrition Research and Practice  2007;1(2):143-149.
Inadequate dietary intakes and poor health behaviors are of concern among rural residents in Korea. This study is conducted to compare dietary intakes, dietary diversity score (DDS), mean nutrient adequacy ratio (MAR) and health related behaviors by rural, factory and urban areas in Asan. A total of 930 adults (351 men and 579 women) were interviewed to assess social economic status (SES), health related behaviors and food intakes by a 24-hour recall method. Mean age was 61.5 years with men being older (64.8 years) than women (59.3 years, p<0.001). Men in the factory area were older than rural or urban men while urban women were the youngest. Education and income of urban residents were higher than other area residents. There were more current drinkers in urban area while smoking status was not different by regions. Physical activity was significantly higher in rural or factory areas, whilst urban residents exercised more often (p<0.05). Rural or factory area residents considered themselves less healthy than others while perceived stress was lower than urban residents. Energy intakes were higher in urban residents or in men, however, after SES was controlled, energy intake did not show any differences. Energy-adjusted nutrient intakes were significantly higher in the urban area (p<0.05) for most nutrients except for carbohydrate, niacin, folic acid, vitamin B6, iron and fiber. Sodium intake was higher in factory area than in other areas after SES was controlled. DDS of rural men and MAR of both men and women in the rural area were significantly lower when SES was controlled. In conclusion, dietary intakes, diversity, adequacy and perceived health were poor in the rural area, although other health behaviors such as drinking and perceived stress were better than in the urban area. In order to improve perceived health of rural residents, good nutrition and exercise education programs are recommended.
PMCID: PMC2882589  PMID: 20535400
Dietary intake; health behavior; regional comparison; rural area
2.  High dietary diversity is associated with obesity in Sri Lankan adults: an evaluation of three dietary scores 
BMC Public Health  2013;13:314.
Dietary diversity is recognized as a key element of a high quality diet. However, diets that offer a greater variety of energy-dense foods could increase food intake and body weight. The aim of this study was to explore association of diet diversity with obesity in Sri Lankan adults.
Six hundred adults aged > 18 years were randomly selected by using multi-stage stratified sample. Dietary intake assessment was undertaken by a 24 hour dietary recall. Three dietary scores, Dietary Diversity Score (DDS), Dietary Diversity Score with Portions (DDSP) and Food Variety Score (FVS) were calculated. Body mass index (BMI) ≥ 25 kg.m-2 is defined as obese and Asian waist circumference cut-offs were used diagnosed abdominal obesity.
Mean of DDS for men and women were 6.23 and 6.50 (p=0.06), while DDSP was 3.26 and 3.17 respectively (p=0.24). FVS values were significantly different between men and women 9.55 and 10.24 (p=0.002). Dietary diversity among Sri Lankan adults was significantly associated with gender, residency, ethnicity, education level but not with diabetes status. As dietary scores increased, the percentage consumption was increased in most of food groups except starches. Obese and abdominal obese adults had the highest DDS compared to non obese groups (p<0.05). With increased dietary diversity the level of BMI, waist circumference and energy consumption was significantly increased in this population.
Our data suggests that dietary diversity is positively associated with several socio-demographic characteristics and obesity among Sri Lankan adults. Although high dietary diversity is widely recommended, public health messages should emphasize to improve dietary diversity in selective food items.
PMCID: PMC3626879  PMID: 23566236
Diet diversity; Dietary variety; DDS; Sri Lanka; Obesity; Adults
3.  High carbohydrate diet and physical inactivity associated with central obesity among premenopausal housewives in Sri Lanka 
BMC Research Notes  2014;7(1):564.
Prevalence of obesity and overweight is rising in developing countries, including Sri Lanka at a rapid pace due to dietary and lifestyle changes. This study aimed to assess the association between high carbohydrate diet, physical inactivity and central obesity among premenopausal housewives in Sri Lanka.
This study was conducted as a cross-sectional study. A sample of 100 premenopausal women with home duties aged between 20 to 45 years were selected randomly from two divisional secretariats (DS), representing urban and rural sectors in Sri Lanka. Data on basic characteristics, anthropometric measurements, dietary assessment and physical activity were collected. We defined central obesity as a waist circumference ≥80 cm, which is the WHO recommended cut-off. Independent sample t test was used to compare the mean values. Linear and binary logistic regression analyses were performed to find out the relationship and the magnitude of association between central obesity and percentage of energy contributed from carbohydrate and physical activity level, respectively.
The women reported an average of 18 starch portions per day, which was well above the national recommendations. Seventy percent of energy in the diet came from carbohydrate. The mean BMI and waist circumference of total sample was 25.4 kgm-2 and 78.5 cm, respectively. Prevalence of overweight, obesity and centrally obesity among total sample was 38%, 34% and 45%, respectively. A significant positive correlation observed between high carbohydrate diet and waist circumference (r = 0.628, p < 0.0001). There was a significant negative correlation between energy expenditure from physical activity and waist circumference (r = -0.742, p < 0.0001). High carbohydrate diet and physical inactivity were significantly associated with central obesity (OR = 6.26, p = 0.001, 95% CI-2.11-18.57, OR = 3.32, p = 0.028, 95% CI-1.14-9.68).
High carbohydrate diet and physical inactivity are possible explanations for the high prevalence of central obesity. There is an urgent need to implement effective specific public health interventions at household level to reverse this trend among the housewives in Sri Lanka.
PMCID: PMC4148929  PMID: 25150690
Central obesity; Premenopausal; Housewives; High carbohydrate diet
4.  Acute Human Lethal Toxicity of Agricultural Pesticides: A Prospective Cohort Study 
PLoS Medicine  2010;7(10):e1000357.
In a prospective cohort study of patients presenting with pesticide self-poisoning, Andrew Dawson and colleagues investigate the relative human toxicity of agricultural pesticides and contrast it with WHO toxicity classifications, which are based on toxicity in rats.
Agricultural pesticide poisoning is a major public health problem in the developing world, killing at least 250,000–370,000 people each year. Targeted pesticide restrictions in Sri Lanka over the last 20 years have reduced pesticide deaths by 50% without decreasing agricultural output. However, regulatory decisions have thus far not been based on the human toxicity of formulated agricultural pesticides but on the surrogate of rat toxicity using pure unformulated pesticides. We aimed to determine the relative human toxicity of formulated agricultural pesticides to improve the effectiveness of regulatory policy.
Methods and Findings
We examined the case fatality of different agricultural pesticides in a prospective cohort of patients presenting with pesticide self-poisoning to two clinical trial centers from April 2002 to November 2008. Identification of the pesticide ingested was based on history or positive identification of the container. A single pesticide was ingested by 9,302 patients. A specific pesticide was identified in 7,461 patients; 1,841 ingested an unknown pesticide. In a subset of 808 patients, the history of ingestion was confirmed by laboratory analysis in 95% of patients. There was a large variation in case fatality between pesticides—from 0% to 42%. This marked variation in lethality was observed for compounds within the same chemical and/or WHO toxicity classification of pesticides and for those used for similar agricultural indications.
The human data provided toxicity rankings for some pesticides that contrasted strongly with the WHO toxicity classification based on rat toxicity. Basing regulation on human toxicity will make pesticide poisoning less hazardous, preventing hundreds of thousands of deaths globally without compromising agricultural needs. Ongoing monitoring of patterns of use and clinical toxicity for new pesticides is needed to identify highly toxic pesticides in a timely manner.
Please see later in the article for the Editors' Summary
Editors' Summary
Suicide is a preventable global public health problem. About 1 million people die each year from suicide and many more harm themselves but survive. Although many people who commit suicide have a mental illness, stressful events (economic hardship or relationship difficulties, for example) can sometimes make life seem too painful to bear. Suicide attempts are frequently impulsive and use methods that are conveniently accessible. Strategies to reduce suicide rates include better treatment of mental illness and programs that help people at high risk of suicide deal with stress. Suicide rates can also be reduced by limiting access to common suicide methods. The single most important means of suicide worldwide is agricultural pesticide poisoning. Every year, between 250,000 and 370,000 people die from deliberate ingestion of pesticides (chemicals that kill animal pests or unwanted plants). Most of these suicides occur in rural areas of the developing world where high levels of pesticide use in agriculture combined with pesticide storage at home facilitate this particular method of suicide.
Why Was This Study Done?
To help reduce suicides through the ingestion of agricultural pesticides, the Food and Agriculture Organization of the United Nations recommends the withdrawal of the most toxic pesticides—World Health Organization (WHO) class I pesticides—from agricultural use. This strategy has proven successful in Sri Lanka where a ban on class I pesticides in 1995 and on the class II pesticide endosulfan in 1998 has reduced pesticide deaths by 50% over the past 20 years without decreasing agricultural output. Further reductions in suicides from pesticide ingestion could be achieved if regulatory restrictions on the sale and distribution of the most toxic class II pesticides were imposed. But such restrictions must balance agricultural needs against the impact of pesticides on public health. Unfortunately, the current WHO pesticide classification is based on toxicity in rats. Because rats handle pesticides differently from people, there is no guarantee that a pesticide with low toxicity in rodents is safe in people. Here, the researchers try to determine the relative human toxicity of agricultural pesticides in a prospective cohort study (a study in which people who share a characteristic—in this case, deliberate pesticide ingestion—are enrolled and followed to see how they fare).
What Did the Researchers Do and Find?
The researchers examined the case fatality (the proportion of patients dying after hospital admission) of different agricultural pesticides among patients who presented with pesticide self-poisoning at two Sri Lankan referral hospitals. Between April 2002 and November 2008, 9,302 people were admitted to the hospitals after swallowing a single pesticide. The researchers identified the pesticide ingested in 7,461 cases by asking the patient what he/she had taken or by identifying the container brought in by the patient or relatives. 10% of the patients died but there was a large variation in case fatality between pesticides. The herbicide paraquat was the most lethal pesticide, killing 42% of patients; several other pesticides killed no one. Compounds in the same chemical class and/or the same WHO toxicity class sometimes had very different toxicities. For example, dimethoate and malathione, both class II organophosphate insecticides, had case fatalities of 20.6% and 1.9%, respectively. Similarly, pesticides used for similar agricultural purposes sometimes had very different case fatalities.
What Do These Findings Mean?
These findings provide a toxicity ranking for pesticides that deviates markedly from the WHO toxicity classification based on rat toxicity. Although the findings are based on a study undertaken at just two Sri Lankan hospitals, they are likely to be generalizable to other hospitals and to other parts of rural Asia. However, because the study only included patients who were admitted to hospital after ingesting pesticides, the actual case fatalities for some pesticides may be somewhat different. Nevertheless, these findings have several important public health implications. For example, they suggest that the decision taken in January 2008 to withdraw paraquat, dimethoate, and fenthion from the Sri Lankan market should reduce deaths from pesticide poisoning in Sri Lanka by a further 33%–65% (equivalent to about 1,000 fewer suicides per year). More generally, they suggest that basing the regulation of pesticides on human toxicity has the potential to prevent hundreds and thousands of intentional and accidental deaths globally without compromising agricultural needs.
Additional Information
Please access these Web sites via the online version of this summary at
This study is further discussed in a PLoS Medicine Perspective by Matt Miller and Kavi Bhalla
The World Health Organization provides information on the global burden of suicide and on suicide prevention (in several languages) and on its classification of pesticides
The US Environmental Protection Agency provides information about all aspects of pesticides (in English and Spanish)
Toxtown, an interactive site from the US National Library of Science, provides information on environmental health concerns including exposure to pesticides (in English and Spanish)
The nonprofit organization Pesticide Action Network UK provides information about all aspects of pesticides
The US National Pesticide Information Center provides objective, science-based information about pesticides (in several languages)
The Food and Agriculture Organization of the United Nations leads international efforts to reduce hunger; as part of this effort, it has introduced pesticide policy reforms (in several languages)
MedlinePlus provides links to further resources about suicide and about pesticides (in English and Spanish)
PMCID: PMC2964340  PMID: 21048990
5.  Validation of the Geriatric Depression Scale for an elderly Sri Lankan clinic population 
Indian Journal of Psychiatry  2010;52(3):254-256.
Geriatric Depression Scale (GDS) has not been validated for the elderly population in Sri Lanka.
To translate, validate, and examine the effectiveness of GDS and to suggest the optimal cut-off scores for elderly Sri Lankans attending a psychogeriatric clinic.
Materials and Methods:
The Sinhalese translation of GDS (GDS-S) was administered to people aged 55 years and above, attending a psychogeriatric outpatient clinic. The diagnostic performance of the instrument was compared against the ICD 10 diagnosis of a consultant psychiatrist, which was considered the ‘gold standard’. Receiver operating characteristic (ROC) analysis was carried out to compare the diagnostic performance of the GDS-S. Optimal cut-off scores for depression and sensitivity and the specificity of the instrument was determined.
A total of 60 subjects formed the final sample (male/female=16/44) of which 30 were depressed, while 30 were age- and sex-matched controls. The optimal cut-off score for GDS-S was 8 for differentiating non-depressed from mildly depressed, while the cut-off score for moderate depression was 10. Sensitivity and specificity of GDS-S was 73.3% for differentiating depressed from non-depressed.
GDS is culturally acceptable, easy to use, sensitive, and a valid instrument to diagnose depression and to differentiate mild from moderate depression in an elderly Sri Lankan clinic population.
PMCID: PMC2990826  PMID: 21180411
GDS; depression; Sri Lanka
6.  Dietary Quality Indices and Biochemical Parameters Among Patients With Non Alcoholic Fatty Liver Disease (NAFLD) 
Hepatitis Monthly  2013;13(7):e10943.
Dietary intake might have important role in non-alcoholic fatty liver diseases (NAFLD). Although, there are some reports on dietary intake and anthropometrics measurements, few studies have focused on the markers of assessing whole diet like dietary quality indices.
Therefore, our aim was to determine the diet quality indices and biochemical parameters among patients with NAFLD and healthy individuals.
Patients and Methods
This case-control study was performed on 100 patients with NAFLD and 100 healthy subjects who were attending to Gastrointestinal Research Center, Baqiyatallah University, Tehran, Iran during the recent years. Usual dietary intake was assessed by three dietary records (one weekend and two week days). Healthy eating index (HEI), dietary diversity score (DDS), dietary energy density (DED), mean adequacy ratio of nutrients (MAR) were assessed according to the standard methods.
Patients with NAFLD had higher body mass index, weight and waist circumference compared to the healthy group (P < 0.05). Serum levels of liver enzymes, triglyceride, LDL, BUN, and uric acid were higher in patients with NAFLD (P < 0.05). Although patients with NAFLD had higher energy, carbohydrate and fat intake, their values for antioxidant vitamins, calcium and vitamin D were lower than healthy subjects (P < 0.05). HEI and MAR were higher among healthy group, and DED was lower among them. Nutrient adequacy ratio for calcium, vitamin D, and antioxidant micronutrients were lower in patients with NAFLD (P < 0.05).
It seems that dietary quality indices may be associated with NAFLD. Calcium, vitamin D, and antioxidant micronutrients intake might be lower among patients with NAFLD based on this case-control study. Further prospective studies should be conducted in this regard.
PMCID: PMC3776150  PMID: 24065998
Diet Therapy; Healthy People Programs; Biochemical Processes; Non-alcoholic Fatty Liver Disease
7.  Evaluation of nutrient intake and diet quality of gastric cancer patients in Korea 
Nutrition Research and Practice  2012;6(3):213-220.
This study was conducted to identify dietary factors that may affect the occurrence of gastric cancer in Koreans. Preoperative daily nutrition intake and diet quality of patients diagnosed with gastric cancer were evaluated. Collected data were comparatively analyzed by gender. The results were then used to prepare basic materials to aid in the creation of a desirable postoperative nutrition management program. The subjects of this study were 812 patients (562 men and 250 women) who were diagnosed with gastric cancer and admitted for surgery at Soonchunhyang University Hospital between January 2003 and December 2010. Nutrition intake and diet quality were evaluated by the 24-hr recall method, the nutrient adequacy ratio, mean adequacy ratio (MAR), nutrient density (ND), index of nutritional quality (INQ), dietary variety score (DVS), and dietary diversity score (DDS). The rate of skipping meals and eating fast, alcohol consumption, and smoking were significantly higher in males than those in females. The levels of energy, protein, fat, carbohydrate, phosphorous, sodium, potassium, vitamin B1, vitamin B2, niacin, and cholesterol consumption were significantly higher in males than those in females. Intake of fiber, zinc, vitamin A, retinol, carotene, folic acid were significantly higher in females than those in males. MAR in males was significantly higher (0.83) than that in females (0.79). INQ values were higher in females for zinc, vitamin A, vitamin B2, vitamin B6, and folic acid than those in males. The average DVS was 17.63 for females and 13.19 for males. The average DDS was 3.68 and the male's average score was 3.44, whereas the female's average score was 3.92. In conclusion, males had more dietary habit problems and poor nutritional balance than those of females. Our findings suggest that proper nutritional management and adequate dietary education for the primary prevention of gastric cancer should be emphasized in men.
PMCID: PMC3395786  PMID: 22808345
Gastric cancer; dietary habit; nutrient intake; diet quality
8.  Development of a food frequency questionnaire for Sri Lankan adults 
Nutrition Journal  2012;11:63.
Food Frequency Questionnaires (FFQs) are commonly used in epidemiologic studies to assess long-term nutritional exposure. Because of wide variations in dietary habits in different countries, a FFQ must be developed to suit the specific population. Sri Lanka is undergoing nutritional transition and diet-related chronic diseases are emerging as an important health problem. Currently, no FFQ has been developed for Sri Lankan adults. In this study, we developed a FFQ to assess the regular dietary intake of Sri Lankan adults.
A nationally representative sample of 600 adults was selected by a multi-stage random cluster sampling technique and dietary intake was assessed by random 24-h dietary recall. Nutrient analysis of the FFQ required the selection of foods, development of recipes and application of these to cooked foods to develop a nutrient database. We constructed a comprehensive food list with the units of measurement. A stepwise regression method was used to identify foods contributing to a cumulative 90% of variance to total energy and macronutrients. In addition, a series of photographs were included.
We obtained dietary data from 482 participants and 312 different food items were recorded. Nutritionists grouped similar food items which resulted in a total of 178 items. After performing step-wise multiple regression, 93 foods explained 90% of the variance for total energy intake, carbohydrates, protein, total fat and dietary fibre. Finally, 90 food items and 12 photographs were selected.
We developed a FFQ and the related nutrient composition database for Sri Lankan adults. Culturally specific dietary tools are central to capturing the role of diet in risk for chronic disease in Sri Lanka. The next step will involve the verification of FFQ reproducibility and validity.
PMCID: PMC3496639  PMID: 22937734
Food frequency questionnaire; Development; FFQ; Sri Lanka; Adults
9.  Validation of a measure to assess Post-Traumatic Stress Disorder: a Sinhalese version of Impact of Event Scale 
There is paucity of measures to conduct epidemiological studies related to disasters in Sri Lanka. This study validates a Sinhalese translation of the Impact of Event Scale- 8 items version (IES-8) for use in Sri Lanka.
This cross-sectional validation study was conducted in the densely populated rural area of Tangalle in the Southern province of Sri Lanka. The English version of the IES-8 after translation procedures in to Sinhalese was administered by trained raters to a community sample of 30 survivors of tsunami aged 13 years and above. Diagnostic accuracy, reproducibility and validity of the translated IES was assessed in terms of sensitivity, specificity, predictive values, likelihood ratios, diagnostic odds ratio, inter-rater reliability, internal consistency, criterion validity and construct validity.
The cut-off score of 15 gave a fair sensitivity (77%) for screening along with other components of diagnostic accuracy. The inter-rater reliability was high (0.89). The internal consistency for the whole scale was high (0.78) with a high face and content validity. The criterion validity was high (0.83) and the construct validity demonstrated the two factor structure documented in the literature.
This study demonstrates that this Sinhalese version of the Impact of Event Scale has sound diagnostic accuracy as well as psychometric properties and makes it an ideal measure for epidemiological studies related to natural and man made disasters in Sri Lanka.
PMCID: PMC1803770  PMID: 17306023
10.  Twelve Years of Rabies Surveillance in Sri Lanka, 1999–2010 
Rabies is endemic in Sri Lanka, but little is known about the temporal and spatial trends of rabies in this country. Knowing these trends may provide insight into past control efforts and serve as the basis for future control measures. In this study, we analyzed distribution of rabies in humans and animals over a period of 12 years in Sri Lanka.
Accumulated data from 1999 through 2010 compiled by the Department of Rabies Diagnosis and Research, Medical Research Institute (MRI), Colombo, were used in this study.
The yearly mean percentage of rabies-positive sample was 62.4% (47.6–75.9%). Three-fourths of the rabies-positive samples were from the Colombo, Gampaha, and Kalutara districts in Western province, followed by Galle in Southern province. A high percentage of the rabies samples were from dogs (85.2%), followed by cats (7.9%), humans (3.8%), wild animals (2.0%), and livestock (1.1%). Among wild animals, mongooses were the main victims followed by civets. The number of suspect human rabies cases decreased gradually in Sri Lanka, although the number of human samples submitted for laboratory confirmation increased.
The number of rabid dogs has remained relatively unchanged, but the number of suspect human rabies is decreasing gradually in Sri Lanka. These findings indicate successful use of postexposure prophylaxis (PEP) by animal bite victims and increased rabies awareness. PEP is free of charge and is supplied through government hospitals by the Ministry of Health, Sri Lanka. Our survey shows that most positive samples were received from Western and Southern provinces, possibly because of the ease of transporting samples to the laboratory. Submissions of wild animal and livestock samples should be increased by creating more awareness among the public. Better rabies surveillance will require introduction of molecular methods for detection and the establishment of more regional rabies diagnostic laboratories.
Author Summary
Rabies is a public health concern in Sri Lanka. The incidence of dog rabies remains unchanged, but the incidence of suspect human rabies is decreasing gradually in Sri Lanka. This finding indicates the effects of improved access to postexposure prophylaxis by animal bite victims and increased rabies awareness. As in other rabies-endemic countries, in Sri Lanka, human rabies is transmitted mainly by dogs, although domestic and wild animals have been diagnosed rabid, and can pose a risk of exposure to humans. In this study, we analyzed 12 years of data accumulated in the national reference laboratory of Sri Lanka to identify the trends of rabies in this country. This study showed that rabies has been recorded mainly in Western and Southern Provinces of Sri Lanka, possibly because of the ease of communication with rabies diagnostic laboratories from these areas. Regional rabies diagnosis laboratories should be established to improve surveillance of rabies in Sri Lanka. There were few submitted animal samples from livestock and wild animals, indicating that greater awareness is needed among the public regarding the need to submit suspect rabid animals for diagnostic evaluation. These data could help policy makers improve rabies prevention and to control rabies in Sri Lanka.
PMCID: PMC4191952  PMID: 25299511
11.  Homegardens as a Multi-functional Land-Use Strategy in Sri Lanka with Focus on Carbon Sequestration 
Ambio  2013;42(7):892-902.
This paper explores the concept of homegardens and their potential functions as strategic elements in land-use planning, and adaptation and mitigation to climate change in Sri Lanka. The ancient and locally adapted agroforestry system of homegardens is presently estimated to occupy nearly 15 % of the land area in Sri Lanka and is described in the scientific literature to offer several ecosystem services to its users; such as climate regulation, protection against natural hazards, enhanced land productivity and biological diversity, increased crop diversity and food security for rural poor and hence reduced vulnerability to climate change. Our results, based on a limited sample size, indicate that the homegardens also store significant amount of carbon, with above ground biomass carbon stocks in dry zone homegardens (n = 8) ranging from 10 to 55 megagrams of carbon per hectare (Mg C ha−1) with a mean value of 35 Mg C ha−1, whereas carbon stocks in wet zone homegardens (n = 4) range from 48 to 145 Mg C ha−1 with a mean value of 87 Mg C ha−1. This implies that homegardens may contain a significant fraction of the total above ground biomass carbon stock in the terrestrial system in Sri Lanka, and from our estimates its share has increased from almost one-sixth in 1992 to nearly one-fifth in 2010. In the light of current discussions on reducing emissions from deforestation and forest degradation (REDD+), the concept of homegardens in Sri Lanka provides interesting aspects to the debate and future research in terms of forest definitions, setting reference levels, and general sustainability.
PMCID: PMC3790136  PMID: 23456780
Land rehabilitation; Carbon sequestration and offsets; Land-use expansion and intensification; REDD+ implications
12.  Improvement in Survival after Paraquat Ingestion Following Introduction of a New Formulation in Sri Lanka 
PLoS Medicine  2008;5(2):e49.
Pesticide ingestion is a common method of self-harm in the rural developing world. In an attempt to reduce the high case fatality seen with the herbicide paraquat, a novel formulation (INTEON) has been developed containing an increased emetic concentration, a purgative, and an alginate that forms a gel under the acid conditions of the stomach, potentially slowing the absorption of paraquat and giving the emetic more time to be effective. We compared the outcome of paraquat self-poisoning with the standard formulation against the new INTEON formulation following its introduction into Sri Lanka.
Methods and Findings
Clinical data were prospectively collected on 586 patients with paraquat ingestion presenting to nine large hospitals across Sri Lanka with survival to 3 mo as the primary outcome. The identity of the formulation ingested after October 2004 was confirmed by assay of blood or urine samples for a marker compound present in INTEON. The proportion of known survivors increased from 76/297 with the standard formulation to 103/289 with INTEON ingestion, and estimated 3-mo survival improved from 27.1% to 36.7% (difference 9.5%; 95% confidence interval [CI] 2.0%–17.1%; p = 0.002, log rank test). Cox proportional hazards regression analyses showed an approximately 2-fold reduction in toxicity for INTEON compared to standard formulation. A higher proportion of patients ingesting INTEON vomited within 15 min (38% with the original formulation to 55% with INTEON, p < 0.001). Median survival time increased from 2.3 d (95% CI 1.2–3.4 d) with the standard formulation to 6.9 d (95% CI 3.3–10.7 d) with INTEON ingestion (p = 0.002, log rank test); however, in patients who did not survive there was a comparatively smaller increase in median time to death from 0.9 d (interquartile range [IQR] 0.5–3.4) to 1.5 d (IQR 0.5–5.5); p = 0.02.
The survey has shown that INTEON technology significantly reduces the mortality of patients following paraquat ingestion and increases survival time, most likely by reducing absorption.
Martin Wilks and colleagues compared the outcome of paraquat self-poisoning with the standard formulation against a new formulation following its introduction into Sri Lanka.
Editors' Summary
Paraquat is a non-selective herbicide used in many countries on a variety of crops including potatoes, rice, maize, tea, cotton, and bananas. It is fast-acting, rainfast, and facilitates “no-till” farming, but it has attracted controversy because of the potential for misuse, particularly in developing countries. Better training of workers has been shown to reduce the number of accidents, and additions to the liquid formulation have contributed to a reduction in cases where paraquat was drunk by mistake—blue color and a stench agent made it less attractive to drink, and an emetic to induce vomiting aimed to reduce the time it is retained in the body.
Why Was This Study Done?
Despite the changes made to the formulation, paraquat is still taken deliberately as a poison by agricultural workers in parts of the developing world. Although other pesticides cause more deaths overall, paraquat poisoning is more frequently fatal than other common pesticides. Syngenta, a commercial producer of paraquat, has developed a new paraquat formulation designed to reduce its toxicity. Syngenta introduced the new formulation in Sri Lanka, a country well known for its high level of suicides with pesticides, in 2004. This new formulation includes three components designed to reduce paraquat absorption from the stomach and intestines: a gelling agent to thicken the formulation in the acidic environment of the stomach and slow its passage into the small intestine; an increase in the amount of emetic to induce more vomiting more quickly; and a purgative to speed its exit from the small intestine, the main site of its absorption. The researchers wished to know whether the new formulation could contribute to improved survival in instances where paraquat had been ingested.
What Did the Researchers Do and Find?
The researchers gathered information on the time and circumstances of when paraquat was taken, the amount that was taken, the times, and details of any vomiting, treatment, and outcomes for cases of attempted suicide by paraquat poisoning at nine large hospitals in agricultural regions of Sri Lanka from December 2003 to January 2006. In total, 774 patients were tracked in this time. Syngenta introduced the new formulation in Sri Lanka on 1 October 2004. The researchers gathered information on the formulation involved in subsequent cases, by either interview or analysis of samples. After excluding some unusual or less certain cases, they analyzed data on 586 patients, of whom 297 had deliberately taken the standard formulation and 289 the new formulation.
Although the new formulation was still toxic, the data showed an increase in the proportion of cases surviving for at least three months—from 27% (standard formulation) to 37% (new formulation), an effect that was unlikely to be due to chance. More patients vomited within 15 minutes of taking the new formulation of paraquat. Patients who died generally survived longer if they had taken the new rather than the standard formulation. The researchers estimated that the new formulation is just over half as toxic as the standard formulation, meaning that a patient was likely to suffer the same level of ill effects after taking twice as much of the new formulation compared to the standard formulation.
What Do these Findings Mean?
This study was designed, funded, and led by Syngenta, the manufacturer of the standard and new formulations of paraquat but the study team included a number of independent Sri Lankan and international scientists. As the researchers observed the effects of the introduction of the new formulation across the entire country at the same time, they could not completely rule out other possible reasons for the differences in outcomes for those who had taken the two formulations, such as differences in treatment.
Despite this inherent drawback, the researchers estimate that during the study the new formulation saved about 30 lives. They conclude that the the new formulation does reduce the amount of paraquat absorbed by the body, although the study does not answer the question whether this was due to the gelling agent, the increased emetic in the new formulation or a combination of factors. The researchers suggest that the new formulation, by keeping patients alive longer, may allow doctors more time to treat patients. As no effective treatment exists at present, this benefit relies on a treatment being developed in the future.
The researchers note that the most important factor in predicting the outcome when paraquat has been taken deliberately is the dose. As a result, they suggest that the new formulation can only be one part of a wider strategy to reduce deaths by deliberate self-poisoning using paraquat. They suggest that such an integrated approach might include generic measures to reduce incidents of self-harm, reduced access to paraquat, reduced formulation strength, and improvements in treatment.
Additional Information.
Please access these Web sites via the online version of this summary at
The US Environmental Protection Agency has published its Reregistration Eligibility Decision for paraquat
The Department of Health and Human Services of the US Centers for Disease Control and Prevention provides a fact sheet on how to handle paraquat and suspected cases of exposure
The World Health Organisation has recently finished consulting on a draft Poisons Information Monograph for paraquat
The International Programme on Chemical Safety (IPCS) has published a review of paraquat in its Environmental Health Criteria Series
MedlinePlus provides links to information on health effects of paraquat
PMCID: PMC2253611  PMID: 18303942
13.  Index for Measuring the Quality of Complementary Feeding Practices in Rural India 
This community-based cross-sectional study was undertaken to develop a complementary feeding index (CFI) to assess the adequacy of complementary feeding (CF) practices and determine its association with growth of infants, aged 6–12 months, in rural Indian population. The study was conducted in six villages of Ghaziabad district, Uttar Pradesh, India. A structured interview schedule was used for eliciting information from 151 mothers of infants, aged 6–12 months, on CF practices. Data on CF practices were scored using the CFI developed. Measurements of weight and length were taken. Bivariate and multivariate analyses were done using the SPSS software (version 13). The results revealed that the CF practices were suboptimal in the sample. The mean±standard deviation (SD) CFI scores ranged from a low value of 7.09±3.21 in 6–8 months old infants to a comparatively-higher value of 9.69±2.94 in 9–12 months old infants. Using the CFI it could be identified that infants (n=151) had poor dietary diversity, with only 31% and 18% of the infants reportedly being fed the recommended number of food-groups during 6–8 and 9–12 months respectively. The food-frequency scores of the CFI showed that cereals and diluted animal milk were the major food-groups fed to the infants in this setting. Analysis of nutritional status revealed that 24.5% of the infants were stunted (length-for-age [LAZ] <-2SD), 25% were underweight (weight-for-age [WAZ] <-2SD), and 17% were wasted (weight-for-age [WLZ] <-2SD). Significant associations (p<0.05) were observed between the meal-frequency and the dietary diversity of the CFs of infants aged 6–8 months and 9–12 months and the WAZ and LAZ indices of their nutritional status. On multivariate analysis of factors affecting the LAZ, WAZ and WLZ scores, the CFI was significantly associated (p<0.05) with LAZ whereas maternal education and breastfeeding frequency were significantly (p<0.01) associated with WAZ and WLZ. Per-capita income, parity, and birth-order were the significant (p<0.05) determinants of the CFI. The CFI developed is an exploratory attempt to summarize and quantify the key CF practices into a composite index, which would reflect the CF practices holistically. This index can be used as an easy tool by programme planners for identifying, targeting, and monitoring the deficient CF practices and also advocating the importance of the CF at policy level.
PMCID: PMC2928111  PMID: 20099760
Community-based studies; Complementary feeding index; Complementary feeding practices; Cross-sectional studies; Infant nutritional status; India
14.  The effect of race and predictors of socioeconomic status on diet quality in the Healthy Aging in Neighborhoods of Diversity across the Life Span (HANDLS) study sample 
To examine effects of race and predictors of socioeconomic status (SES) on nutrient-based diet quality and their contribution to health disparities in an urban population of low SES.
Data were analyzed from a sample of the Healthy Aging in Neighborhoods of Diversity across the Life Span (HANDLS) Study participants examining effects of age, sex, race, income, poverty income ratio (PIR), education, employment, and smoking status on nutrient-based diet quality as measured by a micronutrient composite index of nutrient adequacy ratios (NAR) and a mean adequacy ratio (MAR). Regression models were used to examine associations and t-tests were used to look at racial differences.
African American and white adults ages 30-64 residing in 12 predefined census tracts in Baltimore City, Maryland.
Sex, age, education, PIR, and income were statistically significant predictors of diet quality for African Americans, while sex, education, and smoking status were statistically significant for whites. African Americans had lower MAR scores than whites (76.4 vs. 79.1). Whites had significantly higher NAR scores for thiamin, riboflavin, folate, B12, vitamins A and E, magnesium, copper, zinc, and calcium, while African Americans had higher vitamin C scores.
Education significantly impacted diet quality in the HANDLS sample, but race cannot be discounted. Whether the racial differences in diet quality are indicative of cultural differences in food preferences, selection, preparation, and availability or disparities in socioeconomic status remains unclear.
PMCID: PMC2987681  PMID: 21053707
Race; Socioeconomic Status; Education; Diet Quality
15.  Development and validation of a food-based diet quality index for New Zealand adolescents 
BMC Public Health  2013;13:562.
As there is no population-specific, simple food-based diet index suitable for examination of diet quality in New Zealand (NZ) adolescents, there is a need to develop such a tool. Therefore, this study aimed to develop an adolescent-specific diet quality index based on dietary information sourced from a Food Questionnaire (FQ) and examine its validity relative to a four-day estimated food record (4DFR) obtained from a group of adolescents aged 14 to 18 years.
A diet quality index for NZ adolescents (NZDQI-A) was developed based on ‘Adequacy’ and ‘Variety’ of five food groups reflecting the New Zealand Food and Nutrition Guidelines for Healthy Adolescents. The NZDQI-A was scored from zero to 100, with a higher score reflecting a better diet quality. Forty-one adolescents (16 males, 25 females, aged 14–18 years) each completed the FQ and a 4DFR. The test-retest reliability of the FQ-derived NZDQI-A scores over a two-week period and the relative validity of the scores compared to the 4DFR were estimated using Pearson’s correlations. Construct validity was examined by comparing NZDQI-A scores against nutrient intakes obtained from the 4DFR.
The NZDQI-A derived from the FQ showed good reliability (r = 0.65) and reasonable agreement with 4DFR in ranking participants by scores (r = 0.39). More than half of the participants were classified into the same thirds of scores while 10% were misclassified into the opposite thirds by the two methods. Higher NZDQI-A scores were also associated with lower total fat and saturated fat intakes and higher iron intakes.
Higher NZDQI-A scores were associated with more desirable fat and iron intakes. The scores derived from either FQ or 4DFR were comparable and reproducible when repeated within two weeks. The NZDQI-A is relatively valid and reliable in ranking diet quality in adolescents at a group level even in a small sample size. Further studies are required to test the predictive validity of this food-based diet index in larger samples.
PMCID: PMC3706237  PMID: 23759064
Diet quality index; Dietary patterns; Validity; Adolescents; New Zealand
16.  Consuming cassava as a staple food places children 2-5 years old at risk for inadequate protein intake, an observational study in Kenya and Nigeria 
Nutrition Journal  2010;9:9.
Inadequate protein intake is known to be deleterious in animals. Using WHO consensus documents for human nutrient requirements, the protein:energy ratio (P:E) of an adequate diet is > 5%. Cassava has a very low protein content. This study tested the hypothesis that Nigerian and Kenyan children consuming cassava as their staple food are at greater risk for inadequate dietary protein intake than those children who consume less cassava.
A 24 hour dietary recall was used to determine the food and nutrient intake of 656 Nigerian and 449 Kenyan children aged 2-5 years residing in areas where cassava is a staple food. Anthropometric measurements were conducted. Diets were scored for diversity using a 12 point score. Pearson's Correlation Coefficients were calculated to relate the fraction of dietary energy obtained from cassava with protein intake, P:E, and dietary diversity.
The fraction of dietary energy obtained from cassava was > 25% in 35% of Nigerian children and 89% of Kenyan children. The mean dietary diversity score was 4.0 in Nigerian children and 4.5 in Kenyan children, although the mean number of different foods consumed on the survey day in Nigeria was greater than Kenya, 7.0 compared to 4.6. 13% of Nigerian and 53% of Kenyan children surveyed had inadequate protein intake. The fraction of dietary energy derived from cassava was negatively correlated with protein intake, P:E, and dietary diversity. Height-for age z score was directly associated with protein intake and negatively associated with cassava consumption using regression modeling that controlled for energy and zinc intake.
Inadequate protein intake was found in the diets of Nigerian and Kenyan children consuming cassava as a staple food. Inadequate dietary protein intake is associated with stunting in this population. Interventions to increase protein intake in this vulnerable population should be the focus of future work.
PMCID: PMC2837613  PMID: 20187960
17.  Dietary quality in a sample of adults with type 2 diabetes mellitus in Ireland; a cross-sectional case control study 
Nutrition Journal  2013;12:110.
A number of dietary quality indices (DQIs) have been developed to assess the quality of dietary intake. Analysis of the intake of individual nutrients does not reflect the complexity of dietary behaviours and their association with health and disease. The aim of this study was to determine the dietary quality of individuals with type 2 diabetes mellitus (T2DM) using a variety of validated DQIs.
In this cross-sectional analysis of 111 Caucasian adults, 65 cases with T2DM were recruited from the Diabetes Day Care Services of St. Columcille’s and St. Vincent’s Hospitals, Dublin, Ireland. Forty-six controls did not have T2DM and were recruited from the general population. Data from 3-day estimated diet diaries were used to calculate 4 DQIs.
Participants with T2DM had a significantly lower score for consumption of a Mediterranean dietary pattern compared to the control group, measured using the Mediterranean Diet Score (Range 0–9) and the Alternate Mediterranean Diet Score (Range 0–9) (mean ± SD) (3.4 ± 1.3 vs 4.8 ± 1.8, P < 0.001 and 3.3 ± 1.5 vs 4.2 ± 1.8, P = 0.02 respectively). Participants with T2DM also had lower dietary quality than the control population as assessed by the Healthy Diet Indicator (Range 0–9) (T2DM; 2.6 ± 2.3, control; 3.3 ± 1.1, P = 0.001). No differences between the two groups were found when dietary quality was assessed using the Alternate Healthy Eating Index. Micronutrient intake was assessed using the Micronutrient Adequacy Score (Range 0–8) and participants with T2DM had a significantly lower score than the control group (T2DM; 1.6 ± 1.4, control; 2.3 ± 1.4, P = 0.009). When individual nutrient intakes were assessed, no significant differences were observed in macronutrient intake.
Overall, these findings demonstrate that T2DM was associated with a lower score when dietary quality was assessed using a number of validated indices.
PMCID: PMC3750542  PMID: 23915093
Dietary quality; Nutrient intake; Type 2 diabetes mellitus
18.  Low-Dose Adrenaline, Promethazine, and Hydrocortisone in the Prevention of Acute Adverse Reactions to Antivenom following Snakebite: A Randomised, Double-Blind, Placebo-Controlled Trial 
PLoS Medicine  2011;8(5):e1000435.
In a factorial randomized trial conducted in Sri Lanka, de Silva and colleagues evaluate the safety and efficacy of pretreatments intended to reduce the risk of serious reactions to antivenom following snakebite.
Envenoming from snakebites is most effectively treated by antivenom. However, the antivenom available in South Asian countries commonly causes acute allergic reactions, anaphylactic reactions being particularly serious. We investigated whether adrenaline, promethazine, and hydrocortisone prevent such reactions in secondary referral hospitals in Sri Lanka by conducting a randomised, double-blind placebo-controlled trial.
Methods and Findings
In total, 1,007 patients were randomized, using a 2×2×2 factorial design, in a double-blind, placebo-controlled trial of adrenaline (0.25 ml of a 1∶1,000 solution subcutaneously), promethazine (25 mg intravenously), and hydrocortisone (200 mg intravenously), each alone and in all possible combinations. The interventions, or matching placebo, were given immediately before infusion of antivenom. Patients were monitored for mild, moderate, or severe adverse reactions for at least 96 h. The prespecified primary end point was the effect of the interventions on the incidence of severe reactions up to and including 48 h after antivenom administration. In total, 752 (75%) patients had acute reactions to antivenom: 9% mild, 48% moderate, and 43% severe; 89% of the reactions occurred within 1 h; and 40% of all patients were given rescue medication (adrenaline, promethazine, and hydrocortisone) during the first hour. Compared with placebo, adrenaline significantly reduced severe reactions to antivenom by 43% (95% CI 25–67) at 1 h and by 38% (95% CI 26–49) up to and including 48 h after antivenom administration; hydrocortisone and promethazine did not. Adding hydrocortisone negated the benefit of adrenaline.
Pretreatment with low-dose adrenaline was safe and reduced the risk of acute severe reactions to snake antivenom. This may be of particular importance in countries where adverse reactions to antivenom are common, although the need to improve the quality of available antivenom cannot be overemphasized.
Trial registration NCT00270777
Please see later in the article for the Editors' Summary
Editors' Summary
Of the 3,000 or so snake species in the world, about 600 are venomous. Venomous snakes, which are particularly common in equatorial and tropical regions, immobilize their prey by injecting modified saliva (venom) into their prey's tissues through their fangs—specialized hollow teeth. Snakes also use their venoms for self-defense and will bite people who threaten, startle, or provoke them. A bite from a highly venomous snake such as a pit viper or cobra can cause widespread bleeding, muscle paralysis, irreversible kidney damage, and tissue destruction (necrosis) around the bite site. All these effects of snakebite are potentially fatal; necrosis can also result in amputation and permanent disability. It is hard to get accurate estimates of the number of people affected by snakebite, but there may be about 2 million envenomings (injections of venom) and 100,000 deaths every year, many of them in rural areas of South Asia, Southeast Asia, and sub-Saharan Africa.
Why Was This Study Done?
The best treatment for snakebite is to give antivenom (a mixture of antibodies that neutralize the venom) as soon as possible. Unfortunately, in countries where snakebites are common (for example, Sri Lanka), antivenoms are often of dubious quality, and acute allergic reactions to them frequently occur. Although some of these reactions are mild (for example, rashes), in up to 40% of cases, anaphylaxis—a potentially fatal, whole-body allergic reaction—develops. The major symptoms of anaphylaxis—a sudden drop in blood pressure and breathing difficulties caused by swelling of the airways—can be treated with adrenaline. Injections of antihistamines (for example, promethazine) and hydrocortisone can also help. In an effort to prevent anaphylaxis, these drugs are also widely given before antivenom, but there is little evidence that such “prophylactic” treatment is effective or safe. In this randomized double-blind controlled trial (RCT), the researchers test whether low-dose adrenaline, promethazine, and/or hydrocortisone can prevent acute adverse reactions to antivenom. In an RCT, the effects of various interventions are compared to a placebo (dummy) in groups of randomly chosen patients; neither the patients nor the people caring for them know who is receiving which treatment until the trial is completed.
What Did the Researchers Do and Find?
The researchers randomized 1,007 patients who had been admitted to secondary referral hospitals in Sri Lanka after snakebite to receive low-dose adrenaline, promethazine, hydrocortisone, or placebo alone and in all possible combinations immediately before treatment with antivenom. The patients were monitored for at least 96 hours for adverse reactions to the antivenom; patients who reacted badly were given adrenaline, promethazine, and hydrocortisone as “rescue medication.” Three-quarters of the patients had acute reactions—mostly moderate or severe—to the antivenom. Most of the acute reactions occurred within an hour of receiving the antivenom, and nearly half of all the patients were given rescue medication during the first hour. Compared with placebo, pretreatment with adrenaline reduced severe reactions to the antivenom by 43% at one hour and by 38% over 48 hours. By contrast, neither hydrocortisone nor promethazine given alone reduced the rate of adverse reactions to the antivenom. Moreover, adding hydrocortisone negated the beneficial effect of adrenaline.
What Do These Findings Mean?
These findings show that pretreatment with low-dose adrenaline is safe and reduces the risk of acute severe reactions to snake antivenom, particularly during the first hour after infusion. They do not provide support for pretreatment with promethazine or hydrocortisone, however. Indeed, the findings suggest that the addition of hydrocortisone could negate the benefits of adrenaline, although this finding needs to be treated with caution because of the design of the trial, as does the observed increased risk of death associated with pretreatment with hydrocortisone. More generally, the high rate of acute adverse reactions to antivenom in this trial highlights the importance of improving the quality of antivenoms available in Sri Lanka and other parts of South Asia. The researchers note that the recent World Health Organization guidelines on production, control, and regulation of antivenom should help in this regard but stress that, for now, it is imperative that physicians carefully monitor patients who have been given antivenom and provide prompt treatment of acute reactions when they occur.
Additional Information
Please access these Web sites via the online version of this summary at
The MedlinePlus Encyclopedia has pages on snakebite and on anaphylaxis (in English and Spanish)
The UK National Health Service Choices website also has pages on snakebite and on anaphylaxis
The World Health Organization has information on snakebite and on snake antivenoms (in several languages); its Guidelines for the Production, Control and Regulation of Snake Antivenom Immunoglobulins are also available
The Global Snakebite Initiative has information on snakebite
A PLoS Medicine Research Article by Anuradhani Kasturiratne and colleagues provides data on the global burden of snakebite
A PLoS Medicine Neglected Diseases Article by José María Gutiérrez and colleagues discusses the neglected problem of snakebite envenoming
PMCID: PMC3091849  PMID: 21572992
19.  Zinc status and dietary quality of type 2 diabetic patients: implication of physical activity level 
The purpose of this study was to analyze the relationships among zinc status, diet quality, glycemic control and self-rated physical activity level of type 2 diabetic patients. Dietary intakes for two non-consecutive days were measured by 24-hour recall method for seventy-six diabetic patients. Fasting blood glucose and HbA1c were measured for the assessment of glycemic control. We evaluated the extent of dietary adequacy by the percentage of subjects with a dietary intake of a nutrient less than the estimated average requirement(EAR), the dietary diversity score(DDS) and the dietary variety score(DVS). Zinc status was assessed from serum levels and urinary excretion. Dietary inadequacy was serious for five nutrients: riboflavin, calcium, thiamin, zinc and vitamin C. Dietary intakes from the meat, fish, and egg food groups and the milk food group were below the recommended level. We found that subjects with high levels of physical activities had significantly higher DVS and serum zinc levels compared to others (p<0.05). Fasting blood glucose levels and HbA1c were not significantly different across self-reported physical activity levels. Therefore, we suggest that maintaining physical activity at or above a moderate level is beneficial to improving dietary quality and zinc status.
PMCID: PMC2815304  PMID: 20126364
Diabetes; zinc status; diet quality; physical activity level; glycemic control
20.  Changes in dietary habits after migration and consequences for health: a focus on South Asians in Europe 
Food & Nutrition Research  2012;56:10.3402/fnr.v56i0.18891.
Immigrants from low-income countries comprise an increasing proportion of the population in Europe. Higher prevalence of obesity and nutrition related diseases, such as type 2 diabetes (T2D) and cardiovascular disease (CVD) is found in some immigrant groups, especially in South Asians.
To review dietary changes after migration and discuss the implication for health and prevention among immigrants from low-income countries to Europe, with a special focus on South Asians.
Systematic searches in PubMed were performed to identify relevant high quality review articles and primary research papers. The searches were limited to major immigrant groups in Europe, including those from South Asia (India, Pakistan, Bangladesh, Sri Lanka). Articles in English from 1990 and onwards from Europe were included. For health implications, recent review articles and studies of particular relevance to dietary changes among South Asian migrants in Europe were chosen.
Most studies report on dietary changes and health consequences in South Asians. The picture of dietary change is complex, depending on a variety of factors related to country of origin, urban/rural residence, socio-economic and cultural factors and situation in host country. However, the main dietary trend after migration is a substantial increase in energy and fat intake, a reduction in carbohydrates and a switch from whole grains and pulses to more refined sources of carbohydrates, resulting in a low intake of fiber. The data also indicate an increase in intake of meat and dairy foods. Some groups have also reduced their vegetable intake. The findings suggest that these dietary changes may all have contributed to higher risk of obesity, T2D and CVD.
Implications for prevention
A first priority in prevention should be adoption of a low-energy density – high fiber diet, rich in whole grains and grain products, as well as fruits, vegetables and pulses. Furthermore, avoidance of energy dense and hyperprocessed foods is an important preventive measure.
PMCID: PMC3492807  PMID: 23139649
dietary change; food habits; diabetes; cardiovascular disease; immigrants; South Asia; acculturation
21.  Validation of a functional screening instrument for dementia in an elderly sri lankan population: comparison of modified bristol and blessed activities of daily living scales 
BMC Research Notes  2010;3:268.
Cognitive tests have been used in population surveys as first stage screens for dementia but are biased by education. However functional ability scales are less biased by education than the cognitive scale and thus can be used in screening for dementia.
To validate Activities of Daily Living (ADL) scale appropriate for use in assessing the presence of dementia in an elderly population living in care homes in Sri Lanka.
Sinhalese version of the modified Bristol and Blessed scale was administered to subjects aged 55 years and above residing in 14 randomly selected elders' homes. Receiver Operating Characteristic (ROC) was used to determine the cut-off scores of both the scales.
Based on the ROC analysis, optimal cut off score of the modified Bristol scale was 20 with a sensitivity of 100%, specificity of 74.2% and the area under the curve 0.933(95% CI: 0.871-0.995) while the optimal cut off score of the modified Blessed scale was 10.5 with a sensitivity of 100%, specificity of 71% and the area under the curve 0.892 (95% CI: 0.816-0.967).
The findings confirm that both the scales can be used in screening for dementia in the elderly living in care homes in Sri Lanka.
PMCID: PMC2987868  PMID: 20974013
22.  Comparison of Nutrient Intake and Diet Quality Between Hyperuricemia Subjects and Controls in Korea 
Clinical Nutrition Research  2014;3(1):56-63.
Hyperuricemia is associated with metabolic syndrome as well as gout, and the prevalence of hyperuricemia is increasing in Korea. This study aimed to compare the nutrient intake and diet quality between hyperuricemia subjects and controls. Of the 28,589 people who participated in a health examination between 2008 and 2011, 9,010 subjects were selected whose 3-day food records were available. Clinical and laboratory data were collected from electronic medical records. Diet quality was evaluated using the food habit score (FHS), nutrient adequacy ratio (NAR), and mean adequacy ratio (MAR). The prevalence of hyperuricemia was 13.8% (27.1%, men; 5.2%, women). Body mass index, waist circumference, triglycerides, total cholesterol, and low-density lipoprotein cholesterol were significantly higher (p < 0.0001), while high-density cholesterol (p < 0.001) was significantly lower in the hyperuricemia subjects than in the controls. The hyperuricemia subjects had a lower intake of vitamin A (p < 0.004), vitamin C, folate, fiber, and calcium than the controls (p < 0.0001). Intake of vegetables and dairy products was significantly lower, whereas alcohol intake was significantly higher in the hyperuricemia subjects than in the controls ( p < 0.0001). The FHS (p < 0.0001), MAR (p < 0.0001), and NARs for vitamin A (p = 0.01), vitamin B2, vitamin C, folate, and calcium (p < 0.0001) were significantly lower in the hyperuricemia subjects than in the controls. In conclusion, the hyperuricemia subjects reported poorer diet quality than the controls, including higher alcohol intake and lower vegetable and dairy product intake.
PMCID: PMC3921296  PMID: 24527421
Hyperuricemia; Uric acid; Diet records; Nutritive Value; Food Habits
23.  The Child Behaviour Assessment Instrument: development and validation of a measure to screen for externalising child behavioural problems in community setting 
In Sri Lanka, behavioural problems have grown to epidemic proportions accounting second highest category of mental health problems among children. Early identification of behavioural problems in children is an important pre-requisite of the implementation of interventions to prevent long term psychiatric outcomes. The objectives of the study were to develop and validate a screening instrument for use in the community setting to identify behavioural problems in children aged 4-6 years.
An initial 54 item questionnaire was developed following an extensive review of the literature. A three round Delphi process involving a panel of experts from six relevant fields was then undertaken to refine the nature and number of items and created the 15 item community screening instrument, Child Behaviour Assessment Instrument (CBAI). Validation study was conducted in the Medical Officer of Health area Kaduwela, Sri Lanka and a community sample of 332 children aged 4-6 years were recruited by two stage randomization process. The behaviour status of the participants was assessed by an interviewer using the CBAI and a clinical psychologist following clinical assessment concurrently. Criterion validity was appraised by assessing the sensitivity, specificity and predictive values at the optimum screen cut off value. Construct validity of the instrument was quantified by testing whether the data of validation study fits to a hypothetical model. Face and content validity of the CBAI were qualitatively assessed by a panel of experts. The reliability of the instrument was assessed by internal consistency analysis and test-retest methods in a 15% subset of the community sample.
Using the Receiver Operating Characteristic analysis the CBAI score of >16 was identified as the cut off point that optimally differentiated children having behavioural problems, with a sensitivity of 0.88 (95% CI = 0.80-0.96) and specificity of 0.81 (95% CI = 0.75-0.87). The Cronbach's alpha exceeded Nunnaly's criterion of 0.7 for items related to inattention, aggression and impaired social interaction.
Preliminary data obtained from the study indicate that the Child Behaviour Assessment Instrument is a valid and reliable screening instrument for early identification of young children at risk of behavioural problems in the community setting.
PMCID: PMC2897774  PMID: 20529304
24.  Assessment of the nutritional status of urban homeless adults. 
Public Health Reports  1989;104(5):451-457.
Homeless people eat foods at municipal and charity run shelters, fast-food restaurants, delicatessens, and from garbage bins. Data on the adequacy of the diets and the nutritional status of homeless persons are sparse. Therefore, nutritional indicators of 55 urban homeless subjects were assessed, and a high prevalence of risk factors was identified. Although 93 percent of subjects reported that they obtained enough to eat, a low dietary adequacy score of 10.1 (norm = 16) indicated that the quality of the diet was inadequate. Diet records showed a high intake of sodium, saturated fat, and cholesterol. Serum cholesterol levels above the desirable limit of 200 mg per dl were prevalent. Anthropometric measurements were significantly different from percentile distributions of the U.S. population (P less than .001). Triceps skinfold measurement was above the 95th percentile in 25 percent of subjects. Upper arm muscle area, which reflects lean body mass, was below the 5th percentile in 23.3 percent of women and 44 percent of the men. These decreased levels of lean body mass and increased levels of body fat, together with the elevated serum cholesterol levels and the shortages of essential nutrients in the diet, may place the homeless at risk of developing nutrition-related disorders.
PMCID: PMC1579959  PMID: 2508173
25.  Ethnobotanical study of wild edible plants of Kara and Kwego semi-pastoralist people in Lower Omo River Valley, Debub Omo Zone, SNNPR, Ethiopia 
The rural populations in Ethiopia have a rich knowledge of wild edible plants and consumption of wild edible plants is still an integral part of the different cultures in the country. In the southern part of the country, wild edible plants are used as dietary supplements and a means of survival during times of food shortage. Therefore, the aim of this study is to document the wild edible plants gathered and consumed by Kara and Kwego people, and to analyze patterns of use between the two people.
A cross sectional ethnobotanical study of wild edible plant species was conducted from January 2005 to March 2007. About 10% of each people: 150 Kara and 56 Kwego were randomly selected to serve as informants. Data were collected using semi-structured questionnaire and group discussions. Analysis of variance (α = 0.05) was used to test the similarity of species richness of wild edible plants reported by Kara and Kwego people; Pearson's Chi-square test (α = 0.05) was used to test similarity of growth forms and plant parts of wild edible plants used between the two people.
Thirty-eight wild plant species were reported as food sources that were gathered and consumed both at times of plenty and scarcity; three were unique to Kara, five to Kwego and 14 had similar local names. The plant species were distributed among 23 families and 33 genera. The species richness: families, genera and species (p > 0.05) were not significantly different between Kara and Kwego. Nineteen (50%) of the reported wild edible plants were trees, 11 (29%) were shrubs, six (16%) were herbs and two (5%) were climbers. Forty plant parts were indicated as edible: 23 (58.97%) fruits, 13 (33.33%) leaves, 3 (7.69%) roots and one (2.56%) seed. There was no difference between wild edible plants growth forms reported (Pearson's Chi-square test (d.f. = 3) = 0.872) and plant parts used (Pearson's Chi-square test (d.f. = 3) = 0.994) by Kara and Kwego people. The majority of wild edible plants were gathered and consumed from 'Duka' (March) to 'Halet' (May) and from 'Meko' (August) to 'Tejo' (November). Sixteen (41%) of the plant parts were used as a substitute for cultivated vegetables during times of scarcity. The vegetables were chopped and boiled to make 'Belesha' (sauce) or as a relish to 'Adano' (porridge). The ripe fruits were gathered and consumed fresh and some were made into juices. The seeds and underground parts were only consumed in times of famine. Thirty-seven percent of the wild edible plants were used as medicine and 23.6% were used for other functions.
The wild edible plants were used as supplements to the cultivated crops and as famine foods between harvesting seasons. But information on the nutritional values and possible toxic effects of most of the wild edible plants reported by Kara and Kwego, and others in different part of Ethiopia is not available. Therefore, the documented information on the wild edible plants may serve as baseline data for future studies on nutritional values and possible side effects, and to identify plants that may improve nutrition and increase dietary diversity. Some of these wild edible plants may have the potential to be valuable food sources (if cultivated) and could be part of a strategy in tackling food insecurity.
PMCID: PMC2933608  PMID: 20712910

Results 1-25 (714724)