Although podoconiosis is one of the major causes of tropical lymphoedema and is endemic in Ethiopia its epidemiology and risk factors are poorly understood. Individual level data for 129,959 individuals from 1,315 communities in 659 woreda (districts) were collected for nationwide integrated survey of lymphatic filariasis and podoconiosis. Blood samples were tested for circulating Wuchereria bancrofti antigen using immunochromatographic card tests. A clinical algorithm was used to reach a diagnosis of podoconiosis by excluding other potential causes of lymphoedema of the lower limb. Bayesian multilevel models were used to identify individual and environmental risk factors. Overall, 8,110 of 129,959 (6.2%, 95% confidence interval [CI] 6.1 - 6.4%) surveyed individuals were identified with lymphoedema of the lower limb, of whom 5,253 (4.0%, 95% CI 3.9 - 4.1%) were confirmed to be podoconiosis cases. In multivariable analysis, being female, older, unmarried, washing the feet less frequently than daily, and being semiskilled or unemployed were significantly associated with increased risk of podoconiosis. Attending formal education and living in a house with a covered floor were associated with decreased risk of podoconiosis. Podoconiosis exhibits marked geographical variation across Ethiopia, with variation in risk associated with variation in rainfall, enhanced vegetation index and altitude.
podoconiosis; mapping; non-filarial elephantiasis; lymphoedema; Ethiopia
The present study describes the distribution of selected micronutrients and anaemia among school-aged children living in Libo Kemkem and Fogera (Amhara State, Ethiopia), assessing differences by socio-demographic characteristics, health status and dietary habits.
A cross-sectional survey was carried out during May–December 2009. Socio-demographic characteristics, health status and dietary habits were collected. Biomarkers were determined for 764 children. Bivariate and multivariable statistical methods were employed to assess micronutrient deficiencies (MD), anaemia, and their association with different factors.
More than two thirds of the school-aged children (79.5%) had at least one MD and 40.5% had two or more coexisting micronutrient deficiencies. The most prevalent deficiencies were of zinc (12.5%), folate (13.9%), vit A (29.3%) and vit D (49%). Anaemia occurred in 30.9% of the children. Children living in rural areas were more likely to have vit D insufficiency [OR: 5.9 (3.7–9.5)] but less likely to have folate deficiency [OR: 0.2 (0.1–0.4)] and anaemia [OR: 0.58 (0.35–0.97)]. Splenomegaly was positively associated with folate deficiency and anaemia [OR: 2.77 (1.19–6.48) and 4.91 (2.47–9.75)]. Meat and fish consumption were inversely correlated with zinc and ferritin deficiencies [OR: 0.2 (0.1–0.8) and 0.2 (0.1–0.9)], while oil consumption showed a negative association with anaemia and deficiencies of folate and vitamin A [0.58 (0.3–0.9), OR: 0.5 (0.3–0.9) and 0.6 (0.4–0.9)]. Serum ferritin levels were inversely correlated to the presence of anaemia (p<0.005).
There is a high prevalence of vitamin A deficiency and vitamin D insufficiency and a moderate prevalence of zinc and folate deficiencies in school-aged children in this area. The inverse association of anaemia and serum ferritin levels may be due to the presence of infectious diseases in the area. To effectively tackle malnutrition, strategies should target not only isolated micronutrient supplementation but also diet diversification.
A rapid, sensitive and accurate laboratory diagnosis is of prime importance in suspected extrapulmonary tuberculosis (EPTB) cases. However, traditional techniques for the detection of acid-fast bacilli have limitations. The aim of the study was to evaluate the diagnostic value of immunocytochemical staining for detection of Mycobacterium tuberculosis complex specific antigen, MPT64, in aspirates from pleural effusions and lymph nodes, the most common presentations of EPTB.
A cross-sectional study was conducted by including patients at Tikur Anbessa Specialized Hospital and the United Vision Medical Services from December 2011 to June 2012. Lymph node aspirates and pleural fluid samples were collected and analyzed from a total of 51 cases (26 tuberculous (TB) pleuritis and 25 TB lymphadenitis) and 67 non-TB controls. Each specimen was subjected to Ziehl-Neelsen (ZN) staining, culture on Lowenstein– Jensen (LJ) medium, cytological examination, Polymerase Chain Reaction (PCR) using IS1081gene sequence as a primer and immunocytochemistry (ICC) with polyclonal anti-MPT64 antibody. All patients were screened for HIV.
ICC was positive in 38 of 51 cases and in the 7 of 67 controls giving an overall sensitivity and specificity of 74.5% and 89.5%, respectively. Using IS1081-PCR as a reference method, the sensitivity and specificity, positive and negative predictive value of ICC was 88.1%, 89.5%, 82.2% and 93.2%, respectively. The case detection rate increased from 13.7% by ZN stain to 19.6% by LJ culture, to 66.7% by cytology and 74.5% by ICC.
Immunocytochemistry with anti-MPT64 antigen improved detection of TB in pleural effusion and lymph node aspirates. Further studies using monoclonal antibodies on samples from other sites of EPTB is recommended to validate this relatively simple diagnostic method for EPTB.
Tuberculous lymphadenitis; Tuberculous pleural effusion; M. tuberculosis; ICC; MPT64 antigen
Little information is available on malnutrition-related factors among school-aged children ≥5 years in Ethiopia. This study describes the prevalence of stunting and thinness and their related factors in Libo Kemkem and Fogera, Amhara Regional State and assesses differences between urban and rural areas.
In this cross-sectional study, anthropometrics and individual and household characteristics data were collected from 886 children. Height-for-age z-score for stunting and body-mass-index-for-age z-score for thinness were computed. Dietary data were collected through a 24-hour recall. Bivariate and backward stepwise multivariable statistical methods were employed to assess malnutrition-associated factors in rural and urban communities.
The prevalence of stunting among school-aged children was 42.7% in rural areas and 29.2% in urban areas, while the corresponding figures for thinness were 21.6% and 20.8%. Age differences were significant in both strata. In the rural setting, fever in the previous 2 weeks (OR: 1.62; 95% CI: 1.23–2.32), consumption of food from animal sources (OR: 0.51; 95% CI: 0.29–0.91) and consumption of the family's own cattle products (OR: 0.50; 95% CI: 0.27–0.93), among others factors were significantly associated with stunting, while in the urban setting, only age (OR: 4.62; 95% CI: 2.09–10.21) and years of schooling of the person in charge of food preparation were significant (OR: 0.88; 95% CI: 0.79–0.97). Thinness was statistically associated with number of children living in the house (OR: 1.28; 95% CI: 1.03–1.60) and family rice cultivation (OR: 0.64; 95% CI: 0.41–0.99) in the rural setting, and with consumption of food from animal sources (OR: 0.26; 95% CI: 0.10–0.67) and literacy of head of household (OR: 0.24; 95% CI: 0.09–0.65) in the urban setting.
The prevalence of stunting was significantly higher in rural areas, whereas no significant differences were observed for thinness. Various factors were associated with one or both types of malnutrition, and varied by type of setting. To effectively tackle malnutrition, nutritional programs should be oriented to local needs.
Food-borne infections cause huge economic and human life losses worldwide. The most common contaminants of foods include Listeria monocytogenes Salmonellae and Staphylococcus aureus. L. monocytogenes is most notorious due to its tolerance to common food preservation methods and the risks it poses, including higher fatality rates. Safer, more efficacious control methods are thus needed. Along with food-borne pathogens, lactic acid bacteria (LAB) can also be found in foods. Some LAB isolates inhibit pathogenic bacteria by various mechanisms, including by production of antimicrobial metabolites.
The potential of cell-free culture supernatants (CFS) derived from broth cultures of selected local LAB and yeast isolates, some of which were subjected to various treatments, were tested for inhibition of L. monocytogenes, Salmonella spp. and S. aureus in in vitro culture by incorporating various proportions of the CFSs into the growth medium concurrently with inoculation (co-cultures) or following limited proliferation after inoculation of the pathogens (delayed cultures). The effects of the CFSs on various growth parameters were assessed.
CFS from the LAB isolates were strongly inhibitory when co-cultured. The inhibitory activities were stable following heat or protease treatment of the CFSs. Inhibitory activity was dependent primarily on active substance(s) secreted into the supernatant. In all co-cultures, CFS proportion-dependent progressive decrease in the number of colonies was observed and both growth rates and number of generations were reduced with significantly fewer numbers of colony forming units, whereas generation times were significantly increased compared to those of controls. Transfer from co-cultures to fresh broth showed inhibited cultures contained bacteria that can re-grow, indicating the presence of viable bacteria that are undetectable by culture. Growth rates in CFS-treated delayed cultures were also reduced to varying degrees with the number of colonies in some cultures being significantly less than the corresponding control values. CFSs were active against both Gram-positive and –negative bacteria.
Active metabolites produced and secreted by LAB into the growth medium were effective in inhibiting the tested pathogens. Early addition of the CFSs was necessary for significant inhibition to occur. Further studies will help make these findings applicable to food safety.
Lactic acid bacteria; Listeria monocytogenes; Salmonella; Staphylococcus aureus; Inhibition; Cell-free supernatant; Inhibition
The World Health Organization (WHO), international donors and partners have emphasized the importance of integrated control of neglected tropical diseases (NTDs). Integrated mapping of NTDs is a first step for integrated planning of programmes, proper resource allocation and monitoring progress of control. Integrated mapping has several advantages over disease specific mapping by reducing costs and enabling co-endemic areas to be more precisely identified. We designed and conducted integrated mapping of lymphatic filariasis (LF) and podoconiosis in Ethiopia; here we present the methods, challenges and lessons learnt.
Integrated mapping of 1315 communities across Ethiopia was accomplished within three months. Within these communities, 129,959 individuals provided blood samples that were tested for circulating Wuchereria bancrofti antigen using immunochromatographic card tests (ICT). Wb123 antibody tests were used to further establish exposure to LF in areas where at least one ICT positive individual was detected. A clinical algorithm was used to reliably diagnose podoconiosis by excluding other potential causes of lymphoedema of the lower limb.
A total of 8110 individuals with leg swelling were interviewed and underwent physical examination. Smartphones linked to a central database were used to collect data, which facilitated real-time data entry and reduced costs compared to traditional paper-based data collection approach; their inbuilt Geographic Positioning System (GPS) function enabled simultaneous capture of geographical coordinates. The integrated approach led to efficient use of resources and rapid mapping of an enormous geographical area and was well received by survey staff and collaborators. Mobile based technology can be used for such large scale studies in resource constrained settings such as Ethiopia, with minimal challenges.
This was the first integrated mapping of podoconiosis and LF globally. Integrated mapping of podoconiosis and LF is feasible and, if properly planned, can be quickly achieved at nationwide scale.
Electronic supplementary material
The online version of this article (doi:10.1186/1756-3305-7-397) contains supplementary material, which is available to authorized users.
Integrated; Mapping; Lymphedema; Elephantiasis; Lymphatic filariasis; Podoconiosis; Ethiopia
Early detection of drug resistance is one of the priorities of tuberculosis (TB) control programs as drug resistance is increasing. New molecular assays are only accessible for a minority of the second line drugs and their availability in high endemic settings is also hampered by high cost and logistic challenges. Therefore, we evaluated a previously developed method for drug susceptibility testing (DST) including both first- and second line anti-TB drugs for use in high endemic areas.
Baseline mycobacterial isolates from 78 consecutive pulmonary TB patients from Addis Ababa, Ethiopia who were culture positive for Mycobacterium tuberculosis at the end of a two-month directly observed treatment short course (DOTS) were included. The isolates were simultaneously tested for isoniazid, rifampicin, ethambutol, streptomycin, amikacin, kanamycin, capreomycin, ofloxacin, moxifloxacin, ethionamide and para-aminosalicylic acid susceptibility using the indirect proportion method adopted for 24-well agar plates containing Middlebrook 7H10 medium. Applying the 24-well plate assay, 43 (55.1%) isolates were resistant to one or more of the first line drugs tested (isoniazid, rifampicin and ethambutol). MDR-TB was identified in 20.5% of this selected group and there was a perfect correlation for rifampicin resistance with the results from the genotype MTBDRplus assay. All isolates were susceptible to aminoglycosides and fluoroquinolones in agreement with the genotype MTBDRsl assay. The only tested second line drug associated to resistance was ethionamide (14.1% resistant). The method was reproducible with stable results for internal controls (one multi-drug resistant (MDR) and one pan-susceptible strain (H37Rv) and DST results could be reported at two weeks.
The 24-well plate method for simultaneous DST for first- and second line drugs was found to be reproducible and correlated well to molecular drug susceptibility tests. It is likely to be useful in high-endemic areas for surveillance as well as for the detection of second line drug resistance in targeted groups such as in those who fail empirical MDR treatment.
Susceptibility testing; Epidemiological cut off value (ECOFF); Multi drug resistant (MDR) tuberculosis; Ethiopia
Increased resistance by Plasmodium falciparum parasites led to the withdrawal of the antimalarial drugs chloroquine and sulphadoxine-pyrimethamine in Ethiopia. Since 2004 artemether-lumefantrine has served to treat uncomplicated P. falciparum malaria. However, increasing reports on delayed parasite clearance to artemisinin opens up a new challenge in anti-malarial therapy. With the complete withdrawal of CQ for the treatment of Plasmodium falciparum malaria, this study assessed the evolution of CQ resistance by investigating the prevalence of mutant alleles in the pfmdr1 and pfcrt genes in P. falciparum and pvmdr1 gene in Plasmodium vivax in Southern and Eastern Ethiopia.
Of the 1,416 febrile patients attending primary health facilities in Southern Ethiopia, 329 febrile patients positive for P. falciparum or P. vivax were recruited. Similarly of the 1,304 febrile patients from Eastern Ethiopia, 81 febrile patients positive for P. falciparum or P. vivax were included in the study. Of the 410 finger prick blood samples collected from malaria patients, we used direct sequencing to investigate the prevalence of mutations in pfcrt and pfmdr1. This included determining the gene copy number in pfmdr1 in 195 P. falciparum clinical isolates, and mutations in the pvmdr1 locus in 215 P. vivax clinical isolates.
The pfcrt K76 CQ-sensitive allele was observed in 84.1% of the investigated P.falciparum clinical isolates. The pfcrt double mutations (K76T and C72S) were observed less than 3%. The pfcrt SVMNT haplotype was also found to be present in clinical isolates from Ethiopia. The pfcrt CVMNK-sensitive haplotypes were frequently observed (95.9%). The pfmdr1 mutation N86Y was observed only in 14.9% compared to 85.1% of the clinical isolates that carried sensitive alleles. Also, the sensitive pfmdr1 Y184 allele was more common, in 94.9% of clinical isolates. None of the investigated P. falciparum clinical isolates carried S1034C, N1042D and D1246Y pfmdr1 polymorphisms. All investigated P. falciparum clinical isolates from Southern and Eastern Ethiopia carried only a single copy of the mutant pfmdr1 gene.
The study reports for the first time the return of chloroquine sensitive P. falciparum in Ethiopia. These findings support the rationale for the use of CQ-based combination drugs as a possible future alternative.
Malaria; Plasmodium falciparum; Plasmodium vivax; Ethiopia; pfcrt; pfmdr1; pvmdr1; pfmdr1 gene copy number
Genetic factors are involved in susceptibility or protection to tuberculosis (TB). Apart from gene polymorphisms and mutations, changes in levels of gene expression, induced by non-genetic factors, may also determine whether individuals progress to active TB.
We analysed the expression level of 45 genes in a total of 47 individuals (23 healthy household contacts and 24 new smear-positive pulmonary TB patients) in Addis Ababa using a dual colour multiplex ligation-dependent probe amplification (dcRT-MLPA) technique to assess gene expression profiles that may be used to distinguish TB cases and their contacts and also latently infected (LTBI) and uninfected household contacts.
The gene expression level of BLR1, Bcl2, IL4d2, IL7R, FCGR1A, MARCO, MMP9, CCL19, and LTF had significant discriminatory power between sputum smear-positive TB cases and household contacts, with AUCs of 0.84, 0.81, 0.79, 0.79, 0.78, 0.76, 0.75, 0.75 and 0.68 respectively. The combination of Bcl2, BLR1, FCGR1A, IL4d2 and MARCO identified 91.66% of active TB cases and 95.65% of household contacts without active TB. The expression of CCL19, TGFB1, and Foxp3 showed significant difference between LTBI and uninfected contacts, with AUCs of 0.85, 0.82, and 0.75, respectively, whereas the combination of BPI, CCL19, FoxP3, FPR1 and TGFB1 identified 90.9% of QFT- and 91.6% of QFT+ household contacts.
Expression of single and especially combinations of host genes can accurately differentiate between active TB cases and healthy individuals as well as between LTBI and uninfected contacts.
Field-applicable tests detecting asymptomatic Mycobacterium leprae (M. leprae) infection or predicting progression to leprosy, are urgently required. Since the outcome of M. leprae infection is determined by cellular- and humoral immunity, we aim to develop diagnostic tests detecting pro-/anti-inflammatory and regulatory cytokines as well as antibodies against M. leprae. Previously, we developed lateral flow assays (LFA) for detection of cytokines and anti-PGL-I antibodies. Here we evaluate progress of newly developed LFAs for applications in resource-poor settings.
The combined diagnostic value of IP-10, IL-10 and anti-PGL-I antibodies was tested using M. leprae-stimulated blood of leprosy patients and endemic controls (EC). For reduction of the overall test-to-result time the minimal whole blood assay time required to detect distinctive responses was investigated. To accommodate LFAs for field settings, dry-format LFAs for IP-10 and anti-PGL-I antibodies were developed allowing storage and shipment at ambient temperatures. Additionally, a multiplex LFA-format was applied for simultaneous detection of anti-PGL-I antibodies and IP-10. For improved sensitivity and quantitation upconverting phosphor (UCP) reporter technology was applied in all LFAs.
Single and multiplex UCP-LFAs correlated well with ELISAs. The performance of dry reagent assays and portable, lightweight UCP-LF strip readers indicated excellent field-robustness. Notably, detection of IP-10 levels in stimulated samples allowed a reduction of the whole blood assay time from 24 h to 6 h. Moreover, IP-10/IL-10 ratios in unstimulated plasma differed significantly between patients and EC, indicating the feasibility to identify M. leprae infection in endemic areas.
Dry-format UCP-LFAs are low-tech, robust assays allowing detection of relevant cytokines and antibodies in response to M. leprae in the field. The high levels of IP-10 and the required shorter whole blood assay time, render this cytokine useful to discriminate between leprosy patients and EC.
Leprosy is one of the six diseases considered by WHO as a major threat in developing countries and often results in severe, life-long disabilities and deformities due to delayed diagnosis. Early detection of Mycobacterium leprae (M. leprae) infection, followed by effective interventions, is considered vital to interrupt transmission. Thus, field-friendly tests that detect asymptomatic M. leprae infection are urgently required. The clinical outcome after M. leprae infection is determined by the balance of pro- and anti-inflammatory cytokines and antibodies in response to M. leprae. In this study, we developed lateral flow assays (LFA) for detection of pro-inflammatory (IP-10) vs. anti-inflammatory/regulatory (IL-10) cellular immunity as well as antibodies against M. leprae and evaluated these in a field setting in Ethiopia using lightweight, portable readers. We show that detection of IP-10 allowed a significant reduction of the overall test-to-result time from 24 h to 6 h. Moreover, IP-10/IL-10 ratios in unstimulated plasma differed significantly between patients and EC, which can provide means to identify M. leprae infection. Thus, the LFAs are low-tech, robust assays that can be applied in resource-poor settings measuring immunity to M. leprae and can be used as tools for early diagnosis of leprosy leading to timely treatment and reduced transmission.
In the northwest of Ethiopia, at the South Gondar region, there was a visceral leishmaniasis (VL) outbreak in 2005, making the disease a public health concern for the regional health authorities ever since. The knowledge on how the population perceives the disease is essential in order to propose successful control strategies.
Two surveys on VL knowledge, attitudes and practices were conducted at the beginning (May 2009) and at the end (February 2011) of a VL longitudinal study carried out in rural communities of Libo Kemkem and Fogera, two districts of the Amhara Regional State. Results showed that VL global knowledge was very low in the area, and that it improved substantially in the period studied. Specifically, from 2009 to 2011, the frequency of proper knowledge regarding VL signs and symptoms increased from 47% to 71% (p<0.0001), knowledge of VL causes increased from 8% to 25% (p<0.0001), and knowledge on VL protection measures from 16% to 55% (p<0.0001). Moreover, the improvement observed in VL knowledge was more marked among the families with no previous history of VL case. Finally, in 2011 more than 90% of the households owned at least an impregnated bed net and had been sprayed, and attitudes towards these and other protective measures were very positive (over 94% acceptance for all of them).
In 2009 the level of knowledge regarding VL was very low among the rural population of this area, although it improved substantially in the study period, probably due to the contribution of many actors in the area. VL patients and relatives should be appropriately informed and trained as they may act as successful health community agents. VL risk behavioural patterns are subject to change as attitudes towards protective measures were very positive overall.
Visceral leishmaniasis (VL) is a vector borne disease that can be fatal if left untreated. In northern Ethiopia there was a VL outbreak in 2005, making the disease a public health challenge ever since. In order to promote the participation of communities in the control of the disease, it is essential to know how they perceive the disease and its management. There is a paucity of studies dealing with the knowledge, attitudes and practices (KAP) towards VL in the world in general and in rural Ethiopia in particular. We conducted two KAP studies at the beginning and at the end of a VL longitudinal study carried out between 2009 and 2011. The project included VL community talks and sensitization, and there were other interventions implemented by different actors in this period. Our results showed that, among the rural communities surveyed, the knowledge regarding signs and symptoms, causes, and protective measures of the disease was very low. However, it improved substantially in the period studied, suggesting that knowledge was subject to change by community interventions. It also showed that VL patients and relatives can act as successful health agents and that the population had positive attitudes towards the implementation of preventive actions.
The study aimed at determining the prevalence and drug resistance patterns of Mycobacterium tuberculosis among new smear positive pulmonary tuberculosis patients visiting TB diagnosis and treatment facilities at selected health facilities in eastern Ethiopia. A cross-sectional study was conducted between October 2011 and May 2013. A total of 408 new adult pulmonary TB patients (≥ 18 years) were enrolled in this study. Three consecutive sputum samples (spot, morning, and spot) were collected from each patient and transported to the Armauer Hansen Research Institute TB laboratory located in Addis Ababa for culture on Lowenstein Jensen slant media. DST was performed on 357 (87.5%) of the patient samples for isoniazid (H), rifampicin (R), ethambutol (E), and streptomycin (S) using the standard proportion method. The rate of resistance to any one drug was 23%. Any resistance to H, S, R, and E was 14%, 11.5%, 2.8%, and 0.3%, respectively. The highest proportion of monoresistance was observed against H (9.5%). MDRTB was detected in 1.1% of the patients. Any drug resistance was associated with HIV infection (COR = 3.7, 95% CI 1.905–7.222) (P = 0.000). Although the prevalence of MDRTB is relatively low in the study area, high prevalence of H resistance is a serious concern demanding close monitoring. Expanding diagnostic capacity for mycobacterial culture and DST is a vital step in this regard.
Regulatory T (Treg) cells are known for their role in maintaining self-tolerance and balancing immune reactions in autoimmune diseases and chronic infections. However, regulatory mechanisms can also lead to prolonged survival of pathogens in chronic infections like leprosy and tuberculosis (TB). Despite high humoral responses against Mycobacterium leprae (M. leprae), lepromatous leprosy (LL) patients have the characteristic inability to generate T helper 1 (Th1) responses against the bacterium. In this study, we investigated the unresponsiveness to M. leprae in peripheral blood mononuclear cells (PBMC) of LL patients by analysis of IFN-γ responses to M. leprae before and after depletion of CD25+ cells, by cell subsets analysis of PBMC and by immunohistochemistry of patients' skin lesions. Depletion of CD25+ cells from total PBMC identified two groups of LL patients: 7/18 (38.8%) gained in vitro responsiveness towards M. leprae after depletion of CD25+ cells, which was reversed to M. leprae-specific T-cell unresponsiveness by addition of autologous CD25+ cells. In contrast, 11/18 (61.1%) remained anergic in the absence of CD25+ T-cells. For both groups mitogen-induced IFN-γ was, however, not affected by depletion of CD25+ cells. In M. leprae responding healthy controls, treated lepromatous leprosy (LL) and borderline tuberculoid leprosy (BT) patients, depletion of CD25+ cells only slightly increased the IFN-γ response. Furthermore, cell subset analysis showed significantly higher (p = 0.02) numbers of FoxP3+ CD8+CD25+ T-cells in LL compared to BT patients, whereas confocal microscopy of skin biopsies revealed increased numbers of CD68+CD163+ as well as FoxP3+ cells in lesions of LL compared to tuberculoid and borderline tuberculoid leprosy (TT/BT) lesions. Thus, these data show that CD25+ Treg cells play a role in M. leprae-Th1 unresponsiveness in LL.
Leprosy is a curable infectious disease caused by Mycobacterium leprae (M. leprae) that affects the skin and peripheral nerves. It is manifested in different forms ranging from self-healing, tuberculoid leprosy (TT) with low bacillary load and high cellular immunity against M. leprae, to lepromatous leprosy (LL) with high bacillary load and high antibody titers to M. leprae antigens. However, LL patients have poor cell mediated response against M. leprae leading to delayed clearance of the bacilli. A possible explanation for this bacterial persistence could lie in the presence of more regulatory cells at infection sites and in peripheral blood. This study shows the recovery of the cell mediated response by depletion of CD25+ cells in a subset of LL patients, while another patient subset was not affected similarly. Moreover, an increased frequency of FoxP3+ T cells together with anti-inflammatory macrophages was observed in LL patients' skin biopsies. Thus, these data show that CD25+ Treg cells play a role in M. leprae-unresponsiveness in leprosy patients.
The ongoing scale-up of antiretroviral therapy (ART) in sub-Saharan Africa has prompted the interest in surveillance of transmitted and acquired HIV drug resistance. Resistance data on virological failure and mutations in HIV infected populations initiating treatment in sub-Saharan Africa is sparse.
HIV viral load (VL) and resistance mutations pre-ART and after 6 months were determined in a prospective cohort study of ART-naïve HIV patients initiating first-line therapy in Jimma, Ethiopia. VL measurements were done at baseline and after 3 and 6 months. Genotypic HIV drug resistance (HIVDR) was performed on patients exhibiting virological failure (>1000 copies/mL at 6 months) or slow virological response (>5000 copies/mL at 3 months and <1000 copies/mL at 6 months).
Two hundred sixty five patients had VL data available at baseline and at 6 months. Virological failure was observed among 14 (5.3%) participants out of 265 patients. Twelve samples were genotyped and six had HIV drug resistance (HIVDR) mutations at baseline. Among virological failures, 9/11 (81.8%) harbored one or more HIVDR mutations at 6 months. The most frequent mutations were K103N and M184VI.
Our data confirm that the currently recommended first-line ART regimen is efficient in the vast majority of individuals initiating therapy in Jimma, Ethiopia eight years after the introduction of ART. However, the documented occurrence of transmitted resistance and accumulation of acquired HIVDR mutations among failing patients justify increased vigilance by improving the availability and systematic use of VL testing to monitor ART response, and underlines the need for rapid, inexpensive tests to identify the most common drug resistance mutations.
Tuberculosis caused 20% of all human deaths in the Western world between the 17th and 19th centuries, and remains a cause of high mortality in developing countries. In analogy to other crowd diseases, the origin of human tuberculosis has been associated with the Neolithic Demographic Transition, but recent studies point to a much earlier origin. Here we used 259 whole-genome sequences to reconstruct the evolutionary history of the Mycobacterium tuberculosis complex (MTBC). Coalescent analyses indicate that MTBC emerged about 70 thousand years ago, accompanied migrations of anatomically modern humans out of Africa, and expanded as a consequence of increases in human population density during the Neolithic. This long co-evolutionary history is consistent with MTBC displaying characteristics indicative of adaptation to both low- and high host densities.
As a result of extensive chloroquine resistance (CQR) in Plasmodium falciparum in late 1990s, Ethiopia replaced CQ with sulphadoxine-pyrimethamine (SP) as first-line drug, which in turn was replaced by artemisinin combination therapy in 2004. Plasmodium falciparum resistance to CQ is determined by the mutation at K76T of the P. falciparum chloroquine resistance transporter (pfcrt) gene. Understanding diversity in the P. falciparum genome is crucial since it has the potential to influence important phenotypes of the parasite such as drug resistance. Limited data is available regarding the type of pfcrt mutant allelic type, the effect of CQ withdrawal and diversity of the parasite population in south-central Oromia, Ethiopia.
Finger-pricked blood spotted on Whatman 3MM filter papers were collected from falciparum malaria patients. Parasite DNA was extracted from individual blood spots on the filter papers. The presence of K76T mutations was determined using nested PCR for all isolates. Complete sequencing of mutations in pfcrt 72-76 was done for a set of randomly selected resistant isolates. Four microsatellite (MS) markers were analysed to determine the heterozygosity.
Although CQ was withdrawn for more than a decade, 100% of the parasites still carried the pfcrt K76T mutation. All isolates were mutant at the K76T polymorphism. Based on combinations of MS markers, seven different Ethiopian CQR variants (E1-E7) were identified. Heterozygosity (He) for MS flanking the pfcrt chloroquine resistance allele ranged from 0.00 (mscrt -29, -29.268 kb) to 0.21 (mscrt -2, -2.814 kb). He ranged from 0.00 (msint 3, 0 kb) to 0.19 (msint 2, 0 kb) for MS within the pfcrt gene. Both intronic and MS flanking the pfcrt gene showed low levels of diversity.
pfcrt CQR allele seems to be fixed in the study area. Of the different haplotypes associated with CQR, only the CVIET genotype was identified. No reversal to the wild-type has occurred in Ethiopia unlike in many Africa countries where CQR parasites declined after cessation of CQ use. Decreased diversity in CQR isolates surrounding pfcrt suggests CQ selection and homogenization among CQR parasite population. While mutation in msint 3 and mscrt -29 of the mutant pfcrt allele is being fixed, it seems that mutations in msint 2 and mscrt -2 are still evolving and may indicate the start of re-diversification of the population from a fixed 76 T population.
pfcrt; Wild-type; Drug resistance; Chloroquine; Plasmodium falciparum; Mutations; Heterozygosity; Microsatellite; Ethiopia
The immunologic environment during HIV/M. tuberculosis co-infection is characterized by cytokine and chemokine irregularities that have been shown to increase immune activation, viral replication, and T cell dysfunction.
We analysed ex vivo plasma samples from 17 HIV negative and 16 HIV pulmonary tuberculosis co infected cases using Luminex assay to see impact of HIV co-infection on plasma level of cytokines and chemokines of pulmonary tuberculosis patients before and after anti Tuberculosis treatment.
The median plasma level of IFN-γ, IL-4, MCP-3, MIP-1β and IP-10 was significantly different (P < 0.05) before and after treatment in HIV negative TB patients but not in HIV positive TB patients. There was no significant difference between HIV positive and HIV negative TB patients (P > 0.05) in the plasma level of any of the cytokines or chemokines before treatment and anti TB treatment did not change the level of any of the measured cytokines in HIV positive tuberculosis patients. The ratio of IFN-γ/IL-10 and IFN-γ/IL-4 showed a significant increase after treatment in HIV negative TB cases but not in HIV positive TB cases which might indicate prolonged impairment of immune response to TB in HIV positive TB patients as compared to HIV negative tuberculosis patients.
HIV positive and HIV negative Tuberculosis patients display similar plasma cytokine and chemokine pattern. However, anti TB treatment significantly improves the Th1 cytokines and level of chemokines but does not restore the immune response in HIV positive individuals.
Pulmonary tuberculosis; HIV; Cytokines and chemokines
With 75% of the Ethiopian population at risk of malaria, accurate diagnosis is crucial for malaria treatment in endemic areas where Plasmodium falciparum and Plasmodium vivax co-exist. The present study evaluated the performance of regular microscopy in accurate identification of Plasmodium spp. in febrile patients visiting health facilities in southern Ethiopia.
A cross-sectional study design was employed to recruit study subjects who were microscopically positive for malaria parasites and attending health facilities in southern Ethiopia between August and December 2011. Of the 1,416 febrile patients attending primary health facilities, 314 febrile patients, whose slides were positive for P. falciparum, P. vivax or mixed infections using microscopy, were re-evaluated for their infection status by PCR. Finger-prick blood samples were used for parasite genomic DNA extraction. Phylogenetic analyses were performed to reconstruct the distribution of different Plasmodium spp. across the three geographical areas.
Of the 314 patients with a positive thick blood smear, seven patients (2%) were negative for any of the Plasmodium spp. by nested PCR. Among 180 microscopically diagnosed P. falciparum cases, 111 (61.7%) were confirmed by PCR, 44 (24.4%) were confirmed as P. vivax, 18 (10%) had mixed infections with P. falciparum and P. vivax and two (1.1%) were mixed infections with P. falciparum and P. malariae and five (2.8%) were negative for any of the Plasmodium spp. Of 131 microscopically diagnosed P. vivax cases, 110 (84%) were confirmed as P. vivax, 14 (10.7%) were confirmed as P. falciparum, two (1.5%) were P. malariae, three (2.3%) with mixed infections with P. falciparum and P. vivax and two (1.5%) were negative for any of the Plasmodium spp. Plasmodium falciparum and P. vivax mixed infections were observed. Plasmodium malariae was detected as mono and mixed infections in four individuals.
False positivity, under-reporting of mixed infections and a significant number of species mismatch needs attention and should be improved for appropriate diagnosis. The detection of substantial number of false positive results by molecular methodologies may provide the accurate incidence of circulating Plasmodium species in the geographical region and has important repercussions in understanding malaria epidemiology and subsequent control.
Malaria; Plasmodium; Nested PCR; Microscopy; Ethiopia
There are limited data on clinical outcomes of ART-experienced patients with cryptococcal antigenemia. We assessed clinical outcomes of a predominantly asymptomatic, ART-experienced cohort of HIV+ patients previously found to have a high (8.4%) prevalence of cryptococcal antigenemia.
The study took place at All Africa Leprosy, Tuberculosis and Rehabilitative Training Centre and Black Lion Hospital HIV Clinics in Addis Ababa, Ethiopia. A retrospective study design was used to perform 12-month follow-up of 367 mostly asymptomatic HIV-infected patients (CD4<200 cells/µl) with high levels of antiretroviral therapy use (74%) who were previously screened for cryptococcal antigenemia. Medical chart abstraction was performed approximately one year after initial screening to obtain data on clinic visit history, ART use, CD4 count, opportunistic infections, and patient outcome. We evaluated the association of cryptococcal antigenemia and a composite poor outcome of death and loss to follow-up using logistic regression.
Overall, 323 (88%) patients were alive, 8 (2%) dead, and 36 (10%) lost to follow-up. Among the 31 patients with a positive cryptococcal antigen test (titers ≥1∶8) at baseline, 28 were alive (all titers ≤1∶512), 1 dead and 2 lost to follow-up (titers ≥1∶1024). In multivariate analysis, cryptococcal antigenemia was not predictive of a poor outcome (aOR = 1.3, 95% CI 0.3–4.8). A baseline CD4 count <100 cells/µl was associated with an increased risk of a poor outcome (aOR 3.0, 95% CI 1.4–6.7) while an increasing CD4 count (aOR 0.1, 95% CI 0.1–0.3) and receiving antiretroviral therapy at last follow-up visit (aOR 0.1, 95% CI 0.02–0.2) were associated with a reduced risk of a poor outcome.
Unlike prior ART-naïve cohorts, we found that among persons receiving ART and with CD4 counts <200 cells/µl, asymptomatic cryptococcal antigenemia was not predictive of a poor outcome.
To assess seroprevalences of Brucella and C. burnetii in pastoral livestock in southeast Ethiopia, a cross-sectional study was carried out in three livestock species (cattle, camels and goats). The study was conducted from July 2008 to August 2010, and eight pastoral associations (PAs) from the selected districts were included in the study. Sera from a total of 1830 animals, comprising 862 cattle, 458 camels and 510 goats were screened initially with Rose Bengal plate test (RBPT) for Brucella. All RBPT positive and 25% of randomly selected negative sera were further tested by ELISA. These comprise a total of 460 animals (211 cattle, 102 camels and 147 goats). Out of sera from total of 1830 animals, 20% were randomly selected (180 cattle, 90 camels and 98 goats) and tested for C. burnetii using ELISA. The seroprevalences of Brucella was 1.4% (95% confidence interval (CI), 0.8-2.6), 0.9% (95% CI, 0.3-2.7)b and 9.6% (95% CI, 5.2-17.1) in cattle, camels and goats, respectively. Goats and older animals were at higher risk of infection (OR=7.3, 95% CI, 2.8-19.1) and (OR=1.7 95% CI, 0.9-2.9), respectively. Out of 98 RBPT negative camel sera, 12.0% were positive for ELISA. The seroprevalences of C. burnetii were 31.6% (95% CI, 24.7-39.5), 90.0% (95% CI, 81.8-94.7) and 54.2% (95% CI, 46.1-62.1) in cattle, camels and goats, respectively. We found positive animals for C. burnetii test in all tested PAs for all animal species. Being camel and older animal was a risk factor for infection (OR=19.0, 95% CI, 8.9-41.2) and (OR=3.6, 95% CI, 2.0-6.6), respectively. High seropositivity of C. burnetii in all livestock species tested and higher seropositive in goats for Brucella, implies risks of human infection by both diseases. Thus, merit necessity of further study of both diseases in animals and humans in the area.
Brucellosis; Q-fever; Seroprevalence; Pastoral livestock; Southeast Ethiopia
Transmission of Mycobacterium tuberculosis (M. tuberculosis) complex could be possible between farmers and their cattle in Ethiopia.
A study was conducted in mixed type multi-purposes cattle raising region of Ethiopia on 287 households (146 households with case of pulmonary tuberculosis (TB) and 141 free of TB) and 287 herds consisting of 2,033 cattle belonging to these households to evaluate transmission of TB between cattle and farmers. Interview, bacteriological examinations and molecular typing were used for human subjects while comparative intradermal tuberculin (CIDT) test, post mortem and bacteriological examinations, and molecular typing were used for animal studies. Herd prevalence of CIDT reactors was 9.4% and was higher (p<0.01) in herds owned by households with TB than in herds owned by TB free households. Animal prevalence was 1.8% and also higher (p<0.01) in cattle owned by households with TB case than in those owned by TB free households. All mycobacteria (141) isolated from farmers were M. tuberculosis, while only five of the 16 isolates from cattle were members of the M. tuberculosis complex (MTC) while the remaining 11 were members of non-tuberculosis mycobacteria (NTM). Further speciation of the five MTC isolates showed that three of the isolates were M. bovis (strain SB1176), while the remaining two were M. tuberculosis strains (SIT149 and SIT53). Pathology scoring method described by “Vordermeier et al. (2002)” was applied and the average severity of pathology in two cattle infected with M. bovis, in 11 infected with NTM and two infected with M. tuberculosis were 5.5, 2.1 and 0.5, respectively.
The results showed that transmission of TB from farmers to cattle by the airborne route sensitizes the cows but rarely leads to TB. Similarly, low transmission of M. bovis between farmers and their cattle was found, suggesting requirement of ingestion of contaminated milk from cows with tuberculous mastitis.
Prompt and effective malaria diagnosis not only alleviates individual suffering, but also decreases malaria transmission at the community level. The commonly used diagnostic methods, microscopy and rapid diagnostic tests, are usually insensitive at very low-density parasitaemia. Molecular techniques, on the other hand, allow the detection of low-level, sub-microscopic parasitaemia. This study aimed to explore the presence of sub-microscopic Plasmodium falciparum infections using polymerase chain reaction (PCR). The PCR-based parasite prevalence was compared against microscopy and rapid diagnostic test (RDT).
This study used 1,453 blood samples collected from clinical patients and sub-clinical subjects to determine the prevalence of sub-microscopic P. falciparum carriages. Subsets of RDT and microscopy negative blood samples were tested by PCR while all RDT and microscopically confirmed P. falciparum-infected samples were subjected to PCR. Finger-prick blood samples spotted on filter paper were used for parasite genomic DNA extraction.
The prevalence of sub-microscopic P. falciparum carriage was 19.2% (77/400) (95% CI = 15. 4–23.1). Microscopy-based prevalence of P. falciparum infection was 3.7% (54/1,453) while the prevalence was 6.9% (100/1,453) using RDT alone. Using microscopy and PCR, the estimated parasite prevalence was 20.6% if PCR were performed in 1,453 blood samples. The prevalence was estimated to be 22.7% if RDT and PCR were used. Of 54 microscopically confirmed P. falciparum-infected subjects, PCR detected 90.7% (49/54). Out of 100 RDT-confirmed P. falciparum infections; PCR detected 80.0% (80/100). The sensitivity of PCR relative to microscopy and RDT was, therefore, 90.7% and 80%, respectively. The sensitivity of microscopy and RDT relative to PCR was 16.5 (49/299) and 24.2% (80/330), respectively. The overall PCR-based prevalence of P. falciparum infection was 5.6- and 3.3 fold higher than that determined by microscopy and RDT, respectively. None of the sub-microscopic subjects had severe anaemia, though 29.4% had mild anaemia (10–11.9 g/dl).
Asymptomatic, low-density malaria infection was common in the study area and PCR may be a better tool for measuring Plasmodium prevalence than microscopy and RDT. The inadequate sensitivity of the diagnostic methods to detect substantial number of sub-microscopic parasitaemia would undoubtedly affect malaria control efforts, making reduction of transmission more difficult. RDT and microscopy-based prevalence studies and subsequent reports of reduction in malaria incidence underestimate the true pictures of P. falciparum infections in the community. PCR, on the other hand, seems to have reasonable sensitivity to detect a higher number of infected subjects with low and sub-microscopic parasite densities than RDTs or microscopy.
Sub-microscopic carriage; Asymptomatic malaria; Microscopy; RDT; PCR; Ethiopia
Population based prevalence survey is an important epidemiological index to measure the burden of tuberculosis (TB) disease and monitor progress towards TB control in high burden countries like Ethiopia. This study was aimed to estimate the prevalence of bacteriologically confirmed pulmonary tuberculosis (PTB) in the Tigray region of Ethiopia.
Sixteen rural and urban villages were randomly selected in a stratified multistage cluster sampling. Individuals aged 15 years and older were screened by symptom inquiry for PTB. Those individuals who were symptomatic of PTB provided two sputum samples for smear microscopy, culture and molecular typing.
The study covering 4,765 households screened a total of 12,175 individuals aged 15 years and above. The overall weighted prevalence of bacteriologically confirmed PTB in the Tigray region of Ethiopia was found to be 216/100,000 (95% CI: 202.08, 230.76) while the weighted prevalence of smear-positive PTB was 169/100,000 (95% CI: 155.53, 181.60). The prevalence of bacteriologically confirmed TB was higher amongst males (352/100 000; 95% CI: 339.05, 364.52) than females (162/100 000; 95% CI: 153.60, 171.17) and among rural (222/100,000; 95% CI: 212.77-231.53) as compared to urban residents (193/100,000; 95% CI: 183.39-203.59).
This study found a relatively higher prevalence smear-positive PTB in the region than in a same period nationwide survey and identified a significant number of undetected PTB cases. The urgency for improved TB case detection and intensified community awareness is emphasized.
Bacteriologically confirmed; Cross-sectional; Pulmonary tuberculosis; Tigray region; Ethiopia
Paediatric tuberculosis (TB) is poorly addressed in Ethiopia and information about its magnitude and the genotype distribution of the causative Mycobacterium tuberculosis strains responsible for its spread are scanty.
Gastric lavage or sputum samples were collected from consecutively enrolled TB suspect children visiting Jimma University Hospital in 2011 and cultured on Middlebrook 7H11 and Löwenstein-Jensen media. Acid fast bacterial (AFB) isolates were subjected to molecular typing targeting regions of difference (RDs), 16S rDNA gene and the direct repeat (DR) region using multiplex polymerase chain reaction (mPCR), gene sequencing and spoligotyping, respectively. Molecular drug susceptibility testing of M. tuberculosis isolates was performed by Genotype®MTBDRplus line probe assay (LPA) (Hain Life Sciences, Germany).
Gastric lavage (n = 43) or sputum (n = 58) samples were collected from 101 children and 31.7% (32/101) of the samples were positive for AFB by microscopy, culture and/or PCR. Out of 25 AFB isolates, 60% (15/25) were identified as M. tuberculosis by PCR, and 40% isolates (10/25) were confirmed to be non-tuberculous mycobacteria (NTM) by genus typing and 16S rDNA gene sequencing. Lineage classification assigned the M. tuberculosis strains into Euro-American (EUA, 66.7%; 10/15), East-African-Indian (EAI; 2/15), East-Asian (EA; 1/15) and Indio-Oceanic (IO; 1/15) lineages. Seven M. tuberculosis strains were new to the SpolDB4 database. All of the M. tuberculosis isolates were susceptible to isoniazid (INH) and rifampicin (RIF), except for one strain (of spoligotype SIT-149 or T3_ETH family) which had a mutation at the inhA locus which often confers resistance to INH (low level) and ethionamide.
Analysis of the genetic population structure of paediatric M. tuberculosis strains suggested similarity with that of adults, indicating an on-going and active transmission of M. tuberculosis from adults to children in Ethiopia. There were no multidrug-resistant TB (MDR-TB) strains among the isolates.
Paediatric TB; Mycobacterium; Paediatric spoligotype; Gastric lavage