Microbubbles have shown potential as intralymphatic ultrasound contrast agents while nanoparticle-loaded microbubbles are increasingly investigated for ultrasound-triggered drug and gene delivery. To explore whether mRNA-nanoparticle loaded microbubbles could serve as theranostics for detection of and mRNA transfer to the lymph nodes, we investigate the behavior of unloaded and mRNA-loaded microbubbles using contrast-enhanced ultrasound imaging after subcutaneous injection in dogs. Our results indicate that both types of microbubbles are equally capable of rapidly entering the lymph vessels and nodes upon injection, and novel, valuable and detailed information on the lymphatic structure in the animals could be obtained. Furthermore, additional observations were made regarding the dynamics of microbubble lymph node uptake. Importantly, neither the microbubble migration distance within the lymphatics, nor the observed contrast signal intensity was influenced by mRNA-loading. Although further optimization of acoustic parameters will be needed, this could represent a first step towards ultrasound-guided, ultrasound-triggered intranodal mRNA delivery using these theranostic microbubbles.
Contrast-enhanced ultrasound imaging; mRNA; microbubbles; mRNA-loaded microbubbles; dogs; lymph nodes.
The presence of poor quality medicines in the market is a global threat on public health, especially in developing countries. Therefore, we assessed the quality of two commonly used anthelminthic drugs [mebendazole (MEB) and albendazole (ALB)] and one antiprotozoal drug [tinidazole (TNZ)] in Ethiopia.
A multilevel stratified random sampling, with as strata the different levels of supply chain system in Ethiopia, geographic areas and government/privately owned medicines outlets, was used to collect the drug samples using mystery shoppers. The three drugs (106 samples) were collected from 38 drug outlets (government/privately owned) in 7 major cities in Ethiopia between January and March 2012. All samples underwent visual and physical inspection for labeling and packaging before physico-chemical quality testing and evaluated based on individual monographs in Pharmacopoeias for identification, assay/content, dosage uniformity, dissolution, disintegration and friability. In addition, quality risk was analyzed using failure mode effect analysis (FMEA) and a risk priority number (RPN) was assigned to each quality attribute. A clinically rationalized desirability function was applied in quantification of the overall quality of each medicine. Overall, 45.3% (48/106) of the tested samples were substandard, i.e. not meeting the pharmacopoeial quality specifications claimed by their manufacturers. Assay was the quality attribute most often out-of-specification, with 29.2% (31/106) failure of the total samples. The highest failure was observed for MEB (19/42, 45.2%), followed by TNZ (10/39, 25.6%) and ALB (2/25, 8.0%). The risk analysis showed that assay (RPN = 512) is the most critical quality attribute, followed by dissolution (RPN = 336). Based on Derringer's desirability function, samples were classified into excellent (14/106,13%), good (24/106, 23%), acceptable (38/106, 36%%), low (29/106, 27%) and bad (1/106,1%) quality.
This study evidenced that there is a relatively high prevalence of poor quality MEB, ALB and TNZ in Ethiopia: up to 45% if pharmacopoeial acceptance criteria are used in the traditional, dichotomous approach, and 28% if the new risk-based desirability approach was applied. The study identified assay as the most critical quality attributes. The country of origin was the most significant factor determining poor quality status of the investigated medicines in Ethiopia.
Access to medicines of good quality improves the chances of successful treatment for individual patients and promotes better outcomes for public health in general. At present, the prevailing strategy for improving access to medicines for neglected tropical diseases (NTDs) is drug donation programs. However, the presence of poor quality medicines in the market is a global threat on public health, especially in developing countries by critically risking efforts of treatment and control of diseases in general and the NTDs in particular. Conventionally, medicine quality has been ignored in NTDs, though scattered reports show that serious problems exist. Therefore, we assessed the quality of two commonly used anthelminthic drugs (MEB and ALB) and one antiprotozoal drug (TNZ) in Ethiopia. The analytical results were converted into conclusions using two systems: the traditional dichotomous pharmacopoeial specification-compliance based approach and the risk-based Taguchi quantitative desirability approach. Overall, the results showed high prevalence of poor quality of the three medicines, mainly determined by the country of origin. We conclude that risk-based regulatory quality control procedures should be based on identification of the most critical quality attribute and apply desirability functions to quantify and classify the quality of medicines.
To support public health policy, information on the burden of disease is essential. In recent years, the Disability-Adjusted Life Year (DALY) has emerged as the most important summary measure of public health. DALYs quantify the number of healthy life years lost due to morbidity and mortality, and thereby facilitate the comparison of the relative impact of diseases and risk factors and the monitoring of public health over time.
Evidence on the disease burden in Belgium, expressed as DALYs, is available from international and national efforts. Non-communicable diseases and injuries dominate the overall disease burden, while dietary risks, tobacco smoking, and high body-mass index are the major risk factors for ill health. Notwithstanding these efforts, if DALYs were to be used for guiding health policy, a more systematic approach is required. By integrating DALYs in the current data generating systems, comparable estimates, rooted in recent local data, can be produced. This might however be hampered by several restrictions, such as limited harmonization, timeliness, inclusiveness and accessibility of current databases.
Routine quantification of disease burden in terms of DALYs would provide a significant added value to evidence-based public health policy in Belgium, although some hurdles need to be cleared.
Belgium; Disease burden; Disability-adjusted life years; Health policy
Gastrointestinal nematodes are an important cause of reduced performance in cattle. Previous studies in Europe showed that after anthelmintic treatment an average gain in milk production of around 1 kg per day/cow can be expected. However, (1) these studies have mainly evaluated group-based anthelmintic treatments during the grazing season or at housing and (2) little is known about parameters affecting variations in the treatment response amongst cows. A better knowledge of such parameters could help to select animals that benefit most from treatment and thus lead to a more rational use of anthelmintics. Therefore, a randomized, non-blinded, controlled clinical trial was performed on 11 commercial dairy farms (477 animals) in Belgium, aiming (1) to study the effect of eprinomectin treatment at calving on milk production and (2) to investigate whether the milk yield response was related to non-invasive animal parameters such that these could be used to inform targeted selective treatment decisions.
Analyses show that eprinomectin treatment around calving resulted in an average (± standard error) increase of 0.97 (±0.41) kg in daily milk yield that was followed up over 274 days on average. Milk yield responses were higher in multiparous compared to primiparous cows and in cows with a high (4th quartile) anti-O. ostertagi antibody level in a milk sample from the previous lactation. Nonetheless, high responses were also seen in animals with a low (1st quartile) anti-O. ostertagi antibody level. In addition, positive treatment responses were associated with higher faecal egg counts and a moderate body condition score at calving (2nd quartile).
In conclusion, this study provides novel insights into the production response after anthelmintic treatment at calving and factors which influence this. The data could be used to support the development of evidence-based targeted selective anthelmintic treatment strategies in dairy cattle.
Dairy cattle; Gastrointestinal nematodes; Targeted selective treatment; Anti-O. ostertagi antibody level; Faecal egg counts; Eprinomectin
To gain insights into the working mechanism of morphine, regional cerebral blood flow (rCBF) patterns after morphine administration were assessed in dogs. In a randomized cross-over experimental study, rCBF was estimated with 99mTc-Ethylcysteinate Dimer single photon emission computed tomography in 8 dogs at baseline, at 30 minutes and at 120 minutes after a single bolus of morphine. Perfusion indices (PI) in the frontal, parietal, temporal and occipital cortex and in the subcortical and cerebellar region were calculated. PI was significantly decreased 30 min after morphine compared to baseline in the right frontal cortex. The left parietal cortex and subcortical region showed a significantly increased PI 30 min after morphine compared to baseline. No significant differences were noted for the other regions or at other time points. In conclusion, a single bolus of morphine generated a changing rCBF pattern at different time points.
The mammary gland is an organ with a remarkable regenerative capacity that can undergo multiple cycles of proliferation, lactation, and involution. Growing evidence suggests that these changes are driven by the coordinated division and differentiation of mammary stem cell populations (MaSC). Whereas information regarding MaSC and their role in comparative mammary gland physiology is readily available in human and mice, such information remains scarce in most veterinary mammal species such as cows, horses, sheep, goats, pigs, and dogs. We believe that a better knowledge on the MaSC in these species will not only help to gain more insights into mammary gland (patho) physiology in veterinary medicine, but will also be of value for human medicine. Therefore, this review summarizes the current knowledge on stem cell isolation and characterization in different mammals of veterinary importance.
Tsetse flies are obligate blood-feeding insects that transmit African trypanosomes responsible for human sleeping sickness and nagana in livestock. The tsetse salivary proteome contains a highly immunogenic family of the endonuclease-like Tsal proteins. In this study, a recombinant version of Tsal1 (rTsal1) was evaluated in an indirect ELISA to quantify the contact with total Glossina morsitans morsitans saliva, and thus the tsetse fly bite exposure.
Mice and pigs were experimentally exposed to different G. m. morsitans exposure regimens, followed by a long-term follow-up of the specific antibody responses against total tsetse fly saliva and rTsal1. In mice, a single tsetse fly bite was sufficient to induce detectable IgG antibody responses with an estimated half-life of 36–40 days. Specific antibody responses could be detected for more than a year after initial exposure, and a single bite was sufficient to boost anti-saliva immunity. Also, plasmas collected from tsetse-exposed pigs displayed increased anti-rTsal1 and anti-saliva IgG levels that correlated with the exposure intensity. A strong correlation between the detection of anti-rTsal1 and anti-saliva responses was recorded. The ELISA test performance and intra-laboratory repeatability was adequate in the two tested animal models. Cross-reactivity of the mouse IgGs induced by exposure to different Glossina species (G. m. morsitans, G. pallidipes, G. palpalis gambiensis and G. fuscipes) and other hematophagous insects (Stomoxys calcitrans and Tabanus yao) was evaluated.
This study illustrates the potential use of rTsal1 from G. m. morsitans as a sensitive biomarker of exposure to a broad range of Glossina species. We propose that the detection of anti-rTsal1 IgGs could be a promising serological indicator of tsetse fly presence that will be a valuable tool to monitor the impact of tsetse control efforts on the African continent.
Salivary proteins of hematophagous disease vectors represent potential biomarkers of exposure and could be used in serological assays that are complementary to entomological surveys. We illustrate that a recombinant version of the highly immunogenic Tsal1 protein of the savannah tsetse fly (Glossina morsitans morsitans) is a sensitive immunological probe to detect contact with tsetse flies. Experimental exposure of mice and pigs to different regimens of tsetse fly bites combined with serological testing revealed that rTsal1 is a sensitive indicator that can differentiate the various degrees of exposure of animals. Tsetse-induced antibodies persisted relatively long, and an efficient boosting of immunity was observed upon re-exposure. Recombinant Tsal1 is a promising candidate to detect contact with various tsetse species, which would enable screening of populations or herds for exposure to tsetse flies in various areas on the African continent. This exposure indicator could be a valuable tool to monitor the impact of vector control programs and to detect re-invasion of cleared areas by tsetse flies.
Indoor Residual Spraying (IRS) and Long-Lasting Insecticidal nets (LLINs) are major malaria vector control tools in Ethiopia. However, recent reports from different parts of the country showed that populations of Anopheles arabiensis, the principal malaria vector, have developed resistance to most families of insecticides recommended for public health use which may compromise the efficacy of both of these key vector control interventions. Thus, this study evaluated the efficacy of DDT IRS and LLINs against resistant populations of An. arabiensis using experimental huts in Asendabo area, southwestern Ethiopia.
The susceptibility status of populations of An. arabiensis was assessed using WHO test kits to DDT, deltamethrin, malathion, lambda-cyhalothrin, fenitrothion and bendiocarb. The efficacy of LLIN (PermaNet® 2.0), was evaluated using the WHO cone bioassay. Moreover, the effect of the observed resistance against malaria vector control interventions (DDT IRS and LLINs) were assessed using experimental huts.
The findings of this study revealed that populations of An. arabiensis were resistant to DDT, deltamethrin, lambda-cyhalothrin and malathion with mortality rates of 1.3%, 18.8%, 36.3% and 72.5%, respectively but susceptible to fenitrothion and bendiocarb with mortality rates of 98.81% and 97.5%, respectively. The bio-efficacy test of LLIN (PermaNet® 2.0) against An. arabiensis revealed that the mosquito population showed moderate knockdown (64%) and mortality (78%). Moreover, mosquito mortalities in DDT sprayed huts and in huts with LLINs were not significantly different (p > 0.05) from their respective controls.
The evaluation of the efficacy of DDT IRS and LLINs using experimental huts showed that both vector control tools had only low to moderate efficacy against An. arabiensis populations from Ethiopia. Despite DDT being replaced by carbamates for IRS, the low efficacy of LLINs against the resistant population of An. arabiensis is still a problem. Thus, there is a need for alternative vector control tools and implementation of appropriate insecticide resistance management strategies as part of integrated vector management by the national malaria control program.
Ethiopia; An. arabiensis; Insecticide resistance; Experimental huts; Long-lasting insecticide treated nets
The amount of trace elements present in edible bovine tissues is of importance for both animal health and human nutrition. This study presents data on trace element concentrations in semitendinosus and cardiac muscles, livers and kidneys of 60 zebu (Bos indicus) bulls, sampled at Jimma, Ethiopia. From 28 of these bulls, blood samples were also obtained. Deficient levels of copper were found in plasma, livers, kidneys and semitendinosus muscles. Suboptimal selenium concentrations were found in plasma and semitendinosus muscles. Semitendinosus muscles contained high iron concentrations. Trace elements were mainly stored in the liver, except for iron and selenium. Cardiac muscles generally contained higher concentrations of trace elements than semitendinous muscles except for zinc. A strong association was found between liver and kidney concentrations of copper, iron, cobalt and molybdenum. Liver storage was well correlated with storage in semitendinosus muscle for selenium and with cardiac muscle for cobalt and selenium. Plasma concentrations of copper, selenium, cobalt were well related with their respective liver concentrations and for cobalt and selenium, also with cardiac muscle concentrations. The data suggest multiple trace element deficiencies in zebu cattle in South-West Ethiopia, with lowered tissue concentrations as a consequence. Based on the comparison of our data with other literature, trace element concentrations in selected edible tissues of Bos indicus seem quite similar to those in Bos taurus. However, tissue threshold values for deficiency in Bos taurus cattle need to be refined and their applicability for Bos indicus cattle needs to be evaluated.
Degenerative joint disease (DJD) is a major cause of reduced athletic function and retirement in equine performers. For this reason, regenerative therapies for DJD have gained increasing interest. Platelet-rich plasma (PRP) and mesenchymal stem cells (MSCs) were isolated from a 6-year-old donor horse. MSCs were either used in their native state or after chondrogenic induction. In an initial study, 20 horses with naturally occurring DJD in the fetlock joint were divided in 4 groups and injected with the following: 1) PRP; 2) MSCs; 3) MSCs and PRP; or 4) chondrogenic induced MSCs and PRP. The horses were then evaluated by means of a clinical scoring system after 6 weeks (T1), 12 weeks (T2), 6 months (T3) and 12 months (T4) post injection. In a second study, 30 horses with the same medical background were randomly assigned to one of the two combination therapies and evaluated at T1. The protein expression profile of native MSCs was found to be negative for major histocompatibility (MHC) II and p63, low in MHC I and positive for Ki67, collagen type II (Col II) and Vimentin. Chondrogenic induction resulted in increased mRNA expression of aggrecan, Col II and cartilage oligomeric matrix protein (COMP) as well as in increased protein expression of p63 and glycosaminoglycan, but in decreased protein expression of Ki67. The combined use of PRP and MSCs significantly improved the functionality and sustainability of damaged joints from 6 weeks until 12 months after treatment, compared to PRP treatment alone. The highest short-term clinical evolution scores were obtained with chondrogenic induced MSCs and PRP. This study reports successful in vitro chondrogenic induction of equine MSCs. In vivo application of (induced) MSCs together with PRP in horses suffering from DJD in the fetlock joint resulted in a significant clinical improvement until 12 months after treatment.
Parasitic zoonoses (PZs) pose a significant but often neglected threat to public health, especially in developing countries. In order to obtain a better understanding of their health impact, summary measures of population health may be calculated, such as the Disability-Adjusted Life Year (DALY). However, the data required to calculate such measures are often not readily available for these diseases, which may lead to a vicious circle of under-recognition and under-funding.
We examined the burden of PZs in Nepal through a systematic review of online and offline data sources. PZs were classified qualitatively according to endemicity, and where possible a quantitative burden assessment was conducted in terms of the annual number of incident cases, deaths and DALYs.
Between 2000 and 2012, the highest annual burden was imposed by neurocysticercosis and congenital toxoplasmosis (14,268 DALYs [95% Credibility Interval (CrI): 5450–27,694] and 9255 DALYs [95% CrI: 6135–13,292], respectively), followed by cystic echinococcosis (251 DALYs [95% CrI: 105–458]). Nepal is probably endemic for trichinellosis, toxocarosis, diphyllobothriosis, foodborne trematodosis, taeniosis, and zoonotic intestinal helminthic and protozoal infections, but insufficient data were available to quantify their health impact. Sporadic cases of alveolar echinococcosis, angiostrongylosis, capillariosis, dirofilariosis, gnathostomosis, sparganosis and cutaneous leishmaniosis may occur.
In settings with limited surveillance capacity, it is possible to quantify the health impact of PZs and other neglected diseases, thereby interrupting the vicious circle of neglect. In Nepal, we found that several PZs are endemic and are imposing a significant burden to public health, higher than that of malaria, and comparable to that of HIV/AIDS. However, several critical data gaps remain. Enhanced surveillance for the endemic PZs identified in this study would enable additional burden estimates, and a more complete picture of the impact of these diseases.
Various parasites that infect humans require animals in some stage of their life cycle. Infection with these so-called zoonotic parasites may vary from asymptomatic carriership to long-term morbidity and even death. Although data are still scarce, it is clear that parasitic zoonoses (PZs) present a significant burden for public health, particularly in poor and marginalized communities. So far, however, there has been relatively little attention to this group of diseases, causing various PZs to be labeled neglected tropical diseases. In this study, the authors reviewed a large variety of data sources to study the relevance and importance of PZs in Nepal. It was found that a large number of PZs are present in Nepal and are imposing an impact higher than that of malaria and comparable to that of HIV/AIDS. These results therefore suggest that PZs deserve greater attention and more intensive surveillance. Furthermore, this study has shown that even in settings with limited surveillance capacity, it is possible to quantify the impact of neglected diseases and, consequently, to break the vicious circle of neglect.
A fundamental understanding of the spatial distribution and ecology of mosquito larvae is essential for effective vector control intervention strategies. In this study, data-driven decision tree models, generalized linear models and ordination analysis were used to identify the most important biotic and abiotic factors that affect the occurrence and abundance of mosquito larvae in Southwest Ethiopia.
In total, 220 samples were taken at 180 sampling locations during the years 2010 and 2012. Sampling sites were characterized based on physical, chemical and biological attributes. The predictive performance of decision tree models was evaluated based on correctly classified instances (CCI), Cohen’s kappa statistic (κ) and the determination coefficient (R2). A conditional analysis was performed on the regression tree models to test the relation between key environmental and biological parameters and the abundance of mosquito larvae.
The decision tree model developed for anopheline larvae showed a good model performance (CCI = 84 ± 2%, and κ = 0.66 ± 0.04), indicating that the genus has clear habitat requirements. Anopheline mosquito larvae showed a widespread distribution and especially occurred in small human-made aquatic habitats. Water temperature, canopy cover, emergent vegetation cover, and presence of predators and competitors were found to be the main variables determining the abundance and distribution of anopheline larvae. In contrast, anopheline mosquito larvae were found to be less prominently present in permanent larval habitats. This could be attributed to the high abundance and diversity of natural predators and competitors suppressing the mosquito population densities.
The findings of this study suggest that targeting smaller human-made aquatic habitats could result in effective larval control of anopheline mosquitoes in the study area. Controlling the occurrence of mosquito larvae via drainage of permanent wetlands may not be a good management strategy as it negatively affects the occurrence and abundance of mosquito predators and competitors and promotes an increase in anopheline population densities.
Decision trees; Generalized linear model; Macroinvertebrate predators; Mosquito control; Mosquito larvae
Diagnosing canine immune-mediated haemolytic anaemia (IMHA) is often challenging because all currently available tests have their limitations. Dogs with IMHA often have an increased erythrocyte osmotic fragility (OF), a characteristic that is sometimes used in the diagnosis of IMHA. Since the classic osmotic fragility test (COFT) is time-consuming and requires specialized equipment, an easy and less labour-intensive rapid osmotic fragility test (ROFT) has been used in some countries, but its diagnostic value has not yet been investigated.
This study aimed to evaluate erythrocyte osmotic fragility in dogs with and without IMHA, to compare results of the classic (COFT) and rapid (ROFT) test and to assess the value of the ROFT as diagnostic test for canine IMHA.
Nineteen dogs with IMHA (group 1a), 21 anaemic dogs without IMHA (group 1b), 8 dogs with microcytosis (group 2), 13 hyperlipemic dogs (group 3), 10 dogs with lymphoma (group 4), 8 dogs with an infection (group 5) and 13 healthy dogs (group 6) were included.
In all dogs, blood smear examination, in-saline auto-agglutination test, Coombs’ test, COFT and ROFT were performed. In the COFT, OF5, OF50 and OF90 were defined as the NaCl concentrations at which respectively 5, 50 and 90% of erythrocytes were haemolysed.
Compared with healthy dogs, OF5 and OF50 were significantly higher in group 1a (P < 0.001) and OF5 was significantly higher in group 3 (P = 0.0266). The ROFT was positive in 17 dogs with IMHA, 10 hyperlipemic dogs, one anaemic dog without IMHA and one healthy dog.
Osmotic fragility was increased in the majority of dogs with IMHA and in dogs with hyperlipidemia, but not in dogs with microcytosis, lymphoma or an infection. Although more detailed information was obtained about the osmotic fragility by using the COFT, the COFT and ROFT gave similar results. The ROFT does not require specialized equipment, is rapid and easy to perform and can be used easily in daily practice. Although, the ROFT cannot replace other diagnostic tests, it may be a valuable additional tool to diagnose canine IMHA.
Osmotic fragility; Canine immune-mediated haemolytic anaemia; Hyperlipidemia
Artemisinin-based fixed dose combination (FDC) products are recommended by World Health Organization (WHO) as a first-line treatment. However, the current artemisinin FDC products, such as β-artemether and lumefantrine, are inherently unstable and require controlled distribution and storage conditions, which are not always available in resource-limited settings. Moreover, quality control is hampered by lack of suitable analytical methods. Thus, there is a need for a rapid and simple, but stability-indicating method for the simultaneous assay of β-artemether and lumefantrine FDC products.
Three reversed-phase fused-core HPLC columns (Halo RP-Amide, Halo C18 and Halo Phenyl-hexyl), all thermostated at 30°C, were evaluated. β-artemether and lumefantrine (unstressed and stressed), and reference-related impurities were injected and chromatographic parameters were assessed. Optimal chromatographic parameters were obtained using Halo RP-Amide column and an isocratic mobile phase composed of acetonitrile and 1mM phosphate buffer pH 3.0 (52:48; V/V) at a flow of 1.0 ml/min and 3 μl injection volume. Quantification was performed at 210 nm and 335 nm for β-artemether and for lumefantrine, respectively. In-silico toxicological evaluation of the related impurities was made using Derek Nexus v2.0®.
Both β-artemether and lumefantrine were separated from each other as well as from the specified and unspecified related impurities including degradants. A complete chromatographic run only took four minutes. Evaluation of the method, including a Plackett-Burman robustness verification within analytical QbD-principles, and real-life samples showed the method is suitable for quantitative assay purposes of both active pharmaceutical ingredients, with a mean recovery relative standard deviation (± RSD) of 99.7 % (± 0.7%) for β-artemether and 99.7 % (± 0.6%) for lumefantrine. All identified β-artemether-related impurities were predicted in Derek Nexus v2.0® to have toxicity risks similar to β-artemether active pharmaceutical ingredient (API) itself.
A rapid, robust, precise and accurate stability-indicating, quantitative fused-core isocratic HPLC method was developed for simultaneous assay of β-artemether and lumefantrine. This method can be applied in the routine regulatory quality control of FDC products. The in-silico toxicological investigation using Derek Nexus® indicated that the overall toxicity risk for β-artemether-related impurities is comparable to that of β-artemether API.
Anti-malaria; β-artemether; Lumefantrine; Stability-indicating assay; HPLC-UV; Fused-core; Finished pharmaceutical product; Quality-by-design (QbD)
Little is known on the effects of common calf diseases on mortality and carcass traits in the white veal industry (special-fed veal), a highly integrated production system, currently criticized for the intensive pro- and metaphylactic use of antimicrobials. The objective of the present study was to determine the impact of bovine respiratory disease (BRD), diarrhea, arthritis and otitis on the economically important parameters of mortality, hot carcass weight (HCW), carcass quality, fat cover and meat color. For this purpose, a prospective study on 3519 white veal calves, housed in 10 commercial herds, was conducted. Case definitions were based on clinical observation by the producers and written treatment records were used.
Calves received oral antimicrobial group treatments in the milk during 25.2% of the production time on average. With an increasing percentage of the production cycle spent on oral antimicrobials, HCW reduced, whereas the odds for insufficient fat cover or an undesirable red meat color both decreased. Of the calves, 14.8%, 5.3%, 1.5% and 1.6% were individually diagnosed and treated for BRD, diarrhea, arthritis and otitis, respectively. Overall, 5.7% of the calves died and the mortality risk was higher in the first weeks after arrival. Calves that experienced one BRD episode showed a 8.2 kg reduction in HCW, a lower fat cover and an increased mortality risk (hazard ratio (HR) = 5.5), compared to calves which were not individually diagnosed and treated for BRD. With an increasing number of BRD episodes, these losses increased dramatically. Additionally, calves, which experienced multiple BRD episodes, were more likely to have poor carcass quality and an undesirable red meat color at slaughter. Arthritis increased the mortality risk (HR = 3.9), and reduced HCW only when associated with BRD. Otitis did only increase the mortality risk (HR = 7.0). Diarrhea severely increased the mortality risk (HR = 11.0), reduced HCW by 9.2 kg on average and decreased carcass quality.
Despite the massive use of group and individual treatments to alleviate the most prevalent health issues at the fattening period, the effects of BRD, diarrhea, otitis and arthritis on survival and performance are still considerable, especially in cases of chronic pneumonia with or without arthritis. Controlling calf health by effective preventive and therapeutic strategies and in particular the prevention of chronic BRD is key for the profitability of veal operations.
Bovine respiratory disease; Carcass weight; Diarrhea; Economics; Mortality; Veal calves
Reservoirs created by damming rivers are often believed to increase malaria incidence risk and/or stretch the period of malaria transmission. In this paper, we report the effects of a mega hydropower dam on P. falciparum malaria incidence in Ethiopia.
A longitudinal cohort study was conducted over a period of 2 years to determine Plasmodium falciparum malaria incidence among children less than 10 years of age living near a mega hydropower dam in Ethiopia. A total of 2080 children from 16 villages located at different distances from a hydropower dam were followed up from 2008 to 2010 using active detection of cases based on weekly house to house visits. Of this cohort of children, 951 (48.09%) were females and 1059 (51.91%) were males, with a median age of 5 years. Malaria vectors were simultaneously surveyed in all the 16 study villages. Frailty models were used to explore associations between time-to-malaria and potential risk factors, whereas, mixed-effects Poisson regression models were used to assess the effect of different covariates on anopheline abundance.
Overall, 548 (26.86%) children experienced at least one clinical malaria episode during the follow up period with mean incidence rate of 14.26 cases/1000 child-months at risk (95% CI: 12.16 - 16.36). P. falciparum malaria incidence showed no statistically significant association with distance from the dam reservoir (p = 0.32). However, P. falciparum incidence varied significantly between seasons (p < 0.01). The malaria vector, Anopheles arabiensis, was however more abundant in villages nearer to the dam reservoir.
P. falciparum malaria incidence dynamics were more influenced by seasonal drivers than by the dam reservoir itself. The findings could have implications in timing optimal malaria control interventions and in developing an early warning system in Ethiopia.
Malaria incidence; P. falciparum; Mosquito; Dam; Season; Ethiopia
In humans, recombinant human thyrotropin (rhTSH) enhances radioactive iodine uptake (RAIU) in patients with differentiated thyroid cancer. No studies have been performed in veterinary medicine to optimize radioiodine treatment of thyroid cancer. The aim of this study was to evaluate the effect of rhTSH on the uptake of radioiodine-123 (123I) in dogs with thyroid tumors. Nine dogs with thyroid neoplasia were included in this prospective cross-over study. The dogs were divided in 2 groups. In one group, 123I was administered for a baseline RAIU determination in week 1. In week 2 (after a washout period of 2 weeks), these dogs received rhTSH (100 μg IV) 24 h before 123I injection. In the other group the order of the protocol was reversed. For each scan, the dogs received 37 MBq (1 mCi) of 123I intravenously (IV) and planar scintigraphy was performed after 8 and 24 h for tumor RAIU calculation. Overall, rhTSH administration caused no statistically significant change on thyroid tumor RAIU at 8 h (p = 0.89) or at 24 h (p = 0.98). A significant positive correlation was found between the effect of rhTSH on tumor 8h-RAIU and rhTSH serum concentrations at 6 h (τ = 0.68; p = 0.03), at 12 h (τ = 0.68; p = 0.03) and at 24 h (τ = 0.78; p = 0.02) after rhTSH injection. This study suggests that IV administration of 100 μg rhTSH 24 h before 123I has an inconsistent effect on thyroid tumor RAIU. Further studies are necessary to determine the best protocol of rhTSH administration to optimize thyroid tumor RAIU.
The liver fluke Fasciola hepatica is a parasite of ruminants with a worldwide distribution and an apparent increasing incidence in EU member states. Effective control in dairy cattle is hampered by the lack of flukicides with a zero-withdrawal time for milk, leaving the dry period as the only time that preventive treatment can be applied. Here, we present the results of a blinded, randomized and placebo-controlled trial on 11 dairy herds (402 animals) exposed to F. hepatica to 1) assess the effect of closantel treatment at dry-off (or 80–42 days before calving in first-calving heifers) on milk production parameters and 2) evaluate if a number of easy-to-use animal parameters is related to the milk production response after treatment. Closantel treatment resulted in a noticeable decrease of anti-F. hepatica antibody levels from 3–6 months after treatment onwards, a higher peak production (1.06 kg) and a slightly higher persistence (9%) of the lactation, resulting in a 305-day milk production increase of 303 kg. No effects of anthelmintic treatment were found on the average protein and fat content of the milk. Milk production responses after treatment were poor in meagre animals and clinically relevant higher milk production responses were observed in first-lactation animals and in cows with a high (0.3–0.5 optical density ratio (ODR)), but not a very high (≥0.5 ODR) F. hepatica ELISA result on a milk sample from the previous lactation. We conclude that in dairy herds exposed to F. hepatica, flukicide treatment at dry-off is a useful strategy to reduce levels of exposure and increase milk production in the subsequent lactation. Moreover, the results suggest that treatment approaches that only target selected animals within a herd can be developed based on easy-to-use parameters.
The emergence and spread of insecticide resistance in the major African malaria vectors Anopheles gambiae s.s. and Anopheles arabiensis may compromise control initiatives based on insecticide-treated nets (ITNs) or indoor residual spraying (IRS), and thus threaten the global malaria elimination strategy.
We investigated pyrethroid resistance in four populations of An. arabiensis from south-western Ethiopia and then assessed the bio-efficacy of six World Health Organization recommended long lasting insecticidal nets (LLINs) using these populations.
For all four populations of An. arabiensis, bottle bioassays indicated low to moderate susceptibility to deltamethrin (mortality at 30 minutes ranged between 43 and 80%) and permethrin (mortality ranged between 16 and 76%). Pre-exposure to the synergist piperonylbutoxide (PBO) significantly increased the susceptibility of all four populations to both deltamethrin (mortality increased between 15.3 and 56.8%) and permethrin (mortality increased between 11.6 and 58.1%), indicating the possible involvement of metabolic resistance in addition to the previously identified kdr mutations. There was reduced susceptibility of all four An. arabiensis populations to the five standard LLINs tested (maximum mortality 81.1%; minimum mortality 13.9%). Bio-efficacy against the four populations varied by net type, with the largest margin of difference observed with the Jimma population (67.2% difference). Moreover, there were differences in the bio-efficacy of each individual standard LLIN against the four mosquito populations; for example there was a difference of 40% in mortality of Yorkool against two populations. Results from standard LLINs indicated reduced susceptibility to new, unused nets that was likely due to observed pyrethroid resistance. The roof of the combination LLIN performed optimally (100% mortality) against all the four populations of An. arabiensis, indicating that observed reductions in susceptibility could be ameliorated with the combination of PBO with deltamethrin, as used in PermaNet® 3.0.
Our results suggest that bio-efficacy evaluations using local mosquito populations should be conducted where possible to make evidence-based decisions on the most suitable control products, and that those combining multiple chemicals such as PBO and deltamethrin should be considered for maintaining a high level of efficacy in vector control programmes.
Bio-efficacy; Long-lasting insecticidal nets; Insecticide resistance; Anopheles arabiensis; Ethiopia
Disk-associated cervical spondylomyelopathy (DA-CSM) is a multifactorial neurological disorder in which progressive caudal cervical spinal cord compression is mainly caused by one or more intervertebral disk protrusions. The Doberman pinscher breed seems predisposed for this condition. The underlying cause and pathophysiology of DA-CSM are currently unknown. Recently, wider intervertebral disks have been put forward as a risk factor for development of clinically relevant DA-CSM. However, little is known about other factors affecting intervertebral disk width. Therefore the aim of this study was to assess the association between intervertebral disk width, measured on magnetic resonance imaging (MRI), and clinical status, age, gender and intervertebral disk location in dogs with and without clinical signs of DA-CSM.
Doberman pinschers with clinical signs of DA-CSM (N=17),clinically normal Doberman pinschers (N=20), and clinically normal English Foxhounds (N=17), underwent MRI of the cervical vertebral column. On sagittal T2-weighted images, intervertebral disk width was measured from C2-C3 to C6-C7. Intra –and interobserver agreement were assessed on a subset of 20 of the 54 imaging studies.
Intervertebral disk width was not significantly different between Doberman pinschers with clinical signs of DA-CSM, clinically normal Doberman pinschers or clinically normal English Foxhounds (p=0.43). Intervertebral disk width was positively associated with increasing age (p=0.029). Each monthly increase in age resulted in an increase of disk width by 0.0057mm. Intervertebral disk width was not significantly affected by gender (p=0.056), but was significantly influenced by intervertebral disk location (p <0.0001). The assessed measurements were associated with a good intra –and interobserver agreement.
The present study does not provide evidence that wider intervertebral disks are associated with clinical status in dogs with and without DA-CSM. Instead, it seems that cervical intervertebral disk width in dogs is positively associated with increase in age.
Cervical spondylomyelopathy; Wobbler syndrome; Magnetic resonance imaging; Morphometry; Intervertebral disk
The objective of this study was to assess if lipoteichoic acid (LTA), produced by Staphylococcus aureus, exacerbates respiratory disease in porcine respiratory coronavirus (PRCV)-infected pigs, as has previously been shown with lipopolysaccharide. Piglets were inoculated with PRCV and 24 h later with S. aureus LTA. Clinical signs, lung virus titres, inflammatory cells and cytokines in bronchoalveolar lavage fluid (BALF) were compared with those of animals in PRCV- and LTA- inoculated control groups.
All PRCV-LTA inoculated pigs except one developed severe respiratory disease, whereas clinical signs in the control groups were minimal or absent. Virus titres and grossly visible pulmonary lesions were similar in the PRCV-LTA- and PRCV-inoculated groups and were not detected in the LTA-group. Neutrophil percentages in BALF were higher in the PRCV-LTA than in the PRCV group. There was no significant difference in interferon (IFN)-γ, interleukin (IL)-1, IL-6, IL-12/IL-23 and tumour necrosis factor (TNF)-α concentrations in BALF between the PRCV-LTA and PRCV groups, but levels of IL-6, IL-12/IL-23 and IFN-γ were higher in the PRCV-LTA-inoculated than in the LTA-inoculated controls.
The findings suggest that the experimentally-induced respiratory disease was not mediated by cytokine over-production, but rather reflected the concerted action of particular cytokine interactions and/or as yet unidentified mediators. This is the first in vivo study to report the synergistic interaction between a virus and LTA in enhancing the severity of respiratory disease in the pig. Given that Gram-positive bacteria, capable of producing LTA, are commonly found in pig accommodation, the role of this compound in the development of the porcine respiratory disease complex requires further investigation.
Porcine respiratory coronavirus; Lipoteichoic acid; Respiratory disease; Pigs; Cytokines
Age at menarche is the reflection of cumulative pre-adolescent exposure of girls to either adverse environment such as food insecurity or affluent living conditions. Food insecurity could result in inadequate nutrient intake and stress, both of which are hypothesized to have opposing effects on the timing of menarche through divergent pathways. It is not known whether food insecure girls have delayed menarche or early menarche compared with their food secure peers. In this study we test the competing hypothesis of the relationship between food insecurity and age at menarche among adolescent girls in the Southwest Ethiopia.
We report on 900 girls who were investigated in the first two rounds of the five year longitudinal survey. The semi-parametric frailty model was fitted to determine the effect of adolescent food insecurity on time to menarche after adjusting for socio-demographic and economic variables.
Food insecure girls have menarche one year later than their food secure peer (median age of 15 years vs 14 years). The hazard of menarche showed a significant decline (P = 0.019) as severity of food insecurity level increased, the hazard ratio (HR) for mild food insecurity and moderate/severe food insecurity were 0.936 and 0.496, respectively compared to food secure girls. Stunted girls had menarche nearly one year later than their non-stunted peers (HR = 0.551, P < 0.001).
Food insecurity is associated with delay of age at menarche by one year among girls in the study area. Stunted girls had menarche one year later than their non-stunted peers. Age at menarche reflects the development of girls including the timing of sexual maturation, nutritional status and trajectory of growth during the pre-pubertal periods. The findings reflect the consequence of chronic food insecurity on the development and well-being of girls in the study area.
Despite epidemiological data linking necrotizing skin infections with the production of Panton-Valentine leukocidin (PVL), the contribution of this toxin to the virulence of S. aureus has been highly discussed as a result of inconclusive results of in vivo studies. However, the majority of these results originate from experiments using mice, an animal species which neutrophils - the major target cells for PVL - are highly insensitive to the action of this leukocidin. In contrast, the rabbit neutrophils have been shown to be as sensitive to PVL action as human cells, making the rabbit a better experimental animal to explore the PVL role. In this study we examined whether PVL contributes to S. aureus pathogenicity by means of a rabbit skin infection model. The rabbits were injected intradermally with 108 cfu of either a PVL positive community-associated methicillin-resistant S. aureus isolate, its isogenic PVL knockout or a PVL complemented knockout strain, and the development of skin lesions was observed. While all strains induced skin infection, the wild type strain produced larger lesions and a higher degree of skin necrosis compared to the PVL knockout strain in the first week after the infection. The PVL expression in the rabbits was indirectly confirmed by a raise in the serum titer of anti-LukS-PV antibodies observed only in the rabbits infected with PVL positive strains. These results indicate that the rabbit model is more suitable for studying the role of PVL in staphylococcal diseases than other animal models. Further, they support the epidemiological link between PVL producing S. aureus strains and necrotizing skin infections.
The status of knockdown resistance (kdr) mutation was investigated in the major malaria vector Anopheles arabiensis Patton (Diptera: Culicidae) from Ethiopia. Among 240 mosquito samples from 15 villages of southwestern Ethiopia that were screened by allele-specific polymerase chain reaction for kdr mutations, the West African kdr mutation (L1014F) was detected in almost all specimens (98.5%), whereas the East African kdr mutation (L1014S) was absent. Moreover, the mortality of An. gambiae s.l. to diagnostic dosages of 4% DDT, 0.75% permethrin, and 0.05% deltamethrin from bioassay results was 1.0%, 18.1%, and 82.2%, respectively. We report here the highest kdr allele frequency ever observed in An. arabiensis and its implications in malaria vector control in Ethiopia are discussed.
Determinants of active tuberculosis among People Living with HIV/AIDS (PLHA) are not well elucidated in countries with limited resources. The objective of this study was to assess distal and proximate determinants of active tuberculosis among people living with HIV/AIDS in southwest Ethiopia.
A case-control study was conducted from January to March, 2009 in South West Ethiopia. The study population consisted of 162 cases and 647 controls. Cases were adult people living with HIV/AIDS who developed active pulmonary tuberculosis and controls were people living with HIV/AIDS without active tuberculosis. An interviewer administered structured questionnaire was used to collect information on potential risk factors.
After adjustment for potential confounders, male gender (OR=1.7; 95%CI: 1.1, 2.7), a low level of education (OR=2.8; 95% CI: 1.1, 7.1), a body mass index less than 18.5 kg/m2 (OR=4.1; 95% CI: 2.3, 7.4), hemoglobin level less than 10.0 g/dl (OR=2.8; 95%CI: 1.5, 5.2), a CD4 lymphocyte count less than 200 cells/µL (OR=9.8‘95% CI: 5.5, 17.5), a WHO clinical stage IV (OR=4.3; 95% CI: 2.6, 6.8), not taking antiretroviral treatment (OR=3.1; 95%CI: 1.9,4.9), an infection with helminthes (OR=2.2; 95% CI: 1.4, 3.4), a history of contact with a tuberculosis patient in the family (OR=2.0; 95% CI: 1.2, 3.3), and living in a house made of mud wall (OR=3.7; 95% CI: 1.5, 7.5) were independently associated with the development of active tuberculosis in people living with HIV/AIDS.
All people living with HIV/AIDS should be screened for tuberculosis but in the presence of the risk factors mentioned above, intensified screening is recommended.
Active TB; HIV; risk factors; case control study; Southwest Ethiopia