Purpose: Little is known about mental health disorders (MHDs) and their associated health care expenditures for the dual eligible elders across long-term care (LTC) settings. We estimated the 12-month diagnosed prevalence of MHDs among dual eligible older adults in LTC and non-LTC settings and calculated the average incremental effect of MHDs on medical care, LTC, and prescription drug expenditures across LTC settings. Methods: Participants were fee-for-service dual eligible elderly beneficiaries from 7 states. We obtained their 2005 Medicare and Medicaid claims data and LTC program participation data from federal and state governments. We grouped beneficiaries into non-LTC, community LTC, and institutional LTC groups and identified enrollees with any of 5 MHDs (anxiety, bipolar, major depression, mild depression, and schizophrenia) using the International Classification of Diseases Ninth Revision codes associated with Medicare and Medicaid claims. We obtained medical care, LTC, and prescription drug expenditures from related claims. Results: Thirteen percent of all dual eligible elderly beneficiaries had at least 1 MHD diagnosis in 2005. Beneficiaries in non-LTC group had the lowest 12-month prevalence rates but highest percentage increase in health care expenditures associated with MHDs. Institutional LTC residents had the highest prevalence rates but lowest percentage increase in expenditures. LTC expenditures were less affected by MHDs than medical and prescription drug expenditures. Implications: MHDs are prevalent among dual eligible older persons and are costly to the health care system. Policy makers need to focus on better MHD diagnosis among community-living elders and better understanding in treatment of MHDs in LTC settings.
Monocular occlusion has been posited to reduce activation of the contralateral hemisphere (“Sprague effect”), thus inducing a contralateral spatial bias (toward the viewing eye). Healthy right-handed participants bisected horizontal lines during monocular eye viewing. Although subjects tended to deviate away from the viewing eye, only left eye viewing deviated significantly right of midline. These results suggest that eye patching may induce an attentional compensation similar to hemianopic patients. Alternatively, increased activation of higher cortical regions mediating spatial attention in contralateral hemispace may be an adaptive response to decreased activation of its ipsilateral superior colliculus induced by contralateral eye patching.
monocular occlusion; eye patching; constrained monocular viewing; Sprague effect; attention; intention; line bisection; spatial neglect; hemianopia; rehabilitation
The human landing catch (HLC) has long been the gold standard for estimating malaria transmission by mosquitoes, but has come under scrutiny because of ethical concerns of exposing collectors to infectious bites. We estimated the incidence of Plasmodium falciparum malaria infection in a cohort of 152 persons conducting HLCs and compared it with that of 147 non-collectors in western Kenya. Participants were presumptively cleared of malaria with Coartem™ (artemether-lumefantrine) and tested for malaria every 2 weeks for 12 weeks. The HLC collections were conducted four nights per week for six weeks. Collectors were provided chemoprophylaxis with Malarone™ (atovaquone-proguanil) during the six weeks of HLC activities and one week after HLC activities were completed. The incidence of malaria was 96.6% lower in collectors than in non-collectors (hazard ratio = 0.034, P < 0.0001). Therefore, with proper prophylaxis, concern about increased risk of malaria among collectors should not be an impediment to conducting HLC studies.
In Kenya, HIV-1 viral load monitoring is commonly performed with the Cobas Amplicor using plasma specimens. Interest is growing in transitioning to real-time PCR (RT-PCR), such as the Cobas Ampliprep/Cobas TaqMan (CAP/CTM), using dried blood spots (DBS). Before implementation, direct evaluation of the two assays using DBS field specimens is required. This study compares the sensitivity, specificity, negative and positive predictive values (NPV and PPV, respectively), concordance, and agreement between HIV-1 viral load measurements using plasma and DBS specimens obtained from 512 HIV-1-infected pregnant females enrolled in the Kisumu Breastfeeding Study and tested with the Cobas Amplicor and CAP/CTM assays. The sensitivity and NPV of viral load detection in DBS specimens were higher with CAP/CTM (sensitivity, 100%; 95% confidence interval [CI], 99.1 to 100.0%; NPV, 100%; 95% CI, 59.0 to 100.0%) than the Cobas Amplicor (sensitivity, 96.6%; 95% CI, 94.3 to 98.1%; NPV, 58.8%; 95% CI, 40.7 to 75.4%). The PPVs were comparable between both assays when using DBS. The specificity of viral load detection in DBS specimens was lower with CAP/CTM (77.8%; 95% CI, 40.0 to 97.2%) than that of the Cobas Amplicor (95.2%; 95% CI, 76.2 to 99.9%). Good concordance and agreement were observed when paired plasma and DBS specimens were tested with both assays. Lower specificity with the CAP/CTM is likely due to proviral HIV-1 DNA amplification and lower detection limits with RT-PCR. However, the CAP/CTM has better sensitivity and higher throughput than the Cobas Amplicor. These findings suggest that DBS may be a suitable alternative to plasma when using RT-PCR, which could increase access to viral load monitoring in resource-limited settings.
♦ Background: Prophylactic gentamicin 0.1% cream has demonstrated efficacy in preventing both exit-site infection (ESI) and peritonitis attributable to gram-positive and gram-negative organisms; however, the effect of this practice on the gentamicin susceptibility patterns of bacterial pathogens isolated from such infections is unknown. We therefore examined the effect of a change in our prophylactic topical antibiotic exit-site protocol (from mupirocin 2% cream to gentamicin 0.1% cream) on infection rates and susceptibility patterns.
♦ Methods: This retrospective observational cohort study examined two periods of time: before and after the change in exit-site protocol. Each period was 30 months in duration, with a 2-month implementation period between, during which patient data were excluded. Demographic, clinical, and microbiology data were collected for each patient and episode of infection.
♦ Results: Overall, 377 patients were evaluated. In the mupirocin period (MUP), 145 infections occurred in 79 patients, and in the gentamicin period, 145 infections occurred in 93 patients. No significant effect was found either in overall episodes of infection (0.53 per year) or in episodes of peritonitis (0.429 vs 0.375 per year), but episodes of ESI increased significantly (0.098 vs 0.153 per year; p = 0.024; odds ratio: 1.55; 95% confidence interval: 1.05 to 2.28). Episodes of Staphylococcus aureus peritonitis increased by 38% (0.018 vs 0.025 per year), and episodes of S. aureus ESI increased significantly by 150% (0.022 vs 0.055 per year; p = 0.03; hazard ratio: 3.00; 95% confidence interval: 1.09 to 8.26). Episodes of pseudomonal peritonitis declined by 68% (0.022 vs 0.007 per year), and episodes of pseudomonal ESI increased by 150% (0.007 vs 0.018 per year). The gentamicin susceptibility for gram-positive isolates demonstrated no significant change; however, the gentamicin susceptibility for Enterobacteriaceae decreased by 12% and for Pseudomonas, by 14%.
♦ Conclusions: The significant increase in episodes of ESI and the decrease in susceptibility for both Enterobacteriaceae and Pseudomonas isolates represent a concerning trend. Centers should examine trends in infection rates and in bacterial susceptibilities to determine the most appropriate agent for ESI prophylaxis.
Antibiotic cream; pathogen susceptibilities; peritonitis; exit-site infection
Artemether-lumefantrine (AL) was adopted as first-line treatment for uncomplicated malaria in Kenya in 2006. Monitoring drug efficacy at regular intervals is essential to prevent unnecessary morbidity and mortality. The efficacy of AL and dihydroartemisinin-piperaquine (DP) were evaluated for the treatment of uncomplicated malaria in children aged six to 59 months in western Kenya.
From October 2010 to August 2011, children with fever or history of fever with uncomplicated Plasmodium falciparum mono-infection were enrolled in an in vivo efficacy trial in accordance with World Health Organization (WHO) guidelines. The children were randomized to treatment with a three-day course of AL or DP and efficacy outcomes were measured at 28 and 42 days after treatment initiation.
A total of 137 children were enrolled in each treatment arm. There were no early treatment failures and all children except one had cleared parasites by day 3. Polymerase chain reaction (PCR)-uncorrected adequate clinical and parasitological response rate (ACPR) was 61% in the AL arm and 83% in the DP arm at day 28 (p = 0.001). PCR-corrected ACPR at day 28 was 97% in the AL group and 99% in the DP group, and it was 96% in both arms at day 42.
AL and DP remain efficacious for the treatment of uncomplicated malaria among children in western Kenya. The longer half-life of piperaquine relative to lumefantrine may provide a prophylactic effect, accounting for the lower rate of re-infection in the first 28 days after treatment in the DP arm.
We interviewed caretakers of 1,043 children < 5 years old in a baseline cross-sectional survey (April to May 2007) and > 20,000 children on five separate subsequent occasions (May of 2009 to December 31, 2010) to assess healthcare seeking patterns for diarrhea. Diarrhea prevalence during the preceding 2 weeks ranged from 26% at baseline to 4–11% during 2009–2010. Caretakers were less likely to seek healthcare outside the home for infants (versus older children) with diarrhea (adjusted odds ratio [aOR] = 0.33, confidence interval [CI] = 0.12–0.87). Caretakers of children with reduced food intake (aOR = 3.42, CI = 1.37–8.53) and sunken eyes during their diarrheal episode were more likely to seek care outside home (aOR = 4.76, CI = 1.13–8.89). Caretakers with formal education were more likely to provide oral rehydration solution (aOR = 3.01, CI = 1.41–6.42) and visit a healthcare facility (aOR = 3.32, CI = 1.56–7.07). Studies calculating diarrheal incidence and healthcare seeking should account for seasonal trends. Improving caretakers' knowledge of home management could prevent severe diarrhea.
Worldwide, Shigella causes an estimated 160 million infections and >1 million deaths annually. However, limited incidence data are available from African urban slums. We investigated the epidemiology of shigellosis and drug susceptibility patterns within a densely populated urban settlement in Nairobi, Kenya through population-based surveillance.
Surveillance participants were interviewed in their homes every 2 weeks by community interviewers. Participants also had free access to a designated study clinic in the surveillance area where stool specimens were collected from patients with diarrhea (≥3 loose stools within 24 hours) or dysentery (≥1 stool with visible blood during previous 24 hours). We adjusted crude incidence rates for participants meeting stool collection criteria at household visits who reported visiting another clinic.
Shigella species were isolated from 224 (23%) of 976 stool specimens. The overall adjusted incidence rate was 408/100,000 person years of observation (PYO) with highest rates among adults 34–49 years old (1,575/100,000 PYO). Isolates were: Shigella flexneri (64%), S. dysenteriae (11%), S. sonnei (9%), and S. boydii (5%). Over 90% of all Shigella isolates were resistant to trimethoprim-sulfamethoxazole and sulfisoxazole. Additional resistance included nalidixic acid (3%), ciprofloxacin (1%) and ceftriaxone (1%).
More than 1 of every 200 persons experience shigellosis each year in this Kenyan urban slum, yielding rates similar to those in some Asian countries. Provision of safe drinking water, improved sanitation, and hygiene in urban slums are needed to reduce disease burden, in addition to development of effective Shigella vaccines.
Immunoglobulin (Ig) GM and KM allotypes, genetic markers of γ and κ chains, are associated with humoral immune responsiveness. Previous studies have shown the relationships between GM6-carrying haplotypes and susceptibility to malaria infection in children and adults; however, the role of the genetic markers in placental malaria (PM) infection and PM with HIV co-infection during pregnancy has not been investigated. We examined the relationship between the gene polymorphisms of Ig GM6 and KM allotypes and the risk of PM infection in pregnant women with known HIV status. DNA samples from 728 pregnant women were genotyped for GM6 and KM alleles using polymerase chain reaction-restriction fragment length polymorphism method. Individual GM6 and KM genotypes and the combined GM6 and KM genotypes were assessed in relation to PM in HIV-1 negative and positive women, respectively. There was no significant effect of individual GM6 and KM genotypes on the risk of PM infection in HIV-1 negative and positive women. However, the combination of homozygosity for GM6(+) and KM3 was associated with decreased risk of PM (adjusted OR, 0.25; 95% CI, 0.08–0.8; P = 0.019) in HIV-1 negative women while in HIV-1 positive women the combination of GM6(+/−) with either KM1-3 or KM1 was associated with increased risk of PM infection (adjusted OR, 2.10; 95% CI, 1.18–3.73; P = 0.011). Hardy-Weinberg Equilibrium (HWE) tests further showed an overall significant positive F(is) (indication of deficit in heterozygotes) for GM6 while there was no deviation for KM genotype frequency from HWE in the same population. These findings suggest that the combination of homozygous GM6(+) and KM3 may protect against PM in HIV-1 negative women while the HIV-1 positive women with heterozygous GM6(+/−) combined with KM1-3 or KM1 may be more susceptible to PM infection. The deficit in heterozygotes for GM6 further suggests that GM6 could be under selection likely by malaria infection.
Vanishing bile duct syndrome (VBDS) is a group of rare disorders characterized by ductopenia, the progressive destruction and disappearance of intrahepatic bile ducts leading to cholestasis. Described in association with medications, autoimmune disorders, cancer, transplantation, and infections, the specific mechanisms of disease are not known. To date, only 4 cases of VBDS have been reported in human immunodeficiency virus (HIV) infected patients. We report 2 additional cases of HIV-associated VBDS and review the features common to the HIV-associated cases. Presentation includes hyperbilirubinemia, normal liver imaging, and negative viral and autoimmune hepatitis studies. In HIV-infected subjects, VBDS occurred at a range of CD4+ T-cell counts, in some cases following initiation or change in antiretroviral therapy. Lymphoma was associated with two cases; nevirapine, antibiotics, and viral co-infection were suggested as etiologies in the other cases. In HIV-positive patients with progressive cholestasis, early identification of VBDS and referral for transplantation may improve outcomes.
Human immunodeficiency virus; Antiretroviral therapy; Vanishing bile duct syndrome; Ductopenia; Liver biopsy
Patients with traumatic brain injuries (TBI) often develop post traumatic stress disorder (PTSD). This syndrome, defined and diagnosed by psychological and behavioral features, is associated with symptoms such as anxiety and anger with an increase of arousal and vigilance, as well as flashbacks and nightmares. Many of these features and symptoms observed in PTSD may be in part the result of altered autonomic nervous system (ANS) activity in response to psychological and physical challenges. Brain imaging has documented that TBI often induces white matter damage to pathways associated with the anterior limb of the internal capsule and uncinate fasciculus. Since these white matter structures link neocortical networks with subcortical and limbic structures that regulate autonomic control centers, injury to these pathways may induce a loss of inhibitory control of the ANS. In this review, the autonomic features associated with PTSD are discussed in the context of traumatic brain injury. We posit that TBI induced damage to networks that regulate the ANS increase vulnerability to PTSD. The means by which the vulnerability can be measured and tested are also discussed.
emotion; autonomic nervous system; traumatic brain injury; post traumatic stress disorder; white matter; TBI; PTSD
Both insecticide-treated bed nets (ITNs) and indoor residual spraying (IRS) reduce malaria in high malaria transmission areas.1–3 The combined effect of these interventions is unknown. We conducted a non-randomized prospective cohort study to determine protective efficacy of IRS with ITNs (ITN + IRS) compared with ITNs alone (ITN only) in preventing Plasmodium falciparum parasitemia. At baseline, participants provided blood samples for malaria smears, were presumptively treated for malaria, and received ITNs. Blood smears were made monthly and at sick visits. In total, 1,804 participants were enrolled. Incidence of P. falciparum parasitemia in the ITN + IRS and ITN only groups was 18 and 44 infections per 100 persons-years at risk, respectively (unadjusted rate ratio = 0.41; 95% confidence interval [CI] = 0.31–0.56). Adjusted protective efficacy of ITN + IRS compared with ITN only was 62% (95% CI = 0.50–0.72). The combination of IRS and ITN might be a feasible strategy to further reduce malaria transmission in areas of persistent perennial malaria transmission.
We report and explore changes in child mortality in a rural area of Kenya during 2003–2009, when major public health interventions were scaled-up. Mortality ratios and rates were calculated by using the Kenya Medical Research Institute/Centers for Disease Control and Prevention Demographic Surveillance System. Inpatient and outpatient morbidity and mortality, and verbal autopsy data were analyzed. Mortality ratios for children less than five years of age decreased from 241 to 137 deaths/1,000 live-births in 2003 and 2007 respectively. In 2008, they increased to 212 deaths/1,000 live-births. Mortality remained elevated during the first 8 months of 2009 compared with 2006 and 2007. Malaria and/or anemia accounted for the greatest increases in child mortality. Stock-outs of essential antimalarial drugs during a time of increased malaria transmission and disruption of services during civil unrest may have contributed to increased mortality in 2008–2009. To maintain gains in child survival, implementation of good policies and effective interventions must be complemented by reliable supply and access to clinical services and essential drugs.
Patients often demonstrate attentional and action-intentional biases in both the transverse and coronal planes. In addition, when making forelimb movements in the transverse plane, normal participants also have spatial and magnitude asymmetries, but forelimb spatial asymmetries have not been studied in coronal space. Thus, to learn if when normal people make vertical movements they have right–left spatial and magnitude biases, seventeen healthy, blindfolded volunteers had their hands (holding pens) placed vertically in their midsagittal plane, 10 inches apart, on pieces of paper positioned above, below, and at eye-level. Participants were asked to move their hands together vertically and meet in the middle. Participants demonstrated less angular deviation in the below-eye condition than in the other spatial conditions, when moving down than up, and with their right than left hand. Movements toward eye level from upper or lower space were also more accurate than movements in the other directions. Independent of hand, lines were longer with downward than upward movements and the right hand moved more distance than the left. These attentional-intentional asymmetries may be related to gravitational force, hand-hemispheric dominance, and spatial “where” asymmetries; however, the mechanisms accounting for these asymmetries must be ascertained by future research.
Intention; Attention; Hand asymmetry; Coronal plane; Hemispheric specialization; Motor control
We analyzed HIV testing rates, prevalence of undiagnosed HIV, and predictors of testing in the Kenya AIDS Indicator Survey (KAIS) 2007.
KAIS was a nationally representative sero-survey that included demographic and behavioral indicators and testing for HIV, HSV-2, syphilis, and CD4 cell counts in the population aged 15–64 years. We used gender-specific multivariable regression models to identify factors independently associated with HIV testing in sexually active persons.
Of 19,840 eligible persons, 80% consented to interviews and blood specimen collection. National HIV prevalence was 7.1% (95% CI 6.5–7.7). Among ever sexually active persons, 27.4% (95% CI 25.6–29.2) of men and 44.2% (95% CI 42.5–46.0) of women reported previous HIV testing. Among HIV-infected persons, 83.6% (95% CI 76.2–91.0) were unaware of their HIV infection. Among sexually active women aged 15–49 years, 48.7% (95% CI 46.8–50.6) had their last HIV test during antenatal care (ANC). In multivariable analyses, the adjusted odds ratio (AOR) for ever HIV testing in women ≥35 versus 15–19 years was 0.2 (95% CI: 0.1–0.3; p<0.0001). Other independent associations with ever HIV testing included urban residence (AOR 1.6, 95% CI: 1.2–2.0; p = 0.0005, women only), highest wealth index versus the four lower quintiles combined (AOR 1.8, 95% CI: 1.3–2.5; p = 0.0006, men only), and an increasing testing trend with higher levels of education. Missed opportunities for testing were identified during general or pregnancy-specific contacts with health facilities; 89% of adults said they would participate in home-based HIV testing.
The vast majority of HIV-infected persons in Kenya are unaware of their HIV status, posing a major barrier to HIV prevention, care and treatment efforts. New approaches to HIV testing provision and education, including home-based testing, may increase coverage. Targeted interventions should involve sexually active men, sexually active women without access to ANC, and rural and disadvantaged populations.
High rates of typhoid fever in children in urban settings in Asia have led to focus on childhood immunization in Asian cities, but not in Africa, where data, mostly from rural areas, have shown low disease incidence. We set out to compare incidence of typhoid fever in a densely populated urban slum and a rural community in Kenya, hypothesizing higher rates in the urban area, given crowding and suboptimal access to safe water, sanitation and hygiene.
During 2007-9, we conducted population-based surveillance in Kibera, an urban informal settlement in Nairobi, and in Lwak, a rural area in western Kenya. Participants had free access to study clinics; field workers visited their homes biweekly to collect information about acute illnesses. In clinic, blood cultures were processed from patients with fever or pneumonia. Crude and adjusted incidence rates were calculated.
In the urban site, the overall crude incidence of Salmonella enterica serovar Typhi (S. Typhi) bacteremia was 247 cases per 100,000 person-years of observation (pyo) with highest rates in children 5–9 years old (596 per 100,000 pyo) and 2–4 years old (521 per 100,000 pyo). Crude overall incidence in Lwak was 29 cases per 100,000 pyo with low rates in children 2–4 and 5–9 years old (28 and 18 cases per 100,000 pyo, respectively). Adjusted incidence rates were highest in 2–4 year old urban children (2,243 per 100,000 pyo) which were >15-fold higher than rates in the rural site for the same age group. Nearly 75% of S. Typhi isolates were multi-drug resistant.
This systematic urban slum and rural comparison showed dramatically higher typhoid incidence among urban children <10 years old with rates similar to those from Asian urban slums. The findings have potential policy implications for use of typhoid vaccines in increasingly urban Africa.
There has been a relative absence of studies that have examined the neuropsychological profiles of potential lung transplant candidates. Neuropsychological data are presented for 134 patients with end-stage pulmonary disease who were being evaluated as potential candidates for lung transplantation. Neuropsychological test results indicated that a significantly greater proportion of the patients exhibited impaired performances on a number of Selective Reminding Test (SRT) tasks as compared to the expected population frequency distributions for these measures. The highest frequencies of impairment were observed on the SRT’s Immediate Free Recall (46.43%), Long-term Retrieval (41.67%), and Consistent Long-term Retrieval (51.19%) variables. On the Minnesota Multiphasic Personality Inventory-2 (MMPI-2)/Minnesota Multiphasic Personality Inventory-Adolescent (MMPI-A), patients’ mean clinical profile revealed elevations on Scales 1 (Hypochondriasis) and 3 (Conversion Hysteria). This profile indicated that they were experiencing an array of symptomatology ranging from somatic complaints to lethargy and fatigue, and that they may have been functioning at a reduced level of efficiency. Findings are discussed in light of patients’ end-stage pulmonary disease and factors possibly contributing to their neuropsychological test performances. Implications for clinical practice and future research are also provided.
Neuropsychology; Neurocognitive; Pulmonary disease; End-stage; Lung transplant
There has been much debate about the appropriate statistical methodology for the evaluation of malaria field studies and the challenges in interpreting data arising from these trials.
The present paper describes, for a pivotal phase III efficacy of the RTS, S/AS01 malaria vaccine, the methods of the statistical analysis and the rationale for their selection. The methods used to estimate efficacy of the primary course of vaccination, and of a booster dose, in preventing clinical episodes of uncomplicated and severe malaria, and to determine the duration of protection, are described. The interpretation of various measures of efficacy in terms of the potential public health impact of the vaccine is discussed.
The methodology selected to analyse the clinical trial must be scientifically sound, acceptable to regulatory authorities and meaningful to those responsible for malaria control and public health policy.
There is need for locally-derived age-specific clinical laboratory reference ranges of healthy Africans in sub-Saharan Africa. Reference values from North American and European populations are being used for African subjects despite previous studies showing significant differences. Our aim was to establish clinical laboratory reference values for African adolescents and young adults that can be used in clinical trials and for patient management.
Methods and Findings
A panel of 298, HIV-seronegative individuals aged 13–34 years was randomly selected from participants in two population-based cross-sectional surveys assessing HIV prevalence and other sexually transmitted infections in western Kenya. The adolescent (<18 years)-to-adults (≥18 years) ratio and the male-to-female ratio was 1∶1. Median and 95% reference ranges were calculated for immunohematological and biochemistry values. Compared with U.S-derived reference ranges, we detected lower hemoglobin (HB), hematocrit (HCT), red blood cells (RBC), mean corpuscular volume (MCV), neutrophil, glucose, and blood urea nitrogen values but elevated eosinophil and total bilirubin values. Significant gender variation was observed in hematological parameters in addition to T-bilirubin and creatinine indices in all age groups, AST in the younger and neutrophil, platelet and CD4 indices among the older age group. Age variation was also observed, mainly in hematological parameters among males. Applying U.S. NIH Division of AIDS (DAIDS) toxicity grading to our results, 40% of otherwise healthy study participants were classified as having an abnormal laboratory parameter (grade 1–4) which would exclude them from participating in clinical trials.
Hematological and biochemistry reference values from African population differ from those derived from a North American population, showing the need to develop region-specific reference values. Our data also show variations in hematological indices between adolescent and adult males which should be considered when developing reference ranges. This study provides the first locally-derived clinical laboratory reference ranges for adolescents and young adults in western Kenya.
Understanding shedding patterns of 2009 pandemic influenza A (H1N1) (pH1N1) can inform recommendations about infection control measures. We evaluated the duration of pH1N1 virus shedding in patients in Nairobi, Kenya.
Nasopharyngeal (NP) and oropharyngeal (OP) specimens were collected from consenting laboratory-confirmed pH1N1 cases every 2 days during October 14–November 25, 2009, and tested at the Centers for Diseases Control and Prevention-Kenya by real time reverse transcriptase polymerase chain reaction (rRT-PCR). A subset of rRT-PCR-positive samples was cultured.
Of 285 NP/OP specimens from patients with acute respiratory illness, 140 (49%) tested positive for pH1N1 by rRT-PCR; 106 (76%) patients consented and were enrolled. The median age was 6 years (Range: 4 months–41 years); only two patients, both asthmatic, received oseltamivir. The median duration of pH1N1 detection after illness onset was 8 days (95% CI: 7–10 days) for rRT-PCR and 3 days (Range: 0–13 days) for viral isolation. Viable pH1N1 virus was isolated from 132/162 (81%) of rRT-PCR-positive specimens, which included 118/125 (94%) rRT-PCR-positive specimens collected on day 0–7 after symptoms onset. Viral RNA was detectable in 18 (17%) and virus isolated in 7/18 (39%) of specimens collected from patients after all their symptoms had resolved.
In this cohort, pH1N1 was detected by rRT-PCR for a median of 8 days. There was a strong correlation between rRT-PCR results and virus isolation in the first week of illness. In some patients, pH1N1 virus was detectable after all their symptoms had resolved.
Characterizing infectious disease burden in Africa is important for prioritizing and targeting limited resources for curative and preventive services and monitoring the impact of interventions.
From June 1, 2006 to May 31, 2008, we estimated rates of acute lower respiratory tract illness (ALRI), diarrhea and acute febrile illness (AFI) among >50,000 persons participating in population-based surveillance in impoverished, rural western Kenya (Asembo) and an informal settlement in Nairobi, Kenya (Kibera). Field workers visited households every two weeks, collecting recent illness information and performing limited exams. Participants could access free high-quality care in a designated referral clinic in each site. Incidence and longitudinal prevalence were calculated and compared using Poisson regression.
Incidence rates resulting in clinic visitation were the following: ALRI — 0.36 and 0.51 episodes per year for children <5 years and 0.067 and 0.026 for persons ≥5 years in Asembo and Kibera, respectively; diarrhea — 0.40 and 0.71 episodes per year for children <5 years and 0.09 and 0.062 for persons ≥5 years in Asembo and Kibera, respectively; AFI — 0.17 and 0.09 episodes per year for children <5 years and 0.03 and 0.015 for persons ≥5 years in Asembo and Kibera, respectively. Annually, based on household visits, children <5 years in Asembo and Kibera had 60 and 27 cough days, 10 and 8 diarrhea days, and 37 and 11 fever days, respectively. Household-based rates were higher than clinic rates for diarrhea and AFI, this difference being several-fold greater in the rural than urban site.
Individuals in poor Kenyan communities still suffer from a high burden of infectious diseases, which likely hampers their development. Urban slum and rural disease incidence and clinic utilization are sufficiently disparate in Africa to warrant data from both settings for estimating burden and focusing interventions.
Assistive devices for persons with limited motor control translate or amplify remaining functions to allow otherwise impossible actions. These assistive devices usually rely on just one type of input signal which can be derived from residual muscle functions or any other kind of biosignal. When only one signal is used, the functionality of the assistive device can be reduced as soon as the quality of the provided signal is impaired. The quality can decrease in case of fatigue, lack of concentration, high noise, spasms, tremors, depending on the type of signal. To overcome this dependency on one input signal, a combination of more inputs should be feasible. This work presents a hybrid Brain-Computer Interface (hBCI) approach where two different input signals (joystick and BCI) were monitored and only one of them was chosen as a control signal at a time. Users could move a car in a game-like feedback application to collect coins and avoid obstacles via either joystick or BCI control. Both control types were constantly monitored with four different long term quality measures to evaluate the current state of the signals. As soon as the quality dropped below a certain threshold, a monitoring system would switch to the other control mode and vice versa. Additionally, short term quality measures were applied to check for strong artifacts that could render voluntary control impossible. These measures were used to prohibit actions carried out during times when highly uncertain signals were recorded. The switching possibility allowed more functionality for the users. Moving the car was still possible even after one control mode was not working any more. The proposed system serves as a basis that shows how BCI can be used as an assistive device, especially in combination with other assistive technology.
brain-computer interface; BCI; hybrid BCI; assistive technology; electroencephalography; EEG
Insecticide-treated bed nets (ITNs) reduce malaria transmission and are an important prevention tool. However, there are still information gaps on how the reduction in malaria transmission by ITNs affects parasite genetics population structure. This study examined the relationship between transmission reduction from ITN use and the population genetic diversity of Plasmodium falciparum in an area of high ITN coverage in western Kenya.
Parasite genetic diversity was assessed by scoring eight single copy neutral multilocus microsatellite (MS) markers in samples collected from P. falciparum-infected children (< five years) before introduction of ITNs (1996, baseline, n = 69) and five years after intervention (2001, follow-up, n = 74).
There were no significant changes in overall high mixed infections and unbiased expected heterozygosity between baseline (%MA = 94% and He = 0.75) and follow up (%MA = 95% and He = 0.79) years. However, locus specific analysis detected significant differences for some individual loci between the two time points. Pfg377 loci, a gametocyte-specific MS marker showed significant increase in mixed infections and He in the follow up survey (%MA = 53% and He = 0.57) compared to the baseline (%MA = 30% and He = 0.29). An opposite trend was observed in the erythrocyte binding protein (EBP) MS marker. There was moderate genetic differentiation at the Pfg377 and TAA60 loci (FST = 0.117 and 0.137 respectively) between the baseline and post-ITN parasite populations. Further analysis revealed linkage disequilibrium (LD) of the microsatellites in the baseline (14 significant pair-wise tests and ISA = 0.016) that was broken in the follow up parasite population (6 significant pairs and ISA = 0.0003). The locus specific change in He, the moderate population differentiation and break in LD between the baseline and follow up years suggest an underlying change in population sub-structure despite the stability in the overall genetic diversity and multiple infection levels.
The results from this study suggest that although P. falciparum population maintained an overall stability in genetic diversity after five years of high ITN coverage, there was significant locus specific change associated with gametocytes, marking these for further investigation.