Since community viral load (CVL) measurements are associated with incidence of new HIV-1 infections in a population, we hypothesized that similarly measured community drug resistance (CDR) could predict prevalence of transmitted drug resistance (TDR).
Between 2001 and 2011, the prevalence of HIV-1 drug resistance for patients with established infection receiving HIV care (i.e. CDR) and TDR in recently infected patients was determined in San Diego. At each position in HIV-1 reverse transcriptase (RT) and protease (pro), drug resistance was evaluated both as the overall prevalence of resistance associated mutations and by weighting each resistance position to concurrent viral load of the patient and its proportion to the total viral load of the clinic (CVL). The weighting was the proportion of the CVL associated with patients identified with resistance at each residue. Spearman ranked correlation coefficients were used to determine associations between CDR and TDR.
We analyzed 1,088 resistance tests from 971 clinic patients and baseline resistance tests for 542 recently infected patients. CDR at positions 30, 46, and 88 in pro was associated with TDR between 2001 and 2011. When CDR was weighted by viral load of patients, CDR was associated with TDR at position 103 in RT. Each of these associations was corroborated at least once using shorter measurement intervals.
Despite evaluation of a limited percentage of chronically infected patients in San Diego, CDR correlated with TDR at key resistance positions and therefore may be a useful tool to predict the prevalence of TDR.
HIV; Drug resistance, viral; Mutation; Antiretroviral therapy, highly active; Genotype; Viral load
Because of insufficient understanding of the molecular effects of low levels of radiation exposure, there is a great uncertainty regarding its health risks. We report here that treatment of normal human cells with low-dose radiation induces a metabolic shift from oxidative phosphorylation to aerobic glycolysis resulting in increased radiation resistance. This metabolic change is highlighted by upregulation of genes encoding glucose transporters and enzymes of glycolysis and the oxidative pentose phosphate pathway, concomitant with downregulation of mitochondrial genes, with corresponding changes in metabolic flux through these pathways. Mechanistically, the metabolic reprogramming depends on HIF1α, which is induced specifically by low-dose irradiation linking the metabolic pathway with cellular radiation dose response. Increased glucose flux and radiation resistance from low-dose irradiation are also observed systemically in mice. This highly sensitive metabolic response to low-dose radiation has important implications in understanding and assessing the health risks of radiation exposure.
We present a detailed analysis of sexual HIV transmission from one source partner to two recipients. The HLA haplotypes between the source partner and one recipient were very similar with 7 out of 8 HLA alleles from four loci (HLA A, B C and DRB) shared, while the other recipient shared only one allele. The immunologic outcomes between the two recipients differed dramatically, despite the absence of apparent virologic differences in their inoculums. We suggest that non-viral factors, which might be related to differences in the HLA profile, played a role in determining different CD4+ T-cells dynamics for these two recipients.
human immunodeficiency virus; cellular immunity; HLA; evolution; transmission; disease progression; concordance
Adolescent idiopathic scoliosis (AIS) is a deformity of the spine, which may require surgical correction by attaching a rod to the patient’s spine using screws implanted in the vertebral bodies. Surgeons achieve an intra-operative reduction in the deformity by applying compressive forces across the intervertebral disc spaces while they secure the rod to the vertebra. We were interested to understand how the deformity correction is influenced by increasing magnitudes of surgical corrective forces and what tissue level stresses are predicted at the vertebral endplates due to the surgical correction.
Patient-specific finite element models of the osseoligamentous spine and ribcage of eight AIS patients who underwent single rod anterior scoliosis surgery were created using pre-operative computed tomography (CT) scans. The surgically altered spine, including titanium rod and vertebral screws, was simulated. The models were analysed using data for intra-operatively measured compressive forces – three load profiles representing the mean and upper and lower standard deviation of this data were analysed. Data for the clinically observed deformity correction (Cobb angle) were compared with the model-predicted correction and the model results investigated to better understand the influence of increased compressive forces on the biomechanics of the instrumented joints.
The predicted corrected Cobb angle for seven of the eight FE models were within the 5° clinical Cobb measurement variability for at least one of the force profiles. The largest portion of overall correction was predicted at or near the apical intervertebral disc for all load profiles. Model predictions for four of the eight patients showed endplate-to-endplate contact was occurring on adjacent endplates of one or more intervertebral disc spaces in the instrumented curve following the surgical loading steps.
This study demonstrated there is a direct relationship between intra-operative joint compressive forces and the degree of deformity correction achieved. The majority of the deformity correction will occur at or in adjacent spinal levels to the apex of the deformity. This study highlighted the importance of the intervertebral disc space anatomy in governing the coronal plane deformity correction and the limit of this correction will be when bone-to-bone contact of the opposing vertebral endplates occurs.
Scoliosis; Finite element; Surgical forces; Correction
The Methods of Optimal Depression Detection in Parkinson's Disease (MOOD-PD) study compared the psychometric properties of 9 depression scales to provide guidance on scale selection in Parkinson disease (PD).
Patients with PD (n = 229) from community-based neurology practices completed 6 self-report scales (Beck Depression Inventory [BDI]–II, Center for Epidemiologic Studies Depression Rating Scale–Revised [CESD-R], 30-item Geriatric Depression Scale [GDS-30], Inventory of Depressive Symptoms–Patient [IDS-SR], Patient Health Questionnaire-9 [PHQ-9], and Unified Parkinson's Disease Rating Scale [UPDRS]–Part I) and were administered 3 clinician-rated scales (17-item Hamilton Depression Rating Scale [HAM-D-17], Inventory of Depressive Symptoms–Clinician [IDS-C], and Montgomery-Åsberg Depression Rating Scale [MADRS] and a psychiatric interview. DSM-IV-TR diagnoses were established by an expert panel blinded to the self-reported rating scale data. Receiver operating characteristic curves were used to estimate the area under the curve (AUC) of each scale.
All scales performed better than chance (AUC 0.75–0.85). Sensitivity ranged from 0.66 to 0.85 and specificity ranged from 0.60 to 0.88. The UPDRS Depression item had a smaller AUC than the BDI-II, HAM-D-17, IDS-C, and MADRS. The CESD-R also had a smaller AUC than the MADRS. The remaining AUCs were statistically similar.
The GDS-30 may be the most efficient depression screening scale to use in PD because of its brevity, favorable psychometric properties, and lack of copyright protection. However, all scales studied, except for the UPDRS Depression, are valid screening tools when PD-specific cutoff scores are used.
Huperzine A is a natural cholinesterase inhibitor derived from the Chinese herb Huperzia serrata that may compare favorably in symptomatic efficacy to cholinesterase inhibitors currently in use for Alzheimer disease (AD).
We assessed the safety, tolerability, and efficacy of huperzine A in mild to moderate AD in a multicenter trial in which 210 individuals were randomized to receive placebo (n = 70) or huperzine A (200 μg BID [n = 70] or 400 μg BID [n = 70]), for at least 16 weeks, with 177 subjects completing the treatment phase. The primary analysis assessed the cognitive effects of huperzine A 200 μg BID (change in Alzheimer's Disease Assessment Scale–cognitive subscale [ADAS-Cog] at week 16 at 200 μg BID compared to placebo). Secondary analyses assessed the effect of huperzine A 400 μg BID, as well as effect on other outcomes including Mini-Mental State Examination, Alzheimer's Disease Cooperative Study–Clinical Global Impression of Change scale, Alzheimer's Disease Cooperative Study Activities of Daily Living scale, and Neuropsychiatric Inventory (NPI).
Huperzine A 200 μg BID did not influence change in ADAS-Cog at 16 weeks. In secondary analyses, huperzine A 400 μg BID showed a 2.27-point improvement in ADAS-Cog at 11 weeks vs 0.29-point decline in the placebo group (p = 0.001), and a 1.92-point improvement vs 0.34-point improvement in the placebo arm (p = 0.07) at week 16. Changes in clinical global impression of change, NPI, and activities of daily living were not significant at either dose.
The primary efficacy analysis did not show cognitive benefit with huperzine A 200 μg BID.
Classification of evidence:
This study provides Class III evidence that huperzine A 200 μg BID has no demonstrable cognitive effect in patients with mild to moderate AD.
The debate continues regarding the best management for women with low-grade abnormal cervical cytology attending colposcopy. We compared psychosocial outcomes of alternative management policies in these women.
In all, 989 women, aged 20–59 years, with low-grade abnormal cytology, were randomised to immediate large loop excision (LLETZ) or two to four targeted punch biopsies taken immediately with recall for LLETZ if these showed cervical intra-epithelial neoplasia 2/3. At 6 weeks after the last procedure, women completed the hospital anxiety and depression scale (HADS) and the impact of event scale (IES). At 12, 18, 24 and 30 months post recruitment, women completed the HADS and process outcome specific measure (POSM). Prevalence of significant depression (⩾8), significant anxiety (⩾11) and distress (⩾9) and median POSM scores were compared between arms. Multivariate odds ratios (ORs) for immediate LLETZ vs biopsy and recall were computed.
Over the entire follow-up, there was no significant difference between arms in cumulative prevalence or risk of significant depression (OR=0.78, 95% CI 0.52–1.17) or significant anxiety (OR=0.83, 95% CI 0.57–1.19). At 6 weeks post procedure, distress did not differ significantly between arms. At later time points, 8–11% had significant depression and 14–16% had significant anxiety but with no differences between arms. The POSM scores did not differ between the arms.
There is no difference in long- or short-term psychosocial outcomes of immediate LLETZ and punch biopsies with selective recall.
anxiety; colposcopy; depression; distress; LLETZ; punch biopsies
Worldwide, over 1 million cases of colorectal cancer (CRC) were reported in 2002, with a 50% mortality rate, making CRC the second most common cancer in adults. Certain racial/ethnic populations continue to experience a disproportionate burden of CRC. A common polymorphism in the 5,10-methylenetetrahydrofolate reductase (MTHFR) gene has been associated with a lower risk of CRC. The authors performed both a meta-analysis (29 studies; 11,936 cases, 18,714 controls) and a pooled analysis (14 studies; 5,068 cases, 7,876 controls) of the C677T MTHFR polymorphism and CRC, with stratification by racial/ethnic population and behavioral risk factors. There were few studies on different racial/ethnic populations. The overall meta-analysis odds ratio for CRC for persons with the TT genotype was 0.83 (95% confidence interval (CI): 0.77, 0.90). An inverse association was observed in whites (odds ratio = 0.83, 95% CI: 0.74, 0.94) and Asians (odds ratio = 0.80, 95% CI: 0.67, 0.96) but not in Latinos or blacks. Similar results were observed for Asians, Latinos, and blacks in the pooled analysis. The inverse association between the MTHFR 677TT polymorphism and CRC was not significantly modified by smoking status or body mass index; however, it was present in regular alcohol users only. The MTHFR 677TT polymorphism seems to be associated with a reduced risk of CRC, but this may not hold true for all populations.
colorectal neoplasms; epidemiologic methods; epidemiology; folic acid; genetics; genetic variation; methylenetetrahydrofolate reductase (NADPH2); MTHFR C677T
HIV enters the brain soon after infection causing neuronal damage and microglial/astrocyte dysfunction leading to neuropsychological impairment. We examined the impact of HIV on resting cerebral blood flow (rCBF) within the lenticular nuclei (LN) and visual cortex (VC).
This cross-sectional study used arterial spin labeling MRI (ASL-MRI) to measure rCBF within 33 HIV+ and 26 HIV− subjects. Nonparametric Wilcoxon rank sum test assessed rCBF differences due to HIV serostatus. Classification and regression tree (CART) analysis determined optimal rCBF cutoffs for differentiating HIV serostatus. The effects of neuropsychological impairment and infection duration on rCBF were evaluated.
rCBF within the LN and VC were significantly reduced for HIV+ compared to HIV− subjects. A 2-tiered CART approach using either LN rCBF ≤50.09 mL/100 mL/min or LN rCBF >50.09 mL/100 mL/min but VC rCBF ≤37.05 mL/100 mL/min yielded an 88% (29/33) sensitivity and an 88% (23/26) specificity for differentiating by HIV serostatus. HIV+ subjects, including neuropsychologically unimpaired, had reduced rCBF within the LN (p = 0.02) and VC (p = 0.001) compared to HIV− controls. A temporal progression of brain involvement occurred with LN rCBF significantly reduced for both acute/early (<1 year of seroconversion) and chronic HIV-infected subjects, whereas rCBF in the VC was diminished for only chronic HIV-infected subjects.
Resting cerebral blood flow (rCBF) using arterial spin labeling MRI has the potential to be a noninvasive neuroimaging biomarker for assessing HIV in the brain. rCBF reductions that occur soon after seroconversion possibly reflect neuronal or vascular injury among HIV+ individuals not yet expressing neuropsychological impairment.
= acute/early HIV infection;
= analysis of variance;
= arterial spin labeling MRI;
= classification and regression tree;
= cerebral blood flow;
= chronic HIV infection;
= field of view;
= global deficit score;
= highly active antiretroviral therapy;
= HIV-associated neurocognitive disorders;
= lenticular nuclei;
= resting cerebral blood flow;
= echo time;
= inversion time;
= repetition time;
= visual cortex.
Human immunodeficiency virus (HIV-1) can rapidly evolve due to selection pressures exerted by HIV-specific immune responses, antiviral agents, and to allow the virus to establish infection in different compartments in the body. Statistical models applied to HIV-1 sequence data can help to elucidate the nature of these selection pressures through comparisons of non-synonymous (or amino acid changing) and synonymous (or amino acid preserving) substitution rates. These models also need to take into account the non-independence of sequences due to their shared evolutionary history. We review how we have developed these methods and have applied them to characterize the evolution of HIV-1 in vivo.To illustrate our methods, we present an analysis of compartment-specific evolution of HIV-1 env in blood and cerebrospinal fluid and of site-to-site variation in the gag gene of subtype C HIV-1.
The world age-standardised prevalence of high-risk HPV (hrHPV) infection among 5038 UK women aged 20–59 years, with a low-grade smear during 1999–2002, assessed for eligibility for TOMBOLA (Trial Of Management of Borderline and Other Low-grade Abnormal smears) was 34.2%. High-risk HPV prevalence decreased with increasing age, from 61% at ages 20–24 years to 14–15% in those over 50 years. The age-standardised prevalence was 15.1, 30.7 and 52.7%, respectively, in women with a current normal, borderline nuclear abnormalities (BNA) and mild smear. In overall multivariate analyses, tertiary education, previous pregnancy and childbirth were associated with reduced hrHPV infection risk. Risk of infection was increased in non-white women, women not married/cohabiting, hormonal contraceptives users and current smokers. In stratified analyses, current smear status and age remained associated with hrHPV infection. Data of this type are relevant to the debate on human papillomavirus (HPV) testing in screening and development of HPV vaccination programmes.
HPV infection; lifestyle factors; cervical cancer
High levels of fatty acid synthase (FAS) expression have been observed in several cancers, including breast, prostate, colon and lung carcinoma, compared with their respective normal tissue. We present data that show high levels of FAS protein in human and rat glioma cell lines and human glioma tissue samples, as compared to normal rat astrocytes and normal human brain. Incubating glioma cells with the FAS inhibitor cerulenin decreased endogenous fatty acid synthesis by approximately 50%. Cell cycle analysis demonstrated a time- and dose-dependent increase in S-phase cell arrest following cerulenin treatment for 24 h. Further, treatment with cerulenin resulted in time- and dose-dependent decreases in glioma cell viability, as well as reduced clonogenic survival. Increased apoptotic cell death and PARP cleavage were observed in U251 and SNB-19 cells treated with cerulenin, which was independent of the death receptor pathway. Overexpressing Bcl-2 inhibited cerulenin-mediated cell death. In contrast, primary rat astrocytes appeared unaffected. Finally, RNAi-mediated knockdown of FAS leading to reduced FAS enzymatic activity was associated with decreased glioma cell viability. These findings suggest that FAS might be a novel target for antiglioma therapy.
fatty acid synthase; glioma cells; cerulenin; cell cycle arrest; apoptosis; Bcl-2
Viruses encounter changing selective pressures during transmission between hosts, including host-specific immune responses and potentially varying functional demands on specific proteins. The human immunodeficiency virus type 1 Nef protein performs several functions potentially important for successful infection, including immune escape via down-regulation of class I major histocompatibility complex (MHC-I) and direct enhancement of viral infectivity and replication. Nef is also a major target of the host cytotoxic T-lymphocyte (CTL) response. To examine the impact of changing selective pressures on Nef functions following sexual transmission, we analyzed genetic and functional changes in nef clones from six transmission events. Phylogenetic analyses indicated that the diversity of nef was similar in both sources and acutely infected recipients, the patterns of selection across transmission were variable, and regions of Nef associated with distinct functions evolved similarly in sources and recipients. These results weighed against the selection of specific Nef functions by transmission or during acute infection. Measurement of Nef function provided no evidence that the down-regulation of either CD4 or MHC-I was optimized by transmission or during acute infection, although rare nef clones from sources that were impaired in these activities were not detected in recipients. Nef-specific CTL activity was detected as early as 3 weeks after infection and appeared to be an evolutionary force driving the diversification of nef. Despite the change in selective pressure between the source and recipient immune systems and concomitant genetic diversity, the majority of Nef proteins maintained robust abilities to down-regulate MHC-I and CD4. These data suggest that both functions are important for the successful establishment of infection in a new host.
Receipt of an abnormal cervical smear result often generates fear and confusion and can have a negative impact on a woman's well-being. Most previous studies have focussed on high-grade abnormal smears. This study describes the psychological and psychosocial effects, on women, of having received a low-grade abnormal smear result. Over 3500 women recruited to TOMBOLA (Trial Of Management of Borderline and Other Low-grade Abnormal smears) participated in this study. Anxiety was assessed using the Hospital Anxiety and Depression Scale (HADS) at recruitment. Socio-demographic and lifestyle factors, locus of control and factors associated with the psychosocial impact of the abnormal smear result were also assessed. Women reported anxiety levels consistent with those found in previous studies of women with high-grade smear results. Women at highest risk of anxiety were younger, had children, were current smokers, or had the highest levels of physical activity. Interventions that focus particularly on women's understanding of smear results and pre-cancer, and/or directly address their fears about cancer, treatment and fertility might provide the greatest opportunity to reduce the adverse psychosocial impact of receiving a low-grade abnormal cervical smear result.
cervical intraepithelial neoplasia; mass screening; psychological factors; anxiety; questionnaires
Objective: To investigate the frequency of neonatal and later childhood morbidity in children exposed to antiepileptic drugs in utero.
Design: Retrospective population based study.
Setting: Population of the Grampian region of Scotland.
Participants: Mothers taking antiepileptic drugs in pregnancy between 1976 and 2000 were ascertained from hospital obstetric records and 149 (58% of those eligible) took part. They had 293 children whose health and neurodevelopment were assessed.
Main outcome measures: Frequencies of neonatal withdrawal, congenital malformations, childhood onset medical problems, developmental delay, and behaviour disorders.
Results: Neonatal withdrawal was seen in 20% of those exposed to antiepileptic drugs. Congenital malformations occurred in 14% of exposed pregnancies, compared with 5% of non-exposed sibs, and developmental delay in 24% of exposed children, compared with 11% of non-exposed sibs. After excluding cases with a family history of developmental delay, 19% of exposed children and 3% of non-exposed sibs had developmental delay, 31% of exposed children had either major malformations or developmental delay, 52% of exposed children had facial dysmorphism compared with 25% of those not exposed, 31% of exposed children had childhood medical problems (13% of non-exposed sibs), and 20% had behaviour disorders (5% of non-exposed).
Conclusion: Prenatal antiepileptic drug exposure in the setting of maternal epilepsy is associated with developmental delay and later childhood morbidity in addition to congenital malformation.
For common cancers, survival is poorer for deprived and outlying, rural patients. This study investigated whether there were differences in treatment of colorectal and lung cancer in these groups. Case notes of 1314 patients in north and northeast Scotland who were diagnosed with lung or colorectal cancer in 1995 or 1996 were reviewed. On univariate analysis, the proportions of patients receiving surgery, chemotherapy and radiotherapy appeared similar in all socio-economic and rural categories. Adjusting for disease stage, age and other factors, there was less chemotherapy among deprived patients with lung cancer (odds ratio 0.39; 95% confidence intervals 0.16 to 0.96) and less radiotherapy among outlying patients with colorectal cancer (0.39; 0.19 to 0.82). The time between first referral and treatment also appeared similar in all socio-economic and rural groups. Adjusting for disease stage and other variables, times to lung cancer treatment remained similar, but colorectal cancer treatment was quicker for outlying patients (adjusted hazard ratio 1.30; 95% confidence intervals 1.03 to 1.64). These findings suggest that socio-economic status and rurality may have a minor impact on modalities of treatment for colorectal and lung cancer, but do not lead to delays between referral and treatment.
British Journal of Cancer (2002) 21, 585–590. doi:10.1038/sj.bjc.6600515 www.bjcancer.com
© 2002 Cancer Research UK
Lung neoplasms; colorectal neoplasms; delivery of healthcare; socioeconomic factors rural population urban population
Disease patterns in nature may be determined by genetic variation for resistance or by factors, genetic or environmental, which influence the host-parasite encounter rate. Elucidating the cause of natural infection patterns has been a major pursuit of parasitologists, but it also matters for evolutionary biologists because host resistance genes must influence the expression of disease if parasite-mediated selection is to occur. We used a model system in order to disentangle the strict genetic component from other causes of infection in the wild. Using the crustacean Daphnia magna and its sterilizing bacterial parasite Pasteuria ramosa, we tested whether genetic variation for resistance, as determined under controlled conditions, accounted for the distribution of infections within natural populations. Specifically, we compared whether the clonally produced great-granddaughters of those individuals that were infected in field samples (but were subsequently 'cured' with antibiotics) were more susceptible than were the great-granddaughters of those individuals that were healthy in field samples. High doses of parasite spores led to increased infection in all four study populations, indicating the importance of encounter rate. Host genetics appeared to be irrelevant to natural infection patterns in one population. However, in three other populations hosts that were healthy in the field had greater genetic-based resistance than hosts that were infected in the field, unambiguously showing the effect of host genetic factors on the expression of disease in the wild.
There is evidence that patients living in outlying areas have poorer survival from cancer. This study set out to investigate whether they have more advanced disease at diagnosis. Case notes of 1323 patients in north and northeast Scotland who were diagnosed with lung or colorectal cancer in 1995 or 1996 were reviewed. Of patients with lung cancer, 42% (69/164) living 58 km or more from a cancer centre had disseminated disease at diagnosis compared to 33% (71/215) living within 5 km. For colorectal cancer the respective figures were 24% (38/161) and 16% (31/193). For both cancers combined, the adjusted odds ratio for disseminated disease at diagnosis in furthest group compared to the closest group was 1.59 (P = 0.037). Of 198 patients with non-small-cell lung cancer in the closest group, 56 (28%) had limited disease (stage I or II) at diagnosis compared to 23 of 165 (14%) of the furthest group (P = 0.002). The respective figures for Dukes A and B colorectal cancer were 101 of 196 (52%) and 67 of 172 (39%) (P = 0.025). These findings suggest that patients who live remote from cities and the associated cancer centres have poorer chances of survival from lung or colorectal cancer because of more advanced disease at diagnosis. This needs to be taken into account when planning investigation and treatment services. © 2001 Cancer Research Campaign http://www.bjcancer.com
colorectal cancer; lung cancer; epidemiology; rural; urban; staging
In this survival study 63 976 patients diagnosed with one of six common cancers in Scotland were followed up. Increasing distance from a cancer centre was associated with less chance of diagnosis before death for stomach, breast and colorectal cancers and poorer survival after diagnosis for prostate and lung cancers. © 2000 Cancer Research Campaign
survival; rural; urban; cancer registry
In an attempt to ensure high quality cancer treatment for all patients in the UK, care is being centralized in specialist centres and units. For patients in outlying areas, however, access problems may adversely affect treatment. In an attempt to assess alternative methods of delivering cancer care, this paper reviews published evidence about programmes that have set out to provide oncology services in remote and rural areas in order to identify evidence of effectiveness and problems. Keyword and textword searches of on-line databases (MEDLINE, EMBASE, HEALTHSTAR and CINAHL) from 1978 to 1997 and manual searches of references were conducted. Fifteen papers reported evaluations of oncology outreach programmes, tele-oncology programmes and rural hospital initiatives. All studies were small and only two were controlled, so evidence was suggestive rather than conclusive. There were some indications that shared outreach care was safe and could make specialist care more accessible to outlying patients. Tele-oncology, by which some consultations are conducted using televideo, may be an acceptable adjunct. Larger and more methodologically robust studies are justified and should be conducted. © 1999 Cancer Research Campaign
cancer treatment; rural areas; patterns of care; systematic review
Species of Bosmina from the temperate regions of North America and Europe are diploid and reproduce by cyclical parthenogenesis. By contrast, this study provides evidence that the dominant bosminid taxon in High Arctic lakes reproduces by obligate parthenogenesis and is a polyploid derived from interspecific hybridization. Sinobosmina liederi, a species common in temperate North America, is likely to have been one parent of these hybrids, but the other parent is unknown. As neither parent was detected in the Arctic, it seems unlikely that the hybrid clones that now occupy arctic lakes were synthesized locally. Most habitats contained only one or two clones, despite a total of 38 clones in the region, suggesting that priority effects have been important in restricting diversity within single lakes. The high regional diversity of arctic bosminids could reflect either repeated hybridization between the parent taxa or the genetic instability of newly formed polyploid lineages. These processes would produce hybrid polyploids that are considerably more diverse than their sexual parent taxa, and this difference in genetic diversity may confer an advantage to the polyploid biotype. As many zooplankton taxa from the arctic possess genetic characteristics similar to those of bosminids, these processes may provide a general explanation for the widespread occurrence of polyploids in the Arctic.
Hybridization Zooplankton Cladocera Allozymes Bosmina Parthenogenesis
BACKGROUND: Demand for consultations in primary care has risen recently, necessitating a change in working practices. As part of this process, the possible contribution of practice nurses in the telephone assessment of home visit requests merits attention. AIMS: To survey the views of our patients encountering our nurse triage system for home visit requests, set up in June 1995, and to plot its effect on the routine visiting workload of our doctors and thus their availability at the surgery. METHOD: The outcome of each request was categorized as: doctor to visit (DV), surgery consultation with doctor (SC), nurse advice given and accepted (NA), or call passed to doctor for advice (DA). Frequency data from September 1995 to December 1996 were recovered. Questionnaires for self-completion were sent to all those requesting a routine weekday house call during two four-week periods in 1995 and 1996. RESULTS: Analysable activity data revealed 1764 house call requests, with 41% DV, 18% SC, 24% NA, and 8% DA. In the first survey, 121 questionnaires were sent out and 84 returned (69% response rate) and, in the second, the corresponding figures were 113, 85, and 75%. About 80% of responders reported that they were satisfied with the help received from the nurse. CONCLUSIONS: Nurse triage of house call requests has led to more efficient care for our patients, as we have increased the availability of surgery consultations by reducing the number of house calls made by our general practitioners.
An international case-control study of primary pediatric brain tumors included interviews with mothers of cases diagnosed from 1976 to 1994 and mothers of population controls. Data are available on maternal vitamin use during pregnancy for 1051 cases and 1919 controls from eight geographic areas in North America, Europe, and Israel. Although risk estimates varied by study center, combined results suggest that maternal supplementation for two trimesters may decrease risk of brain tumor (odds ratio [OR] 0.7, 95% confidence interval [CI] 0.5-0.9), with a trend of less risk with longer duration of use (p trend = 0.0007). The greatest risk reduction was among children diagnosed under 5 years of age whose mothers used supplements during all three trimesters (OR 0.5, CI 0.3-0.8). This effect did not vary by histology and was seen for supplementation during pregnancy rather than during the month before pregnancy or while breast feeding. These findings are largely driven by data from the United States, where most mothers took vitamins. The proportion of control mothers who took vitamins during pregnancy varied tremendously: from 3% in Israel and France, 21% in Italy, 33% in Canada, 52% in Spain and 86 to 92% at the three U.S. centers. The composition of the various multivitamin compounds taken also varied: the daily dose of vitamin C ranged from 0 to 600 mg, vitamin E ranged from 0 to 70 mg, vitamin A ranged from 0 to 30,000 IU, and folate ranged from 0 to 2000 micrograms. Mothers also took individual micronutrient supplements (e.g., vitamin C tablets), but most mothers who took these also took multivitamins, making it impossible to determine potential independent effects of these micronutrients.