While numerous studies support the efficacy of methadone and buprenorphine for the stabilization and maintenance of opioid dependence, clinically significant opioid withdrawal symptoms occur upon tapering and cessation of dosage.
We present a case study of a 35 year old Caucasian female (Krissie) who was prescribed increasing dosages of prescription opioids after carpel tunnel surgery secondary to chronic pain from reflex sympathetic dystrophy and fibromyalgia. Over the next 5 years, daily dosage requirements increased to over 80 mg of Methadone and 300 ug/hr Fentanyl transdermal patches, along with combinations of 12–14 1600 mcg Actig lollipop and oral 100 mg Morphine and 30 mg oxycodone 1–2 tabs q4-6hr PRN for breakthrough pain. Total monthly prescription costs including supplemental benzodiazepines, hypnotics and stimulants exceeded $50,000. The patient was subsequently transferred to Suboxone® in 2008, and the dosage was gradually tapered until her admission for inpatient detoxification with KB220Z a natural dopaminergic agonist. We carefully documented her withdrawal symptoms when she precipitously stopped taking buprenorphine/naloxone and during follow-up while taking KB220Z daily. We also genotyped the patient using a reward gene panel including (9 genes 18 alleles): DRD 2,3,4; MOA-A; COMT; DAT1; 5HTTLLR; OPRM1; and GABRA3.
At 432 days post Suboxone® withdrawal the patient is being maintained on KB220Z, has been urine tested and is opioid free. Genotyping data revealed a moderate genetic risk for addiction showing a hypodopaminergic trait. This preliminary case data suggest that the daily use of KB220Z could provide a cost effective alternative substitution adjunctive modality for Suboxone®. We encourage double-blind randomized –placebo controlled studies to test the proposition that KB220Z may act as a putative natural opioid substitution maintenance adjunct.
Buprenorphine/naloxone; Withdrawal; Natural dopaminergic agonist
It is well established that inherited human aldehyde dehydrogenase 2 (ALDH-2) deficiency reduces the risk for alcoholism. Kudzu plants and extracts have been used for 1,000 years in traditional Chinese medicine to treat alcoholism. Kudzu contains daidzin, which inhibits ALDH-2 and suppresses heavy drinking in rodents. Decreased drinking due to ALDH-2 inhibition is attributed to aversive properties of acetaldehyde accumulated during alcohol consumption. However not all of the anti-alcohol properties of diadzin are due to inhibition of ALDH-2. This is in agreement with our earlier work showing significant interaction effects of both pyrozole (ALDH-2 inhibitor) and methyl-pyrozole (non-inhibitor) and ethanol’s depressant effects. Moreover, it has been suggested that selective ALDH 2 inhibitors reduce craving for alcohol by increasing dopamine in the nucleus accumbens (NAc). In addition there is significant evidence related to the role of the genetics of bitter receptors (TAS2R) and its stimulation as an aversive mechanism against alcohol intake. The inclusion of bitters such as Gentian & Tangerine Peel in Declinol provides stimulation of gut TAS2R receptors which is potentially synergistic with the effects of Kudzu. Finally the addition of Radix Bupleuri in the Declinol formula may have some protective benefits not only in terms of ethanol induced liver toxicity but neurochemical actions involving endorphins, dopamine and epinephrine. With this information as a rationale, we report herein that this combination significantly reduced Alcohol Use Disorders Identification Test (AUDIT) scores administered to ten heavy drinkers (M=8, F=2; 43.2 ± 14.6 years) attending a recovery program. Specifically, from the pre-post comparison of the AUD scores, it was found that the score of every participant decreased after the intervention which ranged from 1 to 31. The decrease in the scores was found to be statistically significant with the p-value of 0.00298 (two-sided paired test; p-value = 0.00149 for one-sided test). Albeit this being a small pilot, we are encouraged about these significant results, and caution any interpretation until larger controlled studies are executed.
Declinol; Kudzu; Daidzin; ALDH 2 inhibitors; Dopamine; Gentian and tangerine peel; Radix burpleuri; Alcoholism and reward deficiency
Genetic mediated physiological processes that rely on both pharmacological and nutritional principles hold great promise for the successful therapeutic targeting of reduced carbohydrate craving, body-friendly fat loss, healthy body recomposition, and overall wellness. By integrating an assembly of scientific knowledge on inheritable characteristics and environmental mediators of gene expression, we review the relationship of genes, hormones, neurotransmitters, and nutrients as they correct unwanted weight gain coupled with unhappiness. In contrast to a simple one-locus, one-mechanism focus on pharmaceuticals alone, we hypothesize that the use of nutrigenomic treatment targeting multi-physiological neurological, immunological, and metabolic pathways will enable clinicians to intercede in the process of lipogenesis by promoting lipolysis while attenuating aberrant glucose cravings. In turn, this approach will enhance wellness in a safe and predictable manner through the use of a Genetic Positioning System (GPS) Map. The GPS Map, while presently incomplete, ultimately will serve not only as a blueprint for personalized medicine in the treatment of obesity, but also for the development of strategies for reducing many harmful addictive behaviors and promoting optimal health by using substances compatible with the body’s immune system.
In accord with the new definition of addiction published by American Society of Addiction Medicine (ASAM) it is well-known that individuals who present to a treatment center involved in chemical dependency or other documented reward dependence behaviors have impaired brain reward circuitry. They have hypodopaminergic function due to genetic and/or environmental negative pressures upon the reward neuro-circuitry. This impairment leads to aberrant craving behavior and other behaviors such as Substance Use Disorder (SUD). Neurogenetic research in both animal and humans revealed that there is a well-defined cascade in the reward site of the brain that leads to normal dopamine release. This cascade has been termed the “Brain Reward Cascade” (BRC). Any impairment due to either genetics or environmental influences on this cascade will result in a reduced amount of dopamine release in the brain reward site. Manipulation of the BRC has been successfully achieved with neuro-nutrient therapy utilizing nutrigenomic principles. After over four decades of development, neuro-nutrient therapy has provided important clinical benefits when appropriately utilized. This is a review, with some illustrative case histories from a number of addiction professionals, of certain molecular neurobiological mechanisms which if ignored may lead to clinical complications.
Neuro-nutrient therapy; Neuroadaptagen Amino-Acid Therapy™ (NAAT); Brain reward circuitry; Reward Deficiency Syndrome(RDS); Neurogenetics; Nutrigenomics; Dopamine; Reward Genes
Executive functions are processes that act in harmony to control behaviors necessary for maintaining focus and achieving outcomes. Executive dysfunction in neuropsychiatric disorders is attributed to structural or functional pathology of brain networks involving prefrontal cortex (PFC) and its connections with other brain regions. The PFC receives innervations from different neurons associated with a number of neurotransmitters, especially dopamine (DA). Here we review findings on the contribution of PFC DA to higher-order cognitive and emotional behaviors. We suggest examination of multifactorial interactions of an individual’s genetic history, along with environmental risk factors, can assist in the characterization of executive functioning for that individual. Based upon the results of genetic studies we also propose genetic mapping as a probable diagnostic tool serving as a therapeutic adjunct for augmenting executive functioning capabilities. We conclude that preservation of the neurological underpinnings of executive functions requires the integrity of complex neural systems including the influence of specific genes and associated polymorphisms to provide adequate neurotransmission.
Executive functions; dopamine; prefrontal cortex; genetics; Reward Deficiency Syndrome (RDS)
Background and Hypothesis
It is well known that after prolonged abstinence, individuals who imbibe or use their drug of choice experience a powerful euphoria that precipitates serious relapse. While a biological explanation for this conundrum has remained elusive, we hypothesize that this clinically observed “super sensitivity” might be tied to genetic dopaminergic polymorphisms. Another therapeutic conundrum relates to the paradoxical finding that the dopaminergic agonist bromocriptine induces stronger activation of brain reward circuitry in individuals who carry the DRD2 A1 allele compared to DRD2 A2 allele carriers. Based upon the fact that carriers of the A1 allele relative to the A2 allele of the DRD2 gene have significantly lower D2 receptor density, a reduced sensitivity to dopamine agonist activity would be expected in the former. Thus, it is perplexing that with low D2 density there is an increase in reward sensitivity with the dopamine agonist bromocriptine. Moreover, under chronic or long-term therapy, the potential proliferation of D2 receptors with bromocriptine has been shown in vitro. This seems to lead to a positive outcome and significantly better treatment compliance only in A1 carriers.
Proposal and Conclusion
We propose that low D2 receptor density and polymorphisms of the D2 gene are associated with risk for relapse of substance abuse including alcohol dependence, heroin craving, cocaine dependence, methamphetamine abuse, nicotine sensitization, and glucose craving. With this in mind, we suggest a putative physiological mechanism that may help to explain the enhanced sensitivity following intense acute dopaminergic D2 receptor activation: “denervation supersensitivity.” Thus, the administration of dopamine D2 agonists would target D2 sensitization and attenuate relapse, especially in D2 receptor A1 allele carriers. This hypothesized mechanism is supported by clinical trials utilizing the amino-acid neurotransmitter precursors, enkephalinase and catechol-O-methyl-transferase (COMT) enzyme inhibition, which have resulted in attenuated relapse rates in Reward Deficiency Syndrome (RDS) probands. Future warranted translational research with positive outcome showing prevented or lower relapse in RDS will ultimately support the proposed concept, which we term “Deprivation-Amplification Relapse Therapy (DART).”
Fluorodeoxyglucose (FDG) Positron Emission Topography (PET) brain hypometabolism (HM) correlates with diminished cognitive capacity and risk of developing dementia. However, because clinical utility of PET is limited by cost, we sought to determine whether a less costly electrophysiological measure, the P300 evoked potential, in combination with neuropsychological test performance, would validate PET HM in neuropsychiatric patients. We found that patients with amnestic and non-amnestic cognitive impairment and HM (n = 43) evidenced significantly reduced P300 amplitudes, delayed latencies, and neuropsychological deficits, compared to patients with normal brain metabolism (NM; n = 187). Data from patients with missing cognitive test scores (n = 57) were removed from the final sample, and logistic regression modeling was performed on the modified sample (n = 173, p = .000004). The logistic regression modeling, based on P300 and neuropsychological measures, was used to validate membership in the HM vs. NM groups. It showed classification validation in 13/25 HM subjects (52.0%) and in 125/148 NM subjects (84.5%), correlating with total classification accuracy of 79.8%. In this paper, abnormal P300 evoked potentials coupled with cognitive test impairment validates brain metabolism and mild/moderate cognitive impairment (MCI). To this end, we cautiously propose incorporating electrophysiological and neuropsychological assessments as cost-effective brain metabolism and MCI indicators in primary care. Final interpretation of these results must await required additional studies confirming these interesting results.
Mindful of the new evolutionary ideas related to an emerging scientific focus known as omics, we propose that spiritual, social, and political behaviors may be tied in part to inheritable reward gene polymorphisms, as has been demonstrated for the addictions. If so, analyses of gene polymorphisms may assist in predicting liberalism or conservatism in partisan attachments. For example, both drinking (alcohol) and obesity seem to cluster in large social networks and are influenced by friends having the same genotype, in particular the DRD2 A1 allele. Likewise, voting, voting turnout and attachment to a particular political ideology is differentially related to various reward genes (e.g., 5HTT, MOA, DRD2, and DRD4), possibly predicting liberalism or conservatism. Moreover, voters’ genetic information may predict presidential outcomes more than the actual issues at hand or the presidential candidates themselves. Thus, political discussions on TV, radio, or other media may be morphed by one’s reward gene polymorphisms and as such, may explain the prevalence of generations of die-hard republicans and equally entrenched democratic legacies. Indeed, even in politics, birds of a feather (homophily) flock together. We caution that our proposal should be viewed mindfully awaiting additional research before definitive statements or conclusions can be derived from the studies to date, and we encourage large scale studies to confirm these earlier reports.
Liberalism; Conservatism; Politics; Friendships; Happiness; Reward Gene Polymorphisms
Obesity is a serious disease that is associated with an increased risk of diabetes, hypertension, heart disease, stroke, and cancer, among other diseases. The United States Centers for Disease Control and Prevention (CDC) estimates a 20% obesity rate in the 50 states, with 12 states having rates of over 30%. Currently, the body mass index (BMI) is most commonly used to determine adiposity. However, BMI presents as an inaccurate obesity classification method that underestimates the epidemic and contributes to failed treatment. In this study, we examine the effectiveness of precise biomarkers and duel-energy x-ray absorptiometry (DXA) to help diagnose and treat obesity.
A cross-sectional study of adults with BMI, DXA, fasting leptin and insulin results were measured from 1998–2009. Of the participants, 63% were females, 37% were males, 75% white, with a mean age = 51.4 (SD = 14.2). Mean BMI was 27.3 (SD = 5.9) and mean percent body fat was 31.3% (SD = 9.3). BMI characterized 26% of the subjects as obese, while DXA indicated that 64% of them were obese. 39% of the subjects were classified as non-obese by BMI, but were found to be obese by DXA. BMI misclassified 25% men and 48% women. Meanwhile, a strong relationship was demonstrated between increased leptin and increased body fat.
Our results demonstrate the prevalence of false-negative BMIs, increased misclassifications in women of advancing age, and the reliability of gender-specific revised BMI cutoffs. BMI underestimates obesity prevalence, especially in women with high leptin levels (>30 ng/mL). Clinicians can use leptin-revised levels to enhance the accuracy of BMI estimates of percentage body fat when DXA is unavailable.
Background and Hypothesis:
Although the biological underpinnings of immediate and protracted trauma-related responses are extremely complex, 40 years of research on humans and other mammals have demonstrated that trauma (particularly trauma early in the life cycle) has long-term effects on neurochemical responses to stressful events. These effects include the magnitude of the catecholamine response and the duration and extent of the cortisol response. In addition, a number of other biological systems are involved, including mesolimbic brain structures and various neurotransmitters. An understanding of the many genetic and environmental interactions contributing to stress-related responses will provide a diagnostic and treatment map, which will illuminate the vulnerability and resilience of individuals to Posttraumatic Stress Disorder (PTSD).
Proposal and Conclusions:
We propose that successful treatment of PTSD will involve preliminary genetic testing for specific polymorphisms. Early detection is especially important, because early treatment can improve outcome. When genetic testing reveals deficiencies, vulnerable individuals can be recommended for treatment with “body friendly” pharmacologic substances and/or nutrients. Results of our research suggest the following genes should be tested: serotoninergic, dopaminergic (DRD2, DAT, DBH), glucocorticoid, GABAergic (GABRB), apolipoprotein systems (APOE2), brain-derived neurotrophic factor, Monamine B, CNR1, Myo6, CRF-1 and CRF-2 receptors, and neuropeptide Y (NPY). Treatment in part should be developed that would up-regulate the expression of these genes to bring about a feeling of well being as well as a reduction in the frequency and intensity of the symptoms of PTSD.
Post-traumatic Stress Disorder (PTSD); genes and environment; neurotransmitters; Reward Deficiency Syndrome (RDS).
The goal of this study was to determine if impairments detected by the test of variables of attention (TOVA) may be used to predict early attention complaints and memory impairments accurately in a clinical setting. We performed a statistical analysis of outcomes in a patient population screened for attention deficit hyperactivity disorder or attention complaints, processing errors as measured by TOVA and the Wechsler Memory Scale (WMS-III) results. Attention deficit disorder (ADD) checklists, constructed using the Diagnostic and Statistical Manual of Mental Disorders 4th Edition criteria, which were completed by patients at PATH Medical, revealed that 72.8% of the patients had more than one attention complaint out of a total of 16 complaints, and 41.5% had more than five complaints. For the 128 males with a significant number of ADD complaints, individuals whose scores were significantly deviant or borderline (SDB) on TOVA, had a significantly greater number of attention complaints compared with normals for omissions (P < 0.02), response time (P < 0.015), and variability (P < 0.005), but not commissions (P > 0.50). For males, the mean scores for auditory, visual, immediate, and working memory scores as measured by the WMS-III were significantly greater for normals versus SDBs on the TOVA subtest, ie, omission (P < 0.01) and response time (P < 0.05), but not variability or commissions. The means for auditory, visual, and immediate memory scores were significantly greater for normals versus SDBs for variability (P < 0.045) only. In females, the mean scores for visual and working memory scores were significantly greater for normals versus SDBs for omissions (P < 0.025). The number of SDB TOVA quarters was a significant predictor for “impaired” or “normal” group membership for visual memory (P < 0.015), but not for the other three WMS-III components. For males, the partial correlation between the number of attention complaints and the number of SDB TOVA quarters was also significant (r = 0.251, P < 0.005). For the 152 females with a significant number of attention complaints, no significant differences between SDBs and normals were observed (P > 0.15). This is the first report, to our knowledge, which provides evidence that TOVA is an accurate predictor of early attention complaints and memory impairments in a clinical setting. This finding is more robust for males than for females between the ages of 40 and 90 years.
attention complaints; dementia; TOVA; Wechsler Memory Scale
Adult Growth hormone Deficiency is a well known phenomenon effecting both males and females. Adult Growth Hormone Deficiency is marked by a number of neuropsychiatric, cognitive performance, cardiac, metabolic, muscular, and bone symptoms and clinical features. There is no known standardized acceptable therapeutic modality to treat this condition. A recent meta-analysis found that after 16 years of Growth Hormone replacement therapy a large proportion of the patients still had Growth Hormone associated symptoms especially related to executive functioning. A major goal is to increase plasma levels of both insulin-like growth factor (insulin-like growth factor-1) and insulin-like growth factor binding protein 3.
We report a case of a 45-year-old caucasian woman with early ovarian failure for 2 years and amenorrhea since the age of 43, who presented with Adult Growth Hormone Deficiency and an IGF-1 of 126 ng/mL. Since her insulin-like growth factor-1 was lowest at 81 ng/mL, she was started on insulin-like growth factor-1 Increlex at 0.2 mg at bedtime, which immediately raised her insulin-like growth factor-1 levels to 130 ng/mL within 1 month, and 193 ng/mL, 249 ng/mL, and 357 ng/mL, after 3, 4, and 5 months, respectively, thereafter. Her insulin-like growth factor binding protein 3 continued to decrease. It was at this point when we added back the Growth Hormone and increased her Increlex dosage to 1.3 - 1.5 mg that her insulin-like growth factor binding protein 3 began to increase.
It appears that in some patients with Adult Growth Hormone Deficiency, insulin-like growth factor-1 elevation is resistant to direct Growth Hormone treatment. Furthermore, the binding protein may not rise with insulin-like growth factor-1. However, a combination of Growth Hormone and insulin-like growth factor-1 treatment may be a solution.
Approximately 15% (more than 2 million individuals, based on these estimates) of all people with diabetes will develop a lower-extremity ulcer during the course of the disease. Ultimately, between 14% and 20% of patients with lower-extremity diabetic ulcers will require amputation of the affected limb. Analysis of the 1995 Medicare claims revealed that lower-extremity ulcer care accounted for $1.45 billion in Medicare costs. Therapies that promote rapid and complete healing and reduce the need for expensive surgical procedures would impact these costs substantially. One such example is the electrotherapeutic modality utilizing the H-Wave® device therapy and program.
It has been recently shown in acute animal experiments that the H-Wave® device stimulation induces a nitric oxide-dependent increase in microcirculation of the rat Cremaster skeletal muscle. Moreover, chronic H-wave® device stimulation of rat hind limbs not only increases blood flow but induces measured angiogenesis. Coupling these findings strongly suggests that H-Wave® device stimulation promotes rapid and complete healing without need of expensive surgical procedures.
We decided to do a preliminary evaluation of the H-Wave® device therapy and program in three seriously afflicted diabetic patients. Patient 1 had chronic venous stasis for 6 years. Patient 2 had chronic recurrent leg ulcerations. Patient 3 had a chronic venous stasis ulcer for 2 years. All were dispensed a home H-Wave® unit. Patient 1 had no other treatment, patient 2 had H-Wave® therapy along with traditional compressive therapy, and patient 3 had no other therapy.
For patient 1, following treatment the ulcer completely healed with the H-Wave® device and program after 3 months. For patient 2, by one month complete ulcer closure occurred. Patient 3 had a completely healed ulcer after 9 months.
While most diabetic ulcers can be treated successfully on an outpatient basis, a significant proportion will persist and become infected. Based on this preliminary case series investigation we found that three patients prescribed H-Wave® home treatment demonstrate accelerated healing with excellent results. While these results are encouraging, additional large scale investigation is warranted before any interpretation is given to these interesting outcomes.
Albeit other prospective randomized controlled clinical trials on H-Wave Device Stimulation (HWDS), this is the first randomized double-blind Placebo controlled prospective study that assessed the effects of HWDS on range of motion and strength testing in patients who underwent rotator cuff reconstruction.
Twenty-two patients were randomly assigned into one of two groups: 1) H-Wave device stimulation (HWDS); 2) Sham-Placebo Device (PLACEBO). All groups received the same postoperative dressing and the same device treatment instructions. Group I was given HWDS which they were to utilize for one hour twice a day for 90 days postoperatively. Group II was given the same instructions with a Placebo device (PLACEBO). Range of motion was assessed by using one-way ANOVA with a Duncan Multiple Range Test for differences between the groups preoperatively, 45 days postoperatively, and 90 days postoperatively by using an active/passive scale for five basic ranges of motions: Forward Elevation, External Rotation (arm at side), External Rotation (arm at 90 degrees abduction), Internal Rotation (arm at side), and Internal Rotation (arm at 90 degrees abduction). The study also evaluated postoperative changes in strength by using the Medical Research Council (MRC) grade assessed strength testing.
Patients who received HWDS compared to PLACEBO demonstrated, on average, significantly improved range of motion. Results confirm a significant difference for external rotation at 45 and 90 days postoperatively; active range at 45 days postoperatively (p = 0.007), active at 90 days postoperatively (p = 0.007). Internal rotation also demonstrated significant improvement compared to PLACEBO at 45 and 90 days postoperatively; active range at 45 days postoperatively (p = 0.007), and active range at 90 days postoperatively (p = 0.006). There was no significant difference between the two groups for strength testing.
HWDS compared to PLACEBO induces a significant increase in range of motion in positive management of rotator cuff reconstruction, supporting other previous research on HWDS and improvement in function. Interpretation of this preliminary investigation while suggestive of significant increases in Range of Motion of Post -Operative Rotator Cuff Reconstruction, warrants further confirmation in a larger double-blinded sham controlled randomized study.
Numerous studies have reported that age-induced increased parathyroid hormone plasma levels are associated with cognitive decline and dementia. Little is known about the correlation that may exist between neurological processing speed, cognition and bone density in cases of hyperparathyroidism. Thus, we decided to determine if parathyroid hormone levels correlate to processing speed and/or bone density.
The recruited subjects that met the inclusion criteria (n = 92, age-matched, age 18-90 years, mean = 58.85, SD = 15.47) were evaluated for plasma parathyroid hormone levels and these levels were statistically correlated with event-related P300 potentials. Groups were compared for age, bone density and P300 latency. One-tailed tests were used to ascertain the statistical significance of the correlations. The study groups were categorized and analyzed for differences of parathyroid hormone levels: parathyroid hormone levels <30 (n = 30, mean = 22.7 ± 5.6 SD) and PTH levels >30 (n = 62, mean = 62.4 ± 28.3 SD, p ≤ 02).
Patients with parathyroid hormone levels <30 showed statistically significantly less P300 latency (P300 = 332.7 ± 4.8 SE) relative to those with parathyroid hormone levels >30, which demonstrated greater P300 latency (P300 = 345.7 ± 3.6 SE, p = .02). Participants with parathyroid hormone values <30 (n = 26) were found to have statistically significantly higher bone density (M = -1.25 ± .31 SE) than those with parathyroid hormone values >30 (n = 48, M = -1.85 ± .19 SE, p = .04).
Our findings of a statistically lower bone density and prolonged P300 in patients with high parathyroid hormone levels may suggest that increased parathyroid hormone levels coupled with prolonged P300 latency may become putative biological markers of both dementia and osteoporosis and warrant intensive investigation.
Aging is marked by declines in levels of many sex hormones and growth factors, as well as in cognitive function. The P300 event-related potential has been established as a predictor of cognitive decline. We decided to determine if this measure, as well as 2 standard tests of memory and attention, may be correlated with serum levels of sex hormones and growth factors, and if there are any generalizations that could be made based on these parameters and the aging process.
In this large clinically based preliminary study several sex-stratified associations between hormone levels and cognition were observed, including (1) for males aged 30 to 49, both IGF-1 and IGFBP-3 significantly associated negatively with prolonged P300 latency; (2) for males aged 30 to 49, the spearman correlation between prolonged P300 latency and low free testosterone was significant; (3) for males aged 60 to 69, there was a significant negative correlation between P300 latency and DHEA levels; (4) for females aged 50 to 59 IGFBP-3 significantly associated negatively with prolonged P300 latency; (5) for females at all age periods, estrogen and progesterone were uncorrelated with P300 latency; and (6) for females aged 40 to 69, there was significant negative correlation between DHEA levels and P300 latency. Moreover there were no statistically significant correlations between any hormone and Wechsler Memory Scale-III (WMS-111). However, in females, there was a significant positive correlation between estrogen levels and the number of Attention Deficit Disorder (ADD) complaints.
Given certain caveats including confounding factors involving psychiatric and other chronic diseases as well as medications, the results may still have important value. If these results could be confirmed in a more rigorously controlled investigation, it may have important value in the diagnosis, prevention and treatment of cognitive impairments and decline.
Molecular genetic studies have identified several genes that may mediate susceptibility to attention deficit hyperactivity disorder (ADHD). A consensus of the literature suggests that when there is a dysfunction in the “brain reward cascade,” especially in the dopamine system, causing a low or hypo-dopaminergic trait, the brain may require dopamine for individuals to avoid unpleasant feelings. This high-risk genetic trait leads to multiple drug-seeking behaviors, because the drugs activate release of dopamine, which can diminish abnormal cravings. Moreover, this genetic trait is due in part to a form of a gene (DRD2 A1 allele) that prevents the expression of the normal laying down of dopamine receptors in brain reward sites. This gene, and others involved in neurophysiological processing of specific neurotransmitters, have been associated with deficient functions and predispose individuals to have a high risk for addictive, impulsive, and compulsive behavioral propensities. It has been proposed that genetic variants of dopaminergic genes and other “reward genes” are important common determinants of reward deficiency syndrome (RDS), which we hypothesize includes ADHD as a behavioral subtype. We further hypothesize that early diagnosis through genetic polymorphic identification in combination with DNA-based customized nutraceutical administration to young children may attenuate behavioral symptoms associated with ADHD. Moreover, it is concluded that dopamine and serotonin releasers might be useful therapeutic adjuncts for the treatment of other RDS behavioral subtypes, including addictions.
attention deficit hyperactivity disorder (ADHD); genes; reward dependence; reward deficiency syndrome; treatment; neuropsychological deficits
Background and hypothesis
Based on neurochemical and genetic evidence, we suggest that both prevention and treatment of multiple addictions, such as dependence to alcohol, nicotine and glucose, should involve a biphasic approach. Thus, acute treatment should consist of preferential blocking of postsynaptic Nucleus Accumbens (NAc) dopamine receptors (D1-D5), whereas long term activation of the mesolimbic dopaminergic system should involve activation and/or release of Dopamine (DA) at the NAc site. Failure to do so will result in abnormal mood, behavior and potential suicide ideation. Individuals possessing a paucity of serotonergic and/or dopaminergic receptors, and an increased rate of synaptic DA catabolism due to high catabolic genotype of the COMT gene, are predisposed to self-medicating any substance or behavior that will activate DA release, including alcohol, opiates, psychostimulants, nicotine, gambling, sex, and even excessive internet gaming. Acute utilization of these substances and/or stimulatory behaviors induces a feeling of well being. Unfortunately, sustained and prolonged abuse leads to a toxic" pseudo feeling" of well being resulting in tolerance and disease or discomfort. Thus, a reduced number of DA receptors, due to carrying the DRD2 A1 allelic genotype, results in excessive craving behavior; whereas a normal or sufficient amount of DA receptors results in low craving behavior. In terms of preventing substance abuse, one goal would be to induce a proliferation of DA D2 receptors in genetically prone individuals. While in vivo experiments using a typical D2 receptor agonist induce down regulation, experiments in vitro have shown that constant stimulation of the DA receptor system via a known D2 agonist results in significant proliferation of D2 receptors in spite of genetic antecedents. In essence, D2 receptor stimulation signals negative feedback mechanisms in the mesolimbic system to induce mRNA expression causing proliferation of D2 receptors.
Proposal and conclusion
The authors propose that D2 receptor stimulation can be accomplished via the use of Synapatmine™, a natural but therapeutic nutraceutical formulation that potentially induces DA release, causing the same induction of D2-directed mRNA and thus proliferation of D2 receptors in the human. This proliferation of D2 receptors in turn will induce the attenuation of craving behavior. In fact as mentioned earlier, this model has been proven in research showing DNA-directed compensatory overexpression (a form of gene therapy) of the DRD2 receptors, resulting in a significant reduction in alcohol craving behavior in alcohol preferring rodents. Utilizing natural dopaminergic repletion therapy to promote long term dopaminergic activation will ultimately lead to a common, safe and effective modality to treat Reward Deficiency Syndrome (RDS) behaviors including Substance Use Disorders (SUD), Attention Deficit Hyperactivity Disorder (ADHD), Obesity and other reward deficient aberrant behaviors. This concept is further supported by the more comprehensive understanding of the role of dopamine in the NAc as a "wanting" messenger in the meso-limbic DA system.
A review of the literature in both animals and humans reveals that changes in sex hormone have often been associated with changes in behavioral and mental abilities. Previously published research from our laboratory, and others, provides strong evidence that P300 (latency) event-related potential (ERP), a marker of neuronal processing speed, is an accurate predictor of early memory impairment in both males and females across a wide age range. It is our hypothesis, given the vast literature on the subject, that coupling growth hormones (insulin-like growth factor-I, (IGF-I) and insulin-like growth factor binding protein 3 (IGF-BP3)), P300 event-related potential and test of variables of attention (TOVA) are important neuroendocrinological predictors of early cognitive decline in a clinical setting. To support this hypothesis, we utilized structural equation modeling (SEM) parameter estimates to determine the relationship between aging and memory, as mediated by growth hormone (GH) levels (indirectly measured through the insulin-like growth factor system), P300 latency and TOVA, putative neurocognitive predictors tested in this study. An SEM was developed hypothesizing a causal directive path, leading from age to memory, mediated by IGF-1 and IGF-BP3, P300 latency (speed), and TOVA decrements. An increase in age was accompanied by a decrease in IGF-1 and IGF-BP3, an increase in P300 latency, a prolongation in TOVA response time, and a decrease in memory functioning. Moreover, independent of age, decreases in IGF-1 and IGF-BP3, were accompanied by increases in P300 latency, and were accompanied by increases in TOVA response time. Finally, increases in P300 latency were accompanied by decreased memory function, both directly and indirectly through mediation of TOVA response time. In summary, this is the first report utilizing SEM to reveal the finding that aging affects memory function negatively through mediation of decreased IGF-1 and IGF-BP3, and increased P300 latency (delayed attention and processing speed).
Structural equation modeling (SEM); P300 latency; TOVA; IGF-1; IGF-BP3; Age and memory
Opiate addiction is associated with many adverse health and social harms, fatal overdose, infectious disease transmission, elevated health care costs, public disorder, and crime. Although community-based addiction treatment programs continue to reduce the harms of opiate addiction with narcotic substitution therapy such as methadone maintenance, there remains a need to find a substance that not only blocks opiate-type receptors (mu, delta, etc.) but also provides agonistic activity; hence the impetus arose for the development of a combination of narcotic antagonism and mu receptor agonist therapy. After three decades of extensive research the federal Drug Abuse Treatment Act 2000 (DATA) opened a window of opportunity for patients with addiction disorders by providing increased access to options for treatment. DATA allows physicians who complete a brief specialty-training course to become certified to prescribe buprenorphine and buprenorphine/naloxone (Subutex, Suboxone) for treatment of patients with opioid dependence. Clinical studies indicate buprenorphine maintenance is as effective as methadone maintenance in retaining patients in substance abuse treatment and in reducing illicit opioid use. With that stated, we must consider the long-term benefits or potential toxicity attributed to Subutex or Suboxone. We describe a mechanism whereby chronic blockade of opiate receptors, in spite of only partial opiate agonist action, may ultimately block dopaminergic activity causing anti-reward and relapse potential. While the direct comparison is not as yet available, toxicity to buprenorphine can be found in the scientific literature. In considering our cautionary note in this commentary, we are cognizant that to date this is what we have available, and until such a time when the real magic bullet is discovered, we will have to endure. However, more than anything else this commentary should at least encourage the development of thoughtful new strategies to target the specific brain regions responsible for relapse prevention.
Vibrio cholerae is the causal organism of the cholera epidemic, which is mostly prevalent in developing and underdeveloped countries. However, incidences of cholera in developed countries are also alarming. Because of the emergence of new drug-resistant strains, even though several generic drugs and vaccines have been developed over time, Vibrio infections remain a global health problem that appeals for the development of novel drugs and vaccines against the pathogen. Here, applying comparative proteomic and reverse vaccinology approaches to the exoproteome and secretome of the pathogen, we have identified three candidate targets (ompU, uppP and yajC) for most of the pathogenic Vibrio strains. Two targets (uppP and yajC) are novel to Vibrio, and two targets (uppP and ompU) can be used to develop both drugs and vaccines (dual targets) against broad spectrum Vibrio serotypes. Using our novel computational approach, we have identified three peptide vaccine candidates that have high potential to induce both B- and T-cell-mediated immune responses from our identified two dual targets. These two targets were modeled and subjected to virtual screening against natural compounds derived from Piper betel. Seven compounds were identified first time from Piper betel to be highly effective to render the function of these targets to identify them as emerging potential drugs against Vibrio. Our preliminary validation suggests that these identified peptide vaccines and betel compounds are highly effective against Vibrio cholerae. Currently we are exhaustively validating these targets, candidate peptide vaccines, and betel derived lead compounds against a number of Vibrio species.