Now after many years of successful bariatric (weight-loss) surgeries directed at the obesity epidemic clinicians are reporting that some patients are replacing compulsive overeating with newly acquired compulsive disorders such as alcoholism, gambling, drugs, and other addictions like compulsive shopping and exercise. This review article explores evidence from psychiatric genetic animal and human studies that link compulsive overeating and other compulsive disorders to explain the phenomenon of addiction transfer. Possibly due to neurochemical similarities, overeating and obesity may act as protective factors reducing drug reward and addictive behaviors. In animal models of addiction withdrawal from sugar induces imbalances in the neurotransmitters, acetylcholine and dopamine, similar to opiate withdrawal. Many human neuroimaging studies have supported the concept of linking food craving to drug craving behavior. Previously our laboratory coined the term Reward Deficiency Syndrome (RDS) for common genetic determinants in predicting addictive disorders and reported that the predictive value for future RDS behaviors in subjects carrying the DRD2 Taq A1 allele was 74%. While poly genes play a role in RDS, we have also inferred that disruptions in dopamine function may predispose certain individuals to addictive behaviors and obesity. It is now known that family history of alcoholism is a significant obesity risk factor. Therefore, we hypothesize here that RDS is the root cause of substituting food addiction for other dependencies and potentially explains this recently described Phenomenon (addiction transfer) common after bariatric surgery.
Bariatric surgery; Addiction transfer; Cross tolerance; Reward Deficiency Syndrome; Dopamine; Reward genes
Vibrio cholerae is the causal organism of the cholera epidemic, which is mostly prevalent in developing and underdeveloped countries. However, incidences of cholera in developed countries are also alarming. Because of the emergence of new drug-resistant strains, even though several generic drugs and vaccines have been developed over time, Vibrio infections remain a global health problem that appeals for the development of novel drugs and vaccines against the pathogen. Here, applying comparative proteomic and reverse vaccinology approaches to the exoproteome and secretome of the pathogen, we have identified three candidate targets (ompU, uppP and yajC) for most of the pathogenic Vibrio strains. Two targets (uppP and yajC) are novel to Vibrio, and two targets (uppP and ompU) can be used to develop both drugs and vaccines (dual targets) against broad spectrum Vibrio serotypes. Using our novel computational approach, we have identified three peptide vaccine candidates that have high potential to induce both B- and T-cell-mediated immune responses from our identified two dual targets. These two targets were modeled and subjected to virtual screening against natural compounds derived from Piper betel. Seven compounds were identified first time from Piper betel to be highly effective to render the function of these targets to identify them as emerging potential drugs against Vibrio. Our preliminary validation suggests that these identified peptide vaccines and betel compounds are highly effective against Vibrio cholerae. Currently we are exhaustively validating these targets, candidate peptide vaccines, and betel derived lead compounds against a number of Vibrio species.
Mindful of the new evolutionary ideas related to an emerging scientific focus known as omics, we propose that spiritual, social, and political behaviors may be tied in part to inheritable reward gene polymorphisms, as has been demonstrated for the addictions. If so, analyses of gene polymorphisms may assist in predicting liberalism or conservatism in partisan attachments. For example, both drinking (alcohol) and obesity seem to cluster in large social networks and are influenced by friends having the same genotype, in particular the DRD2 A1 allele. Likewise, voting, voting turnout and attachment to a particular political ideology is differentially related to various reward genes (e.g., 5HTT, MOA, DRD2, and DRD4), possibly predicting liberalism or conservatism. Moreover, voters’ genetic information may predict presidential outcomes more than the actual issues at hand or the presidential candidates themselves. Thus, political discussions on TV, radio, or other media may be morphed by one’s reward gene polymorphisms and as such, may explain the prevalence of generations of die-hard republicans and equally entrenched democratic legacies. Indeed, even in politics, birds of a feather (homophily) flock together. We caution that our proposal should be viewed mindfully awaiting additional research before definitive statements or conclusions can be derived from the studies to date, and we encourage large scale studies to confirm these earlier reports.
Liberalism; Conservatism; Politics; Friendships; Happiness; Reward Gene Polymorphisms
There is a need for understanding and treating post-traumatic stress disorder (PTSD), in soldiers returning to the United States of America after combat. Likewise, it would be beneficial to finding a way to reduce violence committed by soldiers, here and abroad, who are suspected of having post-traumatic stress disorder (PTSD). We hypothesize that even before combat, soldiers with a childhood background of violence (or with a familial susceptibility risk) would benefit from being genotyped for high-risk alleles. Such a process could help to identify candidates who would be less suited for combat than those without high-risk alleles. Of secondary importance is finding safe methods to treat individuals already exposed to combat and known to have PTSD. Since hypodopaminergic function in the brain’s reward circuitry due to gene polymorphisms is known to increase substance use disorder in individuals with PTSD, it might be parsimonious to administer dopaminergic agonists to affect gene expression (mRNA) to overcome this deficiency.
Post-Traumatic Stress Disorder (PTSD); Reward Deficiency Syndrome (RDS); Gene testing; Dopamine; KB220
Adverse, unfavourable life conditions, particularly during early life stages and infancy, can lead to epigenetic regulation of genes involved in stress-response, behavioral disinhibition, and cognitive-emotional systems. Over time, the ultimate final outcome can be expressed through behaviors bedeviled by problems with impulse control, such as eating disorders, alcoholism, and indiscriminate social behavior. While many reward gene polymorphisms are involved in impulsive behaviors, a polymorphism by itself may not translate to the development of a particular behavioral disorder unless it is impacted by epigenetic effects. Brain-derived neurotrophic factor (BDNF) affects the development and integrity of the noradrenergic, dopaminergic, serotonergic, glutamatergic, and cholinergic neurotransmitter systems, and plasma levels of the neurotrophin are associated with both cognitive and aggressive impulsiveness. Epigenetic mechanisms associated with a multitude of environmental factors, including premature birth, low birth weight, prenatal tobacco exposure, non-intact family, young maternal age at birth of the target child, paternal history of antisocial behavior, and maternal depression, alter the developmental trajectories for several neuropsychiatric disorders. These mechanisms affect brain development and integrity at several levels that determine structure and function in resolving the final behavioral expressions.
Epigenetics; Disinhibition; Eating disorder; Alcoholism; BDNF
Heterogeneity in attention-deficit/hyperactivity disorder (ADHD), with complex interactive operations of genetic and environmental factors, is expressed in a variety of disorder manifestations: severity, co-morbidities of symptoms, and the effects of genes on phenotypes. Neurodevelopmental influences of genomic imprinting have set the stage for the structural-physiological variations that modulate the cognitive, affective, and pathophysiological domains of ADHD. The relative contributions of genetic and environmental factors provide rapidly proliferating insights into the developmental trajectory of the condition, both structurally and functionally. Parent-of-origin effects seem to support the notion that genetic risks for disease process debut often interact with the social environment, i.e., the parental environment in infants and young children. The notion of endophenotypes, markers of an underlying liability to the disorder, may facilitate detection of genetic risks relative to a complex clinical disorder. Simple genetic association has proven insufficient to explain the spectrum of ADHD. At a primary level of analysis, the consideration of epigenetic regulation of brain signalling mechanisms, dopamine, serotonin, and noradrenaline is examined. Neurotrophic factors that participate in the neurogenesis, survival, and functional maintenance of brain systems, are involved in neuroplasticity alterations underlying brain disorders, and are implicated in the genetic predisposition to ADHD, but not obviously, nor in a simple or straightforward fashion. In the context of intervention, genetic linkage studies of ADHD pharmacological intervention have demonstrated that associations have fitted the “drug response phenotype,” rather than the disorder diagnosis. Despite conflicting evidence for the existence, or not, of genetic associations between disorder diagnosis and genes regulating the structure and function of neurotransmitters and brain-derived neurotrophic factor (BDNF), associations between symptoms-profiles endophenotypes and single nucleotide polymorphisms appear reassuring.
Epigenetic; Regulation; Domain; Parent-of-origin; Dopamine; Serotonin; Noradrenaline; Brain-derived neurotrophic factor; Intervention; Endophenotypes
Abnormal behaviors involving dopaminergic gene polymorphisms often reflect an insufficiency of usual feelings of satisfaction, or Reward Deficiency Syndrome (RDS). RDS results from a dysfunction in the “brain reward cascade,” a complex interaction among neurotransmitters (primarily dopaminergic and opioidergic). Individuals with a family history of alcoholism or other addictions may be born with a deficiency in the ability to produce or use these neurotransmitters. Exposure to prolonged periods of stress and alcohol or other substances also can lead to a corruption of the brain reward cascade function. We evaluated the potential association of four variants of dopaminergic candidate genes in RDS (dopamine D1 receptor gene [DRD1]; dopamine D2 receptor gene [DRD2]; dopamine transporter gene [DAT1]; dopamine beta-hydroxylase gene [DBH]). Methodology: We genotyped an experimental group of 55 subjects derived from up to five generations of two independent multiple-affected families compared to rigorously screened control subjects (e.g., N = 30 super controls for DRD2 gene polymorphisms). Data related to RDS behaviors were collected on these subjects plus 13 deceased family members. Results: Among the genotyped family members, the DRD2 Taq1 and the DAT1 10/10 alleles were significantly (at least p < 0.015) more often found in the RDS families vs. controls. The TaqA1 allele occurred in 100% of Family A individuals (N = 32) and 47.8% of Family B subjects (11 of 23). No significant differences were found between the experimental and control positive rates for the other variants. Conclusions: Although our sample size was limited, and linkage analysis is necessary, the results support the putative role of dopaminergic polymorphisms in RDS behaviors. This study shows the importance of a nonspecific RDS phenotype and informs an understanding of how evaluating single subset behaviors of RDS may lead to spurious results. Utilization of a nonspecific “reward” phenotype may be a paradigm shift in future association and linkage studies involving dopaminergic polymorphisms and other neurotransmitter gene candidates.
dopamine; gene polymorphisms; generational association studies; phenotype; “super normal” controls; Reward Deficiency Syndrome (RDS)
Background and Hypothesis:
Although the biological underpinnings of immediate and protracted trauma-related responses are extremely complex, 40 years of research on humans and other mammals have demonstrated that trauma (particularly trauma early in the life cycle) has long-term effects on neurochemical responses to stressful events. These effects include the magnitude of the catecholamine response and the duration and extent of the cortisol response. In addition, a number of other biological systems are involved, including mesolimbic brain structures and various neurotransmitters. An understanding of the many genetic and environmental interactions contributing to stress-related responses will provide a diagnostic and treatment map, which will illuminate the vulnerability and resilience of individuals to Posttraumatic Stress Disorder (PTSD).
Proposal and Conclusions:
We propose that successful treatment of PTSD will involve preliminary genetic testing for specific polymorphisms. Early detection is especially important, because early treatment can improve outcome. When genetic testing reveals deficiencies, vulnerable individuals can be recommended for treatment with “body friendly” pharmacologic substances and/or nutrients. Results of our research suggest the following genes should be tested: serotoninergic, dopaminergic (DRD2, DAT, DBH), glucocorticoid, GABAergic (GABRB), apolipoprotein systems (APOE2), brain-derived neurotrophic factor, Monamine B, CNR1, Myo6, CRF-1 and CRF-2 receptors, and neuropeptide Y (NPY). Treatment in part should be developed that would up-regulate the expression of these genes to bring about a feeling of well being as well as a reduction in the frequency and intensity of the symptoms of PTSD.
Post-traumatic Stress Disorder (PTSD); genes and environment; neurotransmitters; Reward Deficiency Syndrome (RDS).
The goal of this study was to determine if impairments detected by the test of variables of attention (TOVA) may be used to predict early attention complaints and memory impairments accurately in a clinical setting. We performed a statistical analysis of outcomes in a patient population screened for attention deficit hyperactivity disorder or attention complaints, processing errors as measured by TOVA and the Wechsler Memory Scale (WMS-III) results. Attention deficit disorder (ADD) checklists, constructed using the Diagnostic and Statistical Manual of Mental Disorders 4th Edition criteria, which were completed by patients at PATH Medical, revealed that 72.8% of the patients had more than one attention complaint out of a total of 16 complaints, and 41.5% had more than five complaints. For the 128 males with a significant number of ADD complaints, individuals whose scores were significantly deviant or borderline (SDB) on TOVA, had a significantly greater number of attention complaints compared with normals for omissions (P < 0.02), response time (P < 0.015), and variability (P < 0.005), but not commissions (P > 0.50). For males, the mean scores for auditory, visual, immediate, and working memory scores as measured by the WMS-III were significantly greater for normals versus SDBs on the TOVA subtest, ie, omission (P < 0.01) and response time (P < 0.05), but not variability or commissions. The means for auditory, visual, and immediate memory scores were significantly greater for normals versus SDBs for variability (P < 0.045) only. In females, the mean scores for visual and working memory scores were significantly greater for normals versus SDBs for omissions (P < 0.025). The number of SDB TOVA quarters was a significant predictor for “impaired” or “normal” group membership for visual memory (P < 0.015), but not for the other three WMS-III components. For males, the partial correlation between the number of attention complaints and the number of SDB TOVA quarters was also significant (r = 0.251, P < 0.005). For the 152 females with a significant number of attention complaints, no significant differences between SDBs and normals were observed (P > 0.15). This is the first report, to our knowledge, which provides evidence that TOVA is an accurate predictor of early attention complaints and memory impairments in a clinical setting. This finding is more robust for males than for females between the ages of 40 and 90 years.
attention complaints; dementia; TOVA; Wechsler Memory Scale
Adult Growth hormone Deficiency is a well known phenomenon effecting both males and females. Adult Growth Hormone Deficiency is marked by a number of neuropsychiatric, cognitive performance, cardiac, metabolic, muscular, and bone symptoms and clinical features. There is no known standardized acceptable therapeutic modality to treat this condition. A recent meta-analysis found that after 16 years of Growth Hormone replacement therapy a large proportion of the patients still had Growth Hormone associated symptoms especially related to executive functioning. A major goal is to increase plasma levels of both insulin-like growth factor (insulin-like growth factor-1) and insulin-like growth factor binding protein 3.
We report a case of a 45-year-old caucasian woman with early ovarian failure for 2 years and amenorrhea since the age of 43, who presented with Adult Growth Hormone Deficiency and an IGF-1 of 126 ng/mL. Since her insulin-like growth factor-1 was lowest at 81 ng/mL, she was started on insulin-like growth factor-1 Increlex at 0.2 mg at bedtime, which immediately raised her insulin-like growth factor-1 levels to 130 ng/mL within 1 month, and 193 ng/mL, 249 ng/mL, and 357 ng/mL, after 3, 4, and 5 months, respectively, thereafter. Her insulin-like growth factor binding protein 3 continued to decrease. It was at this point when we added back the Growth Hormone and increased her Increlex dosage to 1.3 - 1.5 mg that her insulin-like growth factor binding protein 3 began to increase.
It appears that in some patients with Adult Growth Hormone Deficiency, insulin-like growth factor-1 elevation is resistant to direct Growth Hormone treatment. Furthermore, the binding protein may not rise with insulin-like growth factor-1. However, a combination of Growth Hormone and insulin-like growth factor-1 treatment may be a solution.
Approximately 15% (more than 2 million individuals, based on these estimates) of all people with diabetes will develop a lower-extremity ulcer during the course of the disease. Ultimately, between 14% and 20% of patients with lower-extremity diabetic ulcers will require amputation of the affected limb. Analysis of the 1995 Medicare claims revealed that lower-extremity ulcer care accounted for $1.45 billion in Medicare costs. Therapies that promote rapid and complete healing and reduce the need for expensive surgical procedures would impact these costs substantially. One such example is the electrotherapeutic modality utilizing the H-Wave® device therapy and program.
It has been recently shown in acute animal experiments that the H-Wave® device stimulation induces a nitric oxide-dependent increase in microcirculation of the rat Cremaster skeletal muscle. Moreover, chronic H-wave® device stimulation of rat hind limbs not only increases blood flow but induces measured angiogenesis. Coupling these findings strongly suggests that H-Wave® device stimulation promotes rapid and complete healing without need of expensive surgical procedures.
We decided to do a preliminary evaluation of the H-Wave® device therapy and program in three seriously afflicted diabetic patients. Patient 1 had chronic venous stasis for 6 years. Patient 2 had chronic recurrent leg ulcerations. Patient 3 had a chronic venous stasis ulcer for 2 years. All were dispensed a home H-Wave® unit. Patient 1 had no other treatment, patient 2 had H-Wave® therapy along with traditional compressive therapy, and patient 3 had no other therapy.
For patient 1, following treatment the ulcer completely healed with the H-Wave® device and program after 3 months. For patient 2, by one month complete ulcer closure occurred. Patient 3 had a completely healed ulcer after 9 months.
While most diabetic ulcers can be treated successfully on an outpatient basis, a significant proportion will persist and become infected. Based on this preliminary case series investigation we found that three patients prescribed H-Wave® home treatment demonstrate accelerated healing with excellent results. While these results are encouraging, additional large scale investigation is warranted before any interpretation is given to these interesting outcomes.
Albeit other prospective randomized controlled clinical trials on H-Wave Device Stimulation (HWDS), this is the first randomized double-blind Placebo controlled prospective study that assessed the effects of HWDS on range of motion and strength testing in patients who underwent rotator cuff reconstruction.
Twenty-two patients were randomly assigned into one of two groups: 1) H-Wave device stimulation (HWDS); 2) Sham-Placebo Device (PLACEBO). All groups received the same postoperative dressing and the same device treatment instructions. Group I was given HWDS which they were to utilize for one hour twice a day for 90 days postoperatively. Group II was given the same instructions with a Placebo device (PLACEBO). Range of motion was assessed by using one-way ANOVA with a Duncan Multiple Range Test for differences between the groups preoperatively, 45 days postoperatively, and 90 days postoperatively by using an active/passive scale for five basic ranges of motions: Forward Elevation, External Rotation (arm at side), External Rotation (arm at 90 degrees abduction), Internal Rotation (arm at side), and Internal Rotation (arm at 90 degrees abduction). The study also evaluated postoperative changes in strength by using the Medical Research Council (MRC) grade assessed strength testing.
Patients who received HWDS compared to PLACEBO demonstrated, on average, significantly improved range of motion. Results confirm a significant difference for external rotation at 45 and 90 days postoperatively; active range at 45 days postoperatively (p = 0.007), active at 90 days postoperatively (p = 0.007). Internal rotation also demonstrated significant improvement compared to PLACEBO at 45 and 90 days postoperatively; active range at 45 days postoperatively (p = 0.007), and active range at 90 days postoperatively (p = 0.006). There was no significant difference between the two groups for strength testing.
HWDS compared to PLACEBO induces a significant increase in range of motion in positive management of rotator cuff reconstruction, supporting other previous research on HWDS and improvement in function. Interpretation of this preliminary investigation while suggestive of significant increases in Range of Motion of Post -Operative Rotator Cuff Reconstruction, warrants further confirmation in a larger double-blinded sham controlled randomized study.
Numerous studies have reported that age-induced increased parathyroid hormone plasma levels are associated with cognitive decline and dementia. Little is known about the correlation that may exist between neurological processing speed, cognition and bone density in cases of hyperparathyroidism. Thus, we decided to determine if parathyroid hormone levels correlate to processing speed and/or bone density.
The recruited subjects that met the inclusion criteria (n = 92, age-matched, age 18-90 years, mean = 58.85, SD = 15.47) were evaluated for plasma parathyroid hormone levels and these levels were statistically correlated with event-related P300 potentials. Groups were compared for age, bone density and P300 latency. One-tailed tests were used to ascertain the statistical significance of the correlations. The study groups were categorized and analyzed for differences of parathyroid hormone levels: parathyroid hormone levels <30 (n = 30, mean = 22.7 ± 5.6 SD) and PTH levels >30 (n = 62, mean = 62.4 ± 28.3 SD, p ≤ 02).
Patients with parathyroid hormone levels <30 showed statistically significantly less P300 latency (P300 = 332.7 ± 4.8 SE) relative to those with parathyroid hormone levels >30, which demonstrated greater P300 latency (P300 = 345.7 ± 3.6 SE, p = .02). Participants with parathyroid hormone values <30 (n = 26) were found to have statistically significantly higher bone density (M = -1.25 ± .31 SE) than those with parathyroid hormone values >30 (n = 48, M = -1.85 ± .19 SE, p = .04).
Our findings of a statistically lower bone density and prolonged P300 in patients with high parathyroid hormone levels may suggest that increased parathyroid hormone levels coupled with prolonged P300 latency may become putative biological markers of both dementia and osteoporosis and warrant intensive investigation.
Aging is marked by declines in levels of many sex hormones and growth factors, as well as in cognitive function. The P300 event-related potential has been established as a predictor of cognitive decline. We decided to determine if this measure, as well as 2 standard tests of memory and attention, may be correlated with serum levels of sex hormones and growth factors, and if there are any generalizations that could be made based on these parameters and the aging process.
In this large clinically based preliminary study several sex-stratified associations between hormone levels and cognition were observed, including (1) for males aged 30 to 49, both IGF-1 and IGFBP-3 significantly associated negatively with prolonged P300 latency; (2) for males aged 30 to 49, the spearman correlation between prolonged P300 latency and low free testosterone was significant; (3) for males aged 60 to 69, there was a significant negative correlation between P300 latency and DHEA levels; (4) for females aged 50 to 59 IGFBP-3 significantly associated negatively with prolonged P300 latency; (5) for females at all age periods, estrogen and progesterone were uncorrelated with P300 latency; and (6) for females aged 40 to 69, there was significant negative correlation between DHEA levels and P300 latency. Moreover there were no statistically significant correlations between any hormone and Wechsler Memory Scale-III (WMS-111). However, in females, there was a significant positive correlation between estrogen levels and the number of Attention Deficit Disorder (ADD) complaints.
Given certain caveats including confounding factors involving psychiatric and other chronic diseases as well as medications, the results may still have important value. If these results could be confirmed in a more rigorously controlled investigation, it may have important value in the diagnosis, prevention and treatment of cognitive impairments and decline.
Molecular genetic studies have identified several genes that may mediate susceptibility to attention deficit hyperactivity disorder (ADHD). A consensus of the literature suggests that when there is a dysfunction in the “brain reward cascade,” especially in the dopamine system, causing a low or hypo-dopaminergic trait, the brain may require dopamine for individuals to avoid unpleasant feelings. This high-risk genetic trait leads to multiple drug-seeking behaviors, because the drugs activate release of dopamine, which can diminish abnormal cravings. Moreover, this genetic trait is due in part to a form of a gene (DRD2 A1 allele) that prevents the expression of the normal laying down of dopamine receptors in brain reward sites. This gene, and others involved in neurophysiological processing of specific neurotransmitters, have been associated with deficient functions and predispose individuals to have a high risk for addictive, impulsive, and compulsive behavioral propensities. It has been proposed that genetic variants of dopaminergic genes and other “reward genes” are important common determinants of reward deficiency syndrome (RDS), which we hypothesize includes ADHD as a behavioral subtype. We further hypothesize that early diagnosis through genetic polymorphic identification in combination with DNA-based customized nutraceutical administration to young children may attenuate behavioral symptoms associated with ADHD. Moreover, it is concluded that dopamine and serotonin releasers might be useful therapeutic adjuncts for the treatment of other RDS behavioral subtypes, including addictions.
attention deficit hyperactivity disorder (ADHD); genes; reward dependence; reward deficiency syndrome; treatment; neuropsychological deficits
Endeavors to manage obesity have been heavily reliant on controlling energy intake and expenditure equilibrium, but have failed to curtail the overweight and obesity epidemic. This dynamic equilibrium is more complex than originally postulated and is influenced by lifestyle, calorie and nutrient intake, reward cravings and satiation, energy metabolism, stress response capabilities, immune metabolism and genetics. Fat metabolism is an important indicator of how efficiently and to what extent these factors are competently integrating. We investigated whether an Irvingia gabonensis seed extract (IGOB131) would provide a more beneficial comprehensive approach influencing multiple mechanisms and specifically PPAR gamma, leptin and adiponectin gene expressions, important in anti-obesity strategies.
Using murine 3T3-L1 adipocytes as a model for adipose cell biology research, the effects of IGOB131 were investigated on PPAR gamma, adiponectin, and leptin. These adipocytes were harvested 8 days after the initiation of differentiation and treated with 0 to 250 microM of IGOB131 for 12 and 24 h at 37 degree C in a humidified 5 percent CO2 incubator. The relative expression of PPAR gamma, adiponectin, and leptin in 3T3-L1 adipocytes was quantified densitometrically using the software LabWorks 4.5, and calculated according to the reference bands of beta-actin.
The IGOB131 significantly inhibited adipogenesis in adipocytes. The effect appears to be mediated through the down-regulated expression of adipogenic transcription factors (PPAR gamma) [P less than 0.05] and adipocyte-specific proteins (leptin) [P less than 0.05], and by up-regulated expression of adiponectin [P less than 0.05].
IGOB131 may play an important multifaceted role in the control of adipogenesis and have further implications in in-vivo anti obesity effects by targeting the PPAR gamma gene, a known contributory factor to obesity in humans.
Background and hypothesis
Based on neurochemical and genetic evidence, we suggest that both prevention and treatment of multiple addictions, such as dependence to alcohol, nicotine and glucose, should involve a biphasic approach. Thus, acute treatment should consist of preferential blocking of postsynaptic Nucleus Accumbens (NAc) dopamine receptors (D1-D5), whereas long term activation of the mesolimbic dopaminergic system should involve activation and/or release of Dopamine (DA) at the NAc site. Failure to do so will result in abnormal mood, behavior and potential suicide ideation. Individuals possessing a paucity of serotonergic and/or dopaminergic receptors, and an increased rate of synaptic DA catabolism due to high catabolic genotype of the COMT gene, are predisposed to self-medicating any substance or behavior that will activate DA release, including alcohol, opiates, psychostimulants, nicotine, gambling, sex, and even excessive internet gaming. Acute utilization of these substances and/or stimulatory behaviors induces a feeling of well being. Unfortunately, sustained and prolonged abuse leads to a toxic" pseudo feeling" of well being resulting in tolerance and disease or discomfort. Thus, a reduced number of DA receptors, due to carrying the DRD2 A1 allelic genotype, results in excessive craving behavior; whereas a normal or sufficient amount of DA receptors results in low craving behavior. In terms of preventing substance abuse, one goal would be to induce a proliferation of DA D2 receptors in genetically prone individuals. While in vivo experiments using a typical D2 receptor agonist induce down regulation, experiments in vitro have shown that constant stimulation of the DA receptor system via a known D2 agonist results in significant proliferation of D2 receptors in spite of genetic antecedents. In essence, D2 receptor stimulation signals negative feedback mechanisms in the mesolimbic system to induce mRNA expression causing proliferation of D2 receptors.
Proposal and conclusion
The authors propose that D2 receptor stimulation can be accomplished via the use of Synapatmine™, a natural but therapeutic nutraceutical formulation that potentially induces DA release, causing the same induction of D2-directed mRNA and thus proliferation of D2 receptors in the human. This proliferation of D2 receptors in turn will induce the attenuation of craving behavior. In fact as mentioned earlier, this model has been proven in research showing DNA-directed compensatory overexpression (a form of gene therapy) of the DRD2 receptors, resulting in a significant reduction in alcohol craving behavior in alcohol preferring rodents. Utilizing natural dopaminergic repletion therapy to promote long term dopaminergic activation will ultimately lead to a common, safe and effective modality to treat Reward Deficiency Syndrome (RDS) behaviors including Substance Use Disorders (SUD), Attention Deficit Hyperactivity Disorder (ADHD), Obesity and other reward deficient aberrant behaviors. This concept is further supported by the more comprehensive understanding of the role of dopamine in the NAc as a "wanting" messenger in the meso-limbic DA system.
A review of the literature in both animals and humans reveals that changes in sex hormone have often been associated with changes in behavioral and mental abilities. Previously published research from our laboratory, and others, provides strong evidence that P300 (latency) event-related potential (ERP), a marker of neuronal processing speed, is an accurate predictor of early memory impairment in both males and females across a wide age range. It is our hypothesis, given the vast literature on the subject, that coupling growth hormones (insulin-like growth factor-I, (IGF-I) and insulin-like growth factor binding protein 3 (IGF-BP3)), P300 event-related potential and test of variables of attention (TOVA) are important neuroendocrinological predictors of early cognitive decline in a clinical setting. To support this hypothesis, we utilized structural equation modeling (SEM) parameter estimates to determine the relationship between aging and memory, as mediated by growth hormone (GH) levels (indirectly measured through the insulin-like growth factor system), P300 latency and TOVA, putative neurocognitive predictors tested in this study. An SEM was developed hypothesizing a causal directive path, leading from age to memory, mediated by IGF-1 and IGF-BP3, P300 latency (speed), and TOVA decrements. An increase in age was accompanied by a decrease in IGF-1 and IGF-BP3, an increase in P300 latency, a prolongation in TOVA response time, and a decrease in memory functioning. Moreover, independent of age, decreases in IGF-1 and IGF-BP3, were accompanied by increases in P300 latency, and were accompanied by increases in TOVA response time. Finally, increases in P300 latency were accompanied by decreased memory function, both directly and indirectly through mediation of TOVA response time. In summary, this is the first report utilizing SEM to reveal the finding that aging affects memory function negatively through mediation of decreased IGF-1 and IGF-BP3, and increased P300 latency (delayed attention and processing speed).
Structural equation modeling (SEM); P300 latency; TOVA; IGF-1; IGF-BP3; Age and memory
Attention Deficit Hyperactivity Disorder, commonly referred to as ADHD, is a common, complex, predominately genetic but highly treatable disorder, which in its more severe form has such a profound effect on brain function that every aspect of the life of an affected individual may be permanently compromised. Despite the broad base of scientific investigation over the past 50 years supporting this statement, there are still many misconceptions about ADHD. These include believing the disorder does not exist, that all children have symptoms of ADHD, that if it does exist it is grossly over-diagnosed and over-treated, and that the treatment is dangerous and leads to a propensity to drug addiction. Since most misconceptions contain elements of truth, where does the reality lie?
We have reviewed the literature to evaluate some of the claims and counter-claims. The evidence suggests that ADHD is primarily a polygenic disorder involving at least 50 genes, including those encoding enzymes of neurotransmitter metabolism, neurotransmitter transporters and receptors. Because of its polygenic nature, ADHD is often accompanied by other behavioral abnormalities. It is present in adults as well as children, but in itself it does not necessarily impair function in adult life; associated disorders, however, may do so. A range of treatment options is reviewed and the mechanisms responsible for the efficacy of standard drug treatments are considered.
The genes so far implicated in ADHD account for only part of the total picture. Identification of the remaining genes and characterization of their interactions is likely to establish ADHD firmly as a biological disorder and to lead to better methods of diagnosis and treatment.
ADHD; attention; hyperactivity; inattention; genetics; aberrant behavioral co-morbidity; treatment; genomics