Epileptiform abnormalities often occur at specific times of day or night, possibly attributable to state of consciousness (sleep vs. wake) and/or influences from the endogenous circadian pacemaker. In this pilot study we tested for the existence of circadian variation of interictal epileptiform discharges (IED), independent of changes in state, environment, or behavior. Five patients with generalized epilepsy underwent a protocol whereby their sleep/wake schedule was evenly distributed across the circadian cycle while undergoing full montage electroencephalography and hourly plasma melatonin measurements. Light was <8 lux to prevent circadian entrainment. All patients completed the protocol testifying to its feasibility. All patients had normal circadian rhythmicity of plasma melatonin relative to their habitual sleep times. In the three patients with sufficient IED to assess variability, most IED occurred during NREM (ratio NREM: Wake = 14:1; P<0.001). In both patients who had NREM at all circadian phases, there was apparent circadian variation in IED but with different phases relative to peak melatonin.
circadian; epilepsy; idiopathic generalized epilepsy; sleep; nocturnal; interictal discharges; sleep/wake distribution
We assessed whether home blood pressure monitoring improved the prediction of progression of albuminuria when added to office measurements, and compared it to ambulatory blood pressure monitoring in a multiethnic cohort of older people (n=392) with diabetes mellitus, without macroalbuminuria, participating in the telemedicine arm of the Informatics for Diabetes Education and Telemedicine (IDEATel) study. Albuminuria was assessed by measuring the spot urine albumin-to-creatinine ratio at baseline and annually for three years. Ambulatory sleep/wake systolic blood pressure ratio was categorized as dipping (ratio≤0.9), non-dipping (ratio>0.9 -1), and nocturnal rise (ratio>1). In a repeated measures mixed linear model, after adjustment that included office pulse pressure, home pulse pressure was independently associated with higher follow-up albumin-to-creatinine ratio (p=0.001). That association persisted (p=0.01) after adjusting for 24-hour pulse pressure, and nocturnal rise, which were also independent predictors (p=0.02 and p=0.03, respectively). Cox proportional hazards models examined progression of albuminuria (n=74) as defined by cutoff values used by clinicians. After adjustment for office pulse pressure the hazards ratio (95% CI) per 10 mmHg increment of home pulse pressure was 1.34 (1.1-1.7), p=0.01. Home pulse pressure was not an independent predictor in the model including ambulatory monitoring data—a nocturnal rise was the only independent predictor (p=0.035). However, Cox models built separately for home pulse pressure and ambulatory monitoring exhibited similar calibration and discrimination. In conclusion, home blood pressure adds to office measurements and may substitute for ambulatory monitoring to predict worsening of albuminuria in elderly people with diabetes.
Albuminuria; Diabetes mellitus; Home Blood Pressure; Ambulatory Blood Pressure
Background and Purpose
Small vessel disease contributes to the pathophysiology of stroke, and retinal microvascular signs have been linked to risk of stroke. We examined the relationship of retinal signs with incident stroke in a multi-ethnic cohort.
The Multi-Ethnic Study of Atherosclerosis (MESA) is a prospective cohort study that enrolled participants without clinical cardiovascular diseases from six United States communities between 2000–02. Of the participants, 4,849 (71.2%) had fundus photography performed in 2002–04. Retinopathy and retinal vessel caliber were assessed from retinal images. Stroke risk factors including high-sensitivity C-reactive protein (hsCRP), carotid artery intima-media thickness (IMT) and coronary artery calcium (CAC) were measured using standardized protocols. Incident stroke was confirmed from medical record review and death certificates.
After 6 years of follow-up, there were 62 incident strokes. Narrower retinal arteriolar caliber was associated with increased risk of stroke after adjusting for conventional cardiovascular risk factors (adjusted incidence rate ratio [IRR] 2.83, 95% confidence interval [CI] 1.34–5.95, p=0.006; adjusted hazard ratio [HR] 3.01, 95% CI 1.29–6.99, p=0.011). Retinopathy in persons without diabetes was associated with increased risk of stroke (adjusted IRR 2.96, 95% CI 1.50–5.84, p=0.002; adjusted HR 3.07, 95%CI 1.17–8.09, p=0.023). These associations remained significant after adjusting for hsCRP, carotid IMT or CAC.
Narrower retinal arteriolar caliber and retinopathy in non-diabetic persons were associated with increased risk of stroke in this relatively healthy multi-ethnic cohort independent of traditional risk factors and measures of atherosclerosis. The association between narrower retinal arteriolar caliber and stroke warrants further investigation.
Stroke; Retinal microvascular signs; Retinopathy; Retinal vessel caliber
Suboptimal bowel preparation, present in over 20% of colonoscopies, can severely compromise the effectiveness of the colonoscopy procedure. We surveyed 93 primarily urban minority men and women who underwent asymptomatic ‘screening’ colonoscopy regarding their precolonoscopy bowel-preparation experience.
Print materials alone (39.8%) and in-person verbal instructions alone (35.5%) were reportedly the most common modes of instruction from the gastroenterologists. Liquid-containing laxative (70.6%) was the most common laxative agent; a clear liquid diet (69.6%) the most common dietary restriction. Almost half of the participants mentioned ‘getting the laxative down’ as one of the hardest parts of the preparation; 40.9% mentioned dietary restrictions. The 24.7% who mentioned ‘understanding the instructions’ as one of the hardest parts were more likely to be non-US born and to have lower education and income. There was no relationship between difficulty in understanding instructions and mode of instruction or preparation protocol. One quarter suggested that a smaller volume and/or more palatable liquid would have made the preparation easier. Three quarters agreed that it would have been helpful to have someone to guide them through the preparation process.
These findings suggest a variety of opportunities for both physician- and patient-directed educational interventions to promote higher rates of optimal colonoscopy bowel preparation.
Colorectal cancer screening; colonoscopy; bowel preparation
Due to the high sensitivity of diffusion tensor imaging (DTI) to physiological motion, clinical DTI scans often suffer a significant amount of artifacts. Tensor-fitting-based, post-processing outlier rejection is often used to reduce the influence of motion artifacts. Although it is an effective approach, when there are multiple corrupted data, this method may no longer correctly identify and reject the corrupted data. In this paper, we introduce a new criterion called “corrected Inter-Slice Intensity Discontinuity” (cISID) to detect motion-induced artifacts. We compared the performance of algorithms using cISID and other existing methods with regard to artifact detection. The experimental results show that the integration of cISID into fitting-based methods significantly improves the retrospective detection performance at post-processing analysis. The performance of the cISID criterion, if used alone, was inferior to the fitting-based methods, but cISID could effectively identify severely corrupted images with a rapid calculation time. In the second part of this paper, an outlier rejection scheme was implemented on a scanner for real-time monitoring of image quality and reacquisition of the corrupted data. The real-time monitoring, based on cISID and followed by post-processing, fitting-based outlier rejection, could provide a robust environment for routine DTI studies.
To assess the association between serum adiponectin level and all-cause mortality in people with type 2 diabetes. Because of the insulin-sensitizing, anti-inflammatory, and antiatherogenic effects of adiponectin, we hypothesized that higher adiponectin level would be associated with lower all-cause mortality.
RESEARCH DESIGN AND METHODS
A total of 609 men and women aged 72 ± 6.3 years with type 2 diabetes and information on total and high molecular weight adiponectin were followed for a median of 5 years. The longitudinal association between adiponectin and all-cause mortality was analyzed with Cox proportional hazards models with time from adiponectin measurement to death as the time-to-event variable. Analyses were adjusted for demographic variables and significant diabetes parameters, significant cardiovascular parameters, and significant diabetes medications.
Total and high molecular weight adiponectin were highly correlated. The highest adiponectin quartile was strongly associated with higher all-cause mortality compared with the lowest quartile (hazard ratio = 4.0 [95% CI: 1.7–9.2]) in the fully adjusted model. These results did not change in analyses stratified by sex and thiazolidinedione use, after exclusion of people who died within one year of adiponectin measurement, or when change in weight before adiponectin measurement was considered.
Contrary to our hypothesis, higher adiponectin level was related to higher all-cause mortality. This association was not explained by confounding by other characteristics, including medications or preceding weight loss.
Despite the extended overnight fast, paradoxically, people are typically not ravenous in the morning and breakfast is typically the smallest meal of the day. Here we assessed whether this paradox could be explained by an endogenous circadian influence on appetite with a morning trough, while controlling for sleep/wake and fasting/feeding effects.
Design and Methods
We studied 12 healthy non-obese adults (6 male; age, 20–42 year) throughout a 13-day laboratory protocol that balanced all behaviors, including eucaloric meals and sleep periods, evenly across the endogenous circadian cycle. Participants rated their appetite and food preferences by visual analog scales.
There was a large endogenous circadian rhythm in hunger, with the trough in the biological morning (8 AM) and peak in the biological evening (8 PM; peak-to-trough amplitude=17%; P=0.004). Similarly phased significant endogenous circadian rhythms were present in appetite for sweets, salty and starchy foods, fruits, meats/poultry, food overall, and for estimates of how much food participants could eat (amplitudes 14–25%; all P < 0.05).
In people who sleep at night, the intrinsic circadian evening peak in appetite may promote larger meals before the fasting period necessitated by sleep. Furthermore, the circadian decline in hunger across the night would theoretically counteract the fasting-induced hunger increase that could otherwise disrupt sleep.
Appetite; Circadian; Energy Balance; Hunger; Metabolism
Human motor activity has a robust, intrinsic fractal structure with similar patterns from minutes to hours. The fractal activity patterns appear to be physiologically important because the patterns persist under different environmental conditions but are significantly altered/reduced with aging and Alzheimer's disease (AD). Here, we report that dementia patients, known to have disrupted circadian rhythmicity, also have disrupted fractal activity patterns and that the disruption is more pronounced in patients with more amyloid plaques (a marker of AD severity). Moreover, the degree of fractal activity disruption is strongly associated with vasopressinergic and neurotensinergic neurons (two major circadian neurotransmitters) in postmortem suprachiasmatic nucleus (SCN), and can better predict changes of the two neurotransmitters than traditional circadian measures. These findings suggest that the SCN impacts human activity regulation at multiple time scales and that disrupted fractal activity may serve as a non-invasive biomarker of SCN neurodegeneration in dementia.
Debate surrounds the nature of gender differences in rates of posttraumatic stress disorder (PTSD).
The goal of this study was to quantify and explore the reasons for gender differences in rates of PTSD in low income, primary care patients after the World Trade Center (WTC) attack of September 11, 2001.
A survey was conducted at a large primary care practice in New York City 7 to 16 months after the WTC attack. The study involved a systematic sample of primary care patients aged 18 to 70 years. The main outcome measures were the Life Events Checklist, the Posttraumatic Stress Disorder Checklist–Civilian Version, and the Primary Care Evaluation of Mental Disorders Patient Health Questionnaire, all administered by a bilingual research staff.
A total of 3807 patients were approached at the primary care clinic. Of the 1347 who met eligibility criteria, 1157 (85.9%) consented to participate. After the addition of the WTC/PTSD supplement to the study, the total number of patients was 992, of whom 982 (99.0%) completed the survey. Both sexes had high rates of direct exposure to the WTC attack and high rates of lifetime exposure to stressful life events. Overall, females had lower rates of exposure to the attack compared with males (P < 0.05). Hispanic females had the highest rate of PTSD in the full sample. Gender differences in rates of PTSD were largely accounted for by differences in marital status and education. The rate of current major depressive disorder (MDD) was higher in females than in males (P < 0.001), and the reverse was true for substance abuse (P < 0.001). Gender differences for MDD and substance abuse persisted even after adjustments for demographic differences between the sexes.
The increased rate of PTSD in women attending a primary care clinic was mediated by their social and economic circumstances, such as living alone without a permanent relationship and with little education or income. The increased rate of MDD in women appeared to be less dependent on these circumstances. These findings have implications for the treatment of women with PTSD in primary care and for research on gender differences in rates of psychiatric disorders.
posttraumatic stress disorder; gender; World Trade Center; primary care
Epidemiological studies link short sleep and circadian disruption with risk of metabolic syndrome and diabetes. We tested the hypotheses that prolonged sleep restriction with concurrent circadian disruption, as can occur with shift work, impairs glucose regulation and metabolism. Healthy adults spent >5 weeks in controlled laboratory conditions including: sleep extension (baseline), 3-week sleep restriction (5.6 h sleep/24 h) combined with circadian disruption (recurring 28-h ‘days’), and 9-day recovery sleep with circadian re-entrainment. Prolonged sleep restriction with concurrent circadian disruption significantly decreased resting metabolic rate, and increased postprandial plasma via inadequate pancreatic beta cell responsivity; these normalized with 9 days of recovery sleep and stable circadian reentrainment. Thus, in humans, prolonged sleep restriction with concurrent circadian disruption alters metabolism and could increase risk of obesity and diabetes.
Screening for psychiatric disorders has gained acceptance in some general medical settings, but critics argue about its value. The purpose of this study was to determine the clinical utility of screening by conducting a long-term follow-up of patients who screened positive for psychiatric disorders but who were initially not in treatment.
A cohort of 519 low-income, adult primary care patients were screened for major depression and bipolar, anxiety, and substance use disorders and reassessed with the Structured Clinical Interview for DSM-IV after a mean of 3.7 years by a clinician blind to the initial screen. Data on treatment utilization was obtained through hospital records. The sample consisted of 348 patients who had not received psychiatric care in the year before screening.
Among 39 patients who screened positive for major depression, 62% (95% confidence interval=45.5%–77.6%) met criteria for current major depressive disorder at follow-up. Those who screened positive reported significantly poorer mental and social functioning and worse general health at follow-up than the screen-negative patients and were more likely to have visited the emergency department for psychiatric reasons (12.1% and 3.0%, odds ratio [OR]=6.4) and to have major depression (OR=7.6). Generally similar results were observed for patients who screened positive for other disorders.
Commonly used screening methods identified patients with psychiatric disorders; about four years later, those not initially in treatment were likely to have enduring symptoms and to use emergency psychiatric services. Screening should be followed up by clinical diagnostic assessment in the context of available mental health treatment.
This study examines the long-term psychiatric consequences, pain interference in daily activities, work loss, and functional impairment associated with 9/11-related loss among low-income, minority primary care patients in New York City. A systematic sample of 929 adult patients completed a survey that included a sociodemographic questionnaire, the PTSD Checklist, the PRIME-MD Patient Health Questionnaire, and the Medical Outcomes Study Short Form-12 (SF-12).
Approximately one-quarter of the sample reported knowing someone who was killed in the attacks of 9/11, and these patients were sociodemographically similar to the rest of the sample. Compared to patients who had not experienced 9/11-related loss, patients who experienced loss were roughly twice as likely (OR = 1.97, 95%; CI = 1.40, 2.77) to screen positive for at least one mental disorder, including major depressive disorder (MDD; 29.2%), generalized anxiety disorder (GAD; 19.4%), and posttraumatic stress disorder (PTSD; 17.1%). After controlling for pre-9/11 trauma, 9/11-related loss was significantly related to extreme pain interference, work loss, and functional impairment. The results suggest that disaster-related mental health care in this clinical population should emphasize evidence-based treatments for mood and anxiety disorders.
To examine relationships between exposure to trauma, bipolar spectrum disorder (BD) and posttraumatic stress disorder (PTSD) in a sample of primary care patients.
A systematic sample (n = 977) of adult primary care patients from an urban general medicine practice were interviewed with measures including the Mood Disorders Questionnaire, the PTSD Checklist–Civilian Version, and the Medical Outcomes Study 12-Item Short Form Health Survey.
Compared with patients who screened negative for BD (n = 881), those who screened positive (n = 96) were 2.6 times [95% confidence interval (CI): 1.6–4.2] as likely to report physical or sexual assault, and 2.9 times (95% CI: 1.6–5.1) as likely to screen positive for current PTSD. Among those screening positive for BD, comorbid PTSD was associated with significantly worse social functioning. These results controlled for selected background characteristics, current major depressive episode, and current alcohol/drug use disorder.
In an urban general medicine setting, trauma exposure was related to BD, and the frequency of PTSD among patients with BD appears to be common and clinically significant. These results suggest an unmet need for mental health care in this specific population and are especially important in view of available treatments for BD and PTSD.
bipolar disorder; posttraumatic stress disorder; primary care; trauma exposure
To screen for posttraumatic stress disorder (PTSD) in primary care patients 7–16 months after 9/11 attacks and to examine its comorbidity, clinical presentation and relationships with mental health treatment and service utilization.
A systematic sample (n = 930) of adult primary care patients who were seeking primary care at an urban general medicine clinic were interviewed using the PTSD Checklist: the Primary Care Evaluation of Mental Disorders (PRIME-MD) Patient Health Questionnaire and the Medical Outcome Study 12-Item Short Form Health Survey (SF-12). Health care utilization data were obtained by a cross linkage to the administrative computerized database.
Prevalence estimates of current 9/11-related probable PTSD ranged from 4.7% (based on a cutoff PCL-C score of 50 and over) to 10.2% (based on the DSM-IV criteria). A comorbid mental disorder was more common among patients with PTSD than patients without PTSD (80% vs. 30%). Patients with PTSD were more functionally impaired and reported increased use of mental health medication as compared to patients without PTSD (70% vs. 18%). Among patients with PTSD there was no increase in hospital and emergency room (ER) admissions or outpatient care during the first year after the attacks.
In an urban general medicine setting, 1 year after 9/11, the frequency of probable PTSD appears to be common and clinically significant. These results suggest an unmet need for mental health care in this clinical population and are especially important in view of available treatments for PTSD.
Primary care; Posttraumatic stress disorder; 9/11 attacks
Adiposity is associated with C-reactive protein level in healthy 2–3 year old children and with other markers of endothelial activation adults, but data are lacking in very young children. Data from 491 healthy Hispanic children were analyzed. Mean age was 2.7 years (S.D. 0.5, range 2 to 3 years); mean body mass index (BMI) was 17.2 kg/m2 (S.D. 1.9) among boys and 17.1 kg/m2 (S.D. 2.1) among girls. E-selectin level was associated with BMI (R =0.11; p < 0.02), ponderal index (p < 0.02), waist circumference (p = 0.02), fasting insulin (p < 0.02), and insulin resistance (p ≤ 0.05); these associations remained significant after adjustment for age, sex and fasting glucose. sVCAM was also associated with BMI (R = 0.12; P<0.05). These observations indicate that adiposity is associated with inflammation and endothelial activation in very early childhood.
children; adiposity; E-selectin; sICAM; sVAM
Diurnal variation in nitrogen homeostasis is observed across phylogeny. But whether these are endogenous rhythms, and if so, molecular mechanisms that link nitrogen homeostasis to the circadian clock remain unknown. Here, we provide evidence that a clock-dependent peripheral oscillator, Krüppel-like factor15 transcriptionally coordinates rhythmic expression of multiple enzymes involved in mammalian nitrogen homeostasis. In particular, Krüppel-like factor15-deficient mice exhibit no discernable amino acid rhythm, and the rhythmicity of ammonia to urea detoxification is impaired. Of the external cues, feeding plays a dominant role in modulating Krüppel-like factor15 rhythm and nitrogen homeostasis. Further, when all behavioral, environmental and dietary cues were controlled in humans, nitrogen homeostasis still expressed endogenous circadian rhythmicity. Thus, in mammals, nitrogen homeostasis exhibits circadian rhythmicity, and is orchestrated by Krüppel-like factor15.
To prospectively examine the association of retinal microvascular signs with incident diabetes and impaired fasting glucose (IFG) in a multi-ethnic population-based cohort.
The multi-ethnic study of atherosclerosis comprised Caucasians, African-Americans, Hispanics and Chinese aged 45–84 years. Retinal vascular calibre and retinopathy were quantified from baseline retinal photographs. Incident diabetes and IFG were ascertained prospectively.
After a median follow-up of 3 years, 243 (4.9%) people developed diabetes and 565 (15.0%) developed IFG. After adjusting for known risk factors, participants with wider retinal arteriolar calibre had a higher risk of developing diabetes [HR: 1.60; 95% CI: 1.12–2.29, p = 0.011 comparing highest with lowest arteriolar calibre tertile]. In ethnic subgroup analysis, the association between wider retinal arteriolar calibre and incident diabetes was stronger and statistically significant only in Caucasians [HR: 2.78; 95% CI: 1.37–5.62, p = 0.005]. Retinal venular calibre and retinopathy signs were not related to risk of diabetes or IFG.
Wider retinal arteriolar calibre is independently associated with an increased risk of diabetes, supporting a possible role for early arteriolar changes in diabetes development. This effect was largely seen in Caucasians, and not in other ethnic groups, and may reflect ethnic differences in susceptibility to diabetes from microvascular pathways.
Retinal microvascular calibre; Retinopathy; Diabetes; Impaired fasting glucose
Poor cell survival and difficulties with visualization of cell delivery are major problems with current cell transplantation methods. To protect cells from early destruction, microencapsulation methods have been developed. The addition of a contrast agent to the microcapsule also could enable tracking by MR, ultrasound, and X-ray imaging. However, determining the cell viability within the microcapsule still remains an issue. Reporter gene imaging provides a way to determine cell viability, but delivery of the reporter probe by systemic injection may be hindered in ischemic diseases.
In the present study, mesenchymal stem cells (MSCs) were transfected with triple fusion reporter gene containing red fluorescent protein, truncated thymidine kinase (SPECT/PET reporter) and firefly luciferase (bioluminescence reporter). Transfected cells were microencapsulated in either unlabeled or perfluorooctylbromide (PFOB) impregnated alginate. The addition of PFOB provided radiopacity to enable visualization of the microcapsules by X-ray imaging. Before intramuscular transplantation in rabbit thigh muscle, the microcapsules were incubated with D-luciferin, and bioluminescence imaging (BLI) was performed immediately. Twenty-four and forty-eight hours post transplantation, c-arm CT was used to target the luciferin to the X-ray-visible microcapsules for BLI cell viability assessment, rather than systemic reporter probe injections. Not only was the bioluminescent signal emission from the PFOB-encapsulated MSCs confirmed as compared to non-encapsulated, naked MSCs, but over 90% of injection sites of PFOB-encapsulated MSCs were visible on c-arm CT. The latter aided in successful targeting of the reporter probe to injection sites using conventional X-ray imaging to determine cell viability at 1-2 days post transplantation. Blind luciferin injections to the approximate location of unlabeled microcapsules resulted in successful BLI signal detection in only 18% of injections. In conclusion, reporter gene probes can be more precisely targeted using c-arm CT for in vivo transplant viability assessment, thereby avoiding large and costly systemic injections of a reporter probe.
mesenchymal stem cells; reporter gene; microencapsulation; bioluminescence; c-arm CT; probe targeting.
The mammalian central circadian pacemaker (the suprachiasmatic nucleus, SCN) contains thousands of neurons that are coupled through a complex network of interactions. In addition to the established role of the SCN in generating rhythms of ∼24 hours in many physiological functions, the SCN was recently shown to be necessary for normal self-similar/fractal organization of motor activity and heart rate over a wide range of time scales—from minutes to 24 hours. To test whether the neural network within the SCN is sufficient to generate such fractal patterns, we studied multi-unit neural activity of in vivo and in vitro SCNs in rodents. In vivo SCN-neural activity exhibited fractal patterns that are virtually identical in mice and rats and are similar to those in motor activity at time scales from minutes up to 10 hours. In addition, these patterns remained unchanged when the main afferent signal to the SCN, namely light, was removed. However, the fractal patterns of SCN-neural activity are not autonomous within the SCN as these patterns completely broke down in the isolated in vitro SCN despite persistence of circadian rhythmicity. Thus, SCN-neural activity is fractal in the intact organism and these fractal patterns require network interactions between the SCN and extra-SCN nodes. Such a fractal control network could underlie the fractal regulation observed in many physiological functions that involve the SCN, including motor control and heart rate regulation.
Sudden cardiac death exhibits diurnal variation in both acquired and hereditary forms of heart disease 1, 2, but the molecular basis is unknown. A common mechanism that underlies susceptibility to ventricular arrhythmias is abnormalities in the duration (e.g. short or long QT syndromes, heart failure) 3-5 or pattern (e.g. Brugada syndrome) 6 of myocardial repolarization. Here we provide the first molecular evidence that links circadian rhythms to vulnerability in ventricular arrhythmias in mice. Specifically, we show that cardiac ion channel expression and QT interval duration (an index of myocardial repolarization) exhibit endogenous circadian rhythmicity under the control of a novel clock-dependent oscillator, Krüppel-like factor 15 (Klf15). Klf15 transcriptionally controls rhythmic expression of KChIP2, a critical subunit required for generating the transient outward potassium current (Ito). 7 Deficiency or excess of Klf15 causes loss of rhythmic QT variation, abnormal repolarization and enhanced susceptibility to ventricular arrhythmias. In sum, these findings identify circadian transcription of ion channels as a novel mechanism for cardiac arrhythmogenesis.
The prevalence of hypertension is higher among African-Americans than whites. However, inconsistent findings have been reported on the incidence of hypertension among middle-aged and older African-Americans and whites and limited data are available on the incidence of hypertension among Hispanics and Asians in the US. Therefore, this study investigated the age-specific incidence of hypertension by ethnicity for 3,146 participants from the Multi-Ethnic Study of Atherosclerosis. Participants, age 45–84 years at baseline, were followed for a median of 4.8 years for incident hypertension, defined as systolic blood pressure ≥ 140 mmHg, diastolic blood pressure ≥ 90 mmHg, or the initiation of antihypertensive medications. The crude incidence rate of hypertension, per 1,000 person-years, was 56.8 for whites, 84.9 for African-Americans, 65.7 for Hispanics, and 52.2 for Chinese. After adjustment for age, gender, and study site, the incidence rate ratio (IRR) for hypertension was increased for African-Americans age 45–54 (IRR=2.05, 95% CI=1.47, 2.85), 55–64 (IRR=1.63, 95% CI=1.20, 2.23), and 65–74 years (IRR=1.67, 95% CI=1.21, 2.30) compared with whites, but not for those 75–84 years of age (IRR=0.97, 95% CI=0.56, 1.66). Additional adjustment for health characteristics attenuated these associations. Hispanic participants also had a higher incidence of hypertension compared with whites; however, hypertension incidence did not differ for Chinese and white participants. In summary, hypertension incidence was higher for African-Americans compared with whites between 45 and 74 years of age but not after age 75 years. Public health prevention programs tailored to middle-age and older adults are needed to eliminate ethnic disparities in incident hypertension.
hypertension; race/ethnicity; epidemiology; incidence
to examine whether improved diabetes control is related to better cognitive outcomes.
randomized control trial
a randomized trial of telemedicine vs. usual care in elderly persons with type 2 diabetes.
Participants were 2169 persons 55 years and older with type 2 diabetes from New York City and Upstate New York.
The diabetes case management intervention was implemented by a diabetes nurse, via a telemedicine unit in the participant’s home, and in coordination with the primary care physician.
Hemoglobin A1c (HbA1c), systolic blood pressure (SBP), and low density lipoprotein cholesterol (LDL), were measured at a baseline visit and at up to 5 annual follow-up visits. Global cognition was measured at those visits with the Comprehensive Assessment and Referral Evaluation (CARE).
In mixed models the intervention was related to slower global cognitive decline in the intervention group (p = 0.01). Improvements in HbA1c (p = 0.03), but not SBP or LDL, mediated the effect of the intervention on cognitive decline.
Improved diabetes control in the elderly following existing guidelines through a telemedicine intervention was associated with less global cognitive decline. The main mediator of this effect seemed to be improvements in HbA1c.
Diabetes treatment; cognitive impairment; clinical trials
Blood pressure (BP) usually decreases during nocturnal sleep and increases during daytime activities. Whether the endogenous circadian control system contributes to this daily BP variation has not been determined under appropriately controlled conditions.
To determine if there exists an endogenous circadian rhythm of BP in humans.
Methods and Results
In 28 normotensive adults (16 men), we assessed BP across three complementary, multi-day, in-laboratory protocols performed in dim light, throughout which behavioral and environmental influences were controlled and/or uniformly distributed across the circadian cycle via: (1) a 38-h ‘constant routine’, including continuous wakefulness; (2) a 196-h ‘forced desynchrony’ with seven recurring 28-h sleep/wake cycles; and (3) a 240-h ‘forced desynchrony’ with twelve recurring 20-h sleep/wake cycles. Circadian phases were derived from core body temperature. Each protocol revealed significant circadian rhythms in systolic and diastolic BP, with almost identical rhythm profiles among protocols. The peak-to-trough amplitudes were 3–6 mmHg for systolic BP and 2–3 mmHg for diastolic BP (always P<0.05). All six peaks (systolic and diastolic BP in three protocols) occurred at a circadian phase corresponding to ~9 PM. Based on substantial phase differences among circadian rhythms of BP and other variables, the rhythm in BP appeared to be unrelated to circadian rhythms in cortisol, catecholamines, cardiac vagal tone, heart rate, or urine flow.
There exists a robust endogenous circadian rhythm in BP. The highest BP occurred at the circadian time corresponding to ~9 PM, suggesting that the endogenous BP rhythm is unlikely to underlie the well-documented morning peak in adverse cardiovascular events.
Circadian; Blood Pressure; Humans; Myocardial Infarction; Stroke