Background: Childhood obesity and hypertension are global problems that are on the rise in India. Improving physical activity is an accepted main line of strategy for overcoming poor body composition, hypertension and reduced cardio respiratory fitness (CRF) all of which are considered as independent risk factors for the development of future cardiovascular complications.
Aim: Present study was conducted to evaluate the effect of regular unstructured physical training and athletic level training on anthropometric measures, body composition, blood pressure and cardio respiratory fitness in adolescents.
Settings and Design: This is a collaborative study between the Department of physiology, Jawaharlal Institute of Postgraduate Medical Education and Research and Residential school, Jawahar Navodhya Vidyalaya, Puducherry, India.
Method and Material: Student volunteers in the age group of 12–17 years were classified into athletes (group 1) and physically active non-athletes (group 2). Parameters measured and calculated were weight, height, body mass index, waist and hip circumference, body fat percentage (BF%), fat free mass (FFM), Systolic (SBP) & Diastolic blood pressure (DBP), Mean arterial pressure (MAP), Rate pressure product (RPP) and Predicted VO2 max.
Statistical Analysis used: Mean difference between the groups was analysed using unpaired Student’s t–test. All statistical analysis was carried out for two-tailed significance at the 5 % level using SPSS version 19 (SPSSInc, USA).
Results: Anthropometric measures, body composition measures and blood pressure values of both the group students were within the normal limits. There was no significant difference in anthropometric and body composition parameters between the group 1 and group 2 students. DBP, MAP and RPP were significantly lower in group 1 students when compared to group 2 students. VO2 max values were more in group 1 girls as compared to group 2 girls while the values of boys were comparable between the two groups.
Conclusion: Regular unstructured physical activity for 60 minutes daily for the duration of one year can help the students to maintain their anthropometric parameters, body composition measures and CRF at par with the athletes of the same age and gender. However, athletic level training further reduces the cardiovascular load of the adolescent students.
Physical activity; Body composition; Fat free mass; Cardio respiratory fitness
Background & objectives:
Intensive regular physical exercise training is associated with a physiological changes in left ventricular (LV) morphology and functions. This cardiac remodeling observed in the athletes is associated with the specific haemodynamic requirements of the exercise undertaken. The main objective of this study is to evaluate the effect of endurance training on cardiac morphology, systolic and diastolic LV functions and haemodynamic parameters both in male and female athletes.
Seventy nine healthy athletes (age 20.0 ± 2.6 yr; 49% male) and 82 healthy sedentary adolescent (age 20.8 ± 2.2 yr, 49% male) volunteered to participate in this study. All subjects underwent transthoracic echocardiography and impedance cardiography.
Both female and male athletes had greater LV end-diastolic cavity sizes, LV mass and stroke volume (SV) values when compared with controls. Also, in male athletes, LV mass index was higher than in female athletes. While male athletes had lower resting heart rate compared to female athletes, they had higher mean arterial blood pressure. In male athletes, basal septal and mid septal strain values were higher compared to controls. There were no significant differences in strain and peak systolic strain rate values between female athletes and controls. In male athletes, there was a weak positive correlation between SV and LV mass, basal lateral and septal strain values. In female athletes, only a weak positive correlation was found between SV and basal septal strain values.
Interpretation & conclusions:
Endurance-trained male and female athletes had higher LV mass, LV cavity dimensions and SV compared to sedentary controls. Although there was no difference in diastolic cardiac functions between athletes and controls, local enhanced systolic function was found with increase of SV. Both morphologic and haemodynamic differences were more evident in male athletes.
Athlete's heart; endurance training; impedance cardiography; strain imaging; tissue Doppler
Elite endurance athletes typically have larger arteries contributing to greater skeletal muscle blood flow, oxygen and nutrient delivery and improved physical performance. Few studies have examined structural and functional properties of arteries in power athletes.
To compare the size and vasoreactivity of the brachial artery of elite power athletes to age-matched controls. It was hypothesized brachial artery diameters of athletes would be larger, have less vasodilation in response to cuff occlusion, but more constriction after a cold pressor test than age-matched controls.
Eight elite power athletes (age = 23±2 years) and ten controls (age = 22±1 yrs) were studied. High-resolution ultrasonography was used to assess brachial artery diameters at rest and following 5 minutes of forearm occlusion (Brachial Artery Flow Mediated Dilation = BAFMD) and a cold pressor test (CPT). Basic fitness measures included a handgrip test and 3-minute step test.
Brachial arteries of athletes were larger (Athletes 5.39±1.51 vs. Controls: 3.73±0.71 mm, p<0.05), had greater vasodilatory (BAFMD%: Athletes: 8.21±1.78 vs. Controls: 5.69±1.56%) and constrictor (CPT %: Athletes: -2.95±1.07 vs. Controls: −1.20±0.48%) responses, compared to controls. Vascular operating range (VOR = Peak dilation+Peak Constriction) was also greater in athletes (VOR: Athletes: 0.55±0.15 vs. Controls: 0.25±0.18 mm, p<0.05). Athletes had superior handgrip strength (Athletes: 55.92±17.06 vs. Controls: 36.77±17.06 kg, p<0.05) but similar heart rate responses at peak (Athletes: 123±16 vs. Controls: 130±25 bpm, p>0.05) and 1 minute recovery (Athletes: 88±21 vs. Controls: 98±26 bpm, p>0.05) following the step test.
Elite power athletes have larger brachial arteries, and greater vasoreactivity (greater vasodilatory and constrictor responses) than age-matched controls, contributing to a significantly greater VOR. These data extend the existence of an ‘athlete’s artery’ as previously shown for elite endurance athletes to elite power athletes, and presents a hypothetical explanation for the functional significance of the ‘power athlete’s artery’.
Context: As the number of female college students participating in athletics has grown dramatically in the last few decades, sports medicine health care providers have become more aware of the unique health concerns of athletic women. These concerns include disordered eating, amenorrhea, and osteoporosis: the female athlete triad. Disordered eating appears to be central in the triad, and the literature has conflicting data regarding the influence of athletic participation on disordered-eating behaviors.
Objective: To compare disordered-eating symptoms between collegiate athletes (in lean and non-lean sports) and nonathletes.
Design: A volunteer, cross-sectional cohort study of female students during the 2002–2003 academic year.
Setting: A National Collegiate Athletic Association Division I institution.
Patients or Other Participants: Undergraduate females, including 84 collegiate athletes and 62 nonathletes.
Main Outcome Measure(s): Symptoms associated with disordered eating were assessed using the Eating Disorders Inventory-2, a self-report measure of 91 items, and self-reported weight and menstrual function.
Results: The athletes had significantly lower scores in body dissatisfaction (P = .01) and ineffectiveness (P = .002). No difference in mean body weight was noted between the 2 groups, but the nonathlete group had a significantly lower desired body weight (P = .004). Lean-sport athletes had a higher score on body dissatisfaction (P = .008) and lower actual (P = .024) and desired body weight (P = .002) than non–lean-sport athletes. A total of 7.1% of the collegiate athletes and 12.9% of the nonathletes were classified as having a high risk for disordered eating. Within the athlete sample, the high-risk group included 2.9% of the non–lean-sport athletes and 25% of the lean-sport athletes.
Conclusions: In our study, female athletes did not exhibit more disordered-eating symptoms than women who did not participate in collegiate sports. However, our data suggest that lean-sport athletes are at greater risk for disordered eating than athletes in non-lean sports.
female athlete triad; nutrition; psychology
The aim of the study was to assess pre, during, and postexercise compartment pressures in the anterior tibial compartment in asymptomatic long distance runners (5000 m) and recreational athletes. Forty-eight participants (n = 48, 24 females and 24 males) underwent the experimental procedures. The participants were assigned into 4 groups of 12 volunteers. Intracompartmental pressures measurements were recorded 1 minute before, at the 1st minute after the onset of exercise, and finally 5 minutes after the completion of the exercise on treadmill. The wick catheter technique was the method of choice for measuring intracompartmental pressure values. Post hoc analysis of the groups by measures interaction indicated that all pairwise comparisons among pre-test (1 minute before exercise), during-test (1st minute during exercise), and post-test measures (5 minutes after exercise) were statistically significant for male controls (p < .001), male athletes (p < .001), female controls (p < .001) and female athletes (p < .001). The results confirm the correlation between long distance runners and the increased risk of chronic exertional compartment syndrome (CECS) development.
Key pointsCompartment syndrome is a condition characterised by increased intracompartmental pressures within inelastic fascia which surrounds muscular compartmentsInitial CECS symptomatology is not clear and increases graduallyAll the study participants presented the lowest intra-compartment pressure values one minute before the beginning of exercise (at rest) with the highest value being recorded at the first minute of exercise.Control population had lower intra-compartment pressure than professional runners.One minute after the beginning of exercise control and athlete men group showed higher intra-compartment pressure than control and athlete women group, indicating a probable sex difference both for athletes and controls.Further studies on predisposing factors of CECS, such as increased intracompartmental pressure values in asymptomatic population is needed to establish the diagnosis in a proper time.
Compartment syndrome; athletes; wick catheter; intracompartmental pressures; runners
Over the last two decades, morphological cardiac changes induced by athletic conditioning have been of great interest. Therefore, several studies have been orchestrated to delineate electrocardiography (ECG), echocardiography, and heart rate variability (HRV) findings in athletes.
To assess the ECG, echocardiography, and HRV in a group of dynamic and static type athletes.
Fifty professional athletes (20 static and 30 dynamic exercise athletes) and 50 healthy nonathletes (control group) were recruited. Standard 12-lead ECG and transthoracic echocardiography was performed on all athletes and the control group. Through echocardiography, variables including left ventricular (LV) end-diastolic/systolic diameter, LV mass, and left atrial volume index were measured. In addition, both the athletes and the control group underwent ECG Holter monitoring for 15 minutes and several parameters related to HRV (time and frequency domain) were recorded.
The most common ECG abnormalities among the athletes were sinus bradycardia and incomplete right bundle branch block. LV end-diastolic diameter and left atrial volume index were significantly greater in the dynamic athletes (P < 0.001). LV end-systolic diameter was significantly lower in the static group (P < 0.001). LV mass of the dynamic and static athletes was significantly greater than that of the controls (P < 0.001). Among the ECG Holter monitoring findings, the dynamic athletes had lower systolic blood pressure than the controls (P = 0.01). Heart rate was lowest in the control group (P < 0.001).
The most common ECG abnormalities among adolescent Iranian athletes were sinus bradycardia and incomplete right bundle branch block. Static exercise seemed to reduce LV end-systolic diameter, while dynamic exercise resulted in increased LV end-diastolic diameter and left atrial volume index. Additionally, Iranian athletes showed no differences in HRV parameters, excluding heart rate and systolic blood pressure, compared with the nonathletes.
athlete’s heart; electrocardiography; echocardiography; heart rate variability
We investigated the response of insulin-like growth factor (IGF- I), insulin-like growth factor binding protein-3 (IGFBP-3) and some hormones, i.e., testosterone (T), growth hormone (GH), cortisol (C), and insulin (I), to maximal exercise in road cyclists with and without diagnosed left ventricular hypertrophy. M-mode and two-dimensional Doppler echocardiography was performed in 30 professional male endurance athletes and a group of 14 healthy untrained subjects using a Hewlett-Packard Image Point HX ultrasound system with standard imaging transducers. Echocardiography and an incremental physical exercise test were performed during the competitive season. Venous blood samples were drawn before and immediately after the maximal cycling exercise test for determination of somatomedin and hormonal concentrations. The basal concentration of IGF-I was statistically higher (p < 0.05) in athletes with left ventricular muscle hypertrophy (LVH) when compared to athletes with a normal upper limit of the left ventricular wall (LVN) (p < 0.05) and to the control group (CG) (p < 0.01). The IGF-I level increased significantly at maximal intensity of incremental exercise in CG (p < 0.01), LVN (p < 0.05) and LVH (p < 0.05) compared to respective values at rest. Long-term endurance training induced an increase in resting (p < 0.01) and post-exercise (p < 0.05) IGF-I/IGFBP-3 ratio in athletes with LVH compared to LVN. The testosterone (T) level was lower in LVH at rest compared to LVN and CG groups (p < 0.05). These results indicate that resting serum IGF-I concentration were higher in trained subjects with LVH compared to athletes without LVH. Serum IGF- I/IGFBP-3 elevation at rest and after exercise might suggest that IGF-I act as a potent stimulant of left ventricular hypertrophy in chronically trained endurance athletes.
Key pointsIn sports training athletes engaged in the same training regimen acquired different stages of cardiac hypertrophy.Physical exercise had a significant effect on serum insulin-like growth factor - I concentration depending on maximal oxygen uptake during endurance exercise.Athletes with clinically diagnosed physiological left ventricular hypertrophy had higher resting serum insulin-like growth factor - I concentration compared to those without left ventricular hypertrophy and sedentary subjects.Increased insulin-like growth factor - I release during long-term training seems to significantly contribute to sports-specific functional adaptation of the left ventricle.
Echocardiography; heart; somatomedins; anabolic hormones; endurance training.
The purpose of this study was to examine exercise-induced arterial adaptations in elite Judo male and female athletes. 27 male Judo athletes (age 24.06 ± 2 years), 11 female Judoka (age 24.27 ± 1 years), 27 sedentary healthy men (age 24.01 ± 2 years) and 11 women (age 24.21 ± 1 years) participated in the current study. The examined vessels included brachial, radial, ulnar, popliteal, anterior and posterior tibial arteries. The experimental parameters were recorded with the use of Duplex ultrasound at rest. Diastolic diameter and blood mean flow velocity of the examined arteries in Judo athletes were found to be both significantly increased (p < 0.05) compared to the findings of the control groups. In male Judo athletes the brachial (p < 0.001), radial (p < 0.001), and anterior tibial artery (p < 0.001) presented the highest difference on the diastolic diameter, compared with the control male group. In female Judo athletes, ulnar (p < 0.001), radial (p < 0.001), and brachial (p < 0.001) arteries illustrated the highest diastolic diameter. The highest blood mean flow velocity was recorded in ulnar (p < 0.001) and popliteal arteries (p < 0.001) of the Judo athletes groups. Recording differences between the two genders, male participants presented larger arteries than females. Conclusively, Judo has been found to be a highly demanding physical sport, involving upper and lower limbs leading to significant arterial adaptations. Obtaining vascular parameters provide a useful tool to the medical team, not only in the direction of enhancement of the efficacy of physical training, but in unknown so far parameters that may influence athletic performance of both male and female elite Judokas.
Key pointsJudo athletes demonstrated a general homogenous increase of the arterial functionality of the upper and lower limbs compared to the control groups.Diastolic diameter found to be significantly increased in male and female Judo athletes, highlighting the effects of exercise training on the vascular system.Judo athletes had had statistically significant increase of the blood mean flow velocity in all examined arteries, compared with the relevant control group.The current study underscores the impact of Judo training on the structure and the function of the arterial system.Clinically, the increased arterial parameters in elite Judo athletes may be essential elements for improved athletic performance.Sports medicine practitioners should give special concern to the vascular functionality for several physiological and medical tests.
Diastolic diameter; blood mean flow velocity; duplex sonography; judo athletes
To examine the impact of life-stress sources that student athletic trainers encountered over the course of an academic year, to investigate the existence of sex differences in stress source symptoms, and to provide athletic training staffs with suggestions on ways to assist student athletic trainers.
Design and Setting:
In a classroom setting, the 25-item Quick Stress Questionnaire (QSQ) was administered to all subjects at the beginning of each month during an academic year. The QSQ, which can be completed in approximately 5 minutes, uses a 9-point Likert scale ranging from 1 (little stress) to 9 (extreme stress) to measure sources of stress and stress-related symptoms.
The sample consisted of 11 male and 9 female student athletic trainers enrolled in a Commission on Accreditation of Allied Health Education Programs (CAAHEP)-accredited undergraduate program at a mid-Atlantic university.
We computed descriptive statistics for the stress items and symptoms (ie, cognitive, somatic, and behavioral) and graphed them according to sex. Separate sex × time analyses of variance were performed to investigate changes in cognitive, somatic, and behavioral stress over the course of the study and to determine if these changes were different for male and female student athletic trainers.
Academic and financial concerns represented the greatest sources of stress for student athletic trainers. Repeated-measures analyses of variance indicated that stress levels fluctuated significantly during the academic year, with peak stress levels experienced during midterm and at the end of the spring semester. Although female student athletic trainers consistently reported higher levels of stress than their male counterparts, these differences were not statistically significant.
Student athletic trainers exhibited fluctuations in their stress levels throughout an academic calendar. Academic and financial concerns were the most common stressors. Certified athletic trainers should take an interest in their student athletic trainers and be willing to provide assistance in times of need. Additional research is needed regarding student athletic trainers and stress.
burnout; college students; coping; stress
Anabolic androgenic steroids (AAS) are sometimes used by power athletes to improve performance by increasing muscle mass and strength. Recent bioptical data have shown that in athletes under the pharmacological effects of AAS, a focal increase in myocardial collagen content might occur as a repair mechanism against myocardial damage.
To investigate the potential underlying left ventricular myocardial dysfunction after chronic misuse of AAS in athletes by use of Doppler myocardial imaging (DMI) and strain rate imaging (SRI).
Standard Doppler echocardiography, DMI, SRI and ECG treadmill test were undertaken by 45 bodybuilders, including 20 athletes misusing AAS for at least 5 years (users), by 25 anabolic‐free bodybuilders (non‐users) and by 25 age‐matched healthy sedentary controls, all men. The mean (SD) number of weeks of AAS use per year was 31.3 (6.4) in users, compared with 8.9 (3.8) years in non‐users, and the mean weekly dosage of AAS was 525.4 (90.7) mg.
The groups were matched for age. Systolic blood pressure was higher in athletes (145 (9) vs 130 (5) mm Hg) than in controls. Left ventricular mass index did not significantly differ between the two groups of athletes. In particular, both users and non‐users showed increased wall thickness and relative wall thickness compared with controls, whereas left ventricular ejection fraction, left ventricular end‐diastolic diameter and transmitral Doppler indexes were comparable for the three groups. Colour DMI analysis showed significantly lower myocardial early: myocardial atrial diastolic wave ratios in users at the level of the basal interventricular septum (IVS) and left ventricular lateral wall (p<0.01), in comparison with both non‐users and controls. In addition, in users, peak systolic left ventricular strain rate and strain were both reduced in the middle IVS (both p<0.001) and in the left ventricular lateral free wall (both p<0.01). By stepwise forward multivariate analyses, the sum of the left ventricular wall thickness (β coefficient = −0.32, p<0.01), the number of weeks of AAS use per year (β = −0.42, p<0.001) and the weekly dosage of AAS (β = −0.48, p<0.001) were the only independent determinants of middle IVS strain rate. In addition, impaired left ventricular strain in users was associated with a reduced performance during physical effort (p<0.001).
Several years after chronic misuse of AAS, power athletes show a subclinical impairment of both systolic and diastolic myocardial function, strongly associated with mean dosage and duration of AAS use. The combined use of DMI and SRI may therefore be useful for the early identification of patients with more diffused cardiac involvement, and eventually for investigation of the reversibility of such myocardial effects after discontinuation of the drug.
Lower bone density in young amenorrheic athletes (AA) compared to eumenorrheic athletes (EA) and non-athletes may increase fracture risk during a critical time of bone accrual. Finite element analysis (FEA) is a unique tool to estimate bone strength in vivo, and the contribution of cortical microstructure to bone strength in young athletes is not well understood.
We hypothesized that FEA-estimated stiffness and failure load are impaired in AA at the distal radius and tibia compared to EA and non-athletes despite weight-bearing exercise.
DESIGN AND SETTING
Cross-sectional study; Clinical Research Center
34 female endurance athletes involved in weight-bearing sports (17 AA, 17 EA) and 16 non-athletes (14-21y) of comparable age, maturity and BMI
We used HR-pQCT images to assess cortical microarchitecture and FEA to estimate bone stiffness and failure load.
Cortical perimeter, porosity and trabecular area at the weight-bearing tibia were greater in both groups of athletes than non-athletes, whereas the ratio (%) of cortical to total area was lowest in AA. Despite greater cortical porosity in EA, estimated tibial stiffness and failure load was higher than in non-athletes. However, this advantage was lost in AA. At the non-weight-bearing radius, failure load and stiffness were lower in AA than non-athletes. After controlling for lean mass and menarchal age, athletic status accounted for 5-9% of the variability in stiffness and failure load, menarchal age for 8-23%, and lean mass for 12-37%.
AA have lower FEA-estimated bone strength at the distal radius than non-athletes, and lose the advantage of weight-bearing exercise seen in EA at the distal tibia.
athletes; adolescents; bone strength; stiffness; failure load
I evaluated the perceptions student-athletes had of their athletic trainers and of the medical coverage provided them by the athletic departments at their institutions. My intent was to assess differences between male and female athletes, between athletes of high-profile and low-profile sports, and between athletes who competed at the NCAA Division I and Division II levels. The research design was also directed at identifying any subgroup of student-athletes who demonstrated a significantly different perception toward their athletic trainer(s).
Design and Setting:
Questionnaires were sent to 32 athletic training programs at 28 NCAA Division I and II institutions. Eighteen of the 32 programs participated, yielding a 56% response.
A total of 343 student-athletes from 18 selected athletic programs at both the NCAA Division I and II levels participated. One questionnaire contained response errors and was not included in the analysis.
A questionnaire was developed and pilot tested at 3 collegiate settings apart from those participating in the study. Validity and reliability analyses were conducted and confirmed by additional professionals in the field of athletic training. Cumulative mean perception scores between groups were measured using independent t tests. Differences in scores between subgroups were measured using a 1-way analysis of variance.
I observed significant differences in mean cumulative perception scores between sex and sport-profile groups. Male athletes and athletes in high-profile sports demonstrated a higher mean perception score than did females and athletes in low-profile sports. There was no difference in scores when compared across athletic divisions. Subgroups of all the athletes participating were identified. Several subgroups demonstrated significant differences in mean cumulative perception scores.
Males and females in low-profile sports at Division II schools and females in high-profile sports at Division II schools had significantly lower mean perception scores than did other subgroups of athletes.
relationship; attitude; responsibility; quality
Urine specific gravity is often used to assess hydration status. Athletes who are hypohydrated prior to exercise tend to ingest more fluid during the exercise, possibly to compensate for their pre exercise fluid deficit. The purpose of this study was to evaluate the effect of additional fluid intake on fluid balance and gastrointestinal tract comfort during 1h running in a thermoneutral environment when athletes followed their habitual fluid and dietary regimes. Sixteen men and sixteen women ingested a 6% carbohydrate-electrolyte solution immediately prior to exercise and then every 15 minutes during two runs, with a consumption rate of 2 mL.kg-1 (LV, lower volume) or 3 mL.kg-1 (HV, higher volume) body mass. Urine specific gravity and body mass changes were determined before and after the tests to estimate hydration status. During exercise subjects verbally responded to surveys inquiring about gastrointestinal symptoms, sensation of thirst and ratings of perceived exertion. Plasma glucose, heart rate and blood pressure were also evaluated. Men had higher preexercise urine specific gravity than women (1.025 vs. 1.016 g·mL-1 HV; and 1.024 vs. 1.017 g·mL-1 LV) and greater sweat loss (1.21 ± 0.27 L vs. 0.83 ± 0.21 L HV; and 1.18 ± 0.23 L vs. 0.77 ± 0.17 LV). Prevalence of gastrointestinal discomfort increased after 45 min. No significant differences on heart rate, rate of perceived exertion, blood pressure or glycemia was observed with the additional fluid intake. From these results it appears that additional fluid intake reduces body mass loss and thirst sensation. When compared to the men, however, preexercise euhydration was more common in women and an increased fluid intake increases the risk of body mass gain and gastrointestinal discomfort.
Key pointsThere seems to be a wide variability in pre-exercise hydration status between male and female and efforts aimed at educating athletes about the importance of pregame hydration must be emphasized.The fluid ingestion during running exercise in a moderate environment reduces body mass loss and thirst sensation, but an increased fluid intake at rates to match the fluid loss might raise the risk of body mass gain in women during prolonged activities.Individual gastric tolerance and familiarization with fluid replacement should be taken into account when providing athletes with strategies for hydration during exercise.
Hydration; urine specific gravity; exercise; gender; gastrointestinal discomfort
The interleukin-1 (IL-1) family of cytokines is involved in the inflammatory and repair reactions of skeletal muscle during and after exercise. Specifically, plasma levels of the IL-1 receptor antagonist (IL-1ra) increase dramatically after intense exercise, and accumulating evidence points to an effect of genetic polymorphisms on athletic phenotypes. Therefore, the IL-1 family cytokine genes are plausible candidate genes for athleticism. We explored whether IL-1 polymorphisms are associated with athlete status in European subjects.
Genomic DNA was obtained from 205 (53 professional and 152 competitive non-professional) Italian athletes and 458 non-athlete controls. Two diallelic polymorphisms in the IL-1β gene (IL-1B) at -511 and +3954 positions, and a variable number tandem repeats (VNTR) in intron 2 of the IL-1ra gene (IL-1RN) were assessed.
We found a 2-fold higher frequency of the IL-1RN 1/2 genotype in athletes compared to non-athlete controls (OR = 1.93, 95% CI = 1.37-2.74, 41.0% vs. 26.4%), and a lower frequency of the 1/1 genotype (OR = 0.55, 95% CI = 0.40-0.77, 43.9% vs. 58.5%). Frequency of the IL-1RN 2/2 genotype did not differ between groups. No significant differences between athletes and controls were found for either -511 or +3954 IL-1B polymorphisms. However, the haplotype (-511)C-(+3954)T-(VNTR)2 was 3-fold more frequent in athletes than in non-athletes (OR = 3.02, 95% CI = 1.16-7.87). Interestingly, the IL-1RN 1/2 genotype was more frequent in professional than in non-professional athletes (OR = 1.92, 95% CI = 1.02-3.61, 52.8% vs. 36.8%).
Our study found that variants at the IL-1ra gene associate with athletic status. This confirms the crucial role that cytokine IL-1ra plays in human physical exercise. The VNTR IL-1RN polymorphism may have implications for muscle health, performance, and/or recovery capacities. Further studies are needed to assess these specific issues. As VNTR IL-1RN polymorphism is implicated in several disease conditions, athlete status may constitute a confounding variable that will need to be accounted for when examining associations of this polymorphism with disease risk.
This study tested the hypothesis that male athletes who feel pressured to maintain a specific body weight present an elevated risk of subclinical eating disorders. Twelve judoists (19.5 ± 0.5 yr), fifteen cyclists (21.2 ± 2.8 yr) and seventeen non- competitive students matched for BMI and used as controls (21.8 ± 1.8 yr) were studied using the Eating Attitudes Test (EAT-26). The Multidimensional Perfectionism Scale, the Body Esteem Scale and the Profile of Mood States were also used to evaluate the relationships between eating disorders and psychological characteristics. Athletes completed the tests during their competitive period and controls completed the same scales at the same time. Scores obtained on EAT-26 differed significantly from the control group on EAT (p < 0.01), Dieting (p < 0.01), and Bulimia scores (p < 0.05). Sixty percent of the athletes used weight loss methods. Self-induced vomiting, use of laxatives and diet pills were reported by 4%, 10%, and 8.5% of them, respectively. Increasing exercise was the primary method used by controls to lose body weight. Athletes reported greater negative feelings about their physical appearance and their Body Weight Satisfaction than controls (p < 0.01, p < 0.05, respectively). Our results also showed that depression mood accounted for 73% of the variance in Bulimia scores and for 64% of the variance in Global EAT scores in athletes. Body-esteem Appearance and depression accounted for a significant proportion of the variance in Dieting scores. There was no difference in perfectionism and mood between athletes and controls. This study highlights that these athletes may tread a fine line between optimal competitive attitudes and detrimental health behaviors.
Key pointsPrevalence of eating disorders has become a growing concern among athletic populations, but very little information is available concerning male athletes.This study highlights that these athletes may tread a fine line between optimal competitive attitudes and detrimental health behaviors.
Eating behavior; male athletes; perfectionism; body esteem; mood
AIM: There is little information on the plasma free amino acid patterns of elite athletes against which fatigue and nutrition can be considered. Therefore the aim was to include analysis of this pattern in the medical screening of elite athletes during both especially intense and light training periods. METHODS: Plasma amino acid analysis was undertaken in three situations. (1) A medical screening service was offered to elite athletes during an intense training period before the 1992 Olympics. Screening included a blood haematological/biochemical profile and a microbial screen in athletes who presented with infection. The athletes were divided into three groups who differed in training fatigue and were considered separately. Group A (21 track and field athletes) had no lasting fatigue; group B (12 judo competitors) reported heavy fatigue at night but recovered overnight to continue training; group C (18 track and field athletes, one rower) had chronic fatigue and had been unable to train normally for at least several weeks. (2) Athletes from each group were further screened during a post- Olympic light training period. (3) Athletes who still had low amino acid levels during the light training period were reanalysed after three weeks of additional protein intake. RESULTS: (1) The pre-Olympics amino acid patterns were as follows. Group A had a normal amino acid pattern (glutamine 554 (25.2) micromol/l, histidine 79 (6.1) micromol/l, total amino acids 2839 (92.1) micromol/l); all results are means (SEM). By comparison, both groups B and C had decreased plasma glutamine (average 33%; p<0.001) with, especially in group B, decreased histidine, glucogenic, ketogenic, and branched chain amino acids (p<0.05 to p<0.001). None in group A, one in group B, but ten athletes in group C presented with infection: all 11 athletes had plasma glutamine levels of less than 450 micromol/l. No intergroup differences in haematological or other blood biochemical parameters, apart from a lower plasma creatine kinase activity in group C than in group B (p<0.05) and a low neutrophil to lymphocyte ratio in the athletes with viral infections (1.2 (0.17)), were found. (2) During post-Olympic light training, group A showed no significant amino acid changes. In contrast, group B recovered normal amino acid levels (glutamine 528 (41.4) micromol/l, histidine 76 (5.3) micromol/l, and total amino acids 2772 (165) micromol/l) (p<0.05 to p<0.001) to give a pattern comparable with that of group A, whereas, in group C, valine and threonine had increased (p<0.05), but glutamine (441 (24.5) micromol/l) and histidine (58 (5.3) micromol/l) remained low. Thus none in group A, two in group B, but ten (53%) in group C still had plasma glutamine levels below 450 micromol/l, including eight of the 11 athletes who had presented with infection. (3) With the additional protein intake, virtually all persisting low glutamine levels increased to above 500 micromol/l. Plasma glutamine rose to 592 (35.1) micromol/l and histidine to 86 (6.0) micromol/l. Total amino acids increased to 2761 (128) micromol/l (p<0.05 to p<0.001) and the amino acid pattern normalised. Six of the ten athletes on this protein intake returned to increased training within the three weeks. CONCLUSION: Analysis of these results provided contrasting plasma amino acid patterns: (a) a normal pattern in those without lasting fatigue; (b) marked but temporary changes in those with acute fatigue; (c) a persistent decrease in plasma amino acids, mainly glutamine, in those with chronic fatigue and infection, for which an inadequate protein intake appeared to be a factor.
With regard to blood pressure responses to plyometric exercise and decreasing blood pressure after exercise (post-exercise hypotension), the influence of different workloads of plyometric exercise on blood pressure is not clear.
The purpose of this investigation was to examine the effects of a low, moderate and high workload of plyometric exercise on the post-exercise systolic (SBP) and diastolic blood pressure (DBP), heart rate (HR) and rate-pressure product (RPP) responses in athletes.
Material and methods
Ten male athletes (age: 22.6 ±0.5 years; height: 178.2 ±3.3 cm; and body mass: 75.2 ±2.8 kg) underwent PE protocols involving 5 × 10 reps (Low Workload – LW), 10 × 10 reps (Moderate Workload – MW), and 15 × 10 reps (High Workload – HW) depth jump exercise from a 50-cm box in 3 non-consecutive days. After each exercise session, SBP, DBP and HR were measured every 10 min for a period of 70 min.
No significant differences were observed among post-exercise SBP and DBP when the protocols (LW, MW and HW) were compared. The MW and HW protocols showed greater increases in HR compared with LW. Also the HW indicated greater increases than LW in RPP at post-exercise (p < 0.05).
All protocols increased SBP, HR and RPP responses at the 10th and 20th min of post-exercise. With regard to different workloads of plyometric exercise, HW condition indicated greater increases in HR and RPP and strength and conditioning professionals and athletes must keep in their mind that HW of plyometric exercise induces greater cardiovascular responses.
plyometric; systolic blood pressure; diastolic blood pressure; heart rate; different workload
Objective: To establish a time profile to determine how athletic training students use their time in clinical placements and to determine the effects of academic standing, sex, sport type, and risk of injury associated with a sport during athletic training students' clinical placements on instructional, clinical, unengaged, managerial, and active learning time.
Design and Setting: Subjects were enrolled in clinical placements within National Collegiate Athletic Association Division I athletics, intramural sports, and a local high school. Students were individually videotaped for approximately 4 hours.
Subjects: A total of 20 undergraduate athletic training students (17 women, 3 men) from a Committee on Accreditation of Allied Health Education Programs (CAAHEP)-accredited athletic training education program.
Measurements: We created a conceptual behavioral time framework to examine athletic training students' use of clinical-placement time with the performance domains associated with the 1999 National Athletic Trainers' Association Board of Certification Role Delineation Study. Students' use of time was analyzed with the Behavior Evaluation Strategies and Taxonomies software.
Results: Students spent 7% of their overall clinical-placement time in instructional activities, 23% in clinical activities, 10% in managerial activities, and 59% in unengaged activities. Using multiple 3 × 3 factorial analyses of variance, we found that advanced students were engaged in significantly more active learning and clinical time compared with novice and intermediate students. Students assigned to sports in which injuries predominately occur in the upper extremities (upper extremity sports) spent significantly more clinical-placement time unengaged compared with students assigned to sports in which injuries predominantly occur in the lower extremities (lower extremity sports) or in both upper and lower extremities (mixed extremity sports).
Conclusions: In this exploratory study, we examined only the clinical-placement component of 1 athletic training program; therefore, it may not be accurate to generalize the results for all CAAHEP-accredited programs. However, these results can be used by athletic training educators to examine the amount of time students are actually engaged in specific domains of athletic training, to determine the domains in which skills are most commonly being performed, to identify the relationships between the students and clinical instructors or supervisors, and to develop clinical placements in which students learn and practice clinical and educational competencies.
engaged time; clinical behaviors; active learning time
Athletic training educators often anecdotally suggest that athletic training students enhance their learning by teaching their peers. However, peer-assisted learning (PAL) has not been examined within athletic training education in order to provide evidence for its current use or as a pedagogic tool.
To describe the prevalence of PAL in athletic training clinical education and to identify students' perceptions of PAL.
“The Athletic Training Student Seminar” at the National Athletic Trainers' Association 2002 Annual Meeting and Clinical Symposia.
Patients or Other Participants:
A convenience sample of 138 entry-level male and female athletic training students.
Main Outcome Measure(s):
Students' perceptions regarding the prevalence and benefits of and preferences for PAL were measured using the Athletic Training Peer-Assisted Learning Assessment Survey. The Survey is a self-report tool with 4 items regarding the prevalence of PAL and 7 items regarding perceived benefits and preferences.
A total of 66% of participants practiced a moderate to large amount of their clinical skills with other athletic training students. Sixty percent of students reported feeling less anxious when performing clinical skills on patients in front of other athletic training students than in front of their clinical instructors. Chi-square analysis revealed that 91% of students enrolled in Commission on Accreditation of Allied Health Education Programs–accredited athletic training education programs learned a minimal to small amount of clinical skills from their peers compared with 65% of students in Joint Review Committee on Educational Programs in Athletic Training–candidacy schools (χ2
3 = 14.57, P < .01). Multiple analysis of variance revealed significant interactions between sex and academic level on several items regarding benefits and preferences.
According to athletic training students, PAL is occurring in the athletic training clinical setting. Entry-level students are utilizing their peers as resources for practicing clinical skills and report benefiting from the collaboration. Educators should consider deliberately integrating PAL into athletic training education programs to enhance student learning and collaboration.
peer teaching; clinical instruction; athletic training students; peer education
Objective: To determine if approximate entropy (ApEn), a regularity statistic from non-linear dynamics, could detect changes in postural control during quiet standing in athletes with normal postural stability after cerebral concussion.
Methods: The study was a retrospective, case series analysis of centre of pressure (COP) data collected during the Sensory Organization Test (SOT) from NCAA Division I (USA) athletes prior to and within 48 h after injury. Subjects were 21 male and six female athletes from a variety of sports who sustained a cerebral concussion between 1997 and 2003. After injury, athletes displayed normal postural stability equivalent to preseason levels. For comparison, COP data also were collected from 15 male and 15 female healthy non-athletes on two occasions. ApEn values were calculated for COP anterior-posterior (AP) and medial-lateral (ML) time series.
Results: Compared to healthy subjects, COP oscillations among athletes generally became more regular (lower ApEn value) after injury despite the absence of postural instability. For AP time series, declines in ApEn values were much larger in SOT conditions 1 and 2 (approximately three times as large as the standard error of the mean) than for all other conditions. For ML time series, ApEn values declined after injury in all sensory conditions (F1,55 = 6.36, p = 0.02).
Conclusions: Athletes who demonstrated normal postural stability after concussion nonetheless displayed subtle changes in postural control. Changes in ApEn may have represented a clinically abnormal finding. ApEn analysis of COP oscillations may be a valuable supplement to existing concussion assessment protocols for athletes.
[Purpose] The aim of this study was to analyze stabilometry in athletes during an indoor
season in order to determine whether injured athletes show different stabilometric values
before injury than non-injured athletes in two different training periods (volume and
pre-competition periods). [Subjects] The subjects were 51 athletes from Unicaja athletic
club who trained regularly. [Methods] At the end of the preseason and volume periods,
athletes were subjected to bipodal and monopodal stabilometry. In addition, all injuries
happening in the periods after performing stabilometry (volume and pre-competition
periods) were tracked. [Results] Variance analysis of bipodal stabilometric measurements
taken at the end of the preseason period showed that athletes with higher values for the
center-of-pressure spread variables suffered injuries during the volume period. The
right-leg monopodal stabilometric measurements taken at the end of the volume period
showed that athletes with higher values in the center-of-pressure position variables
suffered injuries during the pre-competition period. [Conclusion] Athletes showing the
worst values for center-of-pressure spread variables are more prone to sports injuries in
the subsequent training period. In monopodal measurements, athletes with poorer
mediolateral stability were more prone to injuries in the subsequent training period.
Sports injury; Athletes; Postural stability
Background and Aim:
The Functional Movement Screen (FMS™) is a screening instrument that evaluates selective fundamental movement patterns. The main aim of this study was to investigate the relationship between the FMS™ score and history of injury, and attempt to determine which active students are prone to injury.
One hundred physically active (50 females and 50 males) students, between 18 and 25 years of age, with no recent (<6 weeks) history of musculoskeletal injury were recruited. All participants performed the FMS™ and were scored using the previously established standardized FMS™ criteria. The chi square, independent t‐test, one‐way analysis of variance, and POSTHOC Bonferroni tests were used for data analysis with a preset alpha value of p < 0.05.
Of the 100 subjects, 35 suffered an acute lower extremity (ankle = 20, knee = 15) injury in practice or competition. An odds ratio was calculated at 4.70, meaning that an athlete had an approximately 4.7 times greater chance of suffering a lower extremity injury during a regular competitive season if they scored less than 17 on the FMS™. There were statistical differences between the pre‐season FMS™ scores of the injured and non‐injured groups, the ankle injury, knee injury, and non‐injured groups, and also between contact injury, non‐contact injury, and non‐injured groups.
Discussion and Conclusion:
This cross‐sectional study provides FMS™ reference values for physically active students, which will assist in the interpretation of individual scores when screening athletes for musculoskeletal injury and performance factors. More research is still necessary before implementing the FMS™ into a pre‐participation physical examination (PPE) for athletics, but due to the low cost and its simplicity to implement, it should be considered by clinicians and researchers in the future.
Level of Evidence:
Athletic performance; Functional Movement Screen™; injury risk; physically active students; pre‐participation screening
Pre-exercise sports drinks (PRX) are commonly used as ergogenic aids in athletic competitions requiring aerobic power. However, in most cases, claims regarding their effectiveness have not been substantiated. In addition, the ingredients in PRX products must be deemed acceptable by the athletic governing bodies that regulate their use in training and competition. The purpose of this study was to examine the effects of a modified PRX formulation (known as EM·PACT™) from earlier investigations on factors related to maximal aerobic performance during a graded exercise test. The modification consisted of removing creatine to meet the compliance standards set forth by various athletic organizations that regulate the use of nutritional supplements.
Twenty-nine male and female college students varying in levels of aerobic fitness participated in a randomized crossover administration of PRX (containing 14 g/serving of fructose, medium-chain triglycerides, and amino acids mixed with 8 oz. of water) and placebo (PL) 30 minutes prior to performing a treadmill test with approximately one week separation between the trials. VO2max, maximal heart rate (HR), time to exhaustion (Time), and percentage estimated non-protein fat substrate utilization (FA) during two a priori submaximal stages of a graded exercise testing were evaluated.
The VO2max mean value of the PRX trial was significantly greater than the PL trial (P < 0.01). The mean value for Time was also observed to be greater for the PRX trial compared to PL (P < 0.05). Additionally, percentage of FA during submaximal stages of the exercise test was greater for PRX trial in comparison to PL (P < 0.01).
The modified PRX formulation utilized in this investigation supports the findings of the previous investigation and its efficacy for enhancing indices of aerobic performance (specifically VO2max, Time, & FA) during graded exercise testing.
To prospectively observe and compare injury patterns between hypermobile and nonhypermobile NCAA athletes.
Design and Setting:
Athletes were screened for generalized joint hypermobility before the 1995 lacrosse season. Injuries were recorded through the end of the postseason and compared in hypermobile and nonhypermobile athletes.
A total of 310 male and female volunteers from 17 lacrosse teams participated in the study.
Hypermobility was evaluated with the technique of Carter and Wilkinson (as modified by Beighton and colleagues), which uses 9 joint measurements to assess global joint mobility. For an athlete to be considered hypermobile, 5/9 of these measurements must have been positive. Next, certified athletic trainers prospectively recorded injuries and hours of practice and game participation on a standard form. After the season, all data forms were returned to us for analysis. Significance was set at P = .05, and x² and independent t tests were used to compare injuries between groups.
Twenty of 147 men (13.6%) and 54 of 163 women (33.1%) were hypermobile, yielding an overall hypermobility prevalence of 23.8%. One hundred athletes sustained 134 injuries. There were no significant differences in overall injury rate among hypermobile (2.29/1000 hours) compared with nonhypermobile (3.54/1000 hours) athletes. Nonhypermobile athletes suffered contact injuries at a higher rate (1.38/1000 hours) than hypermobile athletes (0.52/1000 hours). Hypermobile athletes showed an increased rate of ankle injuries, and nonhypermobile athletes showed a trend toward an increased rate of strains. Multiple approaches to analysis of the data revealed no other significant findings.
There was no difference in overall injury rates between hypermobile and nonhypermobile athletes in this sample. This finding is somewhat surprising in light of significant evidence that hypermobility appears to be a factor in joint complaints among nonathletes. Additional research is needed to clearly determine whether a relationship exists between hypermobility and injury rates among athletes.
athletic injury surveillance; laxity; injury risk; rheumatology
A fall in FEV1 of ⩾10% following bronchoprovocation (eucapnic voluntary hyperventilation (EVH) or exercise) is regarded as the gold standard criterion for diagnosing exercise induced asthma (EIA) in athletes. Previous studies have suggested that mid‐expiratory flow (FEF50) might be used to supplement FEV1 to improve the sensitivity and specificity of the diagnosis. A study was undertaken to investigate the response of FEF50 following EVH or exercise challenges in elite athletes as an adjunct to FEV1.
Sixty six male (36 asthmatic, 30 non‐asthmatic) and 50 female (24 asthmatic, 26 non‐asthmatic) elite athletes volunteered for the study. Maximal voluntary flow‐volume loops were measured before and 3, 5, 10, and 15 minutes after stopping EVH or exercise. A fall in FEV1 of ⩾10% and a fall in FEF50 of ⩾26% were used as the cut off criteria for identification of EIA.
There was a strong correlation between ΔFEV1 and ΔFEF50 following bronchoprovocation (r = 0.94, p = 0.000). Sixty athletes had a fall in FEV1 of ⩾10% leading to the diagnosis of EIA. Using the FEF50 criterion alone led to 21 (35%) of these asthmatic athletes receiving a false negative diagnosis. The lowest fall in FEF50 in an athlete with a ⩾10% fall in FEV1 was 14.3%. Reducing the FEF50 criteria to ⩾14% led to 13 athletes receiving a false positive diagnosis. Only one athlete had a fall in FEF50 of ⩾26% in the absence of a fall in FEV1 of ⩾10% (ΔFEV1 = 8.9%).
The inclusion of FEF50 in the diagnosis of EIA in elite athletes reduces the sensitivity and does not enhance the sensitivity or specificity of the diagnosis. The use of FEF50 alone is insufficiently sensitive to diagnose EIA reliably in elite athletes.
asthma; sensitivity; diagnosis; eucapnic voluntary hyperventilation; elite athletes