Body mass index (BMI) is widely accepted in determining obesity. Skinfold thickness measurements have been commonly used to determine percentage of body fat.
The authors hypothesize that because BMI does not measure fat directly but relies on body weight alone, a large percentage of athletic adolescents will be misclassified as obese by BMI.
To compare BMI and skinfold measurements as indicators for obesity in the adolescent athletic population, anthropometric data (height, weight, percentage body fat, age, and sex) were recorded from 33 896 student athletes (average age, 15 years; range, 11-19 years) during preparticipation physical examinations from 1985 to 2003. BMI was calculated from height and weight. Percentage of body fat was determined by measuring skinfold thickness.
According to their BMI percentile, 13.31% of adolescent athletes were obese. Using the skinfold method, only 5.95% were obese. Of those classified as obese by the BMI, 62% were considered false positives by the skinfold method. In contrast, there was a 99% probability that the nonobese by BMI would not be obese by the skinfold method (negative predictive value = 0.99).
BMI is a measurement of relative body weight, not body composition. Because lean mass weighs far more than fat, many adolescent athletes are incorrectly classified as obese based on BMI. Skinfold testing provides a more accurate body assessment than BMI in adolescent athletes.
Correct body composition data can help to provide better diet and activity guidelines and prevent the psychological problems associated with being labeled as obese.
body mass index; skinfold; adolescent; obesity
Consumption of energy drinks has become widespread among athletes. The effectiveness of Red Bull and Hype energy drinks on selected indices of maximal cardiorespiratory fitness and blood lactate levels in male athletes was examined in this study.
Ten male student athletes (age: 22.4 ± 2.1 years, height: 180.8 ± 7.7 cm, weight: 74.2 ± 8.5 kg) performed three randomized maximal oxygen consumption tests on a treadmill. Each test was separated by four days and participants were asked to ingest Red Bull, Hype or placebo drinks 40 minutes before the exercise bout. The VO
2max, time to exhaustion, heart rate and lactate were measured to determine if the caffeine-based beverages influence performance. ANOVA test was used for analyzing data.
A greater value was observed in VO
2maxand time to exhaustion for the Red Bull and Hype trial compared to the placebo trial (p < 0.05). No significant difference was found in pre-and post-test heart rate for two drinks (p > 0.05). For blood lactate levels no significant changes were observed before and two minute after the test (p > 0.05).
Ingestion of Red Bull and Hype prior to exercise testing is effective on some indices of cardiorespiratory fitness but not on the blood lactate levels.
Energy Drink; Caffeine; Taurine; VO2max; Blood Lactate; Male Athletes
Although young African-American males are at particularly high risk of developing hypertension at an early age, dietary interventions that have successfully reduced blood pressure among African-American adults have not been translated into programs for this group. Life contexts such as school enrollment, competitive athletics, and employment influence the daily activities and meal patterns of African-American males. This study explored the activities of young African-American males to identify opportunities to increase healthful food choices. A purposive sample was recruited which included five groups of African-American males (15–22 years of age, n=106): high school athletes and non-athletes, college athletes and non-athletes, and non-students. A structured interview guided participants through a description of their activities, meal patterns, and food choices over the course of a typical weekday. Common elements emerged that provided a contextual view of the participant meal patterns and food choices. These elements were sports team participation, college employment, school as a food source, non-student status, and eating dinner at home. These findings suggest opportunities for the design of dietary interventions for young African-American males which take into consideration how school, athletics and employment may influence opportunities to eat regular meals that include healthful foods.
minority health; hypertension prevention; meal patterns; food choice; African-American men’s health; qualitative research
A general delay in menarche in female athletes has been confirmed based on comparisons of mean ages between athletes and non-athletes; however, it has not been possible to judge such delays individually. If delayed menarche could be evaluated for an individual, the athlete could be advised as to necessary precautions. In this study, the age at maximum peak velocity (MPV) of height, adopted as an index of physical maturation, was identified by the wavelet interpolation method (WIM). The relationship between the age at menarche and age at MPV of height in female athletes and non-athletes was then examined. For the athlete group, health examination records of 90 female ball game players in the first year of university in the Tokai area, all of whom had participated in national level competitions, were reviewed for the period from the first grade of elementary school until the final year of high school (from 1985 to 1996). A similar examination was conducted for the control group, among whom a final group of 78 female non-athletes were selected. The age at menarche was determined by questionnaires, and the longitudinal data for height and weight were obtained from the health examination records. Based on a comparison of the difference between the age at MPV of height and age at menarche in ball game players and the control group, a tendency was seen for the difference between the two ages to narrow as the age at MPV of height rose. A corrected regression evaluation of age at menarche against age at MPV of height was derived in the control group, and the evaluation system was applied to ball game players. The delay in menarche in ball game players could be individually evaluated.
delay in menarche; maximum peak velocity (MPV); ball game playes; wavelet; regression analysis
Increased attention has been directed toward assessing and improving academic quality in athletic training education. The educational process has been assessed from a global level, but little is known about how athletic training students learn. The purpose of this investigation was to assess the learning styles of undergraduate athletic training students.
Design and Setting:
Undergraduate students enrolled in a Committee on Accreditation of Allied Health Education Programs (CAAHEP)-accredited athletic training education program completed a learning styles inventory during a regularly scheduled athletic training class at the start of the spring semester.
Twenty-seven student athletic trainers (age range, 19-30 yrs, mean age = 20.5 yrs) served as subjects. Sixteen subjects (7 male, 9 female) were in the first year of this 3-year program. Eleven subjects (7 male, 4 female) were second-year students.
Learning style was assessed using the Productivity Environmental Preference Survey.
Parametric and nonparametric one-way analyses of variance for each learning subscale by sex and by year in program revealed significant differences (P < .05) in light preferences for male and female students. There were also significant differences (P < .05) between first-and second-year students in preferences for afternoon learning activities.
These findings suggest that undergraduate athletic training students function best as leamers in a well-lit leaming environment. The significance of aftemoon as the preferred time for learning reinforces the importance of the clinical setting in the introduction and mastery of skills. Athletic training educators and clinical instructors can use these results as they examine their teaching strategies and educational environments.
learning preferences; Productivity Environmental Preference Survey
Goal setting difficulty has been shown to contribute to athletic performance (Burton et al., 2000). However, the potential mediating mechanism of goal difficulty on performance is unclear. Therefore, the purpose of this study was to verify the effect of goal setting difficulty on serving success in table tennis, and determine if self-regulation is the mediating variable. The current study used serving success within a one minute period as the task, and the “Athlete’s Self-regulation in Motor Learning” as the measurement tool. The experiment was designed as a 3 (serving frequency: 20/min, 23/min, and 26/min) × 2 (serving placement: left “small triangle”, and right “small triangle”) model. Participants (N = 60) in the current study were students from a physical education school. These participants were randomly assigned into the experimental and control groups. After the intervention, differences in self-regulation (p < 0.001) and serving success (p < 0.05) between the experimental and control groups were significant. For the experimental groups, there was a significant difference in self-regulation (p < 0.001) and serving success (p < 0.05) before and after the experiment. Serving frequency had a main effect on self-regulation (F (5, 24) = 12.398, p < 0.01) and serving success (F (5, 24) = 37.601, p < 0.001). Moderately difficult goal setting contributed to athletic performance. Regression analysis using bootstrapping methods revealed that self-regulation partially mediated the relationship between the two.
table tennis athletes; goal setting difficulty; self-regulation; serving success
A fall in FEV1 of ⩾10% following bronchoprovocation (eucapnic voluntary hyperventilation (EVH) or exercise) is regarded as the gold standard criterion for diagnosing exercise induced asthma (EIA) in athletes. Previous studies have suggested that mid‐expiratory flow (FEF50) might be used to supplement FEV1 to improve the sensitivity and specificity of the diagnosis. A study was undertaken to investigate the response of FEF50 following EVH or exercise challenges in elite athletes as an adjunct to FEV1.
Sixty six male (36 asthmatic, 30 non‐asthmatic) and 50 female (24 asthmatic, 26 non‐asthmatic) elite athletes volunteered for the study. Maximal voluntary flow‐volume loops were measured before and 3, 5, 10, and 15 minutes after stopping EVH or exercise. A fall in FEV1 of ⩾10% and a fall in FEF50 of ⩾26% were used as the cut off criteria for identification of EIA.
There was a strong correlation between ΔFEV1 and ΔFEF50 following bronchoprovocation (r = 0.94, p = 0.000). Sixty athletes had a fall in FEV1 of ⩾10% leading to the diagnosis of EIA. Using the FEF50 criterion alone led to 21 (35%) of these asthmatic athletes receiving a false negative diagnosis. The lowest fall in FEF50 in an athlete with a ⩾10% fall in FEV1 was 14.3%. Reducing the FEF50 criteria to ⩾14% led to 13 athletes receiving a false positive diagnosis. Only one athlete had a fall in FEF50 of ⩾26% in the absence of a fall in FEV1 of ⩾10% (ΔFEV1 = 8.9%).
The inclusion of FEF50 in the diagnosis of EIA in elite athletes reduces the sensitivity and does not enhance the sensitivity or specificity of the diagnosis. The use of FEF50 alone is insufficiently sensitive to diagnose EIA reliably in elite athletes.
asthma; sensitivity; diagnosis; eucapnic voluntary hyperventilation; elite athletes
Context: As the number of female college students participating in athletics has grown dramatically in the last few decades, sports medicine health care providers have become more aware of the unique health concerns of athletic women. These concerns include disordered eating, amenorrhea, and osteoporosis: the female athlete triad. Disordered eating appears to be central in the triad, and the literature has conflicting data regarding the influence of athletic participation on disordered-eating behaviors.
Objective: To compare disordered-eating symptoms between collegiate athletes (in lean and non-lean sports) and nonathletes.
Design: A volunteer, cross-sectional cohort study of female students during the 2002–2003 academic year.
Setting: A National Collegiate Athletic Association Division I institution.
Patients or Other Participants: Undergraduate females, including 84 collegiate athletes and 62 nonathletes.
Main Outcome Measure(s): Symptoms associated with disordered eating were assessed using the Eating Disorders Inventory-2, a self-report measure of 91 items, and self-reported weight and menstrual function.
Results: The athletes had significantly lower scores in body dissatisfaction (P = .01) and ineffectiveness (P = .002). No difference in mean body weight was noted between the 2 groups, but the nonathlete group had a significantly lower desired body weight (P = .004). Lean-sport athletes had a higher score on body dissatisfaction (P = .008) and lower actual (P = .024) and desired body weight (P = .002) than non–lean-sport athletes. A total of 7.1% of the collegiate athletes and 12.9% of the nonathletes were classified as having a high risk for disordered eating. Within the athlete sample, the high-risk group included 2.9% of the non–lean-sport athletes and 25% of the lean-sport athletes.
Conclusions: In our study, female athletes did not exhibit more disordered-eating symptoms than women who did not participate in collegiate sports. However, our data suggest that lean-sport athletes are at greater risk for disordered eating than athletes in non-lean sports.
female athlete triad; nutrition; psychology
Elite endurance athletes typically have larger arteries contributing to greater skeletal muscle blood flow, oxygen and nutrient delivery and improved physical performance. Few studies have examined structural and functional properties of arteries in power athletes.
To compare the size and vasoreactivity of the brachial artery of elite power athletes to age-matched controls. It was hypothesized brachial artery diameters of athletes would be larger, have less vasodilation in response to cuff occlusion, but more constriction after a cold pressor test than age-matched controls.
Eight elite power athletes (age = 23±2 years) and ten controls (age = 22±1 yrs) were studied. High-resolution ultrasonography was used to assess brachial artery diameters at rest and following 5 minutes of forearm occlusion (Brachial Artery Flow Mediated Dilation = BAFMD) and a cold pressor test (CPT). Basic fitness measures included a handgrip test and 3-minute step test.
Brachial arteries of athletes were larger (Athletes 5.39±1.51 vs. Controls: 3.73±0.71 mm, p<0.05), had greater vasodilatory (BAFMD%: Athletes: 8.21±1.78 vs. Controls: 5.69±1.56%) and constrictor (CPT %: Athletes: -2.95±1.07 vs. Controls: −1.20±0.48%) responses, compared to controls. Vascular operating range (VOR = Peak dilation+Peak Constriction) was also greater in athletes (VOR: Athletes: 0.55±0.15 vs. Controls: 0.25±0.18 mm, p<0.05). Athletes had superior handgrip strength (Athletes: 55.92±17.06 vs. Controls: 36.77±17.06 kg, p<0.05) but similar heart rate responses at peak (Athletes: 123±16 vs. Controls: 130±25 bpm, p>0.05) and 1 minute recovery (Athletes: 88±21 vs. Controls: 98±26 bpm, p>0.05) following the step test.
Elite power athletes have larger brachial arteries, and greater vasoreactivity (greater vasodilatory and constrictor responses) than age-matched controls, contributing to a significantly greater VOR. These data extend the existence of an ‘athlete’s artery’ as previously shown for elite endurance athletes to elite power athletes, and presents a hypothetical explanation for the functional significance of the ‘power athlete’s artery’.
The subjects of the study were 418 highly successful female athletes and 512 female non-athletes drawn from all over Nigeria. Recall procedures were used to ascertain the age at menarche. The study gave the following results: In general, over-all mean menarcheal age of athletes (14.13 years) was significantly higher (p less than .05) than that of non-athletes (13.57 years). Menarche was significantly (p less than .05) delayed (14.41 years) in those athletes (n = 272) who started physical activities before the onset of menstruation. The mean menarcheal age of non-athletes i.e. general population was significantly lower (p less than .05) than that established thirty years ago in Nigerian women.
Obesity and increased blood pressure are identified as risk factors for cardiac and pulmonary disorders. On the other hand, iron deficiency (another preventable disease) is common in adolescence and considered as associated with health impairment. The present study evaluates body mass index (BMI) and its association with blood pressure and hematological indices in freshman students entering the University of Isfahan in 2009.
All the 1675 students who entered the University of Isfahan in September 2009 were examined. Height, weight, BMI, blood pressure, hemoglobin (Hb) and red blood cell (RBC) indices of these students were measured. The prevalence of high blood pressure, its association with BMI and the relation between BMI and anemia, iron deficiency and educational achievement were assessed.
All participants, including 514 males and 1161 females, went under clinical observations. The average age was 20.7 ± 3.8. year Among the students, 18.2% of males and 20% of females were underweight. High systolic blood pressure was more common in the students with BMI > 25 kg/m2 (p < 0.001). Anemia was seen in 8.7% of females. In males, however, a relation between anemia frequency and BMI < 18.5 kg/m2 was more distinct (p = 0.002). There was no association between anemia and students’ average test scores.
High incidence of abnormal BMI in the study population, and its association with systolic blood pressure indicate the importance of nutritional guidelines and counseling programs for freshman students. On the other hand, high incidence of anemia in this population ascertains the necessity of anemia screening programs before academic studies.
Body mass index; Systolic blood pressure; Iron deficiency anemia
Background: Subjects exercising without fluid ingestion in desert heat terminated exercise when the total loss in body weight exceeded 7%. It is not known if athletes competing in cooler conditions with free access to fluid terminate exercise at similar levels of weight loss.
Objectives: To determine any associations between percentage weight losses during a 224 km Ironman triathlon, serum sodium concentrations and rectal temperatures after the race, and prevalence of medical diagnoses.
Methods: Athletes competing in the 2000 and 2001 South African Ironman triathlon were weighed on the day of registration and again immediately before and immediately after the race. Blood pressure and serum sodium concentrations were measured at registration and immediately after the race. Rectal temperatures were also measured after the race, at which time all athletes were medically examined. Athletes were assigned to one of three groups according to percentage weight loss during the race.
Results: Body weight was significantly (p<0.0001) reduced after the race in all three groups. Serum sodium concentrations were significantly (p<0.001) higher in athletes with the greatest percentage weight loss. Rectal temperatures were the same in all groups, with only a weak inverse association between temperature and percentage weight loss. There were no significant differences in diagnostic indices of high weight loss or incidence of medical diagnoses between groups.
Conclusions: Large changes in body weight during a triathlon were not associated with a greater prevalence of medical complications or higher rectal temperatures but were associated with higher serum sodium concentrations.
Context: Athletic training education programs must provide the proper type and amount of clinical supervision in order for athletic training students to obtain appropriate clinical education and to meet Board of Certification examination requirements.
Objective: To assess athletic training students' perceptions of the type and amount of clinical supervision received during clinical education.
Design: Cross-sectional design.
Setting: 124 CAAHEP-accredited NCAA institutions.
Patients or Other Participants: We obtained a national stratified random sample (by National Athletic Trainers' Association district) of undergraduate athletic training students from 61 Commission on Accreditation of Allied Health Education Programs–accredited athletic training education programs. A total of 851 athletic training students participated in the study.
Main Outcome Measure(s): Differences among athletic training students with first-aider/provider qualifications, student supervision during moderate-risk and increased-risk sports, program/institutional characteristics, type and amount of clinical supervision, and students' academic level and mean percentage of time spent in different types of clinical supervision.
Results: A total of 276 (32.4%) of the students reported that they supplied medical care and athletic training–related coverage beyond that of a first aider/provider. Athletic training students stating that they traveled with teams without supervision numbered 342 (40.2%). A significant difference was noted between the amount of supervision reported by sophomore and senior students (
P < .01).
Conclusions: Athletic training students do not seem to be receiving appropriate clinical supervision and are often acting outside the scope of clinical education.
clinical experience; field experience; clinical instruction; athletic training education
To document the relation between serum creatinine concentration and body mass index in elite athletes from five different sports, and to study potential differences among athletes performing different sports with different features and requirements.
Before the start of the competitive season, serum creatinine was measured in 151 elite athletes from five different sports: rugby (n = 44), soccer (n = 27), alpine skiing (n = 34), sailing (n = 22), cycling (n = 24). Pearson's correlation analysis was used to evaluate the relation between serum creatinine and body mass index (BMI). Analysis of variance and unpaired Student's t test were used to compare creatinine concentration and BMI in different sport disciplines.
In the whole group of athletes, a positive correlation between serum creatinine and BMI was found (r = 0.48, p<0.001). Significant differences in creatinine concentration and BMI were found between athletes competing in different sports: their mean (SD) values were respectively 1.31 (0.12) mg/dl and 28.83 (2.41) for rugby players, 1.27 mg/dl (0.10) and 23.10 (1.01) for soccer players, 1.15 (0.11) mg/dl and 25.8 (1.50) for skiers, 1.08 (0.11) mg/dl and 26.93 (2.36) for sailors, and 0.91 (0.07)mg/dl and 21.33 (1.21) for cyclists.
There is a correlation between creatinine concentration and BMI in elite athletes competing in different sports characterised by different kinds of training, competitive season, and involvement of aerobic and anaerobic metabolism. Interpretation of creatinine concentrations in male athletes should consider professional status as well as the specific sport performed. All athletes should be monitored with consecutive creatinine assessments, using as the baseline the concentration determined before the start of training and the competitive season, but taking into consideration the specific sport performed and the BMI until equations that include creatinine and factors that affect its concentration are used.
creatinine; athletes; body mass index
Objective: To determine if approximate entropy (ApEn), a regularity statistic from non-linear dynamics, could detect changes in postural control during quiet standing in athletes with normal postural stability after cerebral concussion.
Methods: The study was a retrospective, case series analysis of centre of pressure (COP) data collected during the Sensory Organization Test (SOT) from NCAA Division I (USA) athletes prior to and within 48 h after injury. Subjects were 21 male and six female athletes from a variety of sports who sustained a cerebral concussion between 1997 and 2003. After injury, athletes displayed normal postural stability equivalent to preseason levels. For comparison, COP data also were collected from 15 male and 15 female healthy non-athletes on two occasions. ApEn values were calculated for COP anterior-posterior (AP) and medial-lateral (ML) time series.
Results: Compared to healthy subjects, COP oscillations among athletes generally became more regular (lower ApEn value) after injury despite the absence of postural instability. For AP time series, declines in ApEn values were much larger in SOT conditions 1 and 2 (approximately three times as large as the standard error of the mean) than for all other conditions. For ML time series, ApEn values declined after injury in all sensory conditions (F1,55 = 6.36, p = 0.02).
Conclusions: Athletes who demonstrated normal postural stability after concussion nonetheless displayed subtle changes in postural control. Changes in ApEn may have represented a clinically abnormal finding. ApEn analysis of COP oscillations may be a valuable supplement to existing concussion assessment protocols for athletes.
The serum concentration of creatine kinase (CK) is used widely as an index of skeletal muscle fibre damage in sport and exercise. Since athletes have higher CK values than non‐athletes, comparing the values of athletes to the normal values established in non‐athletes is pointless. The purpose of this study was to introduce reference intervals for CK in athletes.
CK was assayed in serum samples from 483 male athletes and 245 female athletes, aged 7–44. Samples had been obtained throughout the training and competition period. For comparison, CK was also assayed in a smaller number of non‐athletes. Reference intervals (2.5th to 97.5th percentile) were calculated by the non‐parametric method.
The reference intervals were 82–1083 U/L (37°C) in male and 47–513 U/L in female athletes. The upper reference limits were twice the limits reported for moderately active non‐athletes in the literature or calculated in the non‐athletes in this study. The upper limits were up to six times higher than the limits reported for inactive individuals in the literature. When reference intervals were calculated specifically in male football (soccer) players and swimmers, a threefold difference in the upper reference limit was found (1492 vs 523 U/L, respectively), probably resulting from the different training and competition demands of the two sports.
Sport training and competition have profound effects on the reference intervals for serum CK. Introducing sport‐specific reference intervals may help to avoid misinterpretation of high values and to optimise training.
To examine the impact of life-stress sources that student athletic trainers encountered over the course of an academic year, to investigate the existence of sex differences in stress source symptoms, and to provide athletic training staffs with suggestions on ways to assist student athletic trainers.
Design and Setting:
In a classroom setting, the 25-item Quick Stress Questionnaire (QSQ) was administered to all subjects at the beginning of each month during an academic year. The QSQ, which can be completed in approximately 5 minutes, uses a 9-point Likert scale ranging from 1 (little stress) to 9 (extreme stress) to measure sources of stress and stress-related symptoms.
The sample consisted of 11 male and 9 female student athletic trainers enrolled in a Commission on Accreditation of Allied Health Education Programs (CAAHEP)-accredited undergraduate program at a mid-Atlantic university.
We computed descriptive statistics for the stress items and symptoms (ie, cognitive, somatic, and behavioral) and graphed them according to sex. Separate sex × time analyses of variance were performed to investigate changes in cognitive, somatic, and behavioral stress over the course of the study and to determine if these changes were different for male and female student athletic trainers.
Academic and financial concerns represented the greatest sources of stress for student athletic trainers. Repeated-measures analyses of variance indicated that stress levels fluctuated significantly during the academic year, with peak stress levels experienced during midterm and at the end of the spring semester. Although female student athletic trainers consistently reported higher levels of stress than their male counterparts, these differences were not statistically significant.
Student athletic trainers exhibited fluctuations in their stress levels throughout an academic calendar. Academic and financial concerns were the most common stressors. Certified athletic trainers should take an interest in their student athletic trainers and be willing to provide assistance in times of need. Additional research is needed regarding student athletic trainers and stress.
burnout; college students; coping; stress
This study was conducted in order to measure the reported pain caused by cold immersions over a 5-day period to determine if habituation to the perception of cold pain occurs. Numerous authors have described a habituation phenomenon to therapeutic ice bath immersions. Athletic trainers often explain to athletes that their perceptions of the pain induced by a therapeutic ice bath will decrease each day as they proceed through therapy. Essentially, it is assumed that there is a habituation to the perception of cold-induced pain shortly after initiation of the treatment regime. The subjects were 22 male and female college students who had limited experience with cold immersion. The subjects' right feet and ankles were immersed in an ice bath for 21 minutes on 5 consecutive days followed by a 21-minute recovery period. The McGill Pain Questionnaire (MPQ) was used to measure pain during the immersions. Sensory, affective, evaluative, and miscellaneous qualities of pain were determined from the MPQ. During the testing session, each subject completed the MPQ 30 seconds following immersion and then every 3 minutes until completion of the test. Repeated measures analyses of variance (ANOVAs) adjusted according to the Bonferroni correction revealed no significant differences for any of the qualities of pain over a 5-day period. The subjects' perception of cold-induced pain did appear to decrease during the immersion and there was a trend towards decreasing pain during day five, but a habituation effect was not documented in this study.
To determine the relationship between relative body composition and body mass to height, anterior knee pain, or patellofemoral pain (PFP) in adolescent female athletes.
Patellofemoral pain is common in female athletes and has an undefined etiology. The purpose of this study was to examine whether there was an association among higher body mass index (BMI), BMI z-scores, and relative body fat percentage in the development of PFP in an adolescent female athlete population. We hypothesized that female athletes who developed PFP over the course of a competitive basketball season had higher relative body mass or body fat percentage compared with those who did not develop PFP.
Fifteen middle school basketball teams that consisted of 248 basketball players (mean age, 12.76 ± 1.13 years; height, 158.43 ± 7.78 cm; body mass, 52.35 ± 12.31 kg; BMI, 20.73 ± 3.88 kg/m2) agreed to participate in this study over the course of 2 basketball seasons, resulting in 262 athlete-seasons. Testing included the completion of the Anterior Knee Pain Scale (AKPS), International Knee Documentation Committee (IKDC) form, standardized history, physician-administered physical examination, maturational estimates, and anthropometrics.
Of the 262 athlete-seasons monitored, 39 athletes developed PFP over the course of the study. The incidence rate of new PFP was 1.57 per 1000 athlete-exposures. The cumulative incidence of PFP was 14.9%. There was no difference in BMI between those who developed PFP (mean body mass, 20.2 kg/m2; 95% CI, 18.9–21.4) and those who did not develop PFP (mean body mass, 20.8 kg/m2; 95% CI, 20.3–21.3; P > 0.05). Body mass index z-scores were not different between those who developed PFP (mean, 0.3; 95% CI, 0.7–0.6) and those who did not develop PFP (mean, 0.4; 95% CI, 0.3–0.6; P > 0.05). A similar trend was noted in relative body fat percentage, with mean scores of similar ranges in those who developed PFP (mean body fat percentage, 22.2%; 95% CI, 19.4–24.9) to the referent group who did not (mean body fat percentage, 22.9%; 95% CI, 21.8–24.1; P > 0.05).
Our results do not indicate a relationship between relative body composition or relative body mass to height to the propensity to develop PFP in middle school–aged female basketball players. Although previous data indicate a relationship between higher relative body mass and overall knee injury, these data did not support this association with PFP specifically. These data suggest the underlying etiology of PFP may be neuromuscular in nature. Further research is needed to understand the predictors, etiology, and ultimate prevention of this condition.
patellofemoral pain; anterior knee pain; biomechanics; body mass index; BMI z-score; anthropometrics; body fat; patellofemoral pain
Post-exercise hypotension (PEH) following prolonged dynamic exercise arises from increased total vascular conductance (TVC) via skeletal muscle vasodilation. However, arterial vasodilation of skeletal musculatures does not entirely account for the rise in TVC. The aim of the present study was to determine the contribution of vascular conductance (VC) of the legs, arms, kidneys and viscera to TVC during PEH.
Eight subjects performed a single period of cycling at 60% of heart rate (HR) reserve for 60 minutes. Blood flow in the right renal, superior mesenteric, right brachial and right femoral arteries was measured by Doppler ultrasonography in a supine position before exercise and during recovery. HR and mean arterial pressure (MAP) were measured continuously. MAP decreased significantly from approximately 25 minutes after exercise cessation compared with pre-exercise baseline. TVC significantly increased (approximately 23%; P <0.05) after exercise compared with baseline, which resulted from increased VC in the leg (approximately 33%) and arm (approximately 20%), but not in the abdomen.
PEH was not induced by decreased cardiac output, but by increased TVC, two-thirds of the rise in which can be attributed to increased VC in active and inactive limbs.
Regional hemodynamics; Central hemodynamics; Post-exercise hypotension; Doppler ultrasonography
Left ventricle dimensions and wall stress were measured echocardiographically before and immediately after exercise in 14 athletes and 7 control subjects. Our findings suggest that afterload is an important determinant of cardiac performance and wall hypertrophy in athletes. In spite of major changes in heart rate and blood pressure, left ventricular wall stress remains unchanged following submaximal exercise, in trained and untrained hearts. It would appear that the changes in heart size during exercise are to a large extent limited in untrained ventricles, as smaller left ventricular dimensions are required, to "normalise" wall stress. This results in a lower stroke volume for a given stroke dimensional change. Consequently cardiac output is a function of heart rate rather than stroke volume in untrained subjects. The effect of increased muscle mass in athletes, is to permit larger left ventricular dimensions for a given afterload, thus stroke volume can be augmented. The increase h/R ratio suggests that afterload is more important than preload in the development of left ventricular hypertrophy in rowers and swimmers.
Objectives: To assess the type and amount of clinical supervision athletic training students received during clinical education.
Design and Setting: An online survey was conducted with a questionnaire developed specifically for this study.
Subjects: Head athletic trainers from National Collegiate Athletic Association Division I (28), Division II (34), and Division III institutions (30). Thirty-four represented Commission on the Accreditation of Allied Health Education Programs-accredited athletic training education programs, 20 represented athletic training programs in Joint Review Commission on Athletic Training candidacy, and 35 offered the internship route.
Measurements: Descriptive statistics were computed. Three sets of chi-square analyses were completed to assess associations among athletic training students with first-responder qualifications, program and institution characteristics, certified athletic trainer medical coverage of moderate- and increased-risk sports, and clinical supervision. A trend analysis of students' class standing and time spent in different types of clinical supervision was also completed. The alpha level was set at < .05.
Results: Most of the athletic training students (83.7%), particularly in accredited programs, had first-responder qualifications. More than half of the head athletic trainers (59.8%) indicated that athletic training students were authorized to provide medical care coverage without supervision. A minimal amount of medical care coverage of moderate- and increased-risk sports was unsupervised. No significant difference between the size of the education or athletic program and type and amount of clinical supervision was noted. Freshman athletic training students spent more time in direct clinical supervision and less time in unsupervised experience, but the opposite was true for senior students.
Conclusions: Athletic training students are being utilized beyond appropriate clinical supervision and the scope of clinical education. Future research should employ methods using nonparticipant observation of clinical instructors' supervision of students as well as students' own perceptions of their clinical supervision.
clinical education; clinical experience; field experience; clinical instruction
Context: To assist athletes in maintaining optimal health, athletic trainers must work with athletes of both sexes.
Objective: To examine athletic trainers' comfort levels in providing care for gender-specific and non-gender-specific injuries and issues.
Design: We mailed 235 Gender Comfort in Athletic Training Questionnaires to program directors, who were asked to distribute and collect them.
Setting: We randomly selected 21 athletic training education program directors and invited them by e-mail to participate in the study. Fourteen program directors representing the 10 National Athletic Trainers' Association districts agreed to participate.
Patients or Other Participants: A total of 192 participants returned completed questionnaires, for a response rate of 82% (103 women, 89 men; 101 senior athletic training students, 91 certified athletic trainers).
Main Outcome Measure(s): The questionnaire consisted of 17 injuries and issues common to both female and male athlete scenarios. Three gender-specific items were added to each scenario. Responses were scored on a 5-point scale anchored by 1 (very uncomfortable) and 5 (very comfortable). Participants were asked to indicate the reason for any degree of discomfort. Internal consistency, determined by the Cronbach alpha, was .92 for the female athlete scenario and .93 for the male athlete scenario.
Results: We found significant differences between women and men certified athletic trainers for the female and male athlete scenarios. Overall, women were more comfortable caring for female injuries and issues, whereas men were more comfortable caring for male injuries and issues. Certified athletic trainers reported more comfort overall than athletic training students. The most common underlying reason reported for discomfort in caring for female and male injuries and issues was experience level.
Conclusions: Athletic training education programs should provide early and more deliberate experiences with injuries and issues of a more intimate nature, including those that are gender specific and non-gender specific. These experiences may increase athletic trainers' level of comfort in providing care to athletes of the opposite sex.
comfort level; same-sex health care; opposite-sex health care; health care
Between January and March 1989, I surveyed the athletic directors of the 711 high schools of the Michigan High School Athletic Association, in order to determine the level of medical care available for students who participate in various sports. The results were compared to previous studies done in Michigan and in other states, to determine if there had been any increase in the number of athletic trainers working in a high school setting or any improvement in their educational backgrounds. Certification by the National Athletic Trainers' Association (NATA) was the measurement used to determine improvement in educational background. With 57% of the 711 athletic directors responding, 41% reported that they had the services of an athletic trainer for at least one sport during the year. The percentage of athletic trainers varied directly with the size of the school. The more populous schools had the greatest percentage of athletic trainers. Seventy percent of the athletic trainers were reported to be certified by the NATA. These findings were compared to two earlier studies conducted in Michigan and to surveys in other states. It was determined that there was an increase in the availability of athletic trainers, particularly certified athletic trainers, at the post-secondary level.
Suicide is the third leading cause of death among US adolescents aged 15–24, with males incurring higher rates of completion than females. This study used hierarchical logistic regression analysis to test whether athletic participation was associated with lower rates of suicidal ideation and behavior among a nationally representative sample of over 16,000 US public and private high school students. Net of the effects of age, race/ethnicity, parental educational attainment, and urbanicity, high school athletic participation was significantly associated with reduced odds of considering suicide among both females and males, and reduced odds of planning a suicide attempt among females only. Though the results point to favorable health outcomes for athletes, athletic participation was also associated with higher rates of injury to male athletes who actually attempted suicide.
adolescence; athletic participation and gender; health; suicide