Summary: | Game demands and training practices within team sports such as Australian football (AF) have changed considerably over recent decades, including the requirement of coaching staff to effectively control, manipulate and monitor training and competition loads. The purpose of this investigation was to assess the differences in external and internal physical load measures between game and training in elite junior AF. Twenty five male, adolescent players (mean ±SD: age 17.6 ± 0.5 y) recruited from three elite under 18 AF clubs participated. Global positioning system (GPS), heart rate (HR) and rating of perceived exertion (RPE) data were obtained from 32 game files during four games, and 84 training files during 19 training sessions. Matched-pairs statistics along with Cohen’s d effect size and percent difference were used to compare game and training events. Players were exposed to a higher physical load in the game environment, for both external (GPS) and internal (HR, Session-RPE) load parameters, compared to in-season training. Session time (d = 1.23; percent difference = 31.4% (95% confidence intervals = 17.4 – 45.4)), total distance (3.5; 63.5% (17.4 – 45.4)), distance per minute (1.93; 33.0% (25.8 – 40.1)), high speed distance (2.24; 77.3% (60.3 – 94.2)), number of sprints (0.94; 43.6% (18.9 – 68.6)), mean HR (1.83; 14.3% (10.5 – 18.1)), minutes spent above 80% of predicted HRmax (2.65; 103.7% (89.9 – 117.6)) and Session-RPE (1.22; 48.1% (22.1 – 74.1)) were all higher in competition compared to training. While training should not be expected to fully replicate competition, the observed differences suggest that monitoring of physical load in both environments is warranted to allow comparisons and evaluate whether training objectives are being met.
|