Successful Injury Prevention Interventions



Fig. 15.1
Traditional four-step model of injury prevention





Essential Elements for Military Injury Prevention Program Success


In 2004, the Military Training Task Force of the Defense Safety Oversight Council chartered a Joint Services Physical Training Injury Prevention Working Group to establish the evidence base, prioritize, and recommend proven injury prevention programs to the Secretary of Defense [3]. Additionally, this working group was tasked with substantiating the need for further research and evaluation on interventions and programs likely to reduce physical training-related injuries—namely, MSK-Is. The results of an expedited systematic review determined four elements essential to injury prevention success: education, surveillance, leadership support, and adequate resources for research and program evaluation .


Education


Successful injury prevention should involve informing and educating those involved in all aspects of military training. Injury prevention programs that involve education are more likely to increase adherence and show success [4]. Education includes dissemination of information regarding the proven strategies for the prevention of injury, educating those who deliver training programs, and those who are responsible for training troops at all levels. Perhaps most importantly, effective education is a crucial component for obtaining military commanders’ support for evidence-based injury prevention interventions that are aligned with their responsibility to protect service members [3].


Surveillance


Without adequate and widespread surveillance, it is difficult to appreciate the magnitude of the MSK-I problem in the military, which presents challenges for deciding where to intervene. Surveillance reporting on specific injuries (e.g., stress fracture) and training events (e.g., pugil sticks) provides the foundation for identifying problem areas and informing improvement strategies. Through the synthesis of information about injury rates and training by unit level, training cycle, fiscal or calendar year, and goals for improvement, targeted interventions can be thoroughly evaluated and recommendations made in an evidenced-based manner .

Numerous Department of Defense (DoD)-wide surveillance systems exist, including the Defense Medical Surveillance System (DMSS) administered by the Armed Forces Health Surveillance Center (AFHSC), and the Military Health System Data Repository (MDR). However, more importantly, most training units carefully track their own outcomes, including number of dropped trainees, unit fitness test performance, and in coordination with medical staff, in-house injury rates. Routine surveillance of unit-level injuries and fitness can and should be used as an indicator of physical training program success or failure and is an invaluable tool for garnering leadership support. Local injury surveillance infrastructure and data are also critical for providing timely and actionable data to military leaders at the strategic, operational, and tactical levels .


Leadership Support


Leadership focus at all levels of the organization, from the highest-level military commanders to the squad leader, has the greatest influence on MSK-I rates and whether an injury prevention intervention will be successful or not. Simply understanding the current state of specific injuries, their contributing causes, setting goals to improve outcomes, and monitoring success through surveillance can be an effective way to gain leadership support for injury prevention initiatives. High injury rates indicate a need to modify existing training programs, with command-level decisions having broad-reaching impacts. Regular reporting of injury data through the chain of command may have the effect of encouraging greater command responsibility for unit performance, including MSK-I. However, most importantly, recent work shows that successful short-term prevention program implementation and ultimate long-term sustainability are only possible with complete leadership buy-in and support [5]. Key to obtaining this support is aligning the mutual goal of injury prevention along with the overall mission of military leaders (i.e., operational readiness). Both buy-in and support from leaders and key stakeholders are paramount to garnering commitment for successful implementation and ultimate sustainability of injury prevention initiatives [5].


Research and Program Evaluation


While there have been many successful injury prevention programs reported, there are many more that lack sufficient evidence for implementation. There is a great need for branch or service-level research and program evaluation of multiple types of injury prevention strategies in military populations. More importantly, there is a great need for the willingness and desire to devote resources to implement, disseminate, track, and evaluate new strategies for injury prevention. The following section describes the latest research and program evaluation for successful military injury prevention strategies.


Successful Injury Prevention Strategies


While there have been numerous interventions , three broad categories have shown the most success: (1) preventing overtraining; (2) ankle bracing; and (3) performing multiaxial, neuromuscular, proprioceptive, and agility training.


Risk Factor Identification


Training-related risk factors for MSK-I have been clearly identified [68]. Risk factors have traditionally been categorized into intrinsic and extrinsic risk factors. Intrinsic risk factors are factors inherent in the individual, while extrinsic risk factors are environmental factors that interfere with the individual. Intrinsic risk factors can be further grouped into demographic factors (age, gender, race, tobacco use, history of previous MSK-I), anatomical factors (high arches and genu valgus), and physical fitness factors (low aerobic fitness, endurance, and strength; see Table 15.1). Extrinsic risk factors vary with the training environment. High running mileage, certain training companies, older running shoes, and the summer season have been identified as risk factors for overuse injury in Basic Combat Training (BCT) [9]. Injury prevention strategies for training-related injuries attempt to modify both intrinsic and extrinsic risk factors.


Table 15.1
Risk factors for training-related injury
































Demographic factors

Anatomical factors

Physical fitness factors

Age > 24 years (N)

Genu valgus (N)

Low levels of physical activity before training (M)

Caucasian race (N)

Q-angle > 15° (N)

Low aerobic fitness (M)

Female gender (N)

Decrease ankle dorsiflexion (M)

Extremes of flexibility (M)

Previous musculoskeletal injury (N)

Rearfoot hyperpronation (M)

Low muscular strength and endurance (M)

Tobacco use (M)

Extremes of arches (pes cavus/pes planus) (M)

Extremes of BMI and body composition (M)


N non-modifiable risk factor for injury, M modifiable risk factor for injury,

BMI body mass index

However, a recent paradigm shift in the conceptualization of risk factors has begun to focus on whether risk factors for MSK-I are “modifiable” or “non-modifiable.” Intrinsic and extrinsic risk factors, which may be interrelated and influence each other, whereas modifiable or non-modifiable risk factors allow for straightforward identification on specific risk factors that are amenable to change (i.e., modifiable) [10, 11]. In order to prevent MSK-I, it is critical to identify and focus on the modifiable risk factors associated with injury as these factors are likely amenable to intervention. As a result, classifying risk factors by whether they are modifiable or non-modifiable is much more relevant from a clinical and injury prevention perspective. Non-modifiable risk factors are important to help identify which populations are at greatest risk for injury so that prevention resources can be justifiably directed to these populations. However, as many intrinsic and extrinsic risk factors are in essence modifiable, they hold the most promise as targets for truly successful injury prevention interventions. Table 15.1 describes identified risk factors for training-related MSK-I across various categories and whether they are considered modifiable (M) or non-modifiable (N) .


Preventing Overtraining



Training Modification—Decreasing Running Mileage


Overtraining in the military has been established as a primary cause for MSK-I in military training populations. Overtraining is defined as “the physiology of musculoskeletal overuse due to exercise or physical training” [3]. In addition to overuse injuries, overtraining can lead to a decrement in performance, fatigue, and immune dysfunction. It is estimated that up to 80 % of the lower extremity injuries suffered in basic training are of the overuse type and are likely attributed to low levels of recruits’ baseline level of fitness [3]. While physical training is an essential part of military training and practice, the overwhelming increase of unfit individuals entering military service and the associated alarming increase in stress fracture incidence in basic training (see Chap. 5 on initial entry training (IET) injuries) has led to the establishment of several graduated and interval training interventions designed to increase baseline levels of fitness while preventing overtraining and MSK-I [1215].

Evidence from survey and epidemiologic research has demonstrated that high running volume is strongly associated with overtraining and lower extremity injury. Several studies have established that altering running volume can prevent MSK-I in training troops without negative effects on fitness. A study by Shaffer et al. [16] on Marine Corp trainees found that decreasing the running mileage in basic training by 40 %, decreased stress fracture incidence by more than 50 %, and most importantly, all without affecting physical fitness test performance [17] (see Table 15.2). The dramatic reduction of stress fracture rates from simply reducing running mileage was estimated to have saved $4.5 million in direct medical care costs and nearly 15,000 training days per year [16].


Table 15.2
Stress fracture incidence by mileage and run time





























Marines (n)

Total run distance (km)

Stress fracture incidence (n/100)

Final 3-mile run times (min)

1136

89

3.7

20.3

1117

66

2.7

20.7

1097

53

1.7

20.9

Similarly, a study in Army infantry recruits found that decreasing running mileage during basic training resulted in fewer lower extremity injuries. Jones et al. (1993) compared two different training strategies in separate Army infantry companies during 12 weeks of recruit training [18]. Both companies spent 5–6 days in physical training, spending similar time in calisthenics, stretching, drill, and ceremony; they also completed approximately 40 min of marching and running per day. However, the low running group spent only 8 of the 40 min running compared to 18 min in the high running group. At the end of the 12-week training period, the low running group ran a total of 56 miles and marched 121 miles compared to the high running company with 130 miles run and 68 miles marched. Overall, the incidence of sustaining a lower extremity injury was greater in the high running group (Risk Ratio, RR = 1.3, 95 % Confidence Interval, CI = 1.0–1.7). However, when combined with other training factors, such as age, cigarette use, prior history of injury, job activity, physical activity, and flexibility, the effect was diminished (Odds Ratio, OR 1.6, 95 % CI = 0.9–2.7). Nevertheless, it is remarkable that a 57 % reduction in running mileage was associated with a reduction in lower extremity injury incidence (41.8 % in the high mileage group versus 32.5 % in the low mileage group) without affecting overall fitness scores [1].

Another prospective cohort study in male US Navy recruits examined the impact of self-selected training load, leaving the amount of training mileage up to the discretion of the division. Out of 25 training divisions, recruits in the divisions that ran the most mileage had a significantly higher injury rate (22.4 versus 17.2 %; P < 0.02) after 8 weeks of training without any difference in overall run times in the 1.5-mile final run [19]. Results from both of these studies lend support to standardize training mileage, volume, and intensity as a way to effectively reduce the MSK-I in military training populations .

Similar studies in other countries have also shown positive results from training modifications. Several Australian military studies have demonstrated lower injury patterns with reduction of running mileage. Rudzki et al. (1997) [20] looked at 350 male recruits who were cluster randomized to a weighted march activity compared to routine standard training. The weighted group initially carried a load of 16.2 kg, with the weight progressively increased by 2.5 kg starting at week 5. The routine standard training group showed an increased risk of lower limb injury (RR = 1.65, 95 % CI = 1.21–2.25) and knee injuries (RR = 2.14, 95 % CI = 1.21–3.79) compared to the weighted march group over the 12 weeks of training. Another Australian military study prospectively followed 1634 male and 318 female recruits after changes were made to the Australian Army recruit training program. Interval runs (400–800 m) replaced road runs, test runs were reduced to 2.4 km from 5 km, route marches were standardized, and deep-water running was introduced. Following implementation, injury rates decreased to 46.6 % (χ2 14.31, P < 0.001) [21]. Finally, an intervention study looking at pelvic stress fractures in female Australian Army recruits also found that a multi-intervention program focusing on reduced running mileage and march speed decreased pelvic stress fractures by 91 % (11.2–0.6 %) from the year prior to the intervention [22].


Training Modification—Physical Readiness Training Implementation


Additional training modification studies have also demonstrated positive cardiovascular effects and injury reduction. In the US Army BCT at Fort Benning, a modified Physical Readiness Training (PRT) program was compared with the traditional physical training program, looking at injury rates and Army Physical Fitness Test (APFT) scores [23]. The new PRT intentionally decreased overall formation running mileage and included a gradual increase in distance running. The PRT program standardized basic training warm-ups and physical training and also incorporated new evidence-based calisthenics, dumbbell drills, movement drills, interval training, and flexibility training with a progressive increase in repetitions and intensity. At the end of the 9-week BCT, the PRT group had a higher pass rate on first-time administration of the final APFT and had fewer APFT failures. Also, the PRT group, despite running 54 % fewer formation miles (17.1 miles compared to 37.2 miles), demonstrated a 52 % decrease in the overuse injury rate in males and a 46 % decrease in overuse injuries in females without having any deleterious effects on APFT run times. Even when controlling for other risk factors, there was also a significant decrease in time-loss overuse injuries in the PRT group for both males and females. Males in the traditional training group had 52 % increased risk (RR = 1.52, 95% CI = 1.12–2.07), while women in the traditional group had 46 % increased risk (RR = 1.46, 95 % CI = 1.19–1.80) of sustaining a time-loss overuse injury than the new PRT group [23].

The new PRT method was also found to be successful at reducing injury in Advanced Initial Training (AIT), or secondary training, in the Army. Similar to BCT, the traditional training group had a higher risk of a time-loss injury (RR 1.5, 95 %; CI = 1.2–1.8) without any difference in APFT scores compared to the new PRT group [24]. These studies and others provided strong evidence for the recommendation that the new PRT should be adopted Army-wide in 2004. Since then, there has been a 21 % decrease in the injury rate compared to time period prior to the change in training during BCT/AIT .


Ankle Bracing—Athletic and Training Activities


Ankle sprains in the military occur at a rate of almost 35 sprains per 1000 person-years at risk—5 times the rate reported in civilian populations [25]. Therefore, prevention of ankle sprain has become a top priority. Ankle bracing has been shown to effectively prevent ankle injuries in several well-designed studies, especially in those who have had previous ankle sprains. A randomized controlled trial by Sitler et al. [26] on 1601 cadets at the United States Military Academy at West Point found that ankle brace use significantly reduced ankle injury during required intramural basketball. Cadets were randomized into a semirigid ankle stabilizer versus control group. They were then further randomized into a non-injured versus previously injured group and followed for 2 years. After 2 years, there were a total of 46 ankle injuries, 11 in the ankle stabilizer group and 35 in the control group (χ2 = 12.29; P < 0.01). The ankle stabilizer group had a significantly lower rate of ankle sprains compared to the control group (1.6 per 1000 athlete exposures versus 5.2 per 1000 athlete exposures). Although there are always concerns from the athletes regarding performance decrement and comfort with ankle bracing, a recent study in military cadets looking at obstacle course times and dynamic lower extremity reach found no effect on performance with bracing compared to non-bracing [27].


Ankle Bracing—Parachuting


Ankle bracing has also been shown to prevent ankle injuries related to parachuting. Results of a study by Shumaker et al. [28] found that during airborne jump operations, those wearing an outside-the-boot brace had 0.6 inversion ankle injuries/1000 jumps compared to 3.8 injuries/1000 jumps for those who did not wear the brace. This translated to three times fewer ankle injuries in Army Rangers when wearing braces [28]. According to a recent systematic review on the effectiveness of a parachute ankle brace (PAB) overall, not wearing a PAB approximately doubled the incidence of ankle injury, ankle sprain, or ankle fracture. In addition, the calculated cost-effectiveness of a PAB showed that for every $1 spent on the brace, $7–9 in combined limited duty and medical costs were returned—significant savings [29]. Overall, there appears to be a significant benefit to prophylactic bracing in preventing ankle injuries during airborne training and operations, particularly in the participants with a history of previous ankle injuries.


Multiaxial, Neuromuscular, Proprioceptive, and Agility Training for Injury Prevention in Troops


Specific programs that address increasing neuromuscular control, proprioceptive, and agility training have been shown to decrease anterior knee pain, stress fracture, and other lower extremity MSK-I incidences during military training [8, 12, 30]. Anterior knee pain, or patellofemoral pain syndrome (PFPS), is a common overuse injury in the military. In United States Naval Academy midshipmen, the prevalence of PFPS upon entry to the academy is as high as 15 % in females and 12 % in males. Female midshipmen develop PFPS at a rate of 33/1000 person-years (95 % CI = 20–45/1000 person-years), while males have a rate of 15/1000 person-years (95 % CI = 7–22/1000 person-years) [31]. In British Army recruits, an intervention of stretching and strengthening was found to reduce PFPS during entry-level training [30]. This randomized controlled trial compared the stretching and strengthening intervention (n = 759) to standard warm-ups (n = 743) during the 14-week basic training cycle. The eight intervention exercises were performed as a part of regular physical training and included isometric hip abduction against a wall in standing, forward lunges, single-legged step downs, single-legged squats, quadriceps stretching, iliotibial band stretching, hamstring stretches, and calf stretches. Most importantly, an emphasis was placed on form. In total, 46 cases of diagnosed anterior knee pain were reported: 36 (4.8 %) in the control group and 10 (1.3 %) in the intervention group (P < 0.01). Surprisingly, there were no gender differences noted. Perhaps, the most significant finding from this study was that despite a PFPS diagnosis, the training completion rate for those from the intervention group was 90 %, while only 44 % of the PFPS cases in the control group successfully completed the training [30].
< div class='tao-gold-member'>

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Jul 3, 2016 | Posted by in MUSCULOSKELETAL MEDICINE | Comments Off on Successful Injury Prevention Interventions

Full access? Get Clinical Tree

Get Clinical Tree app for offline access