If you don't remember your password, you can reset it by entering your email address and clicking the Reset Password button. You will then receive an email that contains a secure link for resetting your password
If the address matches a valid account an email will be sent to __email__ with instructions for resetting your password
Neuromuscular Research Laboratory/Warrior Human Performance Research Center, Department of Sports Medicine and Nutrition, University of Pittsburgh, United States of America
Neuromuscular Research Laboratory/Warrior Human Performance Research Center, Department of Sports Medicine and Nutrition, University of Pittsburgh, United States of America
Neuromuscular Research Laboratory/Warrior Human Performance Research Center, Department of Sports Medicine and Nutrition, University of Pittsburgh, United States of America
Neuromuscular Research Laboratory/Warrior Human Performance Research Center, Department of Sports Medicine and Nutrition, University of Pittsburgh, United States of America
Physiological and psychological stressors can degrade soldiers' readiness and performance during military training and operational environments. Integrative and holistic assessments of biomarkers across diverse human performance optimization domains during multistressor training can be leveraged to provide actionable insight to military leadership regarding service member health and readiness.
Design/Method
A broad categorization of biomarkers, to include biochemical measures, bone and body composition, psychometric assessments, movement screening, and physiological load can be incorporated into robust analytical pipelines for understanding the complex factors that impact military human performance.
Results
In this perspective commentary we overview the rationale, selection, and methodologies for monitoring biomarker domains that are relevant to military research and specifically highlight methods that have been incorporated in a research program funded by the Office of Naval Research, Code 34 Biological and Physiological Monitoring and Modeling of Warfighter Performance.
Conclusions
The integration of screening and continuous monitoring methodologies via robust analytical approaches will provide novel insight for military leaders regarding health, performance, and readiness outcomes during multistressor military training.
A broader definition of biomarkers that is inclusive of multiple biological domains can be used to assess military readiness, injury risk, and performance outcomes.
•
Screening and continuous monitoring methodologies can be integrated during multistressor military training environments to understand the impacts on health, performance, and readiness.
•
Robust analytical approaches are necessary to convey the intricacies of large datasets that span multiple domains in a manner that allows for actionable decision-making.
1. Background
The occupational demands experienced during multistressor training and operational environments for military service are physiologically and psychologically significant. The subsequent strain on soldier physiology can degrade health and performance, and increase the likelihood for fatigue and injury, which can negatively impact readiness and deployability.
Strategies for monitoring physiological biomarkers that include an integrative and holistic approach of assessing various human performance optimization domains that are feasible, acceptable, and suitable are needed for the military.
Our operational definition of biomarker is broad and inclusive as a variety of domain modalities may allow for a better understanding of human performance (Fig. 1). In this perspective commentary, we overview the rationale, biomarker domains, and assessment methodologies/technologies relevant to military research, with an additional focus on those incorporated in a research program funded by the Office of Naval Research, Code 34 Biological and Physiological Monitoring and Modeling of Warfighter Performance: “Developing a Warfighter Mobility Signature and Predictive Algorithm for Musculoskeletal Injury Risk During Marine Corps Candidate School”, “Development of a Physical Readiness Decision Tool to Leverage Wearable Technologies for Monitoring Warfighters Mobility and Load Exposure”, and “Physiological Monitoring and Assessment of Marine Physical Readiness During Arduous Military Training and Operations”.
Fig. 1Biomarker domains from screening and continuous monitoring during multistressor military training environments relating to service member health, performance, and readiness.
Biomarkers analyzed from blood, saliva, and/or urine can be used to understand components of the biological landscape underlying physiological adaptation, injury risk, and resiliency during military training. Several studies to date have characterized changes in the hormonal and biochemical profile of individuals exposed to multi-stressor military environments.
has provided key observations on extended energy deficits on soldier performance, body composition, and hormonal biomarker responses. Data leveraged from such studies can provide important mechanistic data regarding the impact of military stressors on physiology, be utilized as part of a screening or risk assessment profile, and/or identify potential targets for intervention strategies to optimize health and performance during training and operational environments.
Studies conducted in diverse laboratory and field settings have investigated the neuroendocrine response following exposure to multistressor environments to include short- and long-term timeframes and during initial and advanced training programs. A complete description of the biochemical biomarker response has been summarized elsewhere.
General findings that have been consistently reported indicate an altered hormonal milieu characterized by increased cortisol concentrations and decreased concentrations of IGF-1 and testosterone.
Utility of circulating IGF-I as a biomarker for assessing body composition changes in men during periods of high physical activity superimposed upon energy and sleep restriction.
This catabolic shift may compromise functional capabilities and musculoskeletal health due to impacting lean mass accrual, bone adaptation, and musculoskeletal resiliency.
Musculoskeletal injuries (MSKIs) subsequently present a major burden to the military due to lost or limited duty days, attrition from training, and high financial cost.
Bone stress injuries (BSIs) are one such MSKI that commonly affect military personnel and studies have investigated the adaptive bone response to military training to elucidate factors associated with BSI etiology.
Indeed, bone-specific biomarkers, such as those that reflect processes associated with bone remodeling, i.e., bone formation (P1NP, BAP) and resorption (CTx, TRAP5b), have been investigated during initial and advanced training programs. Initial training environments that rapidly increase loading to relatively naïve bone subsequently increase bone formation,
Rather than absolute concentrations, however, it may be the relative balance between formation and resorption that translates to structural changes in bone. Alternatively, advanced training that includes more extreme stressors can both reduce bone formation and increase resorption, and therefore may inhibit expected benefits of loading on bone and compromise skeletal integrity.
Future work is needed to elucidate the independent and combined effects of military stressors on bone turnover and relate changes in bone turnover markers to structural adaptation. Moreover, bone turnover markers have been tested for a potential association with BSI risk, although the evidence does not currently support a predictive link between bone turnover markers and stress fracture in military recruits.
The association between bone turnover markers and MSKI risk may, however, still be of interest as previous work relied on a selective set of measures, whereas novel exerkines, myokines, and osteokines released during training and exercise may hold promise for a better understanding for mechanisms underlying tissue adaptations
Understanding the physiological consequences of stressors can also lead to the development of potential countermeasures to maintain readiness and mitigate negative outcomes during arduous military training. For example, energy restriction is often purposefully induced as part of training objectives or due to logistical constraints, but can negatively affect health and performance.
As such, nutritional interventions to provide supplemental calories to limit or eliminate the energy deficit, while maintaining all other stressors, have been tested.
Prevention of an energy deficit in men during an 8-week combat course attenuated losses in fat and lean mass and improved bone formation compared to a control group that had reduced bone turnover alongside decrements in metabolic and reproductive hormones.
Similarly, the provision of additional calories during US Army Ranger training was sufficient to attenuate declines in body weight and fat free mass and delayed the onset of elevated cortisol concentrations.
Furthermore, a short-term period of refeeding within the training course, in the presence of all other stressors, temporarily restored concentrations of IGF-1 and testosterone.
Reductions in the energy deficit may allow for training objectives to be met while alleviating physiological costs.
Alternatively, oftentimes stressors are unavoidable or cannot be altered; therefore, other efforts have examined the efficacy of pharmacological hormone supplementation for its ability to mitigate training consequences. Specifically, the effectiveness of exogenous testosterone replacement during an energy deficit on physical and cognitive performance has been tested as a strategy for countering declines in testosterone, lean body mass, and physical function.
Physiological and psychological effects of testosterone during severe energy deficit and recovery: a study protocol for a randomized, placebo-controlled trial for Optimizing Performance for Soldiers (OPS).
Although the results of this particular investigation did not support a recommendation for testosterone supplementation, this line of inquiry has provided rationale for follow-up investigations.
Despite the depth of information garnered from biochemical factors, the collection process can be difficult to implement due to associated costs, expertise, and invasiveness, particularly for blood samples. Therefore, acquiring less burdensome samples, such as saliva, has become a widely accepted method that is utilized in laboratory and applied settings.
Salivary collection presents a non-invasive alternative, while still being valid and reliable for select analytes, thereby facilitating more frequent assessments over consistent periods of time to better characterize the physiological effects of training. In military populations, saliva has been collected serially throughout the course of intensive basic training as well as during an in-field training exercise to track physiological and psychological stress.
Notably, most work to date has been conducted in men, but there is now a pressing need to investigate sex-specific responses to the various training programs, particularly those that were previously barred to women. Suppression of the hypothalamic–pituitary–gonadal axis, indicated by subtle and clinically evident menstrual disturbances, has been documented in female service members during military training.
and alterations in reproductive hormone exposure will likely impact readiness. As such, additional work is necessary to understand how multistressor military training may impact the neuroendocrine profile of women and to elucidate the effects of reproductive hormone exposure on military relevant health and performance outcomes. While the most clinically severe menstrual disturbances, amenorrhea and oligomenorrhea, are dependent on the frequency of menses and can be detected via self-report menstrual tracking, subclinical conditions, such as luteal phase defects and anovulation, are only evident with thorough biochemical assessments (e.g., quantification of estrogen and progesterone metabolites from serial urine samples during a menstrual cycle
Reductions in urinary collection frequency for assessment of reproductive hormones provide physiologically representative exposure and mean concentrations when compared with daily collection.
). Assessing the frequency of menses, presence of ovulation, and measuring reproductive hormone concentrations in women will provide important insight into sex-differences in physiological adaptations to military training and improve our understanding of risk factors for musculoskeletal injuries.
3. Body composition and bone imaging
Imaging technologies can be a valuable resource for tracking how body size and composition respond to military training and their relation to injury risk. Service members regularly engage in physically demanding training and operational environments that can alter body dimensions and composition depending on the program and objectives. Notably, changes in body composition may not be evident without advanced multi-compartment assessment methodologies, such as skinfolds, bioelectrical impedance (BIA), or dual energy x-ray absorptiometry (DXA). For example, during 10-weeks of Army basic combat training, total body mass did not change in women, but there was a significant reduction in fat mass and an increase in lean mass, suggesting a beneficial adaptation of training that would not otherwise be evident if relying solely on body mass or BMI.
Additionally, work is ongoing to examine how multiple methods of assessment, to include DXA, 3D body scanners, and BIA, can be implemented to evaluate and potentially update body composition standards to balance health, performance, fitness, and military appearance.
Three-dimensional body scanners are currently in use by the apparel industry to measure body dimensions for clothing and uniform fitting, and have been proposed to also estimate body composition. Stationary imaging models (e.g., Sizestream) allow for automated anthropometric evaluation by utilizing visible and infrared light to produce an avatar of the human body. By providing >200 measurements of an individual's body size in <1 min and providing precise circumference and volume estimates
such technology may have utility for collecting common anthropometric variables used for assessing military body composition standards. Estimates of body composition, such as body fat prediction formulas, have been developed but require additional validation for longitudinal tracking.
Multi-compartment methods of assessing body composition, such as skinfolds, BIA, and DXA, are often implemented in military research settings. Skinfold measurements are a two-compartment method that relates subcutaneous fat deposits to total body density and are then converted to percent body fat and fat free mass via specific prediction equations.
Skinfold assessments are common yet highly dependent on proper training to ensure that the appropriate sites are measured consistently according to recommended techniques.
Body composition for health and performance: a survey of body composition assessment practice carried out by the Ad Hoc Research Working Group on Body Composition, Health and Performance under the auspices of the IOC Medical Commission.
Bioelectrical impedance, also a two-compartment model, estimates fat free mass by measuring total body water via an electrical current and is converted to fat free mass based on prediction equations.
Comparison of air-displacement plethysmography with hydrostatic weighing and bioelectrical impedance analysis for the assessment of body composition in healthy adults.
Bioelectrical impedance results are, therefore, highly dependent on hydration status and can vary based on the type of machine and the body parts analyzed.
DXA, a three-compartment body composition assessment, is the current gold standard of body composition assessments based on its speed, accuracy, and reproducibility. However, its implementation does require low dose radiation exposure, specialized training, and access to the machinery. DXA relies on the attenuation of a low dose x-ray beam as it passes through different body tissues and provides estimates of bone mineral content and density, fat-free soft tissue mass, and fat tissue mass for both the whole body and for specific body regions.
DXA has the added benefit of quantifying bone mineral density (BMD), in addition to fat and lean mass, which can provide greater insight to musculoskeletal health and adaptation.
Despite the ability of DXA to evaluate BMD, it can only be assessed in two-dimensions and cannot accurately estimate bone biomechanical characteristics.
Incorporation of three-dimensional bone imaging, such as via peripheral quantitative computed tomography (pQCT) and high-resolution pQCT (HRpQCT), can allow for measurements of true volumetric BMD, differentiate cortical and trabecular compartments, and accurately measure the components underlying bone strength.
As previously discussed, military training can impact bone health and recent evidence highlights the impact of mechanical loading during physical training on adaptive bone formation to alter bone structure and improve bone strength.
Psychological resilience, the ability to resist and adapt to significant levels of stress and adversity, is therefore a potential protective factor during military training.
Although resilience is often considered a “trait,” or long-term characteristic that is unlikely to change significantly over time, resilience may actually present like a “state” characteristic that changes based on circumstances and may manifest itself differently across different life domains, times, and environments.
Previous research examining the relationships among psychological resilience, stress, and attrition from advanced and basic military training report that higher resilience scores were related to successful completion of training and lower levels of perceived stress.
Therefore, those entering military training with higher resilience may adapt more favorably to rigorous military training than those with lower resilience.
Additionally, if resiliency can be effectively screened at the start of training, individuals that may benefit from mental resilience training can be identified to allow for intervention.
Measurements of psychological resilience and perceived stress are often done using self-report questionnaires. The Connor–Davidson Resilience Scale (CD-RISC) is a valid and reliable measure of psychological resilience consisting of 25 self-report items designed to quantify resilience on a scale from 25 to 100.
The Perceived Stress Scale (PSS) is a widely used and accepted measure of perceptions of stress based on Lazarus's transactional stress model that takes into consideration the appraisal of a stressor along with one's perception of their ability to cope with the stressor.
Future research on psychological resilience would benefit from an in-depth analysis of the construct to determine if it is better suited as a “trait” or “state” and to investigate whether psychological stress and resiliency can be utilized as part of screening or mitigation techniques, such as identifying those that may be better suited for more grueling jobs or who may benefit from mental resilience training.
Another aspect important to warfighter success in combat and training operations is cognitive performance. Like resilience, cognitive performance is a multidimensional construct that consists of multiple modalities, such as memory, reaction time, and attentional vigilence.
Cognition is an integral part of many military tasks where soldiers must maintain vigilance in their attention, process incoming communication effectively, and use executive processing to respond to threats in their environment.
and decreased cognitive performance has been linked to high rates of military accidents; therefore, the ability to maintain adequate levels of cognitive performance while experiencing physical and psychological stress is directly related to a warfighter's operational readiness.
To assess cognitive performance, several tests are available, and researchers must first decide what modality of cognition they aim to measure. For example, the Cognition test battery was created to measure cognitive performance in astronauts by using a combination of well-established neurocognitive tests
The Cognition test battery consists of 10 tests to measure cognitive domains such as reaction time, spatial memory, working memory, emotion identification, and vigilant attention.
Future research would benefit from validation of the Cognition test battery in a more diverse representation of military populations.
5. Movement screening
Assessments of functional movement patterns may have utility within a physically active population for identifying those predisposed to MSKIs and as an intervention target to reduce dysfunction. Several methods are currently available for assessing and quantifying movement, including laboratory-grade and field-expedient techniques. To be implemented appropriately, practitioners must understand the limitations and strengths of available tools, and how they may relate to outcomes of interest (i.e., MSKIs). The Functional Movement Screen (FMS), a subjective test designed to identify movement pattern asymmetries and/or deficiencies by scoring individuals on a scale of 0–3 for their ability to complete specific movements (e.g., overhead squat, hurdle step, lunge), has been implemented as a common method to screen for MSKIs; however, FMS had poor internal consistency in a military population
Furthermore, the FMS testing battery is restricted in the number and types of movements tested, whereas complex dynamic movements, such as the countermovement jump or drop jump, may have greater applicability to military populations who must navigate obstacles to complete occupational tasks. Jump landing tasks have therefore been suggested as relevant tests for measuring neuromuscular readiness in military populations
LESS is a valid and reliable clinical assessment tool that utilizes standard cameras and an itemized list to identify movement errors during a drop landing task.
However, the LESS requires retrospective review of videos to evaluate the itemized performance list, which further adds to the time burden, and subjective bias could be introduced if the evaluator has not been fully trained.
Objective measurements of dynamic movements are also available and provide high fidelity data, but are often time consuming and require subject matter technical expertise. Marker-based motion capture (e.g., VICON), which utilizes reflective markers for optical tracking of human motion, is the gold standard for capturing kinetic and kinematic variables and can be paired with inlaid multi-axial force plates to capture ground reaction forces during distinct movements (e.g., jumps, running, load carriage). Strengths of marker-based motion capture include reliability and validity of capture technique to quantitatively analyze movement strategies, and its ability to accurately predict MSKIs in single leg landing mechanics in US special forces.
Limitations include the time burden and expertise required for individually placing markers on the subjects and back-end post-processing required to evaluate joint kinematics and kinetics via inverse kinematics and dynamics using 3rd party software or self-developed MATLAB scripts for singular output variables (e.g., joint moment, joint flexion).
Markerless motion capture, which mitigates the need for markers by relying on color density models and automated intelligence to automatically detect a human within the system,
may be a field-expedient, non-laboratory based alternative approach. Additionally, commercial grade markerless motion capture systems (e.g., DARI) have introduced rapid post-processing techniques that are able to immediately push data capture to internally developed software for the provision of real-time results. Markerless motion capture can also be paired with force plates as demonstrated by a recent investigation that incorporated a single axial force plate inlaid within the DARI system to relate countermovement jump outputs to MSKIs in military trainees.
Unsupervised clustering techniques identify movement strategies in the countermovement jump associated with musculoskeletal injury risk during US Marine Corps Officer Candidates School.
Thus, a lower-fidelity system may still be conducive to investigating MSKI risk compared to current gold standards, particularly in situations that require testing a high throughput of subjects in a dynamic military setting.
In addition to their implementation alongside motion capture technology, force plates, alone, may also provide insight to physical performance and MSKI risk during arduous military training. Furthermore, recent directives from the National Defense Authorization Act require the investigation of force plate technology to lessen the burden of MSKIs.
6. Physiological load monitoring with wearables
To quantify the physiological load and capture real-time assessments of an individual's status during training or discrete operational tasks, strategies and technologies for continuous monitoring via wearables are emerging. Wearables can be categorized as internal or external monitoring devices and have been extensively reviewed for their role in understanding components of performance and resiliency.
Internal monitoring can include heart rate (HR), HR variability (HRV), and oxygen saturation via watches or chest straps to provide insight regarding the internal physiological demands placed on the subject by the external conditions. Alternatively, monitoring via global positioning system (GPS), accelerometers, and inertial measurement units (IMUs) worn on segments of the body can evaluate external load.
Heart rate measurements can provide insight on training load and stress placed on an individual via metrics such as training impulse (TRIMP), which incorporates the intensity and duration of exercise.
Additionally, HRV can be calculated as the variability between heart beats (e.g., RMSSD) and is typically implemented to evaluate for overtraining and/or under-recovery.
The combination of HR and HRV metrics can be used to describe the internal load and whether sufficient recovery has occurred following an exercise bout. For example, increased TRIMP during a field training exercise followed by persistently low RMSSD, compared to baseline values, could be indicative of incomplete recovery from training.
External monitoring, via accelerometers or GPS, can provide volume-based measurements (e.g., distance traveled, step count), as well as measures of intensity (e.g., distance traveled greater than 90 % of peak velocity, time spent in moderate–vigorous physical activity) for tracking physiological load. Actigraphy is commonly used in research and clinical settings to monitor physical activity levels and, depending on the specific model implemented, can be worn on the wrist or hip to determine daily step counts, classify physical activity intensity, estimate energy expenditure, and/or monitor sleep/wake activity. In situations where geographical location information can be utilized, GPS uses satellite-based navigational technology to measure movement, speed, distance covered, and accelerations/decelerations. Many commercially available IMUs (e.g., ImeasureU) incorporate accelerometers with other sensors (e.g., magnetometer, gyroscope) to monitor the orientation of loading in the extremities such as quantifying cumulative impacts and impact asymmetries of the lower limb in higher-fidelity assessments than via accelerometry, alone.
While these technologies are beneficial for providing continuous data during timeframes of interest, their functionality can be limited by battery life and device internal storage. Balance between amount of data collected and device constraints is therefore essential for effective implementation. For example, capturing beat-to-beat HR will provide ~100,000 data points daily while compromising the device's local storage and battery life, whereas reducing the frequency of data capture may extend the battery life and reduce unnecessary data inputs; some commercial grade companies have combatted this issue with frequent but smaller sampling windows. Similarly, GPS systems may not need to geocoordinate a location every second during a week-long training exercise, whereas reducing frequency may significantly increase the battery life. For some measures, however, decreasing the frequency at which data is collected may also decrease the level of accuracy.
The level of personalization that can be implemented will vary depending on the device with some providing manufacturer specifications regarding frequency of data inputs while others can be modified during setup to change the sampling frequency. Commercial grade technologies have also been adjusted for greater user accommodations by facilitating the off-loading of data from the device via Bluetooth or WiFi transmission to a tablet or local laptop and providing immediate feed-back through customizable dashboards at the user interface, which will likely improve their utility and application. Wearable technologies are increasingly pervasive in the market and expanding practitioner knowledge of device limitations, strengths, and specific use case is necessary for their proper implementation.
7. Data outputs from commercial grade technology
Due to the quantity of data outputs collected from wearable and screening devices, many technology companies have hired support personnel skilled in data science and analytics to formulate proprietary aggregate measures and user-friendly dashboards. Although promising for “user-friendliness” and expedient “one-stop shop answers,” many of these aggregate measures have yet to be validated by high-quality research for consumer usage.
Commercial grade equipment typically has three data types: raw time series, aggregate (averaged) data, and/or proprietary black-box cumulative scores. Many commercial devices were primarily developed for general population usage and thus heavily reliant on user-friendly aggregate measures, black-box scores, and visual trends for data monitoring. However, commercial devices have also been implemented in research settings, which often require access to raw time series data that is necessary to validate commercial grade technologies and/or investigate questions of interest. Unfortunately, many commercial grade technologies don't allow access to raw data due to the potential “reverse engineering” of propriety algorithms or require 3rd party applications for collection, all of which can hinder research efforts. Additionally, while proprietary black-box scores are often used to give a “one-stop shop” answer, this score could provide faulty indications of readiness, fatigue, and MSKI risk if not properly verified.
reported that proprietary algorithms from two commercial grade technologies (SPARTA and DARI) were not predictive of MSKIs in military trainees. Additional work is necessary and currently underway to test the predictive efficacy of commercial grade algorithms. Notably, many such companies are still in their infancy and working to create robust normative ranges in specific populations that may improve their algorithms and subsequent predictive utility.
Ultimately, the end goal is to provide actionable usage of the technology for military application. Commercial grade options are promising as industry has the necessary resources available to develop and deploy novel technologies and algorithms for widespread use; however, greater access to raw data beyond algorithm outputs would allow for impartial assessments of device reliability and validity.
8. Analytic pipeline
Assessments of biomarkers during military training and operational scenarios have the potential of improving our understanding of physiology and can be leveraged to optimize performance and reduce injury risk. However, to do so, the data collected must be analyzed and interpreted appropriately to provide actionable information for military leadership (Fig. 2). A review by Merrigan et al.
details a comprehensive approach for how to communicate and visualize data through analytical techniques, which can be applied to all screening and continuous monitoring discussed in this review. Descriptive trends, such as Z-scores and (percentage or absolute) deviations, may give insight to fluctuations during a training cycle and are visually appealing to leadership for quick decision aids, but are not sufficient for research analyses. Alternatively, data science approaches, such as machine learning and robust statistical analyses are necessary for research, but rarely used to communicate results in a sport science setting. Therefore, the specific audience and setting are necessary to consider when compiling and reporting data.
Fig. 2Schematic diagram of how biomarker data can be tracked during military training and presented for decision-making purposes. Soldier physiological stress (y-axis) is plotted over time (x-axis) during a challenging multistressor program. Spider charts are presented at varying time points (Baseline, T1, T2, and T3) to represent how biomarker domains may change in response to physiological stress. Spider charts can be used as a visual aid for displaying an individual's biomarker performance at each timepoint and how it may differ from Baseline. For example, at Baseline, similar component scores of Movement, Biochemistry, Bone, Body Composition and Internal Load are observed (solid line with light fill). From Baseline to T1, physiological stress increases, and the Baseline vs. T1 spider chart displays Movement, Internal Load and Biochemistry greatly reduced from baseline (Baseline = solid line with light fill; T1 = dashed line with dark fill). The subsequent spider charts display the following timepoints compared to baseline wherein domains return toward baseline following reduced physiological stress and recovery.
To understand complex readiness and performance outcomes assessed via multiple biomarker domains during military training will likely require a robust multi-variate approach that allows for multiple physiological inputs with variable cut-offs. Thus, moving away from risk associations (i.e., single predictor) to a web of determinants (i.e., multiple predictors) for understanding factors, such as MSKI risk,
is desired. Supervised and unsupervised machine learning techniques are two such approaches that may add utility to predicting MSKIs from a battery of human physiological testing. Additionally, this approach can assess multiple domains of human physiology within the same analysis, in which all predictors (e.g., blood, bone, movement) are utilized to predict a single outcome (i.e., MSKIs).
9. Conclusions
Military readiness in the 21st century will benefit from the appropriate leveraging of integrative and holistic monitoring technologies for soldier performance and data analytical frameworks that provide military leaders with real-time physiological insight for actionable decisions.
Fig. 3 portrays considerations for maximizing the potential impact of biomarker research, to include incorporation of 1) multi-modal variable domains, 2) multi-disciplinary teams, 3) laboratory to field-based research, 4) advanced data analytics and modeling, 5) including the end-user as a key stakeholder, 6) providing near-real time feedback, 7) using feasible, acceptable, and suitable technologies, and 8) a focus on quality science, transition, and impact. The extent to which the US military can leverage biomarker physiology to inform and enhance military training and readiness will confer competitive advantages over near-peer adversaries in multi-domain battle. As demonstrated by the Code 34 Warfighter Performance research program described herein and consistent with Office of Naval Research efforts to identify innovative scientific and technological solutions to address current and future military requirements, the utilization of novel technologies and physiological predictors of readiness are an important priority for ensuring the health of military service members.
Fig. 3Considerations for researchers to maximize the potential impact of biomarker research.
We acknowledge the efforts of the following individuals for their assistance in initializing and supporting our work at Quantico, VA Officer Candidates School, and the continued progress: COL Stephen Armes (COL, USMC, ret.), COL Michael Lee Rush, COL David Hyman, TECOM Brian McGuire (COL, USMCR, ret.), LCDR Josh Swift, LT Garrett Morgan, CAPT Lindsay Carrick, CAPT Alexzander Szallar, CAPT Whitney Staton, Angelique Bannister, Angelito Vera Cruz, Leah Watson, LCDR Lauren Specht, and all other OCS staff that assisted.
The following projects were funded by the Office of Naval Research, Developing a Warfighter Mobility Signature and Predictive Algorithm for Musculoskeletal Injury Risk During Marine Corps Candidate School, Grant Number: N00014-20-C-2020; Development of a Physical Readiness Decision Tool to Leverage Wearable Technologies for Monitoring Warfighter's Mobility and Load Exposure; Grant Number; N00014-21-1-2725. The funding agency did not have a role in the development of the manuscript.
References
Nindl B.C.
Williams T.J.
Deuster P.A.
et al.
Strategies for optimizing military physical readiness and preventing musculoskeletal injuries in the 21st century.
Utility of circulating IGF-I as a biomarker for assessing body composition changes in men during periods of high physical activity superimposed upon energy and sleep restriction.
Physiological and psychological effects of testosterone during severe energy deficit and recovery: a study protocol for a randomized, placebo-controlled trial for Optimizing Performance for Soldiers (OPS).
Reductions in urinary collection frequency for assessment of reproductive hormones provide physiologically representative exposure and mean concentrations when compared with daily collection.
Body composition for health and performance: a survey of body composition assessment practice carried out by the Ad Hoc Research Working Group on Body Composition, Health and Performance under the auspices of the IOC Medical Commission.
Comparison of air-displacement plethysmography with hydrostatic weighing and bioelectrical impedance analysis for the assessment of body composition in healthy adults.
Unsupervised clustering techniques identify movement strategies in the countermovement jump associated with musculoskeletal injury risk during US Marine Corps Officer Candidates School.