Editor’s note: Last year, Army Futures Command’s Directorate of Concepts announced an essay contest to generate new ideas and expand the community of interest for the Army’s next operating concept focused on 2040. Contest entries were invited to respond to the following question: With AI maturing, autonomous systems and robotics becoming more prevalent on the battlefield, and battlefield transparency increasing, how should Army forces operate, equip, organize, and array the battlefield 2040 to overcome those challenges? This entry, from Lieutenant Colonel Brian Forester, was selected as the overall winner.
Technology alone is not enough for the Army to win in the data-rich environment of 2040. Data already permeates virtually every facet of life. However, the earlier promises of big data have been replaced by the perils of data overload. The Army’s emerging approach to the data overload challenge emphasizes technological tools such as advanced analytics software, powered by artificial intelligence (AI) and enabled by a tech-savvy talent base. The promise of AI is that it can close the gap between the information-processing requirements of future military operations and the limitations of human cognition. Sophisticated algorithms cull vast amounts of data, identify patterns, and forecast outcomes.
Will AI-driven analytics lead to better military decision-making? A pitfall lurks in the proliferation of available data, increasingly advanced analytical tools, and decision-making tendencies embedded in human psychology: analysis paralysis. Under such a condition, organizations become consumed with iteratively collecting and analyzing data at the expense of making timely decisions. With an infinite array of variables and model specifications, and the requisite tools to explore, data analysis options are virtually endless. As psychologist Barry Schwartz captures in his work on the paradox of choice, the presence of more options heightens the fear of making the wrong choice, leading to paralysis. Not surprisingly, this phenomenon is increasingly common in organizations employing analytics to aid decision-making. Risk-averse organizations, such as the Army, are especially susceptible to analysis paralysis as leaders focus on gathering and analyzing more data at the expense of making timely decisions.
The conditions for analysis paralysis are ripe now and will be riper still for the Army of 2040. Commanders and staffs eager to feed new data to their algorithms will run the risk of pushing their organizations into analysis paralysis. The very technological tools designed to help military organizations make decisions with data may have the opposite effect. The solution to this quandary is not found in technology or even by building a cadre of uniformed data specialists. It requires decision makers—commanders and the staffs supporting them—who think clearly about what they hope to learn from data. The data-driven Army of 2040 must have leaders who think clearly about potential outcomes of interest, the uncertainty inherent in statistical estimates, and efficient strategies to uncover relationships among relevant variables. Clear-thinking leaders will turn data centricity into future battlefield victory.
Clear Thinking Is Paramount
Avoiding analysis paralysis begins with a crystallized understanding of the question underlying any analytic effort. It requires thinking. University of Chicago political scientists Ethan Bueno de Mesquita and Anthony Fowler argue this point explicitly in Thinking Clearly with Data: A Guide to Quantitative Reasoning and Analysis. The authors open with the assertion that “thinking clearly in a data-driven age is, first and foremost, about staying focused on ideas and questions.” This maxim is especially relevant for Army leaders who will navigate the complexity of data-driven operations in 2040. Rather than technical acumen, clear thinking in data-driven operations is about applying sound quantitative reasoning to outcomes of interest, the uncertainty of estimates, and relationships among variables.
To enhance decision-making, analytics focus on how some feature of the world, the outcome, relates to other features of the world, the predictors. For instance, how battlefield performance (outcome) relates to military technology (predictor) is one such relationship of interest to Army leaders. Bueno de Mesquita and Fowler note that one of the most common mistakes when reasoning with data is “selecting” on the outcome variable. When we only examine cases with similar or identical values of the outcome (e.g., battlefield victories), we lose the information associated with the alternative outcome (e.g., battlefield losses). Such one-sided analysis is common in studies of military innovation. Eliminating variation in the outcome variable prevents the discovery of correlations with predictors, as “correlation requires variation.” Clear-thinking commanders and staffs will ensure they obtain data with variation on the outcome variable. Failing to do so will result in a biased analysis with low predictive power. Frustrated by the mismatch between expectations and reality, organizations will continue churning on new data and only deepen paralysis.
Battlefield performance, like most outcomes encountered in the world, is a function of both signal and noise. Signal is the meaningful and systematic pattern of activity, the detection of which is the focus of machine learning algorithms. Sophisticated algorithms use predictors to detect signal and predict future outcomes. Numerous predictors can detect the signal of battlefield performance, including military skill, will, organization, and leadership. Noise is the idiosyncratic and random component of outcomes that is the realm of chance, interfering with prediction. Unexpected weather, equipment breakdowns, and breaks in communications are examples of the noise leading to uncertainty in battlefield performance.
Commanders and staffs must think clearly about both signal and noise. Estimates of the signal will carry uncertainty, and the noisier the data the higher this uncertainty will be. Staff officers must clearly communicate the uncertainty associated with an estimated outcome to their commanders. Conversely, commanders must resist the temptation to overcorrect when the observed outcome of an operation does not go exactly as an algorithm predicted. As Bueno de Mesquita and Fowler note, extreme observations will typically be followed by observations closer to the mean (average) for any outcome that is a function of both signal and noise. If bad luck with chance variables such as weather or timing affect the outcome, it may be prudent for commanders to exercise tactical patience before hastily changing course and levying new analysis demands on their organizations. In other cases, the problem may be with signal detection, and reassessing the weight of predictors underlying the algorithm may be prudent. Western analysts arguably overweighted skill and technology while undervaluing will and leadership in the early phases of Russia’s war in Ukraine. Clear thinking about signal and noise is essential for efficient analytical strategies that avoid paralysis.
Sometimes the goal is not prediction but understanding the relationship between a specific predictor and an outcome. Exploring such a relationship requires careful consideration of—and controlling for—confounding variables that may influence both the predictor and the outcome. For example, democratic regime type has long been associated with superior battlefield performance; all else equal, democracies outfight nondemocracies. Subsequent research demonstrates, however, that democracy does not predict battlefield performance when also controlling for economic development. The exclusion of economic development from statistical models in earlier research biased the estimate of democracy’s effect. In the same way, Army leaders must think clearly about potential confounding variables between their variables of interest. Failing to account for them can bias estimates of the predictor’s effect, leading to incorrect conclusions regardless of model sophistication.
Experimentation is an efficient way to analyze the relationship between a treatment (predictor) and outcome. Robust experimentation is essential to keep pace with the rapidly changing character of combined arms warfare. Properly designed, randomized control trials cut through the noise of data with random assignment to treatment and control conditions. Using the scientific method as a guide, Army leaders considering experimentation must think clearly about the unit of analysis, the hypothesis of interest, and the assignment of units to treatment and control. Well-designed experiments can quickly lead to robust learning about the relationship using relatively limited data. Conversely, poorly designed experiments can lead to confusion over the nature of the relationship, increasing the need for further analysis.
Toward Clear-Thinking Leaders
How does the Army cultivate clear-thinking leaders for the data-driven force of 2040? An initial step is the reform of all levels of its professional military education system to give leaders a baseline set of critical-thinking skills to operate in a data-saturated environment. Thinking Clearly with Data provides a good reference point for appropriate material that could serve as a foundation of such efforts. Curating widespread habits of thought, though, will necessarily go beyond professional military education. A number of innovative data-literacy programs now exist through Army People Analytics, XVIII Airborne Corps, and Joint Special Operations Command. Expansion of these types of training programs across the force is critical to the development of the clear-thinking leaders needed in 2040. And perhaps more important than workforce development, unit-level training programs foster the broader cultural change needed that emphasizes evidence-based reasoning and clear thinking with data.
Several minor structural changes could also help leaders employ sound quantitative reasoning techniques. Private industry increasingly recognizes the value of analytics translators in connecting decision makers with the technical specialists analyzing data. Critically, these personnel translate a leader’s intent into variables of interest that can be collected, measured, and analyzed by technical specialists. As the synchronizer and functional integrator, the operations officer should fill this role in Army organizations. Intermediate Level Education should be adapted to train this skill set.
The Army should also consider placing data-management specialists down to battalion-level operations sections. While most AI-driven capabilities will likely reside at echelons above battalion, the effectiveness of those capabilities depends on the quantity of high-quality data fed to them. The demand for high-quality data will thus be constant. Data-management specialists at the battalion level should be highly skilled in the data preparation necessary to keep pace with demand, while also serving as a source of expertise for data efforts across the organization. Much as a trained master gunner is to a combined arms battalion, the data-management specialist will be a critical asset to data-driven operations at the tactical level. Finally, the Army must continue experimenting with new formations that enhance data-centric problem solving among commanders and staffs. Such initiatives will have a broader spillover effect across the Army.
AI-driven analytics are rightly making a splash in Army modernization efforts. The Army is rapidly moving to develop the technological tools and talent necessary for a data-driven force. But exploiting the advantages of such a force on future battlefields will hinge on the leaders directing it. The Army must ensure its leaders know how to learn from data, while avoiding the pernicious effects of analysis paralysis increasingly prevalent in data-driven organizations. Doing so requires clear thinking. Commanders and staffs must think clearly about outcomes, recognize the uncertainty inherent in noisy data, and employ efficient strategies to disentangle relationships among variables of interest. Otherwise, the Army of 2040 will struggle to capitalize on the advantages offered by the technological tools and talent it is investing in today.
Lieutenant Colonel Brian Forester is an Army officer and Goodpaster Scholar who recently completed a doctorate at the University of North Carolina at Chapel Hill. His research uses computational and design-based tools of social science to analyze international military cooperation.
The views expressed are those of the author and do not reflect the official position of the United States Military Academy, Department of the Army, or Department of Defense.
Image credit: Maj. Jason Elmore, US Army