Our nation has given soldiers, sailors, airmen, and Marines access to computers that can perform millions of operations per second and store more information than a human can read in a lifetime. Most use it to create PowerPoint presentations and send emails. The obvious mismatch between capability and application screams that something has gone wrong. Integrating advanced digital technology into warfare should be about more than designing and delivering new applications and hardware. While technology changes can drive changes in warfare, we cannot expect warfighters to change the way they fight without also changing the way they think. The cliché nature of this statement is belied by military planning and decision-making processes that have changed little in decades, do not take advantage of the power of the information-processing capabilities available, and instead look like PowerPoint replications of pen-and-paper processes.
Computational thinking “is the thought processes involved in formulating problems and their solutions so that the solutions are represented in a form that can be effectively carried out by an information-processing agent.” Phrased differently, computational thinking involves breaking problems into their component pieces and identifying which can be optimized, automated, or scaled by a computer. If the military does not improve how it teaches computational thinking, warfighters will not be able to capitalize on the many advantages of machine speed and scale, or be able to build, interact with, and responsibly delegate authority and decision-making to digital systems. Using the lessons learned process as an example, here is the impact computational thinking can have on military operations, even without the addition of new technology.
The Traditional Approach: Lessons Learned Documents
Our fictitious unit is conducting a training event centered around how likely an electronic warfare (EW) system is to work in different circumstances. Because these training events take place infrequently, most of the personnel who participate will no longer be in the unit during the next iteration. Good units try to ensure they share the benefits of their training with those who follow them.
One way to share knowledge gained through training is to draft lessons learned documents. Well-disciplined military units collect and record lessons learned after training exercises and operations. Typically, units capture lessons during after action reviews and record them in slide decks and memos. Those documents are then shared internally, with neighboring units, or more broadly through institutions like the Center for Army Lessons Learned and databases like the Joint Lessons Learned Information System. Particularly key lessons are integrated into the unit’s standard operating procedure (SOP).
This collection and dissemination process is the approach most good units take, but it has to overcome several challenges to maximize effectiveness. Often, servicemembers are hard-pressed to find time to review lessons learned documents. Units preparing for a major training event typically have many administrative and staff requirements competing for limited time and personnel. Unfortunately, in the rush to complete these tasks, reviewing lessons learned documents often falls by the wayside. Even when lessons learned have been incorporated into SOPs, they are often removed when new chains of command review and update those instructions. Thus, lessons learned are easily forgotten. Finally, even when servicemembers read and sufficiently understand lessons learned, they can easily—either consciously or unconsciously—default back to their own experiences and biases.
Building Models
Organizations that prioritize computational thinking have alternative approaches that avoid many of the above challenges and tap into the advantages of machine speed and scale. Rather than drafting a lessons learned document, units could create models. The most basic version would be a series of If-Then statements in a program like Microsoft Excel. To determine the likelihood the EW system will work, servicemembers would determine the variables that influence the EW system, their possible values, and their effects in order to create the If-Then statements.
As an example, the EW system may have four influential variables, each with three possible values, for a total of eighty-one different scenarios that must be analyzed. After establishing the variables, the soldiers would set an outcome for each of the scenarios in probabilistic terms. If filling in eighty-one scenarios seems excessive, keep in mind that this is the exact same volume of information that a thorough lessons learned document would need to address to maximize its effectiveness.
Once the model is complete, users would simply select values for each of the four variables, and the model would provide the probability that the EW system would produce the desired effect. It would be relatively simple to combine models for multiple systems to understand which would have the highest probability of producing a desired outcome in a given scenario.
Alternatively, units could also build much more powerful Bayesian models, still just using a spreadsheet. Bayesian models leverage prior knowledge and assumptions, then incorporate updates based on a stream of data points. In this case, the prior knowledge and assumptions would be experience-based knowledge about the probability of EW system effectiveness in various scenarios, much like the If-Then statements. The stream of data points would come from evaluations of the EW system’s effects after additional uses. This would enable the model to improve over time, becoming more effective as long as the effects of the scenario’s influential variables remain consistent.
Creating a model can be a much more effective way to share lessons learned than creating a memorandum or slide deck. While lessons learned documents take time to review and may not be understood even when read, servicemembers could use the model with very little time and effort by inputting key variables and reading the model’s output. This would also shorten decision-making timelines, especially in environments when commanders and staffs are suffering from information overload and fatigue. While servicemembers would still need to understand how the EW system works, a model would also reduce the amount of knowledge and experience needed to employ the EW system.
Data-informed models also benefit from a strong empirical basis. In the military, lessons learned documents tend to rely on intuitively derived lessons and are intended to help shape a reader’s intuition. While that works adequately in many cases, and is often the only way to function in an ambiguous and uncertain environment, it is incredibly susceptible to bias, including unconscious bias. While models will never fully escape their developers’ biases, a firm empirical grounding will improve the model’s accuracy as long as the servicemembers who develop and maintain the model are adequately trained.
A Bayesian model’s greatest advantage over a lessons learned document is that it can learn at machine speed and scale. Computers can perform millions of calculations per second and digest vast amounts of information. Limits arise from processing power, bandwidth, and access to data but not from the number of people involved. Organizations that identify choke points in their workflow and use machines to do that work can deliver value faster and at a much greater scale than those that remain bound by human limitations. Humans continue to contribute by guiding the machines and by performing the tasks machines cannot.
When units use lessons learned documents alone, the rate that units can learn is limited by human learning speed. First, the initial author must learn the right lessons from their experiences, incorporate that information into a lessons learned document, and share the document. A second human then needs to find, read, understand, and apply those lessons for them to be truly learned. As a result, the entire organization can only learn at the speed that individuals can learn and share lessons, no matter how many people are involved.
In contrast, a Bayesian model can learn and improve its outputs as quickly as new data is added, slowed only by processor speed and bandwidth. Updates can come from an entire network. In this case, every servicemember using the EW system could use and update the same model, capturing the lessons of dozens or hundreds of users rather than just a handful in just one unit or formation. The more users in the model, the more and faster lessons are learned. As long as the model is regularly validated to prove accuracy and avoid optimizing to an implicit bias, the model will improve even faster when data collection and integration are automated.
Skills Needed
Switching from capturing lessons learned in PowerPoints to building models seems like a sharp departure from military’s way of doing business, but it is much more feasible than it might appear. Revising our way of processing information can be done by the current force with just a few additional skills. First, servicemembers need to understand the actual problem they are trying to build a model to address. It would be very easy to assume in the above scenario that the challenge is to write a better lessons learned document. That is ancillary to the real problem. In reality, the challenge is to share the benefits of a specific experience without requiring the experience and its associated cost in time and money.
Second, they would need enough of a basic understanding of models to know that a Bayesian model might solve their challenge more effectively than a lessons learned document. They would also need the ability to create such a model. This would not require a mastery of Bayesian inference or statistics—just fundamental skills in Microsoft Excel and access to tutorials on YouTube. They would also need to understand how to collect and enter data to update the model, how to interpret outputs stated in probabilistic terms, and when to trust their model. Models are not well suited for every type of problem. Even when they are well suited, servicemembers still need to think critically about the model’s output and to become comfortable with trusting the model.
It is worth noting that all of this can be done without any knowledge of computer languages or coding. Servicemembers still need to understand their EW system. Models can be helpful, and would somewhat deskill the decision-making process, but they are not a substitute for understanding how a system functions, its likely faults and failures, how it interacts with other weapon systems, and where its use fits into the larger operational plan. Using computational thinking, though, rapidly expands that knowledge base.
Moving Forward
Everything described up to this point is possible using simple, unsophisticated tools. Computational thinking will take on an entirely new level of importance as the military begins to integrate artificial intelligence into everyday operations. Unfortunately, leaders cannot just wish to improve their servicemembers’ computational thinking and make it so. To turn this aspiration into a reality, the military services need discrete topics they can train and teach.
The National Security Commission on Artificial Intelligence recommends the military integrate five topics into junior leader training and education. They are:
- Problem curation, or discovering the causal mechanisms that lead to problems and associated issues. Problem curation is important for understanding the root cause of an issue, and determining if a computer can help solve the problem.
- The AI lifecycle, which is a framework that simplifies the complex process of developing and deploying an AI-enabled application. A better understanding of the AI lifecycle will encourage leaders to invest in the important but deeply unsexy work that must take place before applications are developed, and prevent leaders from trying to “sprinkle on some AI” after a system is mature.
- Data collection and management. The military’s data sets are often terrible, hamstringing efforts to develop AI-enabled systems. This is a problem everyone in the military contributes to and everyone will need to contribute to the solution. Much like equipment maintenance, marksmanship, and first aid, data collection and management need to become skills for every servicemember.
- Probabilistic reasoning and data visualization. Many AI applications and models present their outputs in probabilistic terms. Servicemembers need to understand enough about probabilistic reasoning to understand their system’s output in a given situation. They also need to understand the limits of probabilistic reasoning as part of problem curation.
- Data-informed decision-making, or the ability to use data to generate insights, then act on those insights. This is as opposed to purely intuitive or experience-based reasoning.
Computers already help military personnel perform their jobs more effectively. Digital technology is becoming ever more powerful, and has the potential to change the way the military does its normal business, plans, and conducts operations, especially as DoD integrates AI. But most servicemembers have only begun to scratch the surface of what the average laptop can do for them. Without better training, servicemembers will be left relying on the analog tools and processes they used in the 1980s while the rest of the world, including potential adversaries, continues to move into the machine age. No matter how powerful technology becomes, we cannot just build systems and applications and expect our warfighters to use them to their full potential. For that, they need to learn to think differently.
Justin Lynch served as an Army officer and at the House Armed Service Committee, National Security Commission on Artificial Intelligence, and the Office of Science and Technology Policy. He is a Nonresident Fellow at the Modern War Institute and a term member of the Council on Foreign Relations. The views expressed in this article are his own and not necessarily those of his employer.
Alexander Mann is a graduate student at the University of Maryland and contributed to the National Security Commission on Artificial Intelligence’s recommendations. Previously, Alex co-authored papers on defense industrial mobilization, AI chips, and the semiconductor supply chain. Prior to entering national security policy, Alex worked in consulting as an industrial engineer.
The views expressed are those of the authors and do not reflect the official position of the United States Military Academy, Department of the Army, or Department of Defense.
Image credit: Sergeant First Class Brent Powell, US Army