Image courtesy of Flikr user US Army.
By Captain Justin Lynch, US Army
Series Introduction
This paper is the second of a three part discussion of uncertainty and unpredictability in warfare. The first reviewed strategic unpredictability caused by an opaque enemy decision making process. The third will discuss uncertainty and unpredictability at the platoon and company level, both for linear warfare and counterinsurgencies. This one will discuss uncertainty and unpredictability at the operational level of war, particularly for counterinsurgencies.
Unpredictability at the Operational Level of War
The last 14 years of warfare have seen a shift in the American military’s focus. Instead of confronting traditional state actors, most fighting has been an attempt to establish order in lawless areas. These conflicts have and will continue to pit militaries with a preponderance of combat power against elusive forces that use a strategy of exhaustion. The military’s greatest challenge is not their enemy’s tactical virtuosity. It is an uncertain and unpredictable environment.
The Creation of Extremes
Warfare lends itself to the creation of extremes. One extreme is the desire for information. Planning creates a reciprocal action between planners and intelligence personnel. Planners want more information. Their intelligence, surveillance, and reconnaissance assets provide what information they can, but never enough to sate the planner. Lack of satisfaction drives demand for more information, and the development of new systems and technology to provide it. The new systems and technology create information abundance, establishing a new standard for the minimum amount of information required for operations. Taken to its logical extreme, the reciprocal action leads to absolute knowledge of the operating environment.
Two limiting factors prevent militaries from realizing absolute knowledge. The first is uncertainty, defined as a lack of comprehension of the operating environment, due to a lack of precise knowledge, an inability to understand what data is relevant, or the possession of incorrect data. The second is unpredictability, the inability to forecast the outcome of actions.
The limiting factors pull reality away from the logical endpoint of the planning processes’ reciprocal action. Militaries operate in an environment where they have some grasp of their surrounding and the consequences of their actions, but never enough to definitively know the right course of action. Militaries that percieve the structure of uncertainty and unpredictability will understand it more effectively, and can use it as another planning factor, and possibly a weapon, instead of merely as a source of friction. The best way to understand the structure of unpredictability and uncertainty in modern war is to view states as complex adaptive systems.
Modeling the World
Humans understand the world by developing models of cause and effect. To predict consequences, people need models based on the correct concepts. One of the standard methods for prediction, empiricism followed by data analysis, does not give sufficient attention to causality, relies too heavily on correlation, and is open to inaccuracies. Without the correct conceptual framework, data is easily misinterpreted. Once the correct model exists, if it accounts for uncertainty and unpredictability, the understanding gained can help soldiers operate more effectively. Complex adaptive systems are the most effective model of states waging counterinsurgencies.
Four characteristics define complex adaptive systems. They have many agents whose interactions produce non-linear results, interdependent agents, structures that span multiple scales, and they display emerging behavior.[1] The many agents in a complex adaptive system can be the large number and variety of organisms in an ecosystem, or the numerous buyers and sellers in a stock market. In nonlinear systems, behavior at one point may not predict behavior at other points, while in a linear system performance at any point can forecast the entire system. A linear function with a slope of two at one point will have a slope of two at every point, so behavior viewed at any point indicates the entire function’s appearance. In a nonlinear function, viewing a point with a slope of two is not indicative of the slope at any other point on the function. While this difference is not particularly insightful when viewing graphical displays of functions, for real world systems it means that proportionate causes do not have proportionate or predictable effects. An input of two at one point results in an output of six, while at another point it causes an output of 90, reducing the effectiveness of empiricism and analysis.
The agents in a complex adaptive system are interdependent. Returning to the ecosystem and market examples, in each case if a significant portion of the system were to vanish, the behavior of the remaining agents would change in response. In a system of independent agents, such as a bucket of water, removing a significant portion of the constituent parts, the water molecules, would not significantly change the remaining agent’s behavior. Interdependency causes many actions to have consequences that unpredictably ripple across the entire system, especially when combined with non-linearity.
Complex adaptive systems have structures that span multiple scales. An ecosystem’s structure exists at the level of molecular biology, individual organisms, predator-prey behaviors, and the entire ecosystem as a whole. Similarly in a stock market, and individual agent in the market has a financial structure, which exists at a different scale than a hedge fund, a stock exchange, or the global market.
Emerging behavior is system behavior observed at one scale that scientists cannot predict by examining the constituent parts of the scale below, breaking down the power of traditional reductive analysis. Biologists cannot predict the impressive structures built by some termite species by observing individual termites.[2] Likewise, observing an isolated stock broker will not reveal the rise and fall of an entire market. One effect of emerging behavior is that scientists cannot accurately think of complex adaptive systems as the sum of their parts. Instead, it’s the sum of their parts and the unpredictable interactions between the parts.
States combating insurgencies are complex adaptive systems. The many different individuals and groups in a state are the agents, and their interactions have the disproportionate consequences characteristic of nonlinearity. A state’s agents are interdependent economically, socially and structurally. A state’s structure spans many scales, from the individual, to the city and region. States display emerging behavior, including party politics, crime, and industrial development.
The Utility of Complexity Theory
In complex adaptive systems, relationships between actors create the system’s performance more than the nature of the actors. Similar relationships in the stock market, immune systems, and states cause them to display qualitatively similar emergent behavior, despite the dissimilarity of the constituent parts. These similarities have proven true in real world applications.[3] The common characteristics of interest are nonlinearity, sensitive dependence on initial conditions, catastrophe theory, and lateral relationships. There are enough characteristics that lead to unpredictability to fill an entire book, but these are some of the most accessible and applicable to the operational level of war.
Nonlinear systems, especially those driven by multiple nonlinear behaviors, are difficult to reverse engineer from periodic data sampling. Because of this, analysis of real world systems rarely creates effective models of nonlinear environments, making predictive analysis very difficult. In a linear model, if military personnel’s sampling of a village showed a steady increase in satisfaction with law enforcement as the number of law enforcement personnel increased, it would go to say that until a point of diminishing returns, increasing the number of law enforcement personnel would continue to increase satisfaction. But that is a linear interpretation. In a nonlinear system, continuing to gradually increase the number of law enforcement personnel could have any number of effects, from community violence to complete peace and happiness.
Sensitive dependence on initial conditions, popularly known as the butterfly effect, means that small changes in a single variable can have a counterintuitively large impact over time. Poincare speculated about this as early as 1890.[4] Edward Lorenz brought the concept to the computer age when he rounded a data point in a program using multiple nonlinear equations to simulate weather. The rounded variable was equivalent to the wind generated by a butterfly flapping its wings, and resulted in the creation of an unexpected hurricane inside the simulation.[5] The uncertainty principle tells us that observers never precisely know conditions; they only estimate them. Combined with sensitive dependence on conditions, this means that nonlinear systems have inherently unpredictable long term trends, and even small choices can have unpredictable long-term consequences.
Catastrophe theory is also relevant. Beginning as a subset of topology, catastrophe theory shows that one or more of a system’s input variables can steadily change while the output holds a steady state, then reach a threshold where the output rapidly changes to a dramatically different steady state.[6] Take a ball rolling across the surface of a table. Changes in the location of the ball on the table do not affect its height from the floor until it reaches the edge of the table. Once it reaches the edge, a slight change in location causes it to fall to the floor, establishing a new steady state at floor level. While this seems both obvious and unimportant on a table, its importance is more significant when used to explain the interaction between less intuitive variables. In a village, a minority population gradually moves to another location outside the village, homogenizing the population, while the local economy slowly improves due to an increase in government spending. As both variables slowly change, the level of insurgent violence suddenly drops significantly. Linear thinking would indicate that a significant change in another variable caused the drop in insurgent violence, such as a new alliance between militias and government forces, or a large increase in the size of coalition forces in the area. While catastrophe theory does not rule this out, it also raises the possibility that gradual or even unnoticed changes in other variables caused the sudden drop in violence.
Lateral relationships between actors are important as well. Many models track the flow of information, logistics, and other variables from input to output. While the relationships found in this flow are a necessary part of understanding a system’s dynamics, they are not sufficient if the lateral relationships are not included. When counterinsurgents interact with a village leader, they observe their input and the eventual output, the leader’s behavior. The lateral relationships affecting the village leader are far less visible, but at least as important to the village leader. His actions affect other villages, the relationship between his village and others, and his relationship with his competitors for power. If the system’s scale is large enough, lateral relationships expand and include everyone in an entire city or region. Unfortunately for the counterinsurgent, the number of lateral relationships grows exponentially with the number of actors in the system, and they are difficult to perceive as an outsider.
The effects of these four characteristics are ubiquitous throughout complex adaptive systems, and therefore throughout states fighting counterinsurgencies. Lateral relationships make understanding human terrain extremely difficult, increasing uncertainty. Nonlinearity makes predictive analysis problematic at best. Catastrophe theory makes determining causality difficult. Sensitive dependence on initial conditions means that every action can have large consequences, and large consequences can have small causes.
How to Thrive in an Unpredictable Environment
When operating in an unpredictable environment, two possible responses emerge. The first is to reduce uncertainty. The American military’s current attempt at this approach began in the 1970s in response to both the uncertainty found in Vietnam and as part of the second offset strategy. Led by General James Clapper, the effort to establish information dominance leveraged digital capabilities to create real time updates for combat forces.[7] Sadly, information dominance failed when confronted with insurgencies in Iraq and Afghanistan. The approach does not map human terrain, closely examine cause and effect, or address the inherent unpredictability of nonlinear warfare. It did create a force reliant on information superiority, then place that force in an environment where enemy insurgents often enjoyed information superiority or even dominance. That is not to say that information dominance does not have a valuable place in some forms of warfare, or even in counterinsurgencies when combined with an understanding of complex adaptive systems.
For the second approach, leaders must embrace the unpredictability of their environment and mold a force that prioritizes adaptability and responsiveness over information superiority. Adaptability is the ability to change perspectives, plans, and courses of action, and it shapes responsiveness, the speed at which individuals or organizations react to their environment. A rapidly adaptable and responsive organization can prevail by relying on their ability to effectively recognize and solve new problems quickly, and by reducing rigid planning that does not account for ever-changing circumstances. To do so, leaders must encourage their subordinates to examine their environment, challenge their assumptions and quickly adapt to the necessary course.
While coalition forces in Iraq and Afghanistan took a gigantic leap towards closely examining their environment by mixing troops with the local population, using unmanned aerial vehicles, increasing language training, and pushing intelligence assets down to the company level, many of these assets still allow information to remain opaque. Instead of examining system outputs, occupying forces need to delve deeper and understand relationships and determine causality, not just sequences of events. Then they can begin to influence their environment effectively.
Leaders should also foster a culture that encourages subordinates to challenge assumptions. Doctrine already codifies questioning assumptions in the military decision making process, but more is required. Planners need to understand their assumptions. Instead of just challenging the assumption that a set of bridges will be intact during the military decision making process, allowing friendly tanks to cross them, planners should be comfortable challenging the assumption that the enemy will be on the other side of the bridge, that an armored formation is the best way to fight the enemy, or that the enemy should be fought at all.
The stated ideas deserve a word of caution. The first is the danger of frequent, rapid fluctuations in campaign planning. The classic system delays-inventory experiment demonstrates that rapid responses intended to create a steady state can actually generate large, unwanted oscillations.[8] While adaptability and responsiveness are crucial, they should provide far thinking leaders the tools to understand their environment, not encourage responses to random noise, an inability to hold a course or fear of commitment. While leaders should not base their actions on a firm belief in the outcome of that action, inaction is just as much of a choice, with consequences just as real.
Uncertainty and unpredictability are intrinsic parts of the nature of war. They challenged America’s military in every war until now, and will continue the challenge in the next, whether it is a repeat of the first Gulf War or a counterinsurgency in a jungle. The challenge is by no means insurmountable. If leaders approach it with an open mind, develop a model that understands the nature of unpredictability, and rely on adaptability and responsiveness, the United States has the ability to succeed.
Captain Justin Lynch graduated the United States Military Academy with a B.S. in Military History and commissioned in the Army. He has served as a platoon leader in Afghanistan, a company executive officer in Iraq, an assistant operations officer, a company commander, and is currently the training officer at the Northern Warfare Training Center. He has written for Infantry, Cicero Magazine, Small Wars Journal, War Council and War on the Rocks. He is a member of the Military Writers Guild.
[1]. Michael Baranger, “Chaos, Complexity, and Entropy,” New England Complex Systems Institute, 2001, 9-10.
[2]. Self organizing book refrencing termites
[3]. Donella H. Meadows and Diana Wright, Thinking in Systems (White River Junction: Sustainability Institute, 2008), 11-17.
[4]. Henri Poincare, The Value of Science Essential Writings of Henri Poincare (New York: Random house, 2001), 416-418.
[5]. James Gleick, Chaos Making a New Science (New York: Penguin, 1987), 11-18.
[6]. Martin Golubtisky, “An Introduction to Catastrophe Theory and its Applications,” SIAM Review, 1978, 352-357.
[7]. Robert Tomes, US Defense Strategy from Vietnam to Operation Iraqi Freedom (Oxon: Routledge, 2007), 84.
[8]. Donella H. Meadows and Diana Wright, Thinking in Systems (White River Junction: Sustainability Institute, 2008), 52-57.