Today’s American military is, arguably, the most tactically adept fighting force in the world—perhaps of all time. It is, without question, the best-resourced military in human history. Our technological advantage is unprecedented, as cutting edge hardware and software platforms deliver extraordinary capabilities in areas ranging from SIGINT to targeting to command and control. Taken together, the United States wields a tactically, financially, and technologically superior warfighting machine.
Why do we have so little to show for it? What accounts for our twenty-first-century inability to translate tactical excellence, technological dominance, and near-boundless resources into durable strategic outcomes in our post-9/11 “long war”?
There is a long list of potential scapegoats. Misguided political leadership. Imperial over-reach. The proliferation of complex asymmetric threats. Hyper-partisan domestic politics. The horizon-lowering influence of the twenty-four-hour news cycle.
The list goes on.
However, when confronting our recent strategic woes—from Afghanistan to Iraq and beyond—the defense establishment must step forward to shoulder its share of the burden. We are present en masse from the front lines of conflict to the inner corridors of power. We have opportunities to shape debates and affect outcomes at every level. A problem of this sort, and of this magnitude, is one that we must own and confront within our ranks.
The Intelligence Cycle is Broken
A key part of the answer lies in what’s known as the “intelligence cycle”—the process through which we investigate, analyze, and decide to act upon the world around us. This process has developed systemic, structural flaws. The platforms through which we gather information, and the processes and mediums through which we conduct analysis, are not adequately capturing ground truth. We are not integrating a granular, nuanced understanding of locality—and the potential strategic implications thereof—into the intellectual foundations of our strategic thinking. This has corrupted our ability to root strategic thinking in the realities of the battlefield. Instead, when strategic decisions are made (by men and women who are, inevitably, both physically and psychologically isolated from the front lines), debate takes place in a virtual reality that has been constructed by the intelligence cycle—and that may bear only a passing resemblance to the facts on the ground.
A confluence of factors has led us to this point:
1. The Ascendance of Technology and the Primacy of Targeting
Technological innovation has revolutionized tactical intelligence in the twenty-first century. Most dramatically, technology has driven extraordinary advances in our targeting capabilities. The global reach of our targeting platforms is unprecedented and unmatched. We are able to track the enemy with a diverse and sophisticated suite of tools, fixing him in time and space, so that we might bring our exceptional lethality to bear. However, a growing focus on targeting has drawn front-line attention away from deeper strategic concerns. This, in turn, has affected the inputs that we feed into the intelligence cycle in subtly pernicious ways.
Appreciating this dynamic is critical to understanding why our targeting excellence has not delivered comparable success at the strategic level. It is a root cause of why we appear to be engaged in a never-ending game of whack-a-mole with our enemies.
Our view of the fight at the tactical level, from counterterrorism operations in the Sahel to counterinsurgency in Afghanistan, is structured around our view of the enemy. Put another way, the operational networks of our enemies are the framework through which we see the battlefield, and the targeting process is our lens.
In and of itself, this is perfectly natural. Why shouldn’t front-line military units take an enemy-centric view of the battlefield?
The problem is twofold:
First, our current approach to network targeting takes an extremely limited view of the enemy. Our men and women on the ground are zeroed in on tactical intelligence about the enemy—the who and where. Questions of strategic intelligence—the why—are marginalized.
Technology has been a driving force behind this phenomenon. With the advent of dynamic targeting software platforms, our actions at the tactical level now center on feeding inputs into the technological tools that underpin the targeting cycle. We conceptualize the enemy in link analysis charts, and we strive to “connect the dots” and generate actionable intelligence.
The resulting reach and specificity of our knowledge is extraordinary. We are able to map out shadowy global networks with speed and precision. Yet this knowledge lacks depth and substance. Link diagrams may be geo-located, thus ostensibly connecting the enemy to locality, but our reporting processes provide little incentive to root our understanding of the enemy in meaningful local context. We excel at connecting the dots and mapping the network, but our view of the enemy is two-dimensional.
Second, this enemy-focused lens has been transposed to the strategic level, with disastrous consequences. Our two-dimensional, network-centric view of the enemy is a limitation at the tactical level, where it inhibits our ability to anticipate second- and third-order effects. It is a catastrophe at the strategic level, where it frames our worldview. Our tactical-level representation of the enemy as a “Palantir Bonsai Tree” (as opposed to as an organic outgrowth of local society) has become the intellectual framework for strategic decision-making. No wonder, then, that our efforts to prune limbs prompt new growth in unanticipated directions.
2. The Fetishization of Data
As front-line intelligence analysts have become consumed within the targeting process, strategic decision-makers have doubled-down on technology and data as a means to understand the battlefield. With fewer and fewer substantive, qualitative inputs into the intelligence cycle, we have compensated by harvesting ever-larger quantities of data.
Dramatic breakthroughs in the fields of big data, predictive analytics, and artificial intelligence are pulling us further and further toward a “data-driven” understanding of the world. We break the battlefield down into measurable parts. Those parts are then measured by a diverse array of sensors. From that baseline, we then establish metrics that can be objectively assessed over time, in a process that can, increasingly, be automated.
The intellectual appeal of this approach is obvious. The defense establishment, similar to any large organization engaged in complex operations in dynamic environments, likes quantitative data. It is clean. Objective. Unambiguous. Scalable. It enables the clear measurement of progress, and the indisputable demonstration of results.
The problem, however, is that quantitative data is reductive. On the battlefield, we can quantify and measure an extraordinary range of things, from incidences of violence to the price of bread to the movement of displaced people. Yet once we quantify something, stripping away its contextual meaning and turning it into a data point, it loses all of its explanatory power. A quantitative data set cannot tell us anything about the significance of changing rates of violence, price fluctuations, or patterns of migration. Is an uptick in violence the result of the enemy’s growing strength? Is it tied to a rogue commander who has broken with the enemy’s central leadership? Is it the final death throes of an insurgent movement that has lost local support? The data cannot tell us. Interpretation requires deep, localized, contextual understanding. Yet this sort of information is not being adequately captured by the intelligence cycle at the tactical level—and, critically, only tactical-level personnel have the firsthand access to ground truth that is essential to acquire this information.
As such, our strategic reliance on quantitative data to compensate for a paucity of substantive, qualitative understanding is dangerously misguided. We are asking data sets to explain what is happening on the battlefield—but the data itself has nothing to say. Strategic-level planners and policymakers are being fed vast quantities of de-contextualized data points, to which they (or those around them) are compelled to ascribe meaning.
The rapidly growing size of our data streams is particularly dangerous in this respect. The very bigness of our data imparts an illusion of understanding. If we have terabytes of data, after all, surely we must know what we are looking at? But data sets, no matter their size, can never answer the question “why?” Indeed, context-free data sets can be structured to say virtually anything—and this is where our shift toward quant goes from being an analytical limitation to a terrifying strategic liability. In a highly politicized environment where no one in the room has an intuitive feel for ground truth, and where qualitative context is largely absent from the intelligence cycle, the field is left open for mistaken assumptions, political manipulation, and old-fashioned careerist bullshitting.
3. The Cult of The Operator
Our preoccupation with targeting, and our growing reliance on technology-driven quantitative analysis, have proceeded hand in hand with a shift in tactical-level organizational culture. Following the example of our most elite units, the ideal of “the operator” has taken root among the men and women on the ground. It is now the archetype of professional competency. Tactical lethality is championed, at the inevitable, albeit unspoken expense of sophistication and strategic effect.
Fueled by our targeting prowess, this has fed into a growing anti-intellectualism. Superficially, this is not an unexpected development. Killing the enemy is the core business of the military. The targeting process is a natural lens through which tactical elements of the military should view the world. It is logical that our most prolific kinetic targeting capabilities will command respect and admiration.
However, if no one at the tactical level is looking beyond the immediate demands of the targeting process to collect substantive and meaningful contextual detail on the enemy, then that information will never enter the intelligence cycle. Instead, it will be left to others (who lack direct access) to invent narratives that ascribe meaning to our network targeting packages and quantitative data sets.
No one, except our front-line personnel, has access to ground truth. Yet with few exceptions, front-line intelligence analysts are not asked to think strategically, to wrestle with ambiguity, and to make big-picture sense of local-level uncertainty. High-level strategy is something that is done by others, elsewhere. “Every Soldier is a Sensor,” goes the mantra, but we have calibrated our sensors to maximize the uptake of reductive, quantitative data points, while we indulge the false humility that tactical-level personnel “are just grunts” and thus beneath the plane of strategic thinking.
A Wealth of Data, A Poverty of Insight
The status quo is a recipe for yet more unproductive tactical excellence. The prevailing currents in the defense sector, meanwhile, are pulling us further toward the extremes noted above. Technology is ascendant. Big data, algorithmic processing, predictive analytics, machine learning, and artificial intelligence are the buzzwords of the day among our best and brightest. The zeitgeist tells us that the future lies with the large-scale quantification of the world around us, and we are following corporate America’s lead toward the technologization of everything.*
Total information awareness is a realistic objective, the technologists tell us, because of impending advances in the industry. Our ability to leverage technology toward tactical objectives is already the driving force behind how we operate on the battlefield. Looking ahead, the ability of technology to convey strategic understanding is central to our thinking about war and intelligence in the twenty-first century as well. Future intelligence analysis, according to this vision, will be grounded in automated data collection and analysis platforms that deliver both tactical acuity and strategic clarity, harvesting and processing unfathomable quantities of data from sources as diverse as social media platforms, classified reporting databases, and weather satellites.
The American military will continue to pursue technology-driven solutions. On the one hand, it will enable us to get better and better at targeting. All organizations like to focus on their strengths. Yet there is a looming danger that our tech-enabled excellence at the point of execution will be held up as validation that technology is the path to success at all levels, and that technology can deliver strategic results as well. On the other hand, technology will deliver an increasingly compelling illusion of situational understanding. As we are able to harvest more and more data, and then to process and visualize that data in ever-more dynamic ways, it will become increasingly tempting to believe that we simply must know what is happening—that the fundamental reductiveness and explanatory impotence of quantitative data has been transcended by sheer volume.
Yet, the fact that we have achieved so little while pursuing this track to date should give us pause. The allure of a high-tech, plug-in solution to understanding the world must be tempered by an appreciation of what technology and quantitative data can and cannot do.
What if our wholesale embrace of technology—as the lens through which we see the battlefield, the brain that processes its dynamics, and the central nervous system that guides and shapes our actions thereupon—is a root cause of our recent strategic malaise? What if a complex human environment, packed with layers of historical, cultural, and social meaning, and inextricably intertwined with political and economic systems, cannot be broken down into patterns of ones and zeroes and then reconstructed in any remotely meaningful way? What if, instead, it presents us with increasingly complex patterns of correlation that we are increasingly ill equipped to contextualize?
How Can We Fix the System?
First, let us be clear about what we should not do. Of course, we should not (and could not) disavow technology. This is not a call for a return to an idealized past, where human intelligence reporting reigned supreme, and long-form narrative was the standard medium. Nor is it proposing that we inject reams of speculative, unstructured, qualitative text into the intelligence cycle, or that we flood the intelligence community with academics.
What we must do, however, is re-structure the intelligence cycle so that contextual detail is fed into the military’s central nervous system. It is not a question of “cultural awareness.” It is not an abstract, open-ended inquiry into “local context.” It is a question of adding an essential layer of depth and meaning to what has become a two-dimensional targeting process that is, in turn, driving an increasingly reductive and de-contextualized intelligence cycle.
Our current embrace of network-centric targeting must be expanded to incorporate economic, social, and political context. We must force qualitative, granular detail into the intelligence cycle at the tactical level in a structured, methodologically consistent fashion. We must develop a three-dimensional view of the enemy from the bottom upwards, which captures a network’s connectivity to local environment. From this knowledge, we can attack not only the nodes and linkages within the enemy’s network, but also the network’s linkages to locality.
Technology and data have vital roles to play in this process, but they cannot deliver the needed insights by themselves. This will require contextual inputs and structural direction from a cadre of highly skilled front-line personnel who can leverage technology as a force multiplier of human expertise. Elements of this capability will be tech-centric, as we continue to reap the extraordinary tactical advantages provided by technological innovation. Yet our work must be executed with an ethnographer’s ear for meaning, and a historian’s eye for context—and situated in an organizational culture where the collectors, producers, and consumers of intelligence possess a shared understanding of the limitations of quantitative data.
This sounds academic. It may even sound pretentious. But in fact, the way ahead is straightforward. Academics have cloaked the skill sets of academia (and of the social sciences in particular) in deliberately complex language, presenting them as things that require uniquely academic expertise. Academia, as a whole, has charted a course toward obscurity, celebrating its “cult of irrelevance.” Yet academics are not priests. It is not their place to reveal or withhold sacred wisdom. The methods and the literature are open and available to us all, and both have much to offer in regard to our current challenges. There is nothing to prevent us from demystifying and utilizing academic skills for ourselves, and integrating academic methods into front-line intelligence collection and analysis.
The integration of structured, qualitative detail into the intelligence cycle is an essential step in the rehabilitation of our strategic thinking. Macro-level strategic debates cannot be allowed to proceed without connectivity to micro-level detail. Our leadership must be forced—by the nature of the inputs that we direct into the intelligence cycle—to engage with contextual nuance. The facts on the ground must be integral to strategic debate, and they must be provided by the men and women on the front lines (rather than projected over data sets by rear-echelon analysts). Technology alone cannot supply the insights that we need to make the right choices, at the right moment, for the right reasons.
* For a discussion of the downside to Corporate America’s infatuation with technology-driven solutions, see Sensemaking: The Power of the Humanities in the Age of the Algorithm, by Christian Madsbjerg (Hachette Books, 2017). Madsbjerg is a Danish strategy consultant and advocate of the value of human intuition. The book offers a range of highly relevant lessons for the defense sector, related to the limitations of quantitative data and the ways in which technology affects our view of the world around us.