Editor’s note: This article is part of a series, “Full-Spectrum: Capabilities and Authorities in Cyber and the Information Environment.” The series endeavors to present expert commentary on diverse issues surrounding US competition with peer and near-peer competitors in the cyber and information spaces. Read all articles in the series here.

Special thanks to series editors Capt. Maggie Smith, PhD of the Army Cyber Institute and MWI fellow Dr. Barnett S. Koven.

“It is in the cognitive space where we must prevail.”

— General Richard Clarke, commander, United States Special Operations Command, May 20, 2021

The current focus on influence and information campaigns as tools of great power competition raises questions about how well prepared the United States is to compete in “the human domain.” Coined in 2010 by retired Lieutenant General Charles Cleveland, then the commander of United States Special Operations Command – Central, the human domain unsurprisingly emphasizes humans—their beliefs, their networks, their values—as a center of gravity in modern, twenty-first-century nation-state competition. While competing for cognitive and social influence is as old as the history of military thought, the human domain has received renewed attention from scholars, policymakers, and operational communities. However, despite renewed interest, neither the Department of Defense nor the interagency are postured or organized for success in the human domain and the information environment. This article, therefore, offers a number of recommendations to enable DoD and the interagency to more effectively compete in multi-domain competition by focusing on the human domain.

The US military and interagency understand how to dominate the competition in the air, land, sea, cyber, and space domains, but are less clear on how to compete in the human domain. In part, the problem is rooted in institutional culture—particularly within DoD. The department has well-developed intuitions for understanding kinetic and material effects through repeated training and education, especially for those effects that are governed by physics. Even nonspecialists can easily grasp how direct fires can reduce a bridge; they can estimate roughly how far out in front of artillery they can push ground forces and still maintain fire support; and they know how quickly an armored formation can advance across different terrain and conditions.

Further, where DoD and the US government have to compete in the human domain without sacrificing key values or crossing ethical lines, our adversaries often show no such constraint, as evidenced by the growing number of examples of their efforts to destabilize and paralyze free societies. Small wonder then that the human domain is increasingly attracting adversary attention and gaining importance within the context of great power competition—especially for Russia and China over the last decade, which have recognized that the United States does not have an information or human domain grand strategy and has struggled to adapt to this domain. Because military force is most consistently discussed in hard-power terms, DoD has been slow to adapt its training to soft-power tactics and requirements. Kinetic power has not become irrelevant, but adversaries are increasingly engaging US forces and the American public directly via soft-power tactics, intending to sow discord and undermine democracy. Both Russia and China have actually adapted to this new environment faster than the United States. There is extensive writing on the Russian concept for information war, and in 2013, Russia created the organizational structure to implement these concepts. The Chinese have also written extensively on information war, and have also created supportive organizational structures. For its part, the United States has continued to grapple with these concepts, challenged by current organizational structures optimized for national security before the rise of information technologies. (Nevertheless, it is worth noting that the US private sector—through investment and advancements in advertising technologies, marketing, branding, and reputation management are ahead of the US government in this area.) Indeed, it was only in 2016 that serious concern over the offensive and defensive components of the human domain emerged in the US government.

Below are a series of recommended actions to improve the US government’s human domain capabilities and prowess in the context of great power competition.

Leadership and Accountability

For too long, information oversight—as a function—has not been adequately resourced or prioritized. To really drive evolution and innovation within the national security apparatus and to compete with our adversaries in the human domain, information-oversight management positions must be prioritized as mission-critical assignments to ensure the right personnel are in place to face the formidable challenges inherent to the domain. Namely, the quick turnover of information technologies, the dynamic information environment itself, and our adversaries’ real-time access to global audiences—to include the American public—all present information-oversight challenges that require steadfast and accountable leadership. Additionally, the power of disinformation, fueled by artificial intelligence (AI) and emerging technologies, presents a broad threat to our national security and, therefore, it is essential that senior leadership is able to represent requirements, champion solutions, negotiate defense contributions to the interagency and our international partners, while simultaneously overseeing, harnessing and guiding a diffuse effort. 

Better Integration of Social and Behavioral Sciences Across the Defense Enterprise

From policy to training to research and development, the social sciences are integral to DoD missions, and have increasingly been leveraged over the past twenty years since the attacks on September 11, 2001. But, the scope and scale of social science utilization across the defense enterprise is miniscule compared to the application of the natural sciences. As R. K. Merton observed, “between twentieth-century physics and twentieth-century sociology stand billions of man-hours of sustained, disciplined, and cumulative research.” Partly responsible is the historic military focus on building and employing kinetic weapons on a physical battlefield, not how to employ or utilize the social sciences in a military or national security context. Instead of trying to understand the human domain by reusing frameworks designed for planning and measuring effects that rely on physical and causal relationships, new frameworks should be developed that account for, and appreciate, the nuance and uncertainty inherent in human interaction.

Because any measure of human cognitive activity is imperfect—no perfect system currently exists to map the dynamics of human behavior—the defense enterprise should stop trying to assess the human domain in concrete terms and become more comfortable with imperfection and uncertainty. The numerous attempts to map the social networks and influences in Iraq, Afghanistan, and other complex operating environments reinforce the observation that kinetic tools and interpretations do not adequately capture the human domain. Retired Lt. Gen. Cardon recommends a different approach, emphasizing that, when considering the human domain, we try to be too perfect, the drive for perfection slows us down, and we are often wrong anyway, especially in the beginning. The approach to influence operations should be more akin to aiming for 50 percent accuracy and then adjusting as we go, adopting an ASDA (Act-Sense-Decide-Adapt) mentality to complement the more common OODA paradigm.

Human Feedback Loops are Essential

Understanding whether and when it might be better to use an influence network versus dropping twenty thousand leaflets is a wicked problem, and social science is not going to provide a 100 percent solution to that or similar problems—but, the social sciences can help make better, timely decisions that embrace some uncertainty instead of erroneously believing that a single, correct solution to every problem exists. In the human domain, a mentality that assumes the risks of learning where and when we are wrong and then quickly adjusting to the new information is required, instead of waiting (foolishly) for an impossible capability that will ensure we are never wrong. Leaders should be empowered to accept high levels of risk when operating in the human domain to prevent analysis-paralysis and inaction.

Change the Way We Train and Exercise for the Human Domain

The military intuitions we discussed above are shaped and sharpened by the extensive education and training our personnel receive in their specializations and in military arts and science. However, unless the social and behavioral sciences are integrated more fully into military education, those working in the human domain are forced to rely on on-the-job learning alone. Some of the best information and influence operators have had to develop their skills over years and through trial and error, relying on their instincts and anecdotal evidence. Formalized training and education are widely needed and cannot remain limited to those in the small cadre of information professionals. A broader approach to educating the force on the human domain is necessary to generate the institutional change required to overcome antiquated battlefield training scenarios, an unwillingness to accept anything but small amounts of risk—even in the information environment—and the notion that domains of warfare are distinct and isolated. Integrating capabilities to achieve desired outcomes requires that all members of the force fully understand how the human domain permeates and influences all other domains.

Additionally, training exercises should regularly incorporate influence and information operations into the scenarios. Most large-scale wargames and training exercises that include influence or information operations do so in name only. The operations themselves are “white carded,” meaning the training is not part of a dynamic scenario that trainees have to counter, but is instead a binary event (e.g., the adversary operation was either effective or ineffective) and decided at the discretion of the adjudicator without any reference to testing or data. Not only does the static white-card approach to testing units on their ability to counter an adversary’s information or influence operations diminish the perceived threat to the human domain, but it also fails to force commanders and leaders to respond to a complex and nonlinear threat or to identify their own formations’ vulnerabilities in the human domain. Additionally, few opportunities are available for units to test messaging concepts or information operations capabilities. Rehearsals and training events do not force commanders to account for the time it takes a message or any psychological operation to enter and populate enemy communications and thinking—even less consideration is given to the civilian population (friendly or enemy), how long it will take to get approval to act, or if the messages will be effective. Estimates like this are outside the normal intuitions of most serving decision-makers because the answers are more contingent and less certain than for kinetic operations. Because human social behaviors are complex, there is an enduring need for models and simulations to help the force better understand, train for, and operate in the human domain. Significant research and development advances have combined social science and computing, and should generate optimism about future tool development to create realistic and dynamic training, but first the social and behavioral sciences must be reincorporated into the entire defense enterprise.

Learn to Leverage the Space

History suggests that building a network of networks is the most effective way to impact the human domain—in other words, a short, targeted influence operation is unlikely to have any long-lasting impact on a community. Instead, as retired General Stanley McChrystal observed, it takes a network to defeat a network—and US adversaries have been strategically building disinformation networks in spaces of strategic interest to influence the human domain. To shift the power balance in the information space, the United States should involve stakeholders outside the defense enterprise, to include organizations external to the US government. Since adversary efforts to influence the human domain target social divisions and undermine democratic institutions, community organizations and civil society leaders that have the public’s trust need to be included. Additionally, relationships with the private sector, international partners, and influential nongovernmental organizations are crucial because their interests often intersect with issues of global and national security. Messaging campaigns to discredit disinformation sources should be coordinated across sectors and identify crosscutting themes and content that appeals to a wider, global audience. Ultimately, a large, inclusive, and global network is required to defeat adversarial networks of disinformation that are targeting the human domain.

Take a Hard Look at Authorities

The consequences of ignoring the human domain are more than theoretical. The defense enterprise is predisposed to consider kinetic solutions to global security issues because the authorities to act are clear, the operational risks are well understood, and the planning process is defined. Operations in the human domain are more convoluted: the authorities are fuzzy, the risks are uncertain, and the planning process is dynamic and highly contingent. Absent a deeper understanding of the “human stuff,” commanders are understandably biased toward choosing a kinetic option because they inherently possess authority over kinetic capabilities and often lack the authority, or permissions to act, in the information space. The defense community has tremendous potential to complement broader US government security and stability goals related to information and influence. Balancing the authorities and associated permissions to bring capability to bear against our adversaries—whose intent is to maintain a competitive advantage in the information space short of physical conflict—is crucial. Achieving balance requires a concerted call to action from the defense community and the legislative and executive branches.

Build a Nimble and Proactive Force

Current defense enterprise operations in the influence and information spaces are a work in progress. There are several major reorganization and realignment efforts underway, from the continuing evolution of overall responsibility for DoD information operations to the development of an operations in the information environment concept of operations. Despite being critical initiatives, operating successfully in the information and influence spaces has been hampered by long approval chains and a lack of operational autonomy and support to those conducting human domain–oriented operations. As stated above, the defense enterprise does not rehearse and train on shaping and influencing operations with any fidelity. Field commanders—much less line operators—possess little authority to conduct information and influence operations based on specific operational goals. Ultimately, we deny ourselves the opportunity to take the initiative and so, by default, we find ourselves relegated to reactive instead of proactive engagements in the human domain.

Finally, there is a strategic-level consideration that is inextricably linked to many of the recommendations made in this article. The concept of a whole-of-government and whole-of-society approach to tackling issues in the information environment (e.g., disinformation, adversarial influence campaigns) is currently popular. However, the authors remain concerned that there are significant hurdles to achieving a truly societal response—in terms of structure and capabilities—without a genuine grand strategy for the human domain.

Our reflections and recommendations are geared toward the defense enterprise, but we recognize that any capabilities, regardless of sector, are inherently linked and must be connected to a higher-level policy. Without a grand strategy for the human domain, the current challenges facing DoD will also continue to be present in the interagency and in society.

The human domain problem is more than academic—the domain is where great power competition is playing out, and in ways that seem both familiar and unprecedented. Indeed, most engagements are now won or lost in the competition phase—before conflict, much less kinetic activity, are even in play. Importantly, the defense enterprise is unprepared for persistent engagement in the human domain. Moreover, since the current competition space does not include a large-scale kinetic conflict, conventional capabilities are not an effective deterrent or response to adversary operations in the information environment. To secure our democracy against authoritarian adversaries who currently operate freely within the human domain, the United States must embrace the new information paradigm and begin to orient capabilities toward developing agile processes and tactics to effectively operate in a contested and uncertain environment. Ultimately, we need to remember that the secret to getting ahead is to get started.

Austin Branch is professor of the practice at the University of Maryland’s Applied Research Laboratory for Intelligence and Security (ARLIS). After retiring from the US Army as one of the first information operations officers, he previously served as a senior civilian leader for defense information operations in the Department of Defense.

Retired Lt. Gen. Ed Cardon is professor of the practice at ARLIS. He had a distinguished career in the US Army, where he served as head of Army Cyber Command, helped establish Army Futures Command, and created and led US Cyber Command Task Force ARES, among many other leadership roles.

Devin Ellis is a faculty affiliate at ARLIS and director of the ICONS Project, which has led innovation in wargaming information and influence operations.

Adam Russell is the chief scientist at ARLIS. He previously served as a program manager at the Defense Advanced Research Projects Agency and the Intelligence Advanced Research Projects Activity, working on behavioral and social science programs.

Authors are listed in alphabetical order.

The views expressed are those of the authors and do not reflect the official position of the United States Military Academy, Department of the Army, Department of Defense, US government, or any organization with which the authors are affiliated.

Image credit: Brian Merrill