This article is part of the series “Compete and Win: Envisioning a Competitive Strategy for the Twenty-First Century.” The series endeavors to present expert commentary on diverse issues surrounding US competitive strategy and irregular warfare with peer and near-peer competitors in the physical, cyber, and information spaces. The series is part of the Competition in Cyberspace Project (C2P), a joint initiative by the Army Cyber Institute and the Modern War Institute. Read all articles in the series here.

Special thanks to series editors Capt. Maggie Smith, PhD, C2P director, and Dr. Barnett S. Koven.


Recently, on Saturday Night Live, actor James Austin Johnson played President Joseph Biden in one of the show’s famous cold opens and pondered the problem of Russian disinformation in the ongoing conflict in Ukraine. The political satire is lighthearted—and even includes a TikTok dance—but comedic performance belies the gravity of Russian adversarial tactics in the information environment and the emerging threat of disinformation as an element of modern conflict.

Spreading disinformation is an asymmetric tactic that is easy to do, low cost, highly transmittable, and challenging to attribute—all while remaining squarely below the threshold of armed conflict, making any military escalation unlikely. Some tactics and procedures might appear novel. However, familiarity with marketing techniques provides a perspective on the possible effectiveness and why an adversary might be embracing new approaches in the information operations space.

Adversarial disinformation campaigns are like a low-grade fever that erodes collective national will unless more robust mitigation measures are implemented. There is no figurative vaccine that will offer protection in the case of mis- and disinformation. However, lessons from the study of marketing around inoculation theory and a collection of efforts from education to greater industry regulation of technology companies offer hope of resiliency against evolving adversarial information operations.

Tactics and Strategy: How Marketing Tactics Translate to Disinformation Tactics

With the average American spending over 1,300 hours a year on social media in 2021, advertising embraced influencer marketing. Brands are expected to invest over $15 billion worldwide on influencer marketing this year alone. Influencer advertising allows for more human-centered engagement via appealing, entertaining, niche content. Influencer advertising is not reserved just for celebrities and mega-influencers but rather nano-, micro-, and macro-influencers, individuals with less than one million organic followers—including those with under one thousand followers. These categories of influencers potentially have deeper connections with their followers, appear more relatable, and, as a result, can be persuasive. They can have followings centered in expertise such as travel or cuisine and can be distributors of information leveraging in-group bias—the phenomenon of increased trust when interacting with someone from our social group versus an outsider.

Unfortunately, the transactional tactics that work well for large brands are quickly co-opted by our adversaries. Recently, independent Russian and associated Chinese organizations attempted to recruit and pay social media influencers. Unlike celebrities with media teams or journalists who adhere to professional standards, influencers with smaller digital media footprints are more susceptible to exploitative partnerships. Allegedly, Russian firms tried to solicit influencers to spread divisive content about vaccines. China partnered with Western influencers, sponsoring their travel to China to generate favorable reviews of the country and way of life. China even hired a US media firm prior to the 2022 Olympics to employ influencers to share Chinese history, culture, modern life, and diplomatic content to shape perceptions. The influencers tergiversate, avoiding sensitive topics like human rights violations and handling the COVID-19 pandemic, ultimately spreading confusion and misleading information by not providing a wholly accurate description of situations. Influencers are also a vehicle to access platforms—like YouTube, Facebook, Instagram, and Twitter—that might be inclined to censor messages directly from Russia or China.

Effective marketing campaigns are often multipronged with numerous digital touchpoints to create trust and accelerate adoption in the customer journey. Likewise in the information operations space, adversaries create multiple touchpoints via fake accounts known as “sock puppets” to increase exposure. Unfortunately, sock puppets can be deceptively real, as in the case of the “Jenna Abrams” account on Twitter. Created by the Internet Research Agency, a Russian government-funded troll farm, the account amassed over seventy thousand Twitter followers and was quoted by major news outlets for her conservative, xenophobic beliefs. The humanization of “Jenna Abrams” was a well-crafted, deliberate approach to amplify a loud, divisive voice. Recently, Russian campaigns used artificial intelligence to create realistic and original human faces to spread disinformation about the Ukraine government during the invasion. Strategically, Russian disinformation campaigns do not offer a concise narrative. Rather, Russian disinformation campaigns have included high volumes of content from multiple channels that flooded the ecosystem with persistent messaging, seeding confusion, fear, insecurity, and division around their targets’ political and social values.

Conversely, Chinese disinformation campaigns are more inclined to replicate a trend in marketing known as “comparative advertising.” This is a marketing tactic in which a brand offers a narrative that its offering is superior to a competitor’s offering. For example, a persuasive video released by New China TV with over 2.3 million views showcases a cheeky narrative between cartoon Legos representing China and the United States. The storyline centers around China’s eagerness to help the world navigate the COVID-19 pandemic, contrasting it against a narrative of the US refusal to listen. Part of the appeal of this approach is the aspect of storytelling and evoking emotion and relatability instead of solely presenting facts. Although China, like Russia, uses scale to spread messages, the distinction is that China seeks to positively shape behavior and perceptions of the Chinese Communist Party compared to the American way of life and response to sensitive issues.

Approaches to Countering Disinformation

There is no clear-cut solution to countering mis- and disinformation, but inaction or passivity is not an approach the United States should take. The research of the late Dr. William McGuire, a social psychologist from Yale who coined the term “inoculation theory,” offers inspiration for techniques to help individuals become less susceptible to influence by adversarial disinformation campaigns. Often used in advertising and public relations, inoculation theory is a model for building resistance to persuasion attempts by exposing people to arguments against their beliefs and giving them counterarguments to refute attacks, similar to how a vaccine works in fighting disease. Inoculation messages, like vaccines, expose users to the threat and build up resistance for when they are unsuspectingly exposed. Recently, Google’s Jigsaw and American University’s Polarization and Extremism Research Innovation Lab studied inoculation in a controlled setting. They found that exposing individuals to warnings about information resulted in a greater ability to discern the truth.

Inoculation theory is also evidenced in the US approach to “prebunking” instead debunking disinformation surrounding the Russian invasion of Ukraine. Instead of countering disinformation after it spreads, NATO forces are trying to shape and control the narrative by releasing information and staying ahead of false Russia claims. For instance, the White House transparently gathered thirty TikTok stars to educate and inform them via a Zoom call on US strategy in dealing with Russia, working with NATO, and assisting Ukraine. This approach parallels what our adversaries are doing. However, the influencers were offered a journalistic-style briefing. Ultimately, the counter to mis- and disinformation is accurate information itself.

Inoculation against mis- and disinformation is already practiced and widely adopted in the European Union. In Finland, a country that shares a border with Russia, anti–fake news is taught to residents, students, journalists, and politicians via media literacy. The Netherlands also launched public awareness campaigns to educate Dutch voters on disinformation and help people recognize it. Public urgency galvanized a movement of “elves” to counter Russian “trolls” in Estonia and Lithuania. Volunteer “elves” try to document and report what they believe is hate speech and pro-Russia propaganda. Digital literacy aimed at spotting disinformation is normalized in Estonia classrooms. Sweden launched the Swedish Psychological Defense Agency in January 2022 with the explicit mission to identify and counter information influence. The agency will work toward helping the public spot disinformation. A similar agency is being developed in France. These efforts are vital to the European Union’s strategy to counter Russian disinformation influence. Due to Russia’s proximity and the presence of Russian citizens in these countries, the European Union endures many Russian disinformation tactics before the United States. The European Union’s progress in countering mis- and disinformation is a vital azimuth to consider when evaluating policy approaches.

Another approach to countering mis- and disinformation was highlighted in the recent Aspen Institute Commission on Information Disorder, detailing the impact civic empowerment, fostered by tools that allow for greater information transparency, could have on the information environment when funded by the major online platforms. One valuable idea to come from the commission’s report was the “amplification flow tool” that would illuminate which influencers and groups shared the same content to help distinguish falsehoods from truths. The commission also recommended holding social media sites more responsible, which aligns with a bipartisan interest in Congress.

In October 2021, Frances Haugen, a former Facebook data scientist and whistleblower, offered a behind-the-scenes look at her previous employer’s failure to control misleading information when testifying before Congress. She offered commentary on how Facebook had avoided implementing simple tactics to slow down the process of sharing invalidated information—like asking users if they wanted to share the information—and focused on profitability instead of information accuracy. Her testimony showed opportunities for external audits and safety measures and how government oversight is necessary to improve online safety.

Haugen’s testimony generated bipartisan interest and support for the issue. It could spur federal regulation, which Meta CEO Mark Zuckerberg himself acknowledged had value, saying, “We’re committed to doing the best work we can, but at some level, the right body to assess the tradeoffs between social equities is our democratically elected Congress.” Regulatory oversight will need to find a balance between consumer privacy and freedoms like how the Federal Trade Commission regulates advertising.

For state and nonstate actors operating in gray zone operations, disinformation campaigns offer a cost-effective, low-skill approach to sow discord and chaos without the responsibility of conventional military hardware, personnel, or formidable infrastructure. Information warfare can degrade people’s will to defend their state, creating a favorable environment for possible military intervention. Democratic norms and freedom of speech on social media combined with expanding study and work-from-home situations offer a broad attack surface for actors to seed disruptive practices.

Unfortunately, as the COVID-19 pandemic endures and the dependency on digital networks continues to increase, the aperture and creativity of our adversaries will likely keep pace or exceed the tactics commonly adopted from advertising. Russian and Chinese malign influence campaigns may be expected to evolve as marketing techniques evolve. Tomorrow’s disinformation may not look like today. Fortunately, just as tactics can be borrowed from industry and marketing so can countermeasures like inoculation theory and greater regulation.

Laura Keenan is a lieutenant colonel in the District of Columbia Army National Guard and is currently assigned as the J55, Division Chief for Policy and Strategy in Cyber National Mission Force.

The views expressed are those of the author and do not reflect the official position of the United States Military Academy, Department of the Army, or Department of Defense, or of any organization the author is affiliated with, including the Army National Guard and US Cyber Command.

Image credit: ajay_suresh