As the digital age accelerates the spread of information across the globe, the scourge of disinformation has emerged as a dire threat to the fabric of democracy and the integrity of national security. High-profile incidents on the international stage, such as the Rohingya crisis in Myanmar and the anti-balaka violence in the Central African Republic, underscore the devastating impact of disinformation. These events, fueled by misleading narratives spread via social media and other channels, led to severe humanitarian crises and acts of violence. Similarly, in the United States, disinformation has significantly escalated protests and violence across the ideological spectrum, as evident in events like the Pizzagate conspiracy, the January 6 Capitol protest, and some Black Lives Matter and Portland protests.
A 2022 report reveals a concerning trend: a majority of Americans, cutting across political lines, now view disinformation as a more critical issue than a range of threats, from infectious diseases to terrorism to climate change. This growing concern is reflected in statistics: 71 percent of respondents believe disinformation exacerbates political polarization, 63 percent see it as a violation of human rights, and over 50 percent report feelings of anxiety and stress upon encountering false information.
Confronting the proliferation of disinformation presents a complex and multifaceted challenge for US policymakers and senior leaders. Their tasks include navigating a rapidly evolving communications landscape, countering global powers intent on rewriting international norms, and bridging the divides within the nation’s political arena. Striking a balance between safeguarding free speech and ensuring the dissemination of truthful information is an intricate endeavor demanding nuanced consideration. The collaborative effort between the executive branch and Congress in understanding and managing disinformation is a topic of ongoing debate and reflection. Recognizing the depth and breadth of disinformation is crucial in devising more effective defense mechanisms. Currently, there exists a significant gap in our comprehension of disinformation and our capability to mitigate its detrimental effects. The proposal for a bipartisan commission to bridge this gap is gaining traction. Such a body would be committed to an in-depth study of disinformation, crafting potent countermeasures, and formulating strategies to protect the United States and its global allies from the adverse consequences of these deceptive practices.
A declassified intelligence report shed light on the international dimensions of disinformation campaigns during the 2020 US elections. It revealed that Russia, under President Vladimir Putin’s direction, orchestrated operations aimed at undermining the candidacy of Joe Biden while bolstering support for then President Donald Trump. Concurrently, Iran, authorized by Supreme Leader Ali Khamenei, conducted covert cyber campaigns designed to impede Trump’s bid for reelection. Despite their support for opposing candidates, both nations shared a mutual goal: to diminish public trust in US institutions, erode confidence in the democratic process, and intensify political divisions within the United States.
The report highlights an environment of heightened international meddling as a major internal threat of political instability. A Quinnipiac University poll underscores this sentiment, revealing that 76 percent of Americans perceive domestic political instability as a more critical danger than the threat posed by foreign adversaries, which just 19 percent of respondents believed to be the bigger threat. This widespread concern transcends political divides, pointing to a collective anxiety. Such internal discord, if left unchecked, not only undermines the nation’s stability but also potentially aids foreign powers like Russia and Iran in their efforts to erode trust in US institutions and deepen societal divisions.
The national security implications are significant and deeply concerning. If the United States is susceptible to disinformation, its policy is at risk of being influenced or even manipulated, potentially leading to inaction or misguided decisions in crucial international engagements where its vital interests are at stake. Moreover, the spillover of societal division into the military sphere could jeopardize the unity and cohesion necessary for military readiness and effectiveness. This vulnerability opens the door for further external influences, potentially compromising the nation’s ability to respond effectively to global threats and undermining its strategic position internationally.
Amid these challenges, both the executive branch and Congress have strived to address the multifaceted issue of disinformation and misinformation. However, their efforts have often been fragmented and lacking a cohesive strategy. During the 117th congressional session, Congress introduced 199 bills targeting disinformation yet only enacted five, including two that were appropriations based on previous legislation. This reflects a significant gap in prioritization. These laws have predominantly focused on countering foreign disinformation from nations like Russia and China impacting regions such as Western Europe, Asia, and Latin America. Yet, they fall short in addressing the growing concern of domestic disinformation within the United States. The RENACER Act, for example, targets Russian misinformation in Nicaragua. But the bills that have been passed largely overlook similar issues within US borders.
The recognition across political lines of foreign actors exploiting political divisions, especially evident in the January 6 Capitol riots where domestic disinformation played a key role, further underscores the pressing need for a unified approach. The legislative response to this threat has been underwhelming. Important initiatives like the Promoting Public Health Information Act and the COVID-19 Disinformation Research and Reporting Act of 2021 failed to pass, underscoring the difficulties in building legislative consensus to combat disinformation effectively. Even the bipartisan Madeleine K. Albright Democracy in the 21st Century Act, aimed at defending democracy and countering misinformation globally, did not succeed.
These legislative shortcomings indicate the urgent need for a more robust and comprehensive strategy addressing the immediate effects of disinformation and its underlying causes. In this context, the ongoing debates around Section 230 of the Communications Decency Act, which is pivotal in governing online content, are particularly relevant. Lawmakers have yet to agree on how to modify this legislation effectively, with proposals ranging from its total repeal to amendments aimed at curbing political bias and censorship. Adding to this complexity, a recent federal court decision has clarified the parameters of the current administration’s engagement with social media companies in matters of content moderation. This ruling defines the extent to which the government can influence these companies’ practices in monitoring and managing content, emphasizing the legal boundaries in the context of free speech considerations.
This legal backdrop is particularly relevant when considering the ongoing challenges in combating misinformation online. For example, the case of the “disinformation dozen”—twelve social media influencers identified by the Center for Countering Digital Hate as spreading false information about COVID-19 and vaccines—exemplifies these challenges. Despite some account suspensions and bans, ten of these users remain active and influential on major platforms, collectively reaching an audience of over six million followers. Even with concerted efforts from platforms like Facebook to mitigate misinformation, these individuals persist in adapting their strategies, continuing to sow doubt about vaccines and COVID-19. This situation not only underscores the difficulties in moderating online content but also brings into question the potential impact of any changes to Section 230 on such efforts.
The Department of Homeland Security (DHS) has faced significant challenges in addressing disinformation, despite its efforts such as the 2019 Strategic Framework for Countering Terrorism and Targeted Violence and the 2020 Homeland Threat Assessment. In 2022, DHS established the Disinformation Governance Board, intended to tackle national security threats stemming from disinformation. The board, however, faced numerous challenges from its inception. Concerns were raised about its objectives and potential impact on First Amendment rights, leading to its suspension after just three weeks and the subsequent resignation of its leader. Ultimately, the board was discontinued following the Homeland Security Advisory Council’s recommendation.
The difficulties faced by DHS highlight the complexity of combating disinformation in a politically charged environment. Yet, public sentiment increasingly favors action to address the issue, as demonstrated by a recent Pew Research Center survey. The survey shows a significant shift in American attitudes toward online false information, with a growing call for both government and technology companies to play a role in addressing disinformation. Currently, 48 percent of Americans advocate for governmental action to restrict false information, a notable increase from 39 percent in 2018. Yet, the majority of survey respondents (59 percent) still lean toward expecting technology companies to address disinformation. This view is not without partisan divides: 70 percent of Republicans emphasize the importance of information freedom, even at the cost of false information circulating, while 65 percent of Democrats support governmental intervention in curbing misinformation.
To effectively counteract the disinformation epidemic, senior leaders and policymakers must tackle its underlying causes, not just its symptoms. This requires a deep understanding of American perceptions of disinformation and the development of strategies to mitigate its harmful effects. Key to this effort is promoting media literacy, bolstering fact-checking initiatives, and enhancing transparency, thereby equipping citizens to discern truth from falsehood. Such measures are crucial in safeguarding democratic processes and public discourse from the corrosive impact of disinformation. A holistic approach, encompassing education, collaboration, regulatory measures, and technological innovation, is essential in this fight.
The United States has a history of establishing commissions to uncover truth and uphold justice, as seen in the Warren Commission and 9/11 Commission. These bodies have been instrumental in investigating significant national events and providing clarity. Inspired by these models and the National Security Commission on Artificial Intelligence, policymakers could form a new bipartisan commission to thoroughly investigate the nuances of disinformation campaigns. The bipartisan nature of this commission is essential, as it ensures a balanced and fair investigation, free from partisan biases, reflecting a united front against the challenges posed by disinformation. This commission would delve into their origins, tactics, and societal impacts, adopting a rigorous and impartial approach similar to its predecessors.
As the Brookings Institution illustrates, addressing disinformation in the modern era demands a multifaceted strategy. Policymakers must serve a society that values critical thinking and media literacy, enabling Americans to adeptly navigate the flood of information. Collaborating with social media platforms, industry leaders, and civil society is essential for developing effective verification and moderation practices. This need becomes even more pressing with the rapid advancement of artificial intelligence, a double-edged sword that could greatly enhance the influence of disinformation campaigns. The sophistication of these AI-driven operations, involving audience manipulation and realistic digital personas, underscores the need for action. A bipartisan commission is vital to developing comprehensive strategies that include education, policy, and technology to counter these threats, thereby protecting democratic integrity and public trust in an era when AI-fueled disinformation can decisively impact elections and democratic stability.
Major Nicholas Dockery is a Special Forces officer, a fellow at the Modern Warfare Institute at West Point, a United States Military Academy alumnus, and a General Wayne Downing scholar. He holds a master in public policy from the Yale Jackson School of Global Affairs.
The views expressed are those of the author and do not reflect the official position of the United States Military Academy, Department of the Army, or Department of Defense.
Image credit: Tom Thai
Oceania has always been at war with Eastasia!
Such a perfect response. Bravo, sir.
Intentional disinformation — whether of the foreign and/or of the domestic variety today — this seems to stem/this seems to emanate — frequently today — from a common source; this such common source being, those conservative individuals and groups (both here at home in the U.S./the West and there abroad elsewhere) who believe (and/or who fully understand) that they have lost — and will continue to lose — power, influence, control, status, privilege, safety, security, etc.; this, if initiatives — sponsored by progressive elements (both here at home in the U.S./the West and there abroad elsewhere) and based largely on the truth — are (a) uncontested by disinformation/lies and, thus, are (b) allowed to see the light of day and bear fruit.
(In the scenarios noted above, these such initiatives threaten conservative elements — both here at home in the U.S./the West and there abroad elsewhere — who depend on the status quo [or a status quo ante; this latter, if too much unwanted change has already taken place] for the power, influence, control, etc., that they currently enjoy [or wish to, once again, enjoy]. In this regard, see the "my nation's enemy is now my group's friend" phenomenon noted in my two examples below:
“Liberal democratic societies have, in the past few decades, undergone a series of revolutionary changes in their social and political life, which are not to the taste of all their citizens. For many of those, who might be called social conservatives, Russia has become a more agreeable society, at least in principle, than those they live in. Communist Westerners used to speak of the Soviet Union as the pioneer society of a brighter future for all. Now, the rightwing nationalists of Europe and North America admire Russia and its leader for cleaving to the past.” [See “The American Interest” article “The Reality of Russian Soft Power” by John Lloyd and Daria Litinova.]
“Compounding it all, Russia’s dictator has achieved all of this while creating sympathy in elements of the Right that mirrors the sympathy the Soviet Union achieved in elements of the Left. In other words, Putin is expanding Russian power and influence while mounting a cultural critique that resonates with some American audiences, casting himself as a defender of Christian civilization against Islam and the godless, decadent West.” [See the “National Review” item entitled: “How Russia Wins” by David French.] )
Also, in support of my argument above, note that all of the anti-disinformation legislation noted in Major Nicholas Dockery in his 8th paragraph above (see the Promoting Public Health Information Act, the COVID-19 Disinformation Research and Reporting Act of 2021 and Madeleine K. Albright Democracy in the 21st Century Act); all of these such anti-disinformation initiatives seem to have been defeated largely by conservative elements. (Did I get this right?)
Bottom Line Thought — Based on the Above:
There are times when "the truth" — clearly — does not and will not serve one's — and/or one's group's — power, influence, control, etc., interests.
It is in times such as these that lies/disinformation — from those both here at home (ex: our conservatives) and there abroad (ex: Putin) — whose power, influence, control, etc., are commonly threatened by the truth — may become weaponized?
Note that — from the exact same perspective and for the exact same reasons that I provide above — Major Dockery's recommended "Bipartisan Disinformation Commission" — this, also — is not likely to get off the ground/is likely to go the same way as the Promoting Public Health Information Act, the COVID-19 Disinformation Research and Reporting Act of 2021 and Madeleine K. Albright Democracy in the 21st Century Act?
r/ActiveMeasures
We are all arbiters of our own truth. Whether bad information is believed or not, the onus is on he who must filter outnthat noise (technical term). None of this is new in concept, just in focus.
Everyone has been exposed to facts that simply ain’t true. Since the dawn of communication, people have had to deal with people who have no understanding of that which they speak.
So, we need to silence Free Speech so the Marxist/Socialist/Dem Party don’t lose their political power ?? Got it.
No, quite the opposite. This calls for Congress to undersand what they are not getting right. Has nothing to do with socialism (hence the whole bipartisan). If I read it right, they are trying to help
Reduce violent outbreaks from disinformation, but in no way does it call for limiting free speech. Maybe you missed the point.
We live in a time of group think. The foremost transformational leader told his Lieutenants “Enter by the narrow gate. For the gate is wide and the way easy that leads to destruction, and those who enter by it are many. For the gate is narrow and the way is hard that leads to life, and those who find it are few.” Obviously, he was speaking on a spiritual plane, but it is relevant to our physical world driving our spirit to seek truth. The mantra “diversity is our strength” is sung by a choral legion, except when it pertains to thought. Do we not already have boardrooms, parliaments and various delegations all full of yes men?