Campaign Now | Grassroots Movement Blog

How Disinformation Blew Up the Campaign Playbook

Written by John Connors | Apr 16, 2025 7:04:47 PM

Sasha Issenberg explains why the future of campaigning starts with letting go of control.

What to Know: 

  • Consultants can no longer track what voters are seeing—media is too decentralized.
  • Viral misinformation often comes from anonymous or foreign actors, not political opponents.
  • Fact-checking rarely changes minds and can even backfire by amplifying lies.
  • Most viral attacks aren’t worth responding to—“99% of the time, the right move is to ignore it.”
  • Campaigns must shift from command-and-control to flexible, humble strategies built for chaos.

The landscape of political consulting has been irrevocably changed by the disinformation era. This shift was the central theme in a recent AAPC podcast featuring journalist and author Sasha Issenberg, whose work spans publications like The New York Times and The Atlantic. Speaking with host Tracy Dietz, CEO of DonorBureau, Issenberg dove into the thicket of disinformation and what it means for campaign strategy going forward.

Source: Sasha Issenburg; Wikimedia Commons

Disinformation, as Issenberg explains, isn’t simply a battle of truth versus lies. Instead, it's a symptom of a larger transformation: the decentralization of media. "The big challenge here isn't really about how accurate a claim is," he said. "It's about the fact that consultants trained in the era of mass media are now navigating a completely different world."

From Newspapers to Newsfeeds

In decades past, campaign strategists could rely on a predictable media environment. They could map a voter's exposure by monitoring local newspapers and TV ad buys. "You knew what people were hearing," Issenberg reflected. But now, with voters absorbing information from an endless sprawl of social media channels, that certainty has evaporated. This murkier media landscape makes core decisions—like whether to respond to an attack or correct a lie—infinitely harder.

Issenberg emphasized the nature of the new "asymmetry" in political battles. Consultants are no longer just facing off against other candidates or PACs. "Your opponent might not be a party or a super PAC," he said. "It might be a kid in a basement, or someone trying to game ad clicks, or even a foreign actor. They're not playing by the same rules." These actors don’t disclose finances, aren’t subject to FEC rules, and most importantly, remain fundamentally unaccountable.

The Federal Election Commission (FEC) enforces campaign finance laws for U.S. federal elections, but anonymous and foreign actors spreading disinformation online often operate beyond its regulatory reach. Image source: Getty Images

This reality leaves political professionals guessing not just about tactics, but about motives. "You used to know what your opponent would do with a new issue contrast," Issenberg said. "Now you have no idea who your opposition is, what they want, or why they're doing it."

The Metrics of Misinformation

The viral nature of content adds another layer of complexity. When a piece of false information racks up millions of views, consultants are left asking: is this dangerous, or just noise? According to Issenberg, understanding the real reach of disinformation requires nuance. "Are those millions of views in the U.S.? Are they among persuadable voters? Are they in states that matter to your strategy?" he asked. The answer isn’t always obvious.

To make sense of this chaos, some campaigns now employ "landscape analyses," a method championed by one of Issenberg's key sources, Jari Craig. As he describes it, this involves mapping regular levels of conversation around specific topics, so campaigns can recognize whether a viral surge is actually a threat or just part of the daily noise. "You need to know if this is the usual crowd talking about vaccines or if it's breaking through to a broader electorate," he said.

Fact-Checking in a Post-Truth Era

Fact-checking, once considered a silver bullet, has largely lost its power. Issenberg pointed to cognitive science research showing that repeating false claims—even to refute them—can cement them further in people's minds. "There's a way to do fact-checking that makes the problem worse," he warned. And in today’s hyper-polarized media environment, sources like The Washington Post or Associated Press only work as validators if audiences already trust them. "The bigger the conspiracy theory, the more people believe the fact-check is part of the conspiracy," Issenberg said.

While some countries have moved toward more aggressive regulation of disinformation, Issenberg doubts that’s likely in the U.S. Citing Brazil’s strict election laws—including court orders to jail disinformation spreaders—he noted, "To an American like me, who lives on free expression, that's pretty scary." Without a First Amendment equivalent, Brazil's model is incompatible with the U.S. legal structure.

A more plausible scenario here might involve repealing Section 230, which protects tech platforms from liability for user-generated content. Such a repeal could prompt companies to moderate more aggressively, not because of regulation, but out of self-preservation.

Source: Summary of Section 230; Congress Website

The Text Message Tsunami

Dietz also brought up the explosion of text messages from Democratic campaigns. Issenberg noted the shift was largely logistical. "It used to take a human being to send a text," he said, referencing Bernie Sanders’ 2016 campaign which relied on thousands of volunteers to click 'send' manually due to regulations. "That's no longer enforced, so now it's like robocalls in the late '90s—cheap, easy, and scalable."

Yet even these tools are losing their edge. "We used to get excited when we got a text. Now it's like our email inbox or junk mail," Issenberg quipped. The novelty is gone, and with it, much of the power.

Wrap Up

Disinformation has fundamentally altered the structure and strategy of political campaigns. Consultants can no longer rely on centralized media systems to track voter exposure or control messaging. Instead, they face an environment where false information is created and amplified by anonymous individuals, foreign actors, or non-political influencers. These sources are not bound by campaign finance laws or disclosure requirements, making them effectively unaccountable. 

Traditional campaign tools—TV ads, press releases, mailers—are increasingly ineffective at competing with viral, emotionally charged content that circulates on platforms like TikTok, YouTube, and Twitter. The scale and speed of disinformation distribution means that by the time a campaign detects and responds to a falsehood, the damage may already be done.

Attempts to fact-check or correct false claims often fail to change public opinion and may unintentionally reinforce the original misinformation. Voters exposed to these narratives are unlikely to trust corrections, especially when they come from institutions they already distrust. In most cases, engaging with viral falsehoods only increases their reach due to algorithmic amplification. Campaigns must now build strategies around monitoring digital conversations, assessing the actual impact of viral content, and resisting the reflex to respond to every attack.