Disinformation is hard to write about.
No doubt it’s a sexy topic. Russian intelligence, internet conspiracy, secret funding — these are Hollywood pitches. An old friend with interesting passport stamps begged me to be cautious after my recent Sandworm essay. My 68-subscriber blog is certainly beneath Russia’s notice, but it’s an exciting idea, right?
That tingle gives me pause. A hallmark of ADHD is that we underproduce dopamine and compensate by seeking adrenaline, so I cultivated a habit of filtering things that seize my attention through my reality sniffer.
That’s tough to do online.
The internet is a mysterious shadow land where whatever you imagine becomes real. I mean this literally. Bitcoin, NFTs, AI: these are pure imagination rendered real enough to change our lives through the mediation of a virtual world that also transmutes our actual lived lives into mirage.
Disinformation sits at this nexus, its tendrils manipulating decisions that shape our lives — vaccination, investment, politics, love — but real people animate that infiltration, drawing us out and inviting us to taste this conspiracy, sniff at that intrigue.
Troll farms are just marketing agencies that specialize in conspiracies; inhabit fake, inflammatory accounts; and invent memes that stroke your emotions. Exploiting the liminal space between real and virtual to inveigle you past truth, toward doubt, into suspicion, they’ll convince you truth doesn’t even exist.
Not out here, in the mundane world, anyway.
People all over the political spectrum are disappearing down rabbit holes that remind me of the hyper-focused flights of fancy dopamine-seeking ADHD folks sometimes chase. We like to think disinformation only affects the gullible or the stupid, but intellect is a leaky life vest.
A friend’s retired RN mom thinks these new vaccines are dangerous. An electrical engineer of my acquaintance is removing his organ donor consent, worried that profit motive will affect emergency treatment. Another friend, with whom I share most positions on COVID, recently re-posted a speculative fever dream of “airborne HIV.”
These are intelligent, sensible people who embraced paranoid (but well crafted) narratives. Each narrative features verifiable facts sandwiching a savory whopper of dystopian fantasy. An ergot-infested truth sandwich, reality bends. The retired nurse is a FOX news watcher, and while they couldn’t convince her to distrust vaccination, they sowed enough doubt about COVID vaccines that she skipped hers. At her age, that’s a potentially deadly choice. I can’t trace the vector of disinformation for the other two so clearly, but I cherish a strong hunch.
Both fears were rooted in truth. A legal change last year did allow for-profits to bid for transplant contracts. COVID does damage the immune system, sometimes even after mild cases. Grafted to those healthy, foundational truths were malformed, fear-laden conclusions. These two friends hold strong (opposing) opinions on a polarizing topic heavily leveraged by Russian trolls. Had my friends been drawn down dark alleys of the internet, where fearmongering and conspiracy lurk?
Or is that my own paranoia?
Done well, propaganda can be subtle. It sneaks in around the edges, slowly normalizing banal nonsense into “common sense.” Once you acclimate to it, it becomes odorless, inveigling you into ever more extreme positions, whispering intrigue.
This isn’t new, of course. Joe McCarthy had no internet to fuel his decades-long, paranoia-fueled purge. The Protocols of the Elders of Zion was translated, printed, and distributed (free of charge!) by none other than Henry Ford. Tabloids have long seized bored housewives’ imaginations, featuring pictures of Hillary’s alien baby amid weight-loss miracles, and who can forget the pyramid craze of the 70s? You’ll never have to sharpen your razor again!
The internet just makes it faster, easier, and so transparent we can’t see through it.
For at least a decade, Russia’s informational warfare tactic has been stoking American polarization. Their trolls work both sides of the aisle. In 2016, Russian agents started two Facebook groups that scheduled competing rallies specifically intended to trigger violence on our streets. [1] Russian trolls funded activists chosen to escalate internal American division. [2] To subvert limits on foreign political donations, they created fake personas based on real people’s identities and social security numbers, complete with bank accounts. [3] Perhaps my well-traveled old friend isn’t so far off — the ante to add someone to a Dark Web target list is trivial.
So far, I’ve got obscurity on my side.
Russia is investing heavily — time and money — in our mutual distrust and paranoia. They want us polarized, locked into divisions of “us” and “them” that we find all too gratifying. Although we know they spread fake information, we still cite anecdotes found online as though they’re legitimate evidence. We share data-gathering memes, authorize games to troll our friend list, and boost sensational hot takes. We do this knowing there is a branch of the Russian military that swayed the 2016 election through exactly these tactics.
Isn’t that reason enough to resist it?
I’m fighting my own impulse to splash and shout from the shallow end of the pool as though I’m drowning, and instead to check that my feet are firmly on the bottom. I’m carefully noting any speculation I indulge in. I’m trying to damp my natural tendency toward hyperfixation and drama.
I’m trying to somehow simultaneously hold your interest, while avoiding gratuitous sensationalism. I’m asking you to contemplate a threat I believe is significant, but without exacerbating the panic and fear which that threat feeds on. I’m asking you to peer into the void without vertigo.
Polarization divides us, not just from each other but from discovering solutions. Complex problems defy resolution when artificial litmus tests — mask-wearing, use of public restrooms, library books — define shallow political identities that stand in for personalities. Polarization may feel easier than thinking, but it damages not only civil society but our very health. [4]
And it looks like that’s Russia’s goal.
If that’s true, then damping down reactivity is our first line of defense against a malevolent cyberwar that threatens both our virtual security and our real-world safety.
Sober consideration seems like an awfully boring response to a sensational problem.
How do I write about that?
[1] https://www.npr.org/2017/11/01/561427876/how-russia-used-facebook-to-organize-two-sets-of-protesters
[2] https://www.theguardian.com/world/2017/oct/17/russian-troll-factory-activists-protests-us-election
[3] https://www.npr.org/2020/03/05/812497423/report-russian-election-trolling-becoming-subtler-tougher-to-detect
[4] https://pubmed.ncbi.nlm.nih.gov/36712795/
Another excellent essay! But someday, I'll tell you my story, or you'll watch it on Netflix. It reads like a bunch of hard-to-believe conspiracy theories rolled into one crazy tale. My point being, to play a bit of devils advocate, sometimes, rarely, a conspiracy theory story can actually be 100% true. The challenge is figuring out which ones are, and which ones are, as you rightly suggest, the product of Russian or other forms of nefarious disinformation. Life is strange, and really strange things can happen to people. If all seemingly wacky stories were dismissed as phony conspiracies, we wouldn't have learned about a host of really important events that were treated with initial skepticism, for example the CIA's experiments with LSD on unsuspecting subjects, or the fact that tobacco companies knew and hid for years the fact that smoking causes cancer. Just sayin'...