How to Stay Sane in a World of Spin, Fakes, and AI Lies
Including 11 trusted tools to sharpen your judgment and restore reality
Author’s Note: This long-form post was written to equip you with 11 practical tools to push back against misinformation, from trusted fact-checking platforms to hands-on verification strategies. It's both a commentary and a field manual for staying grounded in a world that’s rapidly losing its grip on reality.
Two weeks before the 2024 election, a fake robocall mimicking President Biden's voice urged voters to skip the New Hampshire primary, a real example of AI-enabled election interference (AP News). Within hours, thousands had shared it. Fact-checkers eventually debunked the video, but the damage had already spread beyond the candidate himself. We'd lost a little more faith in our ability to tell what's real.
One fake video might seem like a small thing. But we're living through something bigger: Artificial intelligence now generates convincing lies faster than humans can verify the truth. Our information ecosystem has become a battlefield. The very idea of shared reality is under assault.
You've probably asked yourself, "Who can I believe anymore?" or "How do I function when I can't trust what I'm seeing?" The danger goes beyond personal exhaustion. Democracy itself is at stake.
The Fog of AI-Enhanced Information War
We are living through what the RAND Corporation calls "Truth Decay. “Facts and opinion blur together deliberately. Truth and spin become indistinguishable. Today's version comes supercharged with artificial intelligence.
RAND's 2023 research warns that Truth Decay now poses a risk to U.S. national security by weakening military readiness and eroding America's credibility with allies.
The 2024 election cycle became a testing ground for this new reality. AI tools made it cheap, easy, and fast to create convincing fake video, audio, and text. Fabricated candidate statements appeared alongside synthetic robocalls that mimicked familiar voices.
Researchers discovered something unexpected: traditional "cheap fakes" that didn't use AI were used seven times more often than AI-generated content.
Cheap fakes, videos, or audio edited without the use of AI, such as misleading cuts, speed alterations, or mislabeled clips, remain the dominant form of manipulation because they are fast and inexpensive to produce.
Paradoxically, AI has done more to corrode trust: it's not the volume of deepfakes that's most dangerous, but the erosion of confidence that any media can be real. This is the real threat: not that everyone believes the fake, but that no one believes the truth.
AI ultimately eroded people's faith in truth rather than influencing their vote. Experts refer to this as "the liar's dividend": when the mere possibility of fakery prompts people to doubt everything, including authentic evidence.
Disinformation campaigns have evolved into strategic and well-funded endeavors. They exploit basic flaws in how our minds work. Daniel Kahneman's research on System 1 and System 2 thinking reveals that our brains tend to favor fast, emotional judgments over slow, analytical ones. In an AI-accelerated media environment, this leaves us vulnerable to manipulation.
Why This Matters: The Strategic Costs of Digital Confusion
When people can't agree on basic facts, the consequences ripple outward:
Climate action stalls when manufactured doubt overwhelms scientific consensus.
Democratic institutions weaken when citizens question basic facts about elections.
Public health suffers when medical misinformation spreads faster than accurate guidance.
Social cohesion fractures when conspiracy theories become tribal identities.
When citizens lose the ability to distinguish credible information from sophisticated manipulation, governance becomes impossible. We don't get healthy pluralism. We get paralysis.
As Kathleen Hall Jamieson, director of the Annenberg Public Policy Center, warns, we risk losing more than factual accuracy. We're losing the very idea that facts matter at all.
In a world where anything can be faked, everything becomes suspect.
A Framework For Information Verification
If you're looking to build verification skills, there are free and low-cost educational resources designed to help you:
Crash Course in Navigating Digital Information – A free, structured course on evaluating digital media, hosted by John Green and offered by Crash Course and MediaWise
Checkology by the News Literacy Project – An interactive online course that teaches how to spot misinformation, verify sources, and evaluate evidence
Google Fact Check Tools – Use Google's Fact Check Explorer to search claims and see what reputable fact-checkers have said
Information Futures Lab at Brown University – A research and public engagement hub that develops strategies and tools to combat health and information harms, offering practical guidance on digital verification and media resilience
Here's a filter for navigating today's information landscape, grounded in evidence from digital literacy experts and designed for the AI age:
1. SIFT before you share
SIFT method, developed by digital literacy expert Mike Caulfield, provides four essential moves:
STOP: Pause before reacting or sharing
INVESTIGATE the source: Who's behind this information?
FIND better coverage: What do other sources say?
TRACE claims back to its origin: Where did this really start?
2. Practice lateral reading
Don't just read down the page. Read across the web. Open multiple tabs to verify claims and check what other sources say about the same information. Professional fact-checkers spend more time investigating sources than reading content.
3. Triangulate across ecosystems
Don't rely on single sources or ideological bubbles. Tools like AllSides and Ground News help you see how different political perspectives cover the same story. If only one ecosystem is reporting something major, ask why.
4. Examine digital fingerprints
Look for verification signals, such as direct quotes with attribution, hyperlinks to primary sources, author credentials, and publication dates. Be especially wary of content lacking these digital fingerprints. They often signal manufactured information.
5. Check your emotional temperature
Ask yourself: Am I reacting or reflecting? AI-generated disinformation specifically targets emotional triggers. Manipulators aim to provoke rather than inform. Rage and fear are their preferred weapons.
6. Use technology to fight technology
Install browser extensions that help identify questionable sources. While NewsGuard faces some political controversy, alternatives like AllSides ratings and Media Bias/Fact Check provide helpful context. Ground News aggregates coverage across the political spectrum and accurately labels bias.
7. Prepare for future threats
Generative AI is advancing rapidly, and detection tools will always lag behind synthesis tools. Readers should stay informed about provenance technologies, such as C2PA (Coalition for Content Provenance and Authenticity), and push for the broader adoption of media authenticity standards. Platform responsibility and government oversight will be essential components of long-term resilience.
Mental Hygiene For The AI Era
You cannot process infinite information streams and remain clear-headed. Essential practices:
Time-boxing consumption: Check the news at set intervals, rather than continuously. Constant updates create an artificial sense of urgency while reducing comprehension. and clarity
Diversifying inputs: Read across disciplines and formats. Listen to long-form interviews or debates like those at Open to Debate (formerly Intelligence Squared U.S.). Seek out perspectives that challenge your assumptions.
Building delay into response: Avoid sharing or forwarding in emotionally charged moments. Give claims 24 hours. What feels urgent now may be irrelevant tomorrow or proven false.
Protecting cognitive resources: The American Psychological Association documents how unfiltered media exposure contributes to stress, anxiety, and decision fatigue. Sometimes, logging off can preserve mental clarity rather than avoiding reality.
Learning to live with uncertainty: Perfect information is impossible. The goal involves making reasonable decisions with incomplete data, not eliminating all doubt.
Becoming A Force for Signal Over Noise
Individual actions add up to collective defense:
Verify before amplifying: Be the person in your network who checks sources before sharing. This simple step breaks the chains of viral misinformation.
Call out provable falsehoods: When you see demonstrably false information causing harm, speak up. Use evidence and alternative sources rather than partisan attacks.
Support quality journalism: Subscribe to outlets committed to verification standards. Fund fact-checking organizations. Quality information costs money to produce.
Teach critical thinking: Help young people develop skills for evaluating sources, recognizing manipulation techniques, and thinking independently. Focus on how to think rather than what to think.
Stay engaged: Cynicism is not wisdom. A functioning democracy requires discernment rather than disengagement. We need citizens who can navigate complexity, not retreat from it.
Design institutional resilience: Civic leaders, schools, and employers must treat information integrity as a strategic priority. Incorporating media literacy into curricula, supporting trusted local news, and investing in digital safeguards can help scale solutions beyond individual awareness.
The Path Forward
We don't need to believe everything we encounter. But we must believe in the possibility of truth. Facts exist. Evidence matters. Careful analysis can distinguish reliable information from manipulation.
We're building better systems for the future: stronger institutions, smarter technologies, and more skilled citizens. The alternative is a world where nothing can be verified, and everything is suspect. In that world, democracy cannot function, and science cannot advance.
The information war is a reality, and it demands a response. Every time we choose verification over virality, evidence over emotion, and patience over passion, we strengthen our collective defense against manipulation. This preserves both sanity and freedom.
For Leaders, Educators, And Institutions
If you're in a leadership position, such as in a school, newsroom, agency, or workplace, you're likely facing an institutional design issue. You shape the information norms others live within. The digital integrity of your organization is part of national resilience. Leaders should treat the risk of misinformation with the same seriousness as cybersecurity or supply chain reliability.
Equip your teams to recognize manipulation, reduce information overload, and rebuild trust in evidence. This goes beyond civic hygiene. It's strategic foresight. The lesson goes beyond whether AI threats were overblown. Our response matters. Communities with strong media literacy programs, fact-checking infrastructure, and diverse information diets proved more resilient to manipulation attempts.