What happens when inconvenient truths are edited out instead of investigated?
We live in a time where every bold claim, medical decision, and controversial opinion is just one step away from being “fact-checked.” At first glance, that sounds like a good thing. After all, who wouldn’t want accurate information?
But what happens when the mechanism used to validate truth becomes a tool to reinforce a particular worldview—or worse, suppress legitimate questions?
This article explores the increasingly blurry line between fact-checking and reframing. One corrects misinformation. The other edits reality to fit a preferred narrative.
Fact-Checking: The Ideal
In its truest form, fact-checking is one of the most important pillars of public discourse. When done right, it investigates claims thoroughly, provides source material, offers context, and helps the public make informed decisions. It asks, “Is this accurate?” and goes looking for answers with transparency and intellectual humility.
But increasingly, we see something else wearing fact-checking’s name tag.
Reframing: The Quiet Shift
Reframing happens when something isn’t actually fact-checked—it’s just reshaped to be easier to dismiss. A claim might be labelled false because it deviates from consensus, not because it’s provably incorrect. Or the focus shifts from what was actually said to what could be implied if taken out of context.
It’s subtle. It rarely looks like censorship. It sounds more like, “This has been debunked,” followed by a source that never actually addresses the original point.
When consensus becomes the threshold for truth, and nuance gets cut for convenience, we’re no longer fact-checking. We’re reframing.
Reframing is fact-checking’s polite, institutional cousin—the one who always agrees with authority, no matter the evidence.
Case in Point: Our Vaccine Study Critique
We recently published a critique of a major COVID-19 vaccine study that claimed safety in pregnancy. Our point was straightforward: the study excluded over 20,000 non-live pregnancies when evaluating outcomes like miscarriage and birth defects.
We didn’t claim vaccines were unsafe. We didn’t deny scientific consensus. We simply asked: how can a study assess risk if it removes the very outcomes it was supposed to track?
The fact-check that followed ignored this entirely. It reframed our article as anti-vaccine and flagged it as misleading—not because our facts were wrong, but because our conclusion didn’t align with the consensus.
This isn’t correction. It’s containment.
Consensus vs. Curiosity
There’s value in scientific consensus. It helps guide policy, build confidence, and streamline communication. But science itself was never meant to be static. It is a method—not a marketing campaign.
When dissent is viewed as dangerous, curiosity becomes a liability. And that’s a dangerous place to land.
When dissent becomes dangerous, curiosity becomes a liability.
The Real Danger of Reframing
When reframing becomes the dominant response to complexity, we lose more than just good journalism. We lose trust. We lose open debate. We lose the space to explore uncertainty without fear of reputational damage.
In time, the truth becomes whatever survives the edit.
So What Can We Do?
We can start by asking better questions. Reading beyond headlines. Paying attention to how criticism is handled—not just what’s being said, but how it’s being dismissed. We can value transparency over credentialism. And most of all, we can stay curious, even when it’s uncomfortable.
We don’t claim to have all the answers. But we do claim the right to ask better questions—without being misrepresented, reframed, or erased for doing so.
That’s not misinformation. That’s critical thinking.
This article is part of our “Automated Authority” series: exploring how machines, institutions, and algorithms are reshaping our relationship with truth. Start with the series introduction or read our investigation into Algorithmic Truth Engines: Why AI Can’t Be Trusted to Fact-Check Science.
Further Reading: Cutting Through the Noise
If this article resonated with you—if you’ve felt gaslit by headlines, baffled by contradictory “facts,” or simply unsure who to trust—you’re not alone. Here are some curated books and articles that dig deeper into the murky space between information, control, and critical thought.
Books
The following books are linked to Amazon.com for your convenience. If you decide to purchase through these links, we may earn a small commission — at no extra cost to you.
The Death of Expertise: The Campaign against Established Knowledge and Why it Matters
By Tom Nichols
A sharp look at how institutional trust has eroded, and what happens when everyone thinks they’re right—and anyone who questions consensus is shut down.
Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy
By Cathy O’Neil
An eye-opening account of how algorithms, built with bias and shielded from accountability, shape decisions in education, employment, and public perception.
The Black Box Society: The Secret Algorithms That Control Money and Information
By Frank Pasquale
Explores how powerful institutions use opaque algorithms to shape reality—and why we must demand transparency.
Articles & Reports
There’s a Missing Human in Misinformation Fixes – Scientific American
A reminder that automated systems can’t replace human discernment when it comes to truth and trust.
https://www.scientificamerican.com/article/theres-a-missing-human-in-misinformation-fixes/
How Facebook Got Addicted to Spreading Misinformation – MIT Technology Review
A deep dive into how platform algorithms prioritize engagement—even at the cost of accuracy.
https://www.technologyreview.com/2021/03/11/1020600/facebook-responsible-ai-misinformation
Algorithmic Accountability: A Primer – Data & Society
A foundational guide to understanding the risks posed by unregulated algorithmic decision-making.
https://datasociety.net/library/algorithmic-accountability-a-primer/
Reframing may be subtle, but its impact is real. Keep reading. Keep questioning. Because reclaiming truth begins with recognizing where—and how—it’s being reshaped.
Image acknowledgment
The image on this page was created using Canva.com