HomeSocietyLive-Birth Bias: How Research Can Hide Harm by Design

Live-Birth Bias: How Research Can Hide Harm by Design

What happens when the most telling data never makes it into the study?

What do you call a study on pregnancy outcomes that excludes miscarriages?

Apparently, peer-reviewed science.

Welcome to the subtle but serious world of live-birth bias—a research design flaw that doesn’t make headlines but quietly shapes what we believe about risk, safety, and certainty.

At Critical MindShift, we don’t claim to have the answers. But we do ask: Who decided the rules of what gets counted, and what gets conveniently left out? And what are the consequences of only counting the survivors?


Understanding Live-Birth Bias

Live-birth bias occurs when a study about pregnancy outcomes includes only live births, and excludes pregnancies that end in miscarriage, stillbirth, or termination. This is more common than you might think.

And it’s not always accidental.

By excluding adverse outcomes from the dataset, a study can unintentionally (or intentionally) downplay potential risks. The result? Seemingly reassuring conclusions—without ever grappling with what might have gone wrong.

If harm occurs before birth, and the study only counts babies born, how can we possibly know what caused it?


Why It Matters

Pregnancy is a uniquely vulnerable state. What a mother is exposed to—medications, infections, environmental factors—can profoundly impact the fetus. When we study those effects, it matters what outcomes we count.

If 100 women receive a drug during pregnancy, and only 80 have live births, studying only those 80 may miss the very reason the other 20 didn’t make it to delivery.

Would you trust a safety report that excludes the accidents?


Real-World Examples

This bias is not theoretical. It has shown up in multiple high-profile studies, including research on medications, vaccines, and environmental toxins.

In one recent example, a study on COVID-19 vaccine safety during pregnancy excluded over 20,000 non-live pregnancies from its analysis—yet still concluded there was no increased risk.

The authors acknowledged the omission. But the conclusion stood.

Regulators and media picked up the headline. Few questioned the math.


Is This Just Bad Science?

Not necessarily. Sometimes the data on miscarriages or stillbirths is incomplete or harder to access. But the issue is not just what’s missing—it’s how what’s missing is treated.

Were the exclusions clearly disclosed? Were alternative explanations considered? Did anyone attempt to model how the missing data could change the outcome?

Or were we reassured by results built on a subset that survived?

It’s not the presence of uncertainty that’s the problem. It’s the illusion of certainty built on partial truths.


How Can We Do Better?

Live-birth bias isn’t about conspiracy. It’s about questioning assumptions.

  • What outcomes were measured?
  • Who defined the criteria for inclusion?
  • Were the most vulnerable pregnancies counted—or discarded?
  • Is the conclusion one of safety, or one of statistical survival?

These are the questions researchers should be asking. And so should we.


Why This Isn’t Just About Pregnancy

Live-birth bias is a metaphor for something bigger: the way evidence is shaped, cleaned, or filtered to fit a preferred narrative. The principle applies far beyond maternal health. We see similar patterns in climate data, pharmaceutical trials, and even economic modeling.

When the data that doesn’t fit is excluded, the story we’re told becomes less about truth and more about convenience.


Truth doesn’t always survive the edit. But questions do. And that’s where we start.

This article is part of our “Automated Authority” series, exploring how design choices in science and technology shape what we believe. Read more in our investigation on algorithmic truth engines or explore the full series index.

Coming next in this series: What happens when entire categories of people—biological women, sex-based outcomes—are removed from research? We’ll explore how erasing sex and gender in science creates risk, not equity.


Image acknowledgment

The image on this page was created using Canva.com

- Advertisement -spot_img