Hand holding a smartphone displaying the YouTube logo on a white background. A red light illuminates the scene from below, creating a red-to-dark gradient background.

The rabbit hole and the way out: escaping YouTube’s conspiracy theory echo chamber 

Paper Title: “I think youtube’s turning me into a flat earther”: Social media’s role in ex-conspiracy theorists entering and exiting anti-scientific communities.

Author(s) and Year: Channais Matthias, Yael Benn, Ben Harkin, 2025

Journal: PLOS One (open access)

TL;DR: This study examined the experiences of four anonymous former conspiracy theorists as they transitioned in and out of anti-science communities. The researchers found that YouTube videos were one of their main gateways, while empathetic assistance to reach conclusions through critical thinking was their main exit.

Why I chose this paper: If you were to ask at any random moment of the day what I’m doing, the answer would probably be—for better or for worse—watching a YouTube video. While I enjoy how accessible scientific content is on this platform, this very trait can also make YouTube one of the most versatile tools for spreading disinformation. That’s why I found this paper so interesting, as it not only shows the role YouTube plays in attracting new conspiracy theorists, but it also highlights the limitations of the “debunking videos” that are often created to counter science deniers.

If you’re in the sciences, chances are you’ve had your fair share of interactions with conspiracy theorists. Whether they be anti-vaxers or flat-earthers, you’ve probably tried to show them the facts that debunk their ideas, which, without fail, leaves them with an unchanged view, and you with frustration. But according to this research, you may have been doing more harm than good.

The Background

Flat Earthers vs the Globe Bandwagoon

The population’s trust in science and the development of science-based policies go hand in hand. If one is eroded, support for the other will dwindle. Unfortunately, now more than ever, false information and conspiracy theories are being easily shared through social media. 

YouTube, in particular, has become a hub for spreading anti-vax and flat-earth arguments, among others. However, few studies have looked into why this platform is so effective, and what about it draws people into these conspiracy communities and keeps them immersed.

To address this, the researchers, Matthias et al., interviewed four former conspiracy theorists. By analyzing their background and experiences, the authors aimed to explore the pressures that initially attracted these individuals to conspiracy groups, and to what extent YouTube videos played a role. The authors also examined what led the former conspiracy theorists to leave these communities, hoping to identify strategies for effectively correcting misinformation.

The Methods

The interviewees were four anonymous individuals from three countries spread across different continents. The researchers asked them questions that covered a range of topics, including what these former conspiracy theorists were initially searching for when they first came across anti-scientific videos, what kept them hooked, how their personal lives were affected by the conspiracy community, and why they began to doubt the communities they had joined.

The interviews were analyzed by Reflexive Thematic Analysis (RTA), a qualitative method that identifies patterns in research questions by studying what the participants said and what the researchers subjectively perceived. The resulting patterns with similar meanings were then grouped into themes.

The Results

At the right time and the right moment

The researchers identified four themes that encapsulated the journey these participants went through.

  • Theme 1: Gateway into the echo chamber: 

The participants indicated that significant and emotional life events, such as the rupture of a relationship, the death of a loved one, or the birth of their first child, coincided with the discovery of conspiracy videos. Loneliness and isolation, which are often associated with these types of life events, led to binge-watching videos that were fed to them by the YouTube recommendation system, turning their feeds into echo chambers. “It just went from there,” one of the participants said.

  • Theme 2: The puzzle of scientific illiteracy

According to the participants, many people in conspiracy theory communities have a limited understanding of scientific concepts, mainly due to a lack of higher or science education. Conspiracy theories, therefore, offer “simple and accessible explanations for complex phenomena,” write the researchers. 

These communities will also cherry-pick scientific content to support their beliefs, and encourage others to “do your own research,” which mostly means watching more YouTube videos. The ease of access and sharing, plus the emotional appeal, make this platform their main source of information.

  • Theme 3: The pull and power of the conspiracy community

These groups offer a genuine sense of community and purpose to their members, which, according to the interviewees, they tend to lack.

However, the participants agreed that as someone starts questioning other members’ arguments, members went from being family to “lower than dirt.” More examples cited go from being ostracised and devalued, to some extreme cases where former members were harassed, doxed, and had their employers contacted.

Theme 4: Escaping the echo chamber

All participants reported (perhaps retrospectively, according to the researchers) never fully believing the community’s arguments as a reason for wanting to leave. However, they also all had help from external individuals who taught them how to debunk misinformation.

Nonetheless, severing bonds with these conspiracy communities can also mean returning to loneliness or even suffering economic losses, making leaving extremely difficult, and relapsing all too easy.

The researchers concluded that these themes could be applicable to other similar groups of people. Still, it should be emphasized that such a small sample size may not be representative.

The Impact 

The researchers suggestions

In light of their findings, the researchers propose three key intervention strategies.

A- Enhancing literacy among YouTube users:

The participants indicated that YouTube videos intended to debunk misinformation are often too complicated or condescending, so content creators should aim to make respectful and accessible content that encourages viewers to self-reflect, rather than simply “learn facts.” In other words, rather than making videos emphasizing how wrong conspiracy theorists are, content creators should focus on why these people’s logic might be flawed and help them reach more accurate conclusions.

B- Recognizing the influence of personal experiences:

Content creators should foster supportive YouTube communities where viewers can share their experiences while accessing factual information. However, the echo chamber can work both ways, so creators and consumers should avoid encouraging an “us vs them” mentality that alienates the very people they are trying to help.

C- Implementing empathic, platform-specific interventions:

Creators on YouTube should also take a compassionate approach when challenging inaccurate beliefs. As the authors mentioned, “empathic engagement, rather than confrontational fact-checking,” is a more effective way to encourage someone to reassess their beliefs.  However, content creators also need to strike a balance between validating and correcting. 

The researchers suggest that YouTube is also responsible for ensuring  that the platform promotes verified information and recommends “emotionally and personally directed interventions alongside conspiracy-related videos.”

Life after the echo chamber

By understanding why members are drawn to conspiracy theory groups, and more importantly, why some decide to leave, we can develop empathetic strategies to offer a safer way out to more people. Still, more studies that focus empathically on former members and their exit need to be designed to consider these findings conclusive. 

Nevertheless, this study showed the importance of encouraging conspiracy theorists to reach conclusions through critical thinking, rather than simply instilling doubts and telling them how wrong they are.

But YouTube and its creators also need to do their part to create a platform that can double as a safe space to raise questions, rather than shun away individuals who will ultimately get answers, even if they are not the correct ones.

My key takeaway:

If one of the most appealing characteristics of conspiracy groups is their community aspect, it should be all the more important to develop a healthy scientific community; one that welcomes and encourages newcomers, rather than pretending that science is for a privileged group that ”just gets it.” Only then will we see the prevalence of conspiracy theories dwindle.

Alt text for featured image:  Hand holding a smartphone displaying the YouTube logo on a white background. A red light illuminates the scene from below, creating a red-to-dark gradient background.

Written by Diego Ramírez Martín del Campo

Edited by Alex Music and Krystal Vasquez

Featured image credit: Szabó Viktor