Study Shows How 'Intellectual Dark Web' Is Gateway to the Far Right - Rolling Stone
×
Home Culture Culture News

Study Shows How the ‘Intellectual Dark Web’ Is a Gateway to the Far Right

People don’t just adopt far-right extremist views overnight — they have to become radicalized, and YouTube is the perfect place for that to happen

Interest in provocative academics like Jordan Peterson and commentator Ben Shapiro, who are considered part of the "Intellectual Dark Web," has been shown to be a precursor to far-right radicalization.

Shutterstock (3)

When we talk about the far right, it’s generally agreed upon that its (mostly white, mostly male) members aren’t born, but made, usually following an intense process of radicalization. But the radicalization process is akin to that of a frog being dropped in a pot of lukewarm water: The temperature has to be slowly turned up over the course of an extended period of time, degree by degree, until the frog becomes so accustomed to its surroundings that it doesn’t even realize it’s being boiled alive.

A new study from the Federal University of Minas Gerais (UFMG) in Brazil sheds some light on this phenomenon. The study analyzed more than 331,000 videos from what the study authors categorize as a broad, right-wing spectrum to paint a portrait of exactly how viewers become acclimated to increasingly far-right views — and the central role that YouTube’s algorithm, which recommends related videos for its users, plays in the radicalization process.

“Other researchers, NGOs, and the media have indicated or hypothesized that this radicalization process [has] occurred” for those on the far right, Manoel Ribeiro, the lead author for the study and a PhD candidate at EPFL in Switzerland, tells Rolling Stone. “What drove us to do this project was that we wanted to be able to quantitatively assess, to find if this process really existed, and if it was significant. In a way, we wanted to come up with a methodology that ended this discussion for good.”

To convey precisely how the YouTube radicalization process happens, the study’s authors started by looking at videos from content creators associated with the “Intellectual Dark Web,” a term used to identify a community of thinkers and writers who don’t explicitly self-identify as conservative, but whose views fall outside the mainstream leftist establishment. It classified right-wing commentator and podcast host Ben Shapiro and Jordan Peterson, the controversial psychology professor and author who has criticized, among other things, the concept of white privilege and the use of gender-neutral pronouns on campuses, as examples of prominent IDW thinkers.

In the middle of the spectrum, the study looked at content creators who flirt with far-right extremist ideology without openly self-identifying as white supremacists, which it classified as “alt-lite.” Creators in this category included Stefan Molyneux and Steven Crowder, whose channel was demonetized after Crowder used racist and homophobic slurs to mock gay journalist Carlos Maza. On the far-right end of the spectrum was alt-right vloggers like Faith Goldy, who has been referred to by the Cut as a “white nationalist poster girl” and appeared on the podcast for the neo-Nazi website the Daily Stormer; and James Allsup, a member of the white supremacist group Identity Evropa (whose account was banned by the platform as of Tuesday). The researchers defined all three as “contrarian” communities, or groups formed in opposition to the perceived wave of political correctness on the left.

Depressingly, but perhaps unsurprisingly, the researchers found that all three communities had grown exponentially since 2015, the year Donald Trump announced his candidacy for president. But it also wanted to determine whether there was any weight to previous claims by people like former far right extremist Caleb Cain, who was featured in a New York Times report analyzing his YouTube history, that IDW and alt-lite content could lead to increasing interest in extremist white-nationalist content. After looking at more than 79 million comments on hundreds of thousands of videos, they found that, as Ribeiro puts it, “there is migration among users from the Alt-lite and the I.D.W. to the Alt-right,” confirming that less extreme right-wing content did indeed serve as a gateway of sorts for radicalization.

As for whether YouTube’s recommendation algorithm played a role in introducing such content to viewers, the researchers did note that the algorithm did not recommend more extremist white supremacist content (i.e. those in the “alt-right” bucket). It did, however, frequently recommend so-called IDW and “alt-lite” content, and the study noted that the algorithm sometimes recommended alt-right channels, if not videos.

Critics have noted that there are some shortcomings to this study. For starters, it did not determine exactly how long this radicalization process takes (though Ribeiro says he hopes to answer that question in the future). Further, online commenters (and YouTube commenters specifically) comprise a relatively marginal subset of total viewers, with only the most engaged among them actually taking the time to post comments; so making broader assumptions about YouTube audiences based solely on that data is not exactly the most foolproof method.

Further, as the researchers themselves note, YouTube has faced an immense amount of criticism for its role in turning a blind eye to, or even actively boosting, extremist content, and has made many policy changes accordingly, including banning such content as “videos alleging that a group is superior in order to justify discrimination, segregation, or exclusion” and trying to recommend more authoritative content. YouTube also discontinued its “related channels” feature, which factors heavily in the study, back in May, according to a post on the YouTube community forum. But even in light of these changes, due to the lack of transparency surrounding YouTube’s algorithms, it’s difficult to know how effective these changes have been, or whether an even broader swath of YouTube users were exposed to such content prior to the researchers authoring their study.

In a statement sent to Rolling Stone, Farshad Shadloo, a YouTube spokesperson, says: “Over the past few years, we’ve invested heavily in the policies, resources and products needed to protect the YouTube community. We changed our search and discovery algorithms to ensure more authoritative content is surfaced and labeled prominently in search results and recommendations and begun reducing recommendations of borderline content and videos that could misinform users in harmful ways. Thanks to this change, the number of views this type of content gets from recommendations has dropped by over 50% in the U.S. While we welcome external research, this study doesn’t reflect changes as a result of our hate speech policy and recommendations updates and we strongly disagree with the methodology, data and, most importantly, the conclusions made in this new research.”

Ribeiro says his team did not have any contact with YouTube while conducting the study, though he did note that the funding in part came from an academic award from Google, which owns the platform. He also says the study has gotten some backlash from some of the channels implicated, though he’s quick to note that was not his intention. “The paper isn’t about attributing responsibility or pointing fingers at anyone, but about measuring the strength of an effect, and it should primary be read that way,” he says. “That said, we hope our work can indeed serve as a starting point for starting to think about how to decrease this effect.”

One possibility would be to tweak algorithms slightly to create a more diverse range of content to recommend viewers, reflecting a wide range of viewpoints and mitigating the effect of the online “echo chamber,” as Ribeiro puts it. But he’s quick to acknowledge that there will likely be no one “silver bullet” to remedy the issue. Nonetheless, the researchers’ work confirms what many who study the far-right and extremist fringe groups have long hypothesized: that radicalization does not occur overnight, and is instead the culmination of a lengthy, extensive, and almost totally algorithm-driven process.

Wed., Aug. 28, 3:11 p.m.: This story has been updated to include comment from YouTube. 

In This Article: Alt-Right, YouTube

Newswire

Powered by
Arrow Created with Sketch. Calendar Created with Sketch. Path Created with Sketch. Shape Created with Sketch. Plus Created with Sketch. minus Created with Sketch.