YouTube Announces Ban on Extremist, Neo-Nazi Content - Rolling Stone
×
Home Culture Culture News

YouTube to Ban Videos Promoting Extremist, Supremacist Views

Policy change will target clips glorifying Nazism, denying tragedies like Holocaust, Sandy Hook shooting

Aytac Unal/Anadolu Agency/Getty Images

YouTube has announced a new plan to ban and remove videos promoting extreme views such as white supremacy, sexism and racism in its latest attempt to rid the the site of hate speech.

The company, which is owned by Google, detailed their plan in a blog post Wednesday, writing that the new rules will specifically prohibit “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status. This would include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory.”

YouTube added that the new policy will also target videos that try to deny that “well-documented violent events, like the Holocaust or the shooting at  Sandy Hook Elementary, took place.”

Over the past few years, YouTube, along with social media giants like Twitter and Facebook, have faced criticism for not doing enough to stymie the spread of hate speech, extremism and “fake news” on their platforms. The tech giants have enacted some changes in response, like when Facebook followed a 2018 ban of Alex Jones and InfoWars by similarly de-platforming extremists like Laura Loomer, Milo Yiannopoulos and Louis Farrakhan in May.

YouTube, for its part, has previously prohibited ads from running on videos that promote hateful content, while in 2017 they introduced polices meant to curb the spread of videos that promoted such content by limiting or disabling sharing or commenting features. But up until this point, the site has not taken steps to outright ban certain videos or personalities (YouTube has not specifically named any pages, videos or personalities that will be de-platformed under the new policy).

Along with its decision to remove hateful and supremacist content, YouTube said it will continue to improve its powerful recommendations algorithm, which has been accused of leading viewers to more extreme content in order to keep them on the site. In January, the company announced an update to the algorithm meant to target the spread of “borderline content” — videos that may include misinformation but don’t actually violate the company’s policies — and they now claim the tweak has caused recommendations to such videos to drop by over 50 percent in the United States.

YouTube said that the algorithm will continue to get better at regulating borderline content, while they also plan to bring the new algorithm to more countries by the end of the year. Along with trying to cut the flow of misinformation, YouTube said they will start promoting more authoritative sources in the “watch next” panel following a piece of borderline content.

YouTube also plans to strengthen its advertising policies and will suspend channels that “repeatedly brush up against our hate speech policies” from the YouTube Partner Program, meaning they will no longer be able to run ads or or use other monetization features.

Newswire

Powered by
Arrow Created with Sketch. Calendar Created with Sketch. Path Created with Sketch. Shape Created with Sketch. Plus Created with Sketch. minus Created with Sketch.