YouTube doesn’t exactly have a sterling reputation among parents these days. From reports of trolls gaming the algorithm to target kids with disturbing or violent content to parents freaking out over the (most likely extremely fake) “Momo” suicide challenge to the mere existence of the Paul brothers, YouTube is known for being something of a content cesspool. That reputation was only bolstered by recent reports that pedophiles were lurking in the comments section of kids’ videos, swapping contact info and even posting time-stamps where kids were featured in compromising poses.
In the wake of such bad press, YouTube is taking drastic measures to solve the problem: On Thursday, the video-sharing platform announced in a blog post that it would be disabling comments for all videos featuring minors, ostensibly as a way to protect kids from predators on the platform.
“We recognize that comments are a core part of the YouTube experience and how you connect with and grow your audience,” the blog post said. “At the same time, the important steps we’re sharing today are critical for keeping young people safe.”
YouTube CEO Susan Wojniacki also issued a statement on Twitter explaining the change:
Recently, there have been some deeply concerning incidents regarding child safety on YouTube. Nothing is more important to us than ensuring the safety of young people on the platform. More on the steps we're taking to better protect children & families: https://t.co/5ZYaMrMpsI
— Susan Wojcicki (@SusanWojcicki) February 28, 2019
The blog post said that over the past week, YouTube had quietly disabled comments on “tens of millions of videos that could be subject to predatory behavior,” though it’s unclear what exactly that criteria entails. The post also stated that while a “small number of creators” would be allowed to have comments sections on their videos, those creators would have to “actively moderate their comments” beyond YouTube’s existing moderation tools, and would have to “demonstrate a low risk of predatory behavior.” In an email to Rolling Stone, a spokesperson for YouTube reiterated their blog post: “Over the past week, we’ve been taking a number of steps to better protect children and families, including suspending comments on tens of millions of videos. Now, we will begin suspending comments on most videos that feature minors, with the exception of a small number of channels that actively moderate their comments and take additional steps to protect children.
Again, it’s not totally clear from the post itself how creators would moderate these comments, nor is it clear what the standards are for demonstrating a low risk of predatory behavior.
What is clear, however, is that this is not a new problem. Back in 2017, the Times of London reported that ads for major brands like Adidas, eBay and Amazon ran on videos featuring scantily clad young women that had garnered hundreds of lascivious comments, prompting said advertisers to yank ad money from YouTube (and YouTube to subsequently remove 150,000 videos from the platform).
Also that year, tech writer James Bridle published an essay on Medium documenting how content creators exploit YouTube’s algorithm to market violent and disturbing content to children, such as videos of popular characters like Peppa Pig engaging in violent acts. In his essay, Bridle called out YouTube for failing to protect children from such content. In response, YouTube scrubbed many videos from the platform, and promised to hire and train employees to screen kids’ content.
Earlier this month, vlogger Matt Watson released a video alleging that YouTube was host to a “soft-core pedophile ring,” claiming that the platform turned a blind eye to adult users leaving sexualized comments on children’s videos and even swapping child porn with other users. The video, which garnered more than 3 million views, led to an onslaught of negative press, prompting advertisers like Disney, McDonald’s and Hasbro to pull ads from the platform.
Although YouTube’s new policy is clearly an attempt to remedy its latest PR disaster, it’s already sparking furor among its loyal base of content creators, many of whom rely on the comments sections to connect with other YouTubers. (It’s also worth noting that many popular YouTubers are under the age of 18, as is the audience for the platform: according to data from Google, which owns the platform, 8 out of 10 18-to-49-year-olds in the United States watch YouTube.) Indeed, vloggers have already been posting videos angrily lashing out at YouTube for disabling their comments sections, claiming that vloggers are being punished for the actions of a small minority of predators:
But in the absence of moderators obsessively screening every single video on YouTube (a system, by the way, that hasn’t worked out super well for Facebook), it seems that the concerns of YouTube users are taking a backseat to those of advertisers, who provide Google with nearly $3.9 billion in ad revenue, according to Statista. While it may be an unpopular decision, disabling comments for videos aimed at children seems to be the quickest way for YouTube to do damage control — and keep the money coming in at the same time. “We understand that comments are an important way creators build and connect with their audiences, we also know that this is the right thing to do to protect the YouTube community,” the spokesperson told Rolling Stone.
Update: This story has been updated to include comment from YouTube.