“False flag” allegations in the aftermath of mass shootings have become something of an Alex Jones franchise. His conspiracy theories about the massacres at Tucson, Arizona; Aurora, Colorado; and Newtown, Connecticut were broadcast on terrestrial radio, streamed on Jones’ website, and, most importantly from an audience standpoint, chopped into bits for chumming views on YouTube. Last week, following the school shooting in Parkland, Florida, a YouTube clip on Jones’ channel, which suggested the student protesters from Marjory Stoneman Douglas were carpet-bagging “crisis actors,” became the top trending video on the site. But this time, YouTube did something different: It sanctioned Jones’ channel with a “strike” for violating the site’s community content standards. A few days later, a video titled, “David Hogg Can’t Remember His Lines In TV Interview” – which similarly questioned the authenticity of the 17-year-old student activist – earned Jones’ channel a second strike. According to the company’s policy, one more strike within the next three months and Jones’ channel would be permanently banned from the streaming service.
For a decade, Jones’ YouTube channel has been the engine behind Infowars, his Austin-based peddler of rightwing news, batshit-populism, conspiracy analysis and dubious nutritional supplements. The channel’s value has less to do with its long subscriber list – currently at 2.2 million – than as a vortex for sucking in a steady stream of newbies. Thanks to YouTube’s carefully designed algorithms, Jones was a regular in “recommended” and “up next” lists, transfixing online souls with red-faced rants about the New World Order, feminizing juice boxes and DMT elves. When I interviewed Jones two years ago at the Republican National Convention, he twice mentioned the YouTube milestone coming into view. “We’re going to hit a billion views soon,” he said. “We’re reaching more people every day.”
Jones has responded to YouTube’s threats with a mix of bravado, victimhood and panic. The strikes, he said, are part of a “CNN lobbying campaign.” He’s launched a fundraising marathon, issued familiar denunciations of his would-be globalist censors and announced a series of legal actions “to defend the First Amendment.” But Jones knows as well as anyone that YouTube’s actions have nothing to do with his “revolution.” Just like Infowars, Google-owned YouTube is a company dedicated to making money; and for a long time the two worked in peaceful symbiosis to help each other make an awful lot of it.
For years, YouTube profited off all kinds of extremist content; its three-strike policy was directed at copyright infringement. Its current and newly aggressive posture towards content stems from the advertiser revolt that erupted following Trump’s surprise victory. Within weeks of the 2016 election, brands like Johnson & Johnson, and ad-tech companies like AppNexus, began taking steps to distance themselves from Breitbart and other purveyors of “fake news” and extremist content. In early 2017, companies like Starbucks and Walmart started pulling their ads from YouTube, worried that their marketing was sandwiched between clips featuring foaming-at-the-mouth racists and child abusers. In a watershed moment, the global buying agency Havas pulled its ads from Google/YouTube U.K., after the Times of London detailed how ads for well-known charities were supporting Neo-Nazi articles and videos. When the influential research group Pivotal downgraded Google stock from a buy to a hold, Google suddenly grew concerned about the kind of content its proprietary algorithms had been promoting for years – intentionally and by design.
This is not a conspiracy theory worthy of a “strike,” but the testimony of a former YouTube engineer named Guillaume Chaslot, who was profiled by the Guardian in early February. Chaslot, a Ph.D. in artificial intelligence, explained how his team at YouTube was tasked with designing algorithms that prioritized “watch time” alone. “Watch time was the priority,” he told the paper. “Everything else was considered a distraction… There are many ways YouTube can change its algorithms to suppress fake news and improve the quality and diversity of videos people see… I tried to change YouTube from the inside, but it didn’t work.”
When Chaslot conducted an independent study of how his algorithms worked in the real world, he found that during recent elections in France, Germany and the U.K., YouTube “systematically amplifie[d] videos that are divisive, sensational and conspiratorial.” (His findings can be seen at Algotransparency.org.) At the height of the advertising revolt, in March of last year, YouTube announced that it was “taking a hard look at our existing community guidelines to determine what content is allowed on the platform – not just what content can be monetized.” CEO Susan Wojcicki announced the company would hire thousands of human moderators to watch and judge all content on the site.
YouTube’s new policies were part of an industry-wide course correction. Over the past year, under the banner of combatting hate speech and fake news, Google and Facebook began to cut off search traffic and monetized content-creator accounts, not only to dangerous scam-artists like Jones, but to any site that garnered complaints or didn’t meet newly enforced enforced and vaguely defined criteria of “credible” and “quality.”
A number of left-leaning sites of substance, including AlterNet.org and Truthout.org, along with mainstream human rights groups like Amnesty International, were affected. Although “conspiracies,” like allegations of CIA drug trafficking, tend to be better sourced than “Pizzagate,” the algorithm didn’t seem to make a distinction. As a result, these sites saw their traffic fall off a cliff. “Many of the largest progressive news organizations watched their online audiences from Google and Facebook decline as much as 60 percent in 2017,” says Jan Frel, associate publisher of AlterNet.org. “And they’re still falling.”
Jones” many critics might keep this in mind as they indulge in some easy schadenfreude. Beneath all the noise, an Internet designed to calm and please advertisers is quietly kneecapping small and independent news sites. It is being sold under the guise of fighting “fake news” and Russian disinformation, and any focus on Alex Jones conveniently deflects attention from the larger implications of this shift. As loathsome and dangerous as Jones’ schtick has become, it would be a mistake to think of his newly “public-minded” enemies in Big Tech as benevolent protectors.
Which isn’t to say anyone should weep for Alex Jones, either, or come to his defense. He’s guilty as charged, and he’ll absorb the loss of his YouTube channel by selling right-wing rage and virility pills on terrestrial radio and the Infowars website. Given Jones’ record of being ahead of the technology curve, he might even find a new outlet for influence that the rest of us have yet to figure out.