Just as America was despondently staring down a 2022 of school closures, Covid-19 exposures, repeated quarantines, and testing shortages, lo, the universe delivered up a New Year’s miracle: Twitter permanently suspended Marjorie Taylor Greene’s personal account. No more can @mtgreenee wantonly tweet that the 2020 election was stolen, that Democrats are plotting a Communist takeover, that Fauci should be fired, or that Covid vaccines have led to large numbers of ignored deaths (the tweet that finally, belatedly, got her deplatformed). No longer can Greene egg on 465,000 followers to behave in a manner that endangers their health and the welfare of those hapless citizens who come afoul of them. In these dark times, such a piece of good news seems cause for jubilation.
Or does it? No sooner had Greene been suspended than she huffed off to GETTR — a social media platform run by a former Trump adviser — to flex that “Twitter is an enemy to America and can’t handle the truth,” adding that she’ll “show America we don’t need them and it’s time to defeat our enemies.” In short order, then, one began to wonder if maybe she was right, if maybe social media bans just create an elaborate game of whack-a-mole, sending offenders off to GETTR pastures where they can reach just as many like-minded Americans as they did before. Luckily, a number of studies have sought to suss out whether booting bad actors from social platforms is actually effective. The results are mixed but not entirely discouraging.
The least promising offering, from the May 2021 issue of the journal Information, Communication & Society, studied the effects of Facebook banning two high profile Covid-19 conspiracy theorists, David Ickle and Kate Shemirani. The study found that the conspiracy theories themselves did not seem to suffer from the deplatforming of two of their loudest cheerleaders, and that, in fact, there was ample evidence of a “compensatory ‘blowback’ reaction, whereby multiple new profiles representing the banned person were created” in order to perpetuate their disinformation campaigns. In the case of Ickle, for instance, 18 public Facebook pages using his name were created the year he was deplatformed, 10 of which were created that very month. Ickle himself may have been off lurking in the digital shadows, but unsurprisingly, his ideas lived on in so-called “minion” accounts.
Better news could be found in “Deplatforming the Far-Right: An Analysis of YouTube and BitChute,” co-authored this past June by a professor from Harvard’s Center for Internet & Society. The study focused on the reach of far-right channels that had decamped to BitChute after removal from YouTube, and found that “deplatforming is effective in minimizing the reach of disinformation and extreme speech, as alternative platforms that will allow this kind of content cannot mitigate the negative effect of being deplatformed on YouTube.” What is BitChute, exactly? No one knows, which goes a long way toward explaining this data.
GETTR’s not quite a banner name, either (though it does sound disturbingly pervy), so it was gratifying to check out Volume 5 of Proceedings of the ACM on Human-Computer Interaction, which contained “Evaluating the Effectiveness of Deplatforming as a Moderation Strategy,” a case study of what happened when three high-profile right-wingers — Alex Jones, Milo Yiannopoulos, and Owen Benjamin — were kicked off Twitter. Not only did the study find that this “significantly reduced the number of conversations about all three individuals on Twitter,” but it also led to a decline in “the overall activity and toxicity levels of supporters.” Removing someone from Twitter, the study found, had a direct and ameliorating effect.
All of which seems to indicate that though deplatforming Marjorie Taylor Greene may not make her toxic conspiracy theories go away, it could make her go away — at least to some extent. And that’s the best news I’ve heard all year.