Facebook Bans New Political Ads a Week Before Election Day - Rolling Stone
×
Home Politics Politics News

Facebook Excited to Announce Election-Week Pause in New Disinformation

For the week before the election, Mark Zuckerberg will take a stand for democracy by only allowing recycled bullshit on his platform

FILE - In this Wednesday, Oct. 23, 2019 file photo, Facebook CEO Mark Zuckerberg testifies before a House Financial Services Committee hearing on Capitol Hill in Washington on Facebook's impact on the financial services and housing sectors. Facebook CEO Mark Zuckerberg plans to throw his support behind international reforms that would require Silicon Valley tech giants to pay more tax in Europe. (AP Photo/Andrew Harnik, File)

Facebook CEO Mark Zuckerberg

Andrew Harnik/AP Images

In what CEO Mark Zuckerberg described as an effort to “protect our democracy,” Facebook on Thursday announced a series of meager safeguards aimed at preventing the spread of misinformation ahead of the election — or at least at insulating the company from criticism that it didn’t take sufficient action.

In a lengthy post, Zuckerberg wrote that Facebook will block campaigns from purchasing new political ads a week before the election. Facebook will also take various actions to prevent the spread of misinformation that could lead to voter suppression, such as creating a Voter Information Center to appear at the top of its News Feed, and removing or flagging posts with false information about the voting process.

Though campaigns will be unable to purchase new political ads from October 27th up until the election on November 3rd, existing political ads will remain in place, as will the ability of campaigns to target users with those ads. Zuckerberg rationalized this on Thursday by noting that these existing ads “will already be published transparently in our Ads Library so anyone, including fact-checkers and journalists, can scrutinize them.”

Placing the onus on journalists and users (or anyone but itself, really) has been a theme of Facebook’s approach to combating misinformation, which has come under heavy scrutiny since Russia co-opted the platform to influence the 2016 election. Though Facebook spent billions to beef up security, the spread of misinformation has exploded. “I would actually say we’re in a worse place because the amount of domestic misinformation is much more serious than what we saw in the lead-up to 2016,” Claire Wardle, executive director of misinformation nonprofit First Draft, recently told Rolling Stone.

Facebook’s response essentially has been to throw up its hands and point to the First Amendment, especially regarding highly visible political ads. When Rep. Alexandria Ocasio-Cortez (D-N.Y.) confronted Zuckerberg last October about Facebook’s decision not to fact-check these ads, the CEO said he believes “lying is bad” before arguing that users should “judge for themselves” whether information coming from a political campaign is accurate.

President Trump’s campaign has taken full advantage of Facebook’s laizzez-faire approach, routinely buying ads to spread disinformation about Joe Biden, from falsely accusing the former vice president of trying to bribe Ukraine, to, more recently, falsely claiming he wants to defund the police. Though Facebook’s independent network of fact-checkers unanimously agreed the latter accusation was false, the claim remained on the platform and, according to the Washington Post, was served at least 22.5 million times across over 1,400 ads, as of early August.

The new suite of changes Zuckerberg announced on Thursday does nothing to curtail the Trump campaign’s ability to spread disinformation on the platform, outside of preventing it from buying new ads in that seven-day window before Election Day. After November 3rd, it will once again be able to do so, even if the election results remain in the balance.

Newswire

Powered by
Arrow Created with Sketch. Calendar Created with Sketch. Path Created with Sketch. Shape Created with Sketch. Plus Created with Sketch. minus Created with Sketch.