Facebook Papers: Revelations From Leaked Documents - Rolling Stone
×
×
Home Politics Politics Features

10 Shocking Revelations From the Facebook Papers

Thousands of internal documents leaked by Frances Haugen have inspired a bonanza of reporting on the beleaguered social media giant. Here’s what you may have missed

FILE - This Oct. 25, 2019 file photo shows Facebook CEO Mark Zuckerberg speaking at the Paley Center in New York. Zuckerberg donated $400 million to help fund election offices as they scrambled to deal with the coronavirus pandemic late last summer. At least eight GOP-controlled states have passed bans on donations to election offices this year as Republicans try to block outside funding of voting operations. (AP Photo/Mark Lennihan, File)

Facebook CEO Mark Zuckerberg speaks at the Paley Center in New York, on Oct. 25, 2019.

Mark Lennihan/AP

Facebook has had a rough couple of months.

It started in early September when The Wall Street Journal began publishing a series of reports based on tens of thousands of internal documents a whistle-blower named Frances Haugen had turned over to the paper. “The Facebook Files,” as the stories were dubbed, revealed the extent to which the company was aware of the damage its platforms were doing to everything from the push to get Americans vaccinated to the self-esteem of teenage girls. Misinformation was spreading. Hate speech was rampant. People were even using Facebook to sell human organs. The leaked documents make clear that the company didn’t care much to stop it and, in some cases, that it deliberately covered up damning internal research laying bare just how toxic the social network had become.

“Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place,” Facebook said after Haugen appeared on 60 Minutes earlier this month. “We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”

On Sunday, Ben Smith of The New York Times reported that days after Haugen appeared on 60 Minutes, she agreed to share the trove of documents with 17 other news outlets, provided they agreed to hold any stories until a mutually agreed-upon date. These outlets spent the ensuing weeks not only combing through the leaked information, but by conducting their own reporting into Facebook’s practices. The embargo has now been lifted, and on Monday the internet exploded with bombshell stories about the social media behemoth, many of them based on the documents Haugen provided.

It’s been a little overwhelming. Here’s what you may have missed:

1. Facebook ignored internal complaints about the spread of posts with political misinformation leading up to the 2020 election

According to a New York Times report published on Friday, freshly surfaced documents show that Facebook has repeatedly ignored internal complaints about political posts packed with misinformation and conspiracy theories from before and after the 2016 election. While Facebook has publicly celebrated itself for policing the platform in protection of users, whistleblowers allege that a lot was swept under the rug as executives at the company turned a blind eye — presumably, they say, with the goals of letting misinformation spread and keeping site traffic high.

A few days before Election Day last November, one employee sounded an internal alarm to let team members know that “comments with the most incendiary election misinformation were being amplified to appear at the top of comment threads,” according to the Times. Then, on November 9th, a Facebook data scientist messaged colleagues with an eyebrow-raising stat: 10 percent of the U.S. users posting political content were claiming that Joe Biden’s win was fraudulent and that the process was rigged to keep Donald Trump from returning to office. Additionally, some of these posts seemed to aim to incite violence. 

2. Facebook didn’t do enough to tamp down Stop the Steal groups ahead of the insurrection: “Enforcement was piecemeal”

After Trump-supporting rioters stormed the U.S. Capitol on January 6th, some employees complained about Facebook being ill-prepared to react with the necessary expediency. The New York Times recently reported that several messages from employees cited “dozens” of Stop the Steal groups pushing lies about the election that were active up until the insurrection. “I’ve always felt that on the balance my work has been meaningful and helpful to the world at large,” one message read. “But, honestly, this is a really dark day for me here.”

An internal review in March assessing the company’s handling of Stop the Steal groups found that “enforcement was piecemeal.”

3. Facebook watched as QAnon conspiracy theories mushroomed

Long before the election, employees noticed — and raised red flags over — the dangers of radicalization on Facebook, documents reportedly say, and years went by without change substantial enough to slow the snowball effect. In the summer of 2019, one of the company’s researchers created a fake account for an imaginary conservative woman residing in North Carolina, named Carol Smith, NBC News reported last week. The fictitious 41-year-old’s profile had no photo but plenty of interests, including Fox News, parenting, and Christianity. Within days, Facebook began recommending QAnon-related pages to “Smith.” In less than a month, her feed “became a constant flow of misleading, polarizing and low-quality content,” the researcher wrote,” according to the researcher, who left Facebook altogether about a year after creating the account — just around the time that Facebook started noticeably regulating and restricted the spread of QAnon content.

4. Facebook “routinely makes exceptions for powerful actors when enforcing content policy”

“Facebook routinely makes exceptions for powerful actors when enforcing content policy,” a data scientist wrote in documents created for a Facebook presentation, which Politico reported on Monday. This analyst also pointed out that “final calls about content policy are routinely made by senior executives … sometimes Mark Zuckerberg.” They added that “it’s unclear why executives would be consulted” and questioned “if there was an unwritten aspect to our policies, namely to protect sensitive constituencies.” The analyst referenced “many” communications with colleagues on Facebook’s content policy team, which is based in Washington, D.C., who feel “pressure to ensure their recommendations align with the interests of policymakers.”

According to information obtained by Politico, the lobbying and government relations team overseen by former Republican operative Joel Kaplan regularly weighs in on comms-related issues when handling right-wing figures and ads from former President Donald Trump, as well as “the aftermath of the George Floyd protests in June 2020.” Facebook claims that Kaplan’s group is “just one of many” that Facebook consults in making content decisions.

5. Mark Zuckerberg bowed to Vietnam’s censorship demands

The Washington Post reported on Monday that in 2020 Mark Zuckerberg agreed to clamp down on rhetoric critical of the Vietnamese government after the Communist Party in charge of the country asked him to. If he didn’t comply, Facebook risked losing the estimated $1 billion in revenue generated by the company’s presence in the Southeast Asian nation. Facebook’s own transparency report revealed the impact of the decision. The company blocked 834 posts by Vietnamese users in the first half of 2020, compared with more than 2,200 in the second half.

Zuckerberg reportedly defended the decision by arguing that depriving the Vietnamese people of Facebook entirely would be worse for free speech than censoring them. But according to the advocates and activists interviewed by the Post, Facebook gave “the government near-total control over the platform.”

6. Documents show Facebook didn’t do much to stop the spread of violent rhetoric around the Ethiopian civil war

For over a year, Ethiopia has been suffering through a turbulent civil war, and Facebook’s neglect of the African nation has allegedly been making matters worse. Documents reviewed by CNN show that the company has failed to equip local teams with the necessary staffing-related resources to navigate such a fragile period, despite multiple instances of employees alerting the company to a variety of foreign organizations and armed groups spreading hate and violence-inciting content. When Haugen came forward, she specifically called out “how badly Facebook is handling places like Ethiopia” as one of her reasons for doing so, adding that “the raw version [of Facebook] roaming wild in most of the world doesn’t have any of the things [related to online safety and curation] that make it kind of palatable in the United States” and expressing fear for “a lot of lives on the line.”

Facebook has publicly highlighted its Ethiopia issues as a problem and priority, but the company doesn’t seem to be moving fast enough or with strong-enough strategies. While, for one example, Facebook gatekeepers have recommended the takedown of official accounts supporting the Fano militia — a group at the center of many controversies including killings, lootings, and rape, according to CNN — other employees say individuals promoting Fano messaging are still barreling through.

In the summer of 2020, one employee shared a report internally that “found significant gaps” in how Facebook monitors Ethiopian goings-on, detects hate speech, and flags misinformation. Still, Haugen argues that Facebook has only offered “’even slight language support’ in two of the country’s many native languages,” as CNN reported. One researcher, Berhan Taye, penned an open letter begging Facebook to properly staff its local team with enough culturally knowledgeable people to properly respond to hours of triggering footage created by the area’s millions of daily users. Taye said little has changed in over a year.

7. Apple almost booted Facebook and Instagram from the App Store because it wasn’t doing enough to stop human trafficking in the Middle East

Facebook was being used to buy and sell maids in the Middle East who were then abused, a human trafficking violation flagged by employees that Facebook didn’t do much to correct. The Wall Street Journal reported on the issue in September, and the Associated Press followed it up with a new report on Monday. It got so bad, the outlets noted, that Apple threatened to pull Facebook and Instagram from the App Store. Facebook admitted in internal documents that it was “under-enforcing on confirmed abusive activity,” and the AP notes that it still isn’t hard to find listings selling women for domestic help on the platform.

8. Facebook allowed Arabic hate content to spread across its platforms … a lot

At the end of 2020, Facebook realized it had a very real algorithm problem: Only 40 percent of Arabic hate content was being detected proactively. It was even worse on Instagram, which detected only six percent despite having 95 percent “more actioned hate speech violations” than Facebook in the Middle East and North America (MENA), according to an internal report created by a Facebook employee. This goes beyond posts and stretches into online advertisements as well, according to the report, which cites “a lot” of harmful ads targeted at women and the LGBTQ community. These ads, however, were rarely flagged by Facebook in MENA territories. Meanwhile, Facebook was incorrectly detecting terrorist content across the region, resulting in the deletion of non-violent Arabic content 77 percent of the time, according to a Politico report published Monday.

9. Iraqi militias have been posting child nudity on rivals’ Facebook pages

Some supporters of the Sunni and Shia militias have been posting graphic images like that of child nudity on their rivals’ Facebook pages with the goal of getting said adversaries’ accounts shut down, Politico reported on Monday. In another post reviewed by Politico, Islamic State fighters used a photo of Mark Zuckerberg to help a post applauding the killing of 13 Iraqi soldiers fly under the company’s radar. It’s unclear if Facebook has done anything to respond to these specific incidents.

10. Facebook has been allowing anti-Muslim hate content to spread throughout India

When religious protests flooded India, Facebook’s biggest market, in late 2019, “inflammatory content” primarily targeting Muslims on Facebook surged by 300 percent, The Wall Street Journal reported on Saturday. Documents show that Facebook researchers recommended the company remove one of two Hindu nationalist groups for posting hateful, anti-Muslim content — and that Facebook didn’t listen. (The other wasn’t deemed dangerous enough for removal, due to “political sensitivities,” despite things like “misinformation claiming the Quran calls for men to rape their female family members.” Facebook rep Andy Stone responded to the Journal with a statement about the company’s “careful, rigorous, and multidisciplinary” banning processes.

Much like Ethiopian researchers, Indian colleagues asked for more resources to detect and police hate-spreading users, according to the Journal, which cited another report wherein researchers flagged a different Hindu nationalist organization, the Bajrang Dal, for allegedly using the Facebook-owned WhatsApp to “organize and incite violence.” A spokesperson for the Bajrang Dal denied these claims to the Journal, before asking “If they say we have broken the rules, why haven’t they removed us?”

Politico reported Monday on documents from an internal Facebook presentation, wherein one data scientist referenced an Indian politician who “regularly posted hate speech” but “was exempted by the Indian Public Policy team from normal punishment explicitly for political considerations.” A Facebook spokesperson “would only goes as far as saying that this ‘wasn’t the sole factor,’” the analyst wrote.

Newswire

Powered by
Arrow Created with Sketch. Calendar Created with Sketch. Path Created with Sketch. Shape Created with Sketch. Plus Created with Sketch. minus Created with Sketch.