What’s inside the Facebook whistleblower papers?

Company's response to vaccine misinformation raises questions about whether it prioritised division over users' health

Documents show employees warned managers that the social media platform was being used by 'problematic actors' to spread hate speech. Reuters
Powered by automated translation

Seventeen US news organisations and a separate consortium in Europe published dozens of stories on Monday meticulously detailing how Facebook has stoked political and ethnic violence, sowed division and kept investors in the dark amid a drop in teenage users.

These organisations scoured thousands of pages of internal company documents obtained by Frances Haugen, a former Facebook employee-turned-whistleblower who claims the company has sidestepped user safety in favour of profits.

Here is what's inside the Facebook papers:

Vaccine misinformation in the US

Facebook failed to enact most recommendations from a March study on the mitigation of the spread of vaccine misinformation on its platforms, and the changes it did make occurred too late, The Associated Press reported on Tuesday.

The company's response to the study raises questions about whether it prioritised division and user engagement over its users' health.

The recommendations included altering how posts about vaccines are ranked in news feeds and temporarily disabling comments on vaccine posts, but critics said Facebook failed to enact these changes because of how it could have affected profits.

“Very interested in your proposal to remove all in-line comments for vaccine posts as a stopgap solution until we can sufficiently detect vaccine hesitancy in comments to refine our removal,” one Facebook employee wrote on March 2.

But that suggestion went nowhere.

Some of the changes were not enacted until April, a month after the recommendations were made and during a crucial stage when Covid-19 vaccines were being distributed.

Company research in February found that 60 per cent of comments on vaccine posts were either anti-vaccine or vaccine hesitant, The Associated Press said.

Employees admitted they did not even know how to catch those kinds of comments and Facebook's lack of a policy for taking them down allowed users to bombard posts from news outlets and humanitarian organisations with negative comments.

Facebook instead began labelling posts about vaccines that described them as safe, which allowed the platform to continue receiving a high level of engagement.

Language gaps and fuelling ethnic violence

Amid user decline in the US and Western Europe, Facebook pushed for user growth outside these regions. But the company failed to anticipate the unintended consequences of registering millions of new users without the support staff and systems to curb hate speech and calls to violence, The Associated Press said.

In Afghanistan and Myanmar, extremist language has flourished due to a systemic lack of language support for content moderation. In Myanmar, it has been linked to atrocities committed against the country’s minority Rohingya Muslim population.

Facebook ranks Ethiopia — which has been embroiled in a civil war for the past year — as its highest-priority tier for countries at risk of conflict, but the company did little to limit posts inciting violence, CNN reported.

Documents viewed by CNN show employees warned managers that the social media platform was being used by “problematic actors” to spread hate speech.

Speaking before the UK Parliament on Monday, Ms Haugen likened the situations in Myanmar and Ethiopia to the “opening chapters of a novel that is going to be horrific to read".

Whistleblower accuses Facebook of causing harm before UK Parliament

Whistleblower accuses Facebook of causing harm before UK Parliament

Facebook also struggled to moderate content throughout the Middle East because it lacked language support. Across the region, algorithms failed to detect terrorist content while erroneously deleting non-violent Arabic content 77 per cent of the time, Politico reported.

The company's automated systems deleted about 2 per cent of hate speech in 2019, NBC News said.

Delayed response to US Capitol insurrection

Internal documents reviewed by Politico and CNN suggest Facebook belatedly enacted countermeasures leading up to the “Stop The Steal” movement that culminated in the January 6 insurrection.

Content delegitimising US elections fell into “harmful non-violating categories” that stopped short of breaking rules, leaving employees to scramble in response to the escalating violence at the US Capitol.

By midday on January 6, when rioters breached the Capitol, Facebook still had not turned on certain “break-the-glass” measures to limit misinformation, The Financial Times reported.

“How are we expected to ignore when leadership overrides research-based policy decisions to better serve people like the groups inciting violence today?” one employee posted on a January 6 message board in response to a memo from Facebook chief executive Mark Zuckerberg and chief technology officer Mike Schroepfer.

“Rank-and-file workers have done their part to identify changes to improve our platform but have been actively held back.”

Facebook algorithms sow division

In 2019, Facebook researchers created three dummy accounts to study the network's platform for recommending content on its News Feed, finding that all three accounts were recommended increasingly partisan and extreme content within days.

The research shows the tech company was aware that its algorithms predict content users would engage with, sowing division and leading users “down the path to conspiracy theories,” NBC News said.

Rank-and-file workers have done their part to identify changes to improve our platform but have been actively held back
Facebook employee

Despite trying to rejig its algorithm in 2018 to increase engagement, the company found it isolated users instead.

“We know that many things that generate engagement on our platform leave users divided and depressed,” one Facebook researcher wrote in a 2019 report.

The report specified the type of content users wanted to see more of, but Facebook ignored them for “business reasons".

Investors left in dark over drop in teenage users

A report analysed by Bloomberg showed Facebook is struggling to retain its key demographic: teenagers and young people, who view the platform as an “outdated network".

Time spent by US teenagers on Facebook dropped by 16 per cent year over year, fewer teenagers were signing up for the platform and young people were taking much longer to join Facebook than in years past.

Young adults “are choosing other apps to share day-to-day moments and life moments,” read one internal report obtained by Bloomberg.

Despite the decline in younger users, Facebook's audience has consistently grown for years and its market value is close to $1 trillion, meaning the shortcomings in its key demographic have been invisible to outsiders. Facebook does not break down its user numbers by age group.

Many Facebook and Instagram profiles are secondary accounts owned by one person, which Ms Haugen points to as evidence that Facebook misrepresents its numbers to advertisers.

Facebook's user growth and audience engagement are its two most important selling points for advertisers and investors.

Agencies contributed to this report

Updated: October 26, 2021, 3:18 PM