Seventeen US news organisations and a separate consortium in Europe <a href="https://www.thenationalnews.com/business/technology/2021/10/25/facebook-papers-reveal-deep-conflict-between-profit-and-people/" target="_blank">published dozens of stories</a> on Monday meticulously detailing how Facebook has stoked political and ethnic violence, sowed division and kept investors in the dark amid a drop in teenage users. These organisations scoured thousands of pages of internal company documents obtained by <a href="https://www.thenationalnews.com/business/2021/10/05/facebook-fuelling-world-violence-whistle-blower-testifies/" target="_blank">Frances Haugen</a>, a former <a href="https://www.thenationalnews.com/business/technology/2021/10/25/facebook-q3-profit-up-17-despite-multiple-controversies/" target="_blank">Facebook</a> employee-turned-whistleblower who claims the company has sidestepped user safety in favour of profits. Here is what's inside the Facebook papers: Facebook failed to enact most recommendations from a March study on the mitigation of the spread of vaccine misinformation on its platforms, and the changes it did make occurred too late, The Associated Press reported on Tuesday. <a href="https://www.thenationalnews.com/world/us-news/2021/07/16/covid-19-misinformation-on-social-media-is-killing-people-biden-says/" target="_blank">The company's response</a> to the study raises questions about whether it prioritised division and user engagement over its users' health. The recommendations included altering how posts about vaccines are ranked in news feeds and temporarily disabling comments on vaccine posts, but critics said <a href="https://www.thenationalnews.com/world/europe/facebook-fails-to-clamp-down-on-covid-19-misinformation-in-europe-1.1208168" target="_blank">Facebook failed to enact these changes</a> because of how it could have affected profits. “Very interested in your proposal to remove all in-line comments for vaccine posts as a stopgap solution until we can sufficiently detect vaccine hesitancy in comments to refine our removal,” one Facebook employee wrote on March 2. But that suggestion went nowhere. Some of the changes were not enacted until April, a month after the recommendations were made and during a crucial stage when Covid-19 vaccines were being distributed. Company research in February found that 60 per cent of comments on vaccine posts were either anti-vaccine or vaccine hesitant, The Associated Press said. Employees admitted they did not even know how to catch those kinds of comments and Facebook's lack of a policy for taking them down allowed users to bombard posts from news outlets and humanitarian organisations with negative comments. Facebook instead began labelling posts about vaccines that described them as safe, which allowed the platform to continue receiving a high level of engagement. Amid user decline in the US and Western Europe, Facebook pushed for user growth outside these regions. But the company failed to anticipate the unintended consequences of registering millions of new users without the support staff and systems to curb hate speech and calls to violence, The Associated Press said. In Afghanistan and Myanmar, extremist language has flourished due to a systemic lack of language support for content moderation. In Myanmar, it has been linked to atrocities committed against <a href="https://www.thenationalnews.com/world/asia/rohingya-crisis-from-the-killing-fields-to-exile-without-hope-1.766534" target="_blank">the country’s minority Rohingya Muslim population</a>. Facebook ranks <a href="https://www.thenationalnews.com/world/africa/2021/10/22/ethiopia-air-strikes-hit-tigray-for-fourth-time-in-a-week/" target="_blank">Ethiopia</a> — which has been embroiled in a civil war for the past year — as its highest-priority tier for countries at risk of conflict, but the company did little to limit posts inciting violence, CNN reported. Documents viewed by CNN show employees warned managers that the social media platform was being used by “problematic actors” to spread hate speech. Speaking before the UK Parliament on Monday, Ms Haugen likened the situations in Myanmar and Ethiopia to the “opening chapters of a novel that is going to be horrific to read". Facebook also struggled to moderate content throughout the <a href="https://www.thenationalnews.com/world/uk-news/2021/09/15/facebook-told-to-get-independent-reviewer-amid-claims-of-bias-on-israel-palestine-content/" target="_blank">Middle East</a> because it lacked language support. Across the region, algorithms failed to detect terrorist content while erroneously deleting non-violent Arabic content 77 per cent of the time, Politico<i> </i>reported. The company's automated systems deleted about 2 per cent of hate speech in 2019, NBC News said. Internal documents reviewed by Politico and CNN suggest Facebook belatedly enacted countermeasures leading up to the “Stop The Steal” movement that culminated in the <a href="https://www.thenationalnews.com/world/us-news/2021/10/14/us-insurrection-investigators-to-vote-on-holding-trump-ally-steve-bannon-in-contempt/" target="_blank">January 6 insurrection</a>. Content delegitimising US elections fell into “harmful non-violating categories” that stopped short of breaking rules, leaving employees to scramble in response to the escalating violence at the US Capitol. By midday on January 6, when rioters breached the Capitol, Facebook still had not turned on certain “break-the-glass” measures to limit misinformation, <i>The Financial Times </i>reported. “How are we expected to ignore when leadership overrides research-based policy decisions to better serve people like the groups inciting violence today?” one employee posted on a January 6 message board in response to a memo from Facebook chief executive Mark Zuckerberg and chief technology officer Mike Schroepfer. “Rank-and-file workers have done their part to identify changes to improve our platform but have been actively held back.” In 2019, Facebook researchers created three dummy accounts to study the network's platform for recommending content on its News Feed, finding that all three accounts were recommended increasingly partisan and extreme content within days. The research shows the tech company was aware that its algorithms predict content users would engage with, sowing division and leading users “down the path to conspiracy theories,” NBC News said. Despite trying to rejig its algorithm in 2018 to increase engagement, the company found it isolated users instead. “We know that many things that generate engagement on our platform leave users divided and depressed,” one Facebook researcher wrote in a 2019 report. The report specified the type of content users wanted to see more of, but Facebook ignored them for “business reasons". A report analysed by Bloomberg showed Facebook is struggling to retain its key demographic: <a href="https://www.thenationalnews.com/business/2021/09/30/us-senators-grill-facebook-over-leaked-report-depicting-instagram-as-harmful-to-teens/" target="_blank">teenagers and young people</a>, who view the platform as an “outdated network". Time spent by US teenagers on Facebook dropped by 16 per cent year over year, fewer teenagers were signing up for the platform and young people were taking much longer to join Facebook than in years past. Young adults “are choosing other apps to share day-to-day moments and life moments,” read one internal report obtained by Bloomberg<i>.</i> Despite the decline in younger users, Facebook's audience has consistently grown for years and its market value is close to $1 trillion, meaning the shortcomings in its key demographic have been invisible to outsiders. Facebook does not break down its user numbers by age group. Many Facebook and Instagram profiles are secondary accounts owned by one person, which Ms Haugen points to as evidence that Facebook misrepresents its numbers to advertisers. Facebook's user growth and audience engagement are its two most important selling points for advertisers and investors. <i>Agencies contributed to this report</i>