<a href="https://www.thenationalnews.com/world/2021/10/20/facebook-plans-name-change-as-trust-for-brand-falls-in-middle-east/" target="_blank">Facebook</a> is not being “fully forthcoming” and transparent to users, the technology company's independent Oversight Board said in its latest report on Thursday. The <a href="https://scontent.fykz1-1.fna.fbcdn.net/m1/v/t0.41931-6/An_SieDzRxmkwChl2RsnXhvhnTW0JDTT6FvN4oTC3_jNiSjlSxLA5qG053VPQ_n79Bu5sOzX7hzVI0ROPFWlMX1OjtlK8EGtX5cdDZjtYxK1-R_BJ0WLx8lzsZ3rPVKsr16YrH-eb41gChD7lWQauLjr2g?ccb=10-5&oh=59a6a64c82a0a1472dff39f3bb2070a1&oe=6177406F&_nc_sid=340960" target="_blank">report</a>, which covered the fourth quarter of 2020 and the first half of this year, said transparency is clearly an area where Facebook is “falling short and must urgently improve”. The Oversight Board, which was established in November 2018 to promote free expression on Facebook and Instagram, said the social media giant is unclear about how it exempted some high-profile users from its rules. “The team tasked with providing information has not been fully forthcoming on cross-check. On some occasions, Facebook failed to provide relevant information to the board, while in other instances, the information it did provide was incomplete,” the board said. When Facebook referred the case related to former US president Donald Trump to the board, it did not mention the cross-check system, it added. “Given that the referral included a specific policy question about account-level enforcement for political leaders, many of whom the board believes were covered by cross-check, this omission is not acceptable.” “Facebook only mentioned cross-check to the board when we asked whether Mr Trump’s page or account had been subject to ordinary content moderation processes,” the board said. Facebook and its billionaire founder and chief executive <a href="https://www.thenationalnews.com/business/money/2021/10/05/mark-zuckerbergs-net-worth-drops-by-7bn-as-facebook-shares-plummet/" target="_blank">Mark Zuckerberg</a> has come under increasing criticism over the company’s practices and policies since whistle-blower <a href="https://www.thenationalnews.com/business/2021/10/05/facebook-fuelling-world-violence-whistle-blower-testifies/" target="_blank">Frances Haugen</a> testified before Congress on October 5. Ms Haugen, who began working for the company in 2019 and resigned in April 2021, leaked internal documents to <i>The Wall Street Journal</i>, the Securities and Exchange Commission, Congress and other news outlets. The former Facebook employee told a Senate commerce subcommittee hearing that Facebook algorithms promote posts with high levels of engagement, often pushing harmful or divisive content to users. The Oversight Board said it is creating recommendations for how to improve Facebook. Pushing Facebook to be more transparent, to treat users fairly and to honour its human rights commitments is a long-term effort, the board said. “We have consistently seen users left guessing about why Facebook removed their content,” the board said. “Our recommendations have repeatedly urged Facebook to follow some central tenets of transparency … make your rules easily accessible in your users’ languages, tell people as clearly as possible how you make and enforce your decisions and, where people break your rules, tell them exactly what they have done wrong.” The board said Facebook is answering most of its questions, but not all of them. Of the 156 questions sent, Facebook answered 130, partially answered 12 and declined to answer 14. Meanwhile, Facebook and Instagram users submitted 524,000 cases to the board between October last year and the end of June. User appeals increased in each quarter. There were 114,000 cases in the fourth quarter of 2020, 203,000 cases in the first quarter of this year and nearly 207,000 cases in the second quarter. “Having received over half a million appeals up until the end of June, we know these cases are just the tip of the iceberg. Right now, it’s clear that by not being transparent with users, Facebook is not treating them fairly,” the board said. The board said nearly 36 per cent cases related to content concerning Facebook’s rules on hate speech, followed by bullying and harassment (31 per cent), violence and incitement (13 per cent), adult nudity and sexual activity (9 per cent) and dangerous individuals and organisations (6 per cent). Nearly half of the cases (46 per cent) came from the US and Canada, while 22 per cent came from Europe, 16 per cent from Latin America and the Caribbean and 4 per cent from the Middle East and North Africa.