A decade ago it was fashionable to talk about the social media platform Facebook as if it were a country. Commentators measured its active user base against the giant and growing populations of China and India to convey some sense of its <a href="https://www.thenationalnews.com/arts-culture/books/signing-online-increasingly-means-signing-away-your-privacy-1.459776">scale, reach and power</a>. Back then Facebook had around 850 million users, now it is nearer 3 billion people. Both China and India, by comparison, are home to around 1.4bn people each today. With such vast and sudden “population” explosions on social media, we worried about what these platforms were doing with all the data they were harvesting. Despite those serious concerns, social media was also viewed as a positive agent of change. The role that <a href="https://www.thenationalnews.com/uae/facebook-and-twitter-key-to-arab-spring-uprisings-report-1.428773">Facebook, Twitter and others</a> played in the 2011 Arab uprisings in gathering young people together was regularly cited as an example of the powerful catalyst these platforms could be. In those days, Facebook’s log-in page featured the “it’s free and always will be” strapline. The phrase was key to explaining the platform’s popularity. The quid pro quo, such as it was, lay somewhere between being able to use a dazzling application without cost and surrendering some personal data to big tech. Active users across all social media platforms in 2022 number closer to 5bn people, now equivalent to the entire population of Asia, and the intervening years have turned the so-called digital town square into a challenging space. The quid pro quo seems more inequitable than ever. Part of that is because the combination of social media and smartphones has made us into discreet, addictive and private beings. It is possible to spend hours on these platforms with an algorithm serving you an inexhaustible feed of machine-generated content. Online giant Amazon’s recommendation algorithm famously started outperforming human editor’s picks a long time ago because it used a filtering system that was based on links between products rather than customers. It learnt that if you were searching for a copy of, say, F Scott Fitzgerald’s <i>The Great Gatsby</i>, there was a <a href="https://www.thenationalnews.com/arts-culture/books/viktor-mayer-schonberger-more-data-is-being-collected-and-stored-about-each-one-of-us-than-ever-before-1.459310">good chance</a> you’d also be interested in works by Ernest Hemingway. It didn’t have to understand why customers might be interested in both authors, it just had to know that there was a correlation between products by those authors. Similarly, social media doesn’t have to interrogate why I may be searching for and interested in content about a particular subject, they just have to feed that need regardless of whether it is good for me or not. Some people used to think social media would allow all of us to be exposed to a diverse range of opinions, but it has become far too easy to be caught in an echo chamber, consuming, liking and commenting on the posts that reinforce our own biases and disappearing down dangerous trap doors with no easy way out. The separation between our digital personalities and our real one seems to widen every year. While user bases continue to grow across social media platforms – and TikTok now captures vast tracts of the attention economy – the inner workings of the technology need rethinking and rebuilding. A year ago, <a href="https://www.thenationalnews.com/business/technology/2021/10/21/facebooks-oversight-board-reprimands-the-company-over-lack-of-transparency/">Frances Haugen</a>, a former Facebook product manager turned whistleblower, told a US Congressional hearing that the platform’s algorithms promote posts with high levels of engagement, often pushing harmful content towards users. Facebook vigorously denied the accusation. Founder Mark Zuckerberg said the allegations of preferencing profiteering over well-being were “just not true”. Ms Haugen has since joined the new group, Council for Responsible Social Media, which launched this week and is pressing for urgent change. Last month, an <a href="https://www.thenationalnews.com/world/uk-news/2022/09/30/molly-russell-social-media-firms-told-to-find-moral-compass-after-teens-inquest/">inquest in the UK</a> ruled that the death for teenager Molly Russell in November 2017 happened after exposure to the negative effects of online content. The inquest heard how Molly, who was 14 years old when she died, saved, shared or liked 16,300 posts on Instagram in the six-month period before her death. Of those, 2,100 were depression, self-harm or suicide-related. Her case is likely to provide the springboard for action in the UK. Any proposed legislation will need careful calculation and application. Too strict and it becomes unworkable, too lenient and it is toothless. So how do you realistically govern a digital territory now inhabited by almost five billion people? Beyond broad actions that prefer steady hands and good faith, a prescription for change requires the following. The platforms themselves need to further develop their internal audit and oversight procedures. Self-regulation may sound like a terrible oxymoron, but it may also be the best way forward as long as it champions prevention and solutions. The platforms need to be more transparent and accountable with users about why content is being served and swift to address inappropriate or harmful posts. The algorithms need rebuilding and the machine-learning needs to understand why as well as what its users are interested in. And finally, safeguarding legislation must be drafted with purpose rather than symbolism.