<a href="https://www.thenationalnews.com/tags/religion" target="_blank">Religious experts </a>have said it is up to people of faith working in the tech industry to help shape how their companies address the opportunities and risks of <a href="https://www.thenationalnews.com/tags/artificial-intelligence" target="_blank">Artificial Intelligence.</a> “There are huge number of people of faith working in the tech industry. They want to feel that they are working for the good of humanity,” said John Wyatt, Professor of Ethics and Perinatology at University College <a href="https://www.thenationalnews.com/tags/london" target="_blank">London</a>. Speaking at a conference on the implications of AI for faith and society on Thursday, Prof Wyatt added that those people had the capacity to advocate for better uses of AI, or to act as “whistleblowers”. Artificial Intelligence allows a computer to solve complex problems, create texts and images based on prompts, and learn new tasks – making it almost human-like in its abilities. But its rapid progress and little governance around the technology have prompted concerns about the risks posed to society and security. “Along with the great potential offered by these technologies, they also open the door to risks and fears of abuse and misuse, which could lead to catastrophic consequences for our shared humanity,” said Sheikh Abdullah bin Bayyah, President of the Abu Dhabi Forum for Peace, which hosted the conference. Former UK cabinet minister Sajid Javid, who spoke at the conference, highlighted the opportunities AI currently afforded to health care as well as the risks it could pose, including child sexual abuse, information warfare and empowering hostile states. “AI is a revolution that will touch every aspect of society, nothing will be left untouched,” he said. AI’s ability to learn human-like tasks has made it fertile ground for questions of faith. “AI raises questions that go beyond AI. It bring into sharp relief questions about what it means to be human,” said Dr Harris Bor, research fellow at the London School of Jewish Studies. A religious perspective on AI could shed light on the ways in which “humanity will be diminished” by the technology, Dr Bor added. “Machines can do things better than we can. When they become more like ourselves, and better than ourselves, how can we not want to follow them,” he said. But it has also raised concerns that the values and biases of the companies developing AI software will dominate as the technology is more widely adopted. “AI is driven by western and Chinese technology companies […] which raises the risk of cultural hegemony,” said Graham Budd, director of the Faraday Institute for Science and Religion – which co-hosted the event. The <a href="https://www.thenationalnews.com/tags/vatican" target="_blank">Vatican’s </a>Rome Call for AI ethics in 2020 highlighted the need to make sure that no community is excluded from AI. Dr Yaqub Chaudhary of the Cambridge Muslim College warned of an emerging “algorithmic monoculture”, as the network and funding bodies behind the development of AI are based on a fixed set of philosophical movements currently popular in the US tech industry. These include effective altruism, which aims to use the time and money given to causes more effectively, and long-termism, the view that humans should work towards enhancing their long-term future. “These metaphysical ideologies are fully embedded in the way AI is being developed,” he said, “When it comes to this new type of utilitarianism, you don't just think about human beings that are alive today. You think about all human lives a billion years into the future, and that part is highly contentious.” This could lead to an AI-powered world which may suit the whims of Silicon Valley engineers – but is ultimately dissatisfying or exclusionary to others. “Tech-utopianism dreams of a frictionless future, where every desire, every whim, can be satisfied with minimum effort. But is this frictionless technological future the kind of future in which human beings can flourish?” asked Prof Wyatt.