Law enforcement agencies’ use of <a href="https://www.thenationalnews.com/business/technology/2022/02/22/uae-to-boost-number-of-specialised-ai-laboratories-in-the-country/" target="_blank">artificial intelligence</a> and<a href="https://www.thenationalnews.com/world/uk-news/2022/03/25/ukraine-uses-facial-recognition-to-identify-russian-soldiers-killed-in-war/" target="_blank"> facial recognition</a> technology are not subject to proper oversight and could exacerbate discrimination, a House of Lords committee has warned. New technologies were being created in a “new Wild West” without the law and public awareness keeping up with developments, it said. The Lords justice and home affairs committee warned that the lack of oversight meant “users are in effect making it up as they go along”. The cross-party group said <a href="https://www.thenationalnews.com/uae/2022/02/25/uae-pupils-offered-next-gen-internships-in-cryptocurrency-and-ai/" target="_blank">AI </a>could improve people’s lives but could also have “serious implications” for human rights and civil liberties in the justice system. “Algorithms are being used to improve crime detection, aid the security categorisation of prisoners, streamline entry clearance processes at our borders and generate new insights that feed into the entire criminal justice pipeline,” the committee said. But it said there was no scrutiny to ensure new tools were “safe, necessary, proportionate and effective”. “Instead, we uncovered a landscape, a new Wild West, in which new technologies are developing at a pace that public awareness, government and legislation have not kept up with.” Police forces and other agencies were buying equipment in a “worryingly opaque” market, with details of how systems work kept secret because of companies' insistence on commercial confidentiality. The peers also had concerns about <a href="https://www.thenationalnews.com/weekend/2022/01/21/extreme-circumstances-ai-pioneer-rolls-out-hospital-beds-system-saving-lives-in-covid/" target="_blank">AI</a> being used in “predictive policing” – forecasting crime before it happened. There was a danger it could make problems of discrimination worse by embedding in algorithms the “human bias” contained in the original data. Prof Karen Yeung, an expert in law, ethics and informatics at the University of Birmingham, told the committee that “criminal risk assessment” tools were not focused on white-collar crimes such as insider trading, due to the lack of data. They were instead used in the kinds of crimes for which there was more information, Prof Yeung said. “This is really pernicious," she said. "We are looking at high-volume data that is mostly about poor people, and we are turning it into prediction tools about poor people. “We are leaving whole swathes of society untouched by those tools.” The peers called for a mandatory register of algorithms used in criminal justice tools, a national body to set standards and certify new technology, and local ethics committees to oversee its use. “What would it be like to be convicted and imprisoned on the basis of AI, which you don’t understand and which you can’t challenge?" said Baroness Hamwee, the Liberal Democrat chairwoman of the committee. “Without proper safeguards, advanced technologies may affect human rights, undermine the fairness of trials, worsen inequalities and weaken the rule of law. "The tools available must be fit for purpose and not be used unchecked.”