A ‘deepfake’ audio recording was used in a UK child custody battle in an effort to discredit a Dubai resident, it has been revealed. Byron James, a lawyer in the emirate, said a heavily doctored recording of his client had been presented in court as evidence in a family dispute. In the edited version of the audio, the child’s father was heard making direct and “violent” threats towards his wife. But when experts examined the recording they found it had been manipulated to include words not used by the client. “This is a case where the mother has denied the father access to the children and said he was dangerous because of his violent and threatening behaviour,” Mr James said. “She produced an audio file that she said proved he had explicitly threatened her. “We were able to see it had been edited after the original phone call took place and we were also able to establish which parts of it had been edited. “The mother used software and online tutorials to put together a plausible audio file.” Manipulated video or audio recordings, sometimes referred to as deep fakes, risk becoming an increasing issue for police, the courts and other law enforcement agencies. Mr James said it was the first time he had encountered doctored audio evidence in his career, but said that all courts needed to be vigilant. The custody battle was heard in secret – as is required by law – in a UK court last year and as a result went unreported. The case remains ongoing, with none of those involved able to be identified. It is understood, however, that the court has already dismissed the audio recording as fake. Speaking to <em>The National</em>, Mr James outlined what had happened to his client, who lives in the Emirates. “We were lucky to get the original audio file and be able to study the metadata on the recording,” he said. “She [the wife] said [the doctored recording] justified her stance and that he [the husband] should not be allowed to see the children. “If we hadn’t been able to challenge this piece of evidence then it would have negatively affected him and portrayed him as a violent and aggressive man. “It raises all sorts of questions about what sort of evidence can you actually rely on. “Is there sufficient judicial training around the world to be able to identify evidence that has been manipulated in this manner?” Mr James, a partner at law firm Expatriate Law, went on to suggest that it would never occur to most judges that deepfake material could be submitted as evidence. He said the danger was that recordings could be taken at face value, unfairly influencing the outcome of trials. “The software is easy to use and widely available but many judges are in their 50s and 60s and wouldn’t be au fait with the latest technology,” he said.