A <a href="https://www.thenationalnews.com/weekend/2023/07/14/do-deepfakes-and-ai-threats-require-a-un-watchdog/" target="_blank">deepfake video</a> call appearing to be an urgent plea for help from Dubai was used to con a man out of thousands of dirhams. A retired government official living in Kerala, India, received a call last week from someone he thought was a friend at Dubai Airport. He received audio and video calls, during which he saw his <a href="https://www.thenationalnews.com/uae/2023/03/17/how-the-future-of-cybercrime-could-involve-fake-voice-messages-from-loved-ones/" target="_blank">friend's face</a>. Seeing his friend chatting away to him as usual put him at ease and he did not hesitate to act when asked for cash to help with an urgent medical emergency. “I had no reason in the world to think that it was a scam. I have known him for more than 40 years. I even saw his face on my phone's screen and I thought it was him,” said Padiyeri Radhakrishnan, 73. “We were in the same company for many years, and I immediately recognised his voice. “He started the conversation asking about how my retired life is in Calicut, and also about my daughter who recently moved to Singapore from Gurgaon in Haryana. “We also spoke at length about all our friends who are now settled in different parts of India.” Deepfake is the name given to a form of synthetic media created by artificial intelligence and machine learning algorithms. It manipulates images, audio and recordings and can create fake video or audio messages from particular people. The use of the technology is becoming rife around the world. The UAE's National Programme for Artificial Intelligence and the Council for Digital Well-being <a href="https://www.thenationalnews.com/uae/2021/07/09/uae-asks-public-to-help-tackle-deepfakes/" target="_blank">published a guide </a>to raise awareness of deepfake technology in 2021. Mr Radhakrishnan was unaware he was speaking to a digital copy of his friend, so convincing was the voice created to swindle him out of his money. As the conversation flowed, the caller's tone took an urgent turn. He said he was at the airport in Dubai waiting for his flight to Mumbai. The voice on the other side explained he was travelling to see his gravely ill sister who urgently needed money for life-saving treatment. “He said he called several of his relatives, but no one was picking up as it was early hours on a weekend,” Mr Radhakrishnan said. “It felt genuine. But I still wanted to make sure that I was not making a mistake. “I told him that I was worried about transferring money as there were several scams doing the rounds and immediately said he would call me on video. “I saw him on the screen, and he started talking to me. His eyes and lips were moving, and it all looked perfect except that he was holding the phone too close to his face.” Mr Radhakrishnan said he immediately transferred Indian Rupees 40,000 (Dh2,000) to the account provided by his friend. “Within minutes, he called again confirming the receipt and assured me that he will transfer back the money as soon he reached Mumbai,” he said. But what raised the red flag was when the friend asked for more money. “Suddenly, I felt something was odd. Why would he ask me a second time?” said Mr Radhakrishnan. “I excused myself, saying I do not have enough cash in my account, and hung up.” To assuage his doubts, Mr Radhakrishnan then called his friend on the number that he had originally saved. “To my shock, my friend picked up, and the first thing he said was '[it's been] a long time, I haven’t heard from you',” he said. “That was the shock of my life. Was he kidding me? “My friend did not understand what was going on, and he hung up saying he was boarding a flight and will call when he lands. “It was hard to believe because I spoke to the man, and even saw him. I could not believe what had just happened.” A senior official from Kerala’s cyber police, who investigated the case, told <i>The National</i> the money had been traced successfully. “The fraudster had transferred the money to multiple banks at first and later converged it to a single account in a private bank in Maharashtra. That account is frozen,” said the officer. The officer added the money would be returned to Mr Radhakrishnan as soon as a court orders the release of the funds. Deepfake frauds have surfaced in many countries recently, with experts and authorities quick to sound warnings. Early this month, a new deepfake video of MoneySavingExpert.com founder Martin Lewis circulated on social media, with AI mimicking his face and voice to promote an app associated with Tesla and Twitter owner Elon Musk. “As technology gets more sophisticated, identifying these scams becomes harder but not impossible,” said Candid Wuest, Acronis's vice principal of cyber protection research. “Pay attention to what is said, how it is said, and whether or not this is normal behaviour. “For example, if someone without a history of asking you for money all of a sudden needs money right away, your antenna should be up. “Where did you first meet this person? AI probably won’t know that unless you’ve talked about it before. Specifics are sometimes the best way to identify deep fakes.”