Hollywood actors and writers have joined forces in an <a href="https://www.thenationalnews.com/arts-culture/2023/07/13/hollywood-shuts-down-as-sag-aftra-actors-go-on-strike/" target="_blank">industrywide strike</a>, something that has not happened for more than 60 years. The shutdown initially occurred over pay, but the refusal of studios such as Netflix and Disney to rule out using artificial intelligence in the future to replace human creators has fuelled anger on the picket line. AI programs have unnerved many in the industry, showing the ability to mimic human conversation and digitally recreate people's likenesses. But are they capable of writing feature-length films and undercutting A-list actors on set? <i>The National</i> spoke to computer scientist Robert Wahl of Concordia University in Wisconsin to find out what AI is capable of doing in Hollywood today – and what could come next. The concept of using AI in films is not entirely new, and audiences are already accustomed to seeing computer-generated actors on the silver screen. “The most popular one is probably Carrie Fisher [in<i> Star Wars</i>],” Mr Wahl told <i>The National.</i> “When she died, they wanted to create some final scenes with her, so they de-aged her digitally and used her voice to put her back into the films.” A similar method was used after the death of <i>Fast & Furious</i> actor Paul Walker in a 2013 car accident. “Paul Walker died, but they used his face and image to finish <i>Furious 7</i> because they were halfway through filming," Mr Wahl said. Producers mapped Walker’s face on to stand-in actors, in this case two of his brothers, to shoot the final scenes. There was little controversy surrounding this at the time, but after the launch of OpenAI’s conversational chatbot <a href="https://www.thenationalnews.com/business/money/2023/02/10/how-will-chatgpt-affect-jobs/" target="_blank">ChatGPT</a>, studios appear to have their sights set on harnessing AI’s power. This year, a group of top Hollywood executives gathered at the Milken Institute Global Conference in Beverly Hills, where they talked up the future of this new technology. “In the next three years, you’re going to see a movie that was written by AI … a good one,” said producer Todd Lieberman. “Not just scripts. Editing, all of it … storyboarding a movie, anything," said Fox entertainment chief executive Rob Wade. “If we’re talking 10 years? AI is going to be able to do all of these things.” Comments like these are making industry creators, such as writers and actors, nervous about what comes next. “It’s a very big talking point," said Mr Wahl. “And frankly, people just don’t know what the future holds.” A <a href="https://www.thenationalnews.com/weekend/2022/06/24/how-deepfakes-are-blurring-the-lines-in-art-and-film/" target="_blank">viral TikTok account</a> could provide a glance into the future of so-called deepfake technology and its use in future Hollywood films. In a clip that has gathered more than 90 million views, <a href="https://www.thenationalnews.com/arts-culture/film-tv/2023/07/13/tom-cruise-surprises-mission-impossible-fans-at-washington-cinema/" target="_blank">Tom Cruise</a> seemingly dances flamboyantly in his dressing gown. He turns his head from side to side and flicks his hair back in a remarkable display. It is not the real Tom Cruise, of course, but a deepfake creation from AI company Metaphysic, which claims to help its users make AI-generated and hyper-realistic immersive content. “When you look at those videos you cannot tell that anything is fake," said Mr Wahl. “It just looks like him, it sounds like him, the mannerisms are the same.” Deepfakes are made by building a huge library of clips, sounds or images of someone to train a system on how they look, sound and behave. “The more data that you feed these systems, the better off they are," said Mr Wahl. And the technology’s potential goes beyond visual likeness. There are also tools that can digitally recreate people’s voices, something that was <a href="https://www.thenationalnews.com/arts-culture/film/2021/07/17/why-the-anthony-bourdain-voice-cloning-creeps-people-out/" target="_blank">controversially used</a> to mimic Anthony Bourdain in the documentary <i>Roadrunner</i>, after his death in 2018. “If I wanted to conduct this interview and sound like Morgan Freeman, I could do that," said Mr Wahl. “And it would be hard-pressed on the other side to realise that it is somebody else. It doesn’t sound robotic at all, it just sounds like that person.” But examples such as these raise the question of who owns their likeness and what permissions need to be given to use it. One issue for the strikers is creating synthetic performers from an amalgamation of actors’ images. Studio sources said this has not happened yet, although they are aiming to reserve that right as part of contract talks. The chief negotiator for the actors union, Duncan Crabtree-Ireland, said AI posed an “existential crisis” for actors who worry their past, present and future work will be used to generate “synthetic performers who can take their place". But Crabtree-Ireland said the union was not seeking an outright ban on AI, rather that companies consult with it and get approval before casting a synthetic performer in place of an actor. Mr Wahl believes future battles over this sticking point could end up in litigation. "It's either going to get settled through things like the strikes that are going on currently, or it's going to end up in the federal court system," said Mr Wahl. "And they're going to have to put down some rules and regulations." Others in the industry are less concerned. Director <a href="https://www.thenationalnews.com/arts-culture/film-tv/2023/06/23/titanic-director-james-cameron-says-he-wishes-hed-sounded-alarm-over-lost-submersible/" target="_blank">James Cameron</a>, famous for telling the story of a murderous, self-aware AI system called Skynet in the 1984 classic <i>The Terminator</i>, has played down the idea of creatives losing their jobs. “I just don’t personally believe that a disembodied mind that’s just regurgitating what other embodied minds have said … about the life they’ve had, about love, about lying, about fear, about mortality … and just put it all together into a word salad and then regurgitate it … I don’t believe that’s something that’s going to move an audience," Cameron told CTV News. And while deepfakes such as Tom Cruise’s may offer an impressive video in short form, Mr Wahl believes the technology is not yet ready for longer films. “For a feature-length film of maybe an hour and a half? I don’t know that it can be pulled off accurately," he said. Then there is the phenomena known as the "uncanny valley", an unsettling feeling people experience in response to not-quite-human figures such as <a href="https://www.thenationalnews.com/business/technology/2023/07/07/ai-robots-at-un-conference-were-not-here-to-take-your-jobs/" target="_blank">robots</a> and computer-generated characters. “As humans, going to a theatre or watching something on our televisions, we have the ability to say that looks fake," said Mr Wahl. “Something is off; the eyes don’t look quite right, the mouth doesn’t respond the way it should. “I’m definitely not in the panic state. But [AI] is going to be big, it's going to affect a lot of things that we do … and there’s currently no rules or regulations on this.” <i>AFP and Reuters contributed to this story.</i>