As <a href="https://www.thenationalnews.com/future/technology/2024/10/29/apple-intelligence-is-here-first-look-at-new-ai-features-on-iphone/" target="_blank">artificial intelligence</a> advances rapidly towards one day potentially surpassing human capabilities, leaders in <a href="https://www.thenationalnews.com/future/technology/2024/10/29/pwc-to-offer-customised-chatgpt-enterprise-services-to-its-middle-east-clients/" target="_blank">technology</a> have emphasised the critical need for balanced regulatory measures. While the potential of AI could bring significant benefits across industries, experts warn that unregulated growth may carry serious risks. Billionaire businessman and Tesla chief executive Elon Musk anticipated that, within the next couple of years, AI could perform any task a human can. He suggested AI could do what all humans can do collectively in about three more years, estimating this level could be reached by about 2029. Mr Musk, who joined the <a href="https://www.thenationalnews.com/business/economy/2024/10/28/saudi-arabia-future-investment-initiative/" target="_blank">Future Investment Initiative summit</a> in Riyadh online on Tuesday, aligned with previous assessments by AI pioneer Geoffrey Hinton, noting that while there was a probability that AI developments will be beneficial, there were also chances of adverse outcomes. While AI is “most likely going to be great, there's some chance, which could be, 10 to 20 per cent, that it goes bad,” Mr Musk said during an interview at the FII. Ultra-intelligent AI, or Artificial Super Intelligence (ASI), is 10,000 times smarter than the human brain and will be available within eight years, said Masayoshi Son, chief executive of SoftBank. “That's my definition of ASI, and that's coming in 2035,” he said during a fireside chat at the FII summit on Tuesday. He said ASI would be significantly more impactful than Artificial General Intelligence (AGI), software that is humanlike in intelligence and thought capability. “AGI definition is the same as a human brain,” said the head of the global investment holding company, which has its headquarters in Tokyo. "ASI’s future capabilities are still unknown and can be the source of great promise or global harm, depending on how it’s managed.” The anticipated rise of ASI could fundamentally transform economies and societal foundations. This has prompted urgent calls for responsible oversight to ensure the technologies improve human welfare. “If such a superpower comes and there's no regulation, it could be super dangerous," said Mr Son. But Mr Musk argued that excessive regulation was stifling growth, particularly in the US, describing it as a "slow strangulation". He warned that, without efforts to reduce regulatory barriers, it could be nearly impossible to undertake large-scale projects in the future, stalling innovation and development. “The bureaucracy in the US has been growing every year and has particularly grown under the Biden administration," he added. Mr Musk identified AI as a significant near-term existential threat, but considered global population decline to be an even greater concern over the longer term. He explained that, if current low birth rates continue, countries including South Korea could lose up to two thirds of their populations within a few generations, with Europe's potentially dropping by half. He stressed that maintaining or increasing birth rates should be a critical priority for governments worldwide. “I think AI is a significant existential threat and something we should be paying close attention to," Mr Musk said. "It's probably the most significant near-term threat … but longer term than that is the global population collapse. Birth rates have been collapsing pretty much worldwide. “If that [population decline] continues, you would see many countries become 5 per cent of their current size or less within three generations.” In the past, Mr Musk, as well as AI researchers Nick Bostrom and Stephen Hawking, have highlighted concerns about highly intelligent AI systems making decisions that conflict with human welfare, or that act autonomously in ways that surpass human control. Mr Musk also emphasised the significant energy demands required for digital intelligence and electric transport, predicting that solar energy will dominate future energy sources. Mr Son discussed the perceived hype of an AI bubble, exemplified by semiconductor chip maker Nvidia’s third-quarter forecast in August underwhelming investors. The US company said it expected revenue of $32.5 billion in the fourth quarter, plus or minus two per cent. Despite being above the average estimation of $32.5 billion next quarter, it did not meet the high expectations of investors, who expected a number closer to the $37.9 billion projection. Looking at the current returns of the AI industry neglects the greater picture, which is more important, he said. “I think Nvidia is undervalued, because the future is much bigger," he added. "The cumulative capex [capital expenditure] is $9 trillion," he said, adding that estimates will make up only about 5 per cent of GDP 10 years from now. "It's too much investment for many people's views. I say it's still a very reasonable capex," he said, given that it is expected to yield about $4 trillion in net income per year, according to his estimates. "So $9 trillion is not too big, maybe too small," he added. Mr Musk recommended that nations establish their own AI clusters, comparing them to essential infrastructure such as electricity grids. He anticipated a future where every country will have AI compute clusters as standard infrastructure. He predicted that by 2040, humanoid robots could outnumber humans, suggesting a future workforce would be largely composed of robotic helpers. He mentioned Optimus robots, with price points likely to become accessible as production increases. Mr Musk, founder of SpaceX, also shared plans for uncrewed Starship missions to Mars within two years, with aims to carry out human missions two years later. He expressed frustration over regulatory delays, which he views as an impediment to ambitious space projects, including the colonisation of Mars.