The computational power needed to support the growth of artificial intelligence is doubling about every 100 days. To achieve a tenfold increase in AI model development, the demand for computational power could increase by as much as 10,000 times. This means the energy required for AI tasks is accelerating fast, with an annual growth rate of between 26 per cent and 36 per cent. By 2028, AI could consume more electricity than the entire country of Iceland did in 2021, the World Economic Forum said. On average, OpenAI's generative AI platform ChatGPT uses significantly more energy than a traditional Google search – up to 10 times more power in some predictions. This also means there is a growing global demand for data centres to run AI supercomputers. The power consumption of those centres is expected to double by 2030 to 150 gigawatts, rising to 330GW by 2040. This would require $600 billion in annual infrastructure investments and 80 million kilometres of power grid upgrades by 2040. But it is not only the power used to run the supercomputers at data centres that is the issue, the servers that power those applications also need to be cooled. New analysis had shown that ChatGPT, which uses the GPT-4 language model, consumes 519 millilitres, or a little more than one bottle of water, to write a 100-word email, according to <i>The Washington Post</i> in research collaboration with the University of California, Riverside. The same 100-word email generated by an AI chatbot using GPT-4 requires the same amount of energy it takes to run 14 LED light bulbs for one hour, <i>The Washington Post</i> added. However, GPT prompts are not the biggest energy drain. Most power is consumed during the training of the GPT language model, which relies on powerful supercomputers processing vast amounts of text from the internet and various other sources. Analysis by TRG Datacentres examined the energy use of leading chatbots based on the time it took to train the models, processing power and model architecture, such as the number of parameter searches, and found that Microsoft Bing consumed more than five times more energy than ChatGPT-3. As AI technology progresses, its energy demands are surging. By 2030, Gartner Consulting predicts that AI could account for up to 3.5 per cent of global electricity consumption – twice the energy demand in France. This directly effects climate change, driving an increase in greenhouse gas emissions. Researchers are developing energy-efficient models and optimised algorithms to mitigate AI's carbon footprint. A report from Adnoc, Masdar and Microsoft, presented this week at Adipec 2024 in Abu Dhabi, highlighted the potential of AI to improve traditional energy practices, enhance energy efficiency and accelerate the transition to cleaner energy sources. Balancing AI's potential with its impact is quickly becoming one of the tech industry's greatest challenges. But some are looking to address this issue. Dr Sultan Al Jaber, UAE Minister of Industry and Advanced Technology, and managing director and group chief executive of Adnoc, said at Adipec that: "We are at a pivotal moment for human progress driven by three megatrends: the rise of the Global South, the accelerated energy transition and the rapid growth of AI." Read more on <a href="https://www.thenationalnews.com/business/energy/2024/11/04/enact-majlis-abu-dhabi-hosts-global-energy-tech-ai-and-climate-leaders-on-eve-of-adipec/" target="_blank">ADIPEC 2024</a> and the future of <a href="https://www.thenationalnews.com/future/2024/11/05/can-openai-take-on-google-and-bing-with-real-time-feature-chatgpt-search/" target="_blank">AI search engines</a>.