- The Transition
- Posts
- Expensive energy: The UK and USA’s big AI problem
Expensive energy: The UK and USA’s big AI problem
AI has been one of the fastest growing industries in recent years and is already transforming the way the world works. The UK and the US have both made pledges to embrace this evolving technology and the economic benefits it could yield, but their ambitions may be thwarted by the massive amounts of energy AI consumes. Not only will this have an environmental impact, but the costs involved run the risk of causing a resurgence in fossil fuels – a potential roadblock on the road to net zero.

Credit to TalkTV
TL;DR
Despite its many valuable uses, AI consumes a vast amount of energy. The data centres it requires account for around 2% of the world’s electricity usage, but the power needed to sustain its growth is doubling every 100 days.
The UK has committed to becoming an AI powerhouse and pledged to increase public AI computer power twentyfold by 2030.
However, the high cost of energy in the UK means it could cost four times more to power a 100 MW data centre than it would in the US.
In the US, President Trump has announced plans to allocate more public land to drilling for oil, axe offshore wind projects, and remove tax breaks that favour renewable energy to fuel the growth of AI.
To reduce the reliance on fossil fuels and make AI more energy efficient, several mitigations should be pursued – these include making technological efficiencies, introducing regulation, continuing to invest in renewable power, and educating the public on the true cost of using AI.
The detail
While artificial intelligence has been helping make technology more efficient and intuitive since the so-called ‘AI boom’ in the early 1980s, its impact has been particularly felt in the past five years.
It has myriad uses, from content generation and virtual assistants to navigation and robotics. To work effectively, AI systems require training. This process involves huge amounts of information being fed into a computer so that it can start to identify patterns or generate text and images.
AI has both champions and critics, with many fence sitters nestled in between.
In industry, it has the potential to transform the way many administrative, legal and management roles are performed, finding efficiencies that could boost the global economy by 7% (albeit by replacing the equivalent of 300 million full-time jobs).
However, creative industries are concerned about the ethics and legalities of harvesting existing works to create new content. For instance, the Former Executive Vice President of the European Commission for A Europe Fit for the Digital Age, Margrethe Vestager, has highlighted the ways AI can unwittingly amplify bias and discrimination.
The true cost of ChatGPT
Arguably the most worrying impact of AI is on the environment.
In short, AI is thirsty, and the amount of energy required to power it is substantial. Generative AI systems are estimated to already be using approximately 33 times more energy to complete a task than task-specific software would. Training a model like ChatGPT-3 uses just under 1,300 MWh of electricity, while ChatGPT-4 may have used 50 times more.
Most of this energy is consumed by data centres. According to the International Energy agency, data centres used 460 TWh in 2022, and this figure could rise to over 1,000 TWh by 2026. AI currently represents around 2% of the world’s electricity usage, but the computational power needed to sustain the technology’s growth is doubling roughly every 100 days. Indeed, it’s one of the biggest reasons why Google’s greenhouse gas emissions in 2023 were almost 50% higher than they were just four years earlier.
An action plan geared for growth
The AI industry is growing worldwide and has become a priority for Keir Starmer’s Labour Government in the UK. With its AI Action Plan, the Government has outlined plans to turn the country into an AI superpower. The UK is already ranked fourth in the world when it comes implementation, investment and innovation in AI – now, AI will be at the heart of the its strategy to drive economic growth and enhance public service delivery.
In 2020, UK AI and its related infrastructure consumed approximately 3.6 TWh of electricity, and the pledge to increase public AI computer power twentyfold by 2030 will put immense strain on the grid. In fact, the National Grid predicts that electricity demand in the UK will rise six-fold in the next 10 years due to AI growth.
It’s not just grid constraints that are threatening progress in. High electricity costs and a lack of low-carbon energy alternatives are also detracting from the UK’s AI attractiveness. The country’s industrial energy prices mean that it could cost four times more to power a 100 MW data centre in the UK than it would do in the US. Understandably, cost will be an important consideration for the big tech companies currently spending tens of billions each quarter to fund AI accelerators. Capital expenditure for these firms rose by more than 35% year-on-year in 2024, with between $200 and $210 billion spent on AI infrastructure.
As competitors on the continent can supply greener electricity at lower prices, the UK has been urged by Amazon and OpenAI to overhaul the electricity market and split it into different zones. This would mean making energy costs higher in areas where power is in short supply and cheaper in areas with lower demand such as Scotland. If this plan as implemented, it’s predicted that a data centre in Aberdeen could benefit from electricity costs 65% lower than one based in Slough.
An argument for fossil fuels?
The US is facing similar challenges. The country is currently home to 33% of the world’s data centres and its energy consumption is expected to increase to 260 TWh in 2026. If it does hit this figure, that would be equivalent to 6% of the country’s total energy consumption compared to the 4.4% that AI currently accounts for. The Department of Energy predicts that data centre electricity demand will triple in the next three years.
Vice President JD Vance has made the Trump administration’s stance on AI clear by refusing to sign an international agreement on artificial intelligence put forth at a global summit in Paris (the UK also didn’t sign). While the agreement pledged to take an open, inclusive and ethical approach to AI development so that it is made sustainable for people and the planet, Vance was quoted as saying that the US would not squander the opportunity presented by AI and that pro-growth policies should be prioritised over safety.
The Trump administration’s response has been to allocate more public land to drilling for oil, axe offshore wind projects, and remove tax breaks that favour renewable energies. Oil giants Chevron and Exxon have both also published plans to build more natural gas-powered facilities directly connected to data centres
Efficiency and education
But is there a way forward that doesn’t involve reverting back to fossil fuels and putting the energy transition at risk?
Continued investment in renewables is certainly an option. Critics of the Trump administration’s approach note the fact that solar panels, wind turbines and batteries are becoming cost-competitive with natural gas, and that AI firms are already major investors in clean energy.
The UK Government also believes that clean and renewable solutions will be needed to meet the energy demands of AI. As part of its AI Action Plan, the Science and Technology Secretary of State and Energy Secretary will co-chair a new AI Energy Council and look to accelerate investment into clean energy solutions such as Small Modular Reactors (SMRs).
Regulation may have a role in reducing the energy impact of AI. In the European Union, data centres larger than 500 kW already have to file regular emissions reports so that their energy consumption can be properly monitored, and legislators are starting to make it a requirement for systems to be designed with energy consumption tracking in mind.
Meanwhile, more consumers could be tasked with being more mindful of their use of AI by following the French trend towards digital sobriety. With a typical ChatGPT request requiring 10 kJ of energy, which is roughly 10 times as much as a standard Google search, users could improve their carbon footprint by only submitting a request to ChatGPT when it’s the only solution available. Dark data could also be reduced – this is data generated and stored but then never used again.
The team at the Lincoln Laboratory Supercomputing Centre in MIT is actively looking for ways to improve the energy efficiency of AI. Its experts have found that limiting the amount of power a GPU (graphics processing unit) can draw could cut energy consumption by around 12-15%. The only downside is that task time will increase by 3% – when GPU power was limited to 150 watts, for example, researchers saw training time increase by two hours (from 80 to 82 hours) but saved the equivalent of a week’s energy for a US household. MIT has also created a model that can examine the rate at which an AI tool is learning to stop underperforming models early and reduce energy use by 80%.
Within data centres, operators are experimenting with more efficient cooling and heat reuse technologies such as high-density water-cooled racks to reduce energy consumption. Operations can be optimised so computations take place when power is cheaper and more readily available, while upgraded semiconductor process technology is improving the chips used. Optical waveguides can provide more energy-efficient connections between GPUs, and computer chip maker Nvidiaclaims that its new ‘superchip’ can boost generative AI performance 30 times while using 25 times less energy. Progress is and will continue to be made.
Making compromises for climate change
Despite the many valid concerns surrounding AI, what is clear is that it is here to stay.
It also doesn’t have to spell disaster for the renewable energy industry – in fact, it can be harnessed by clean energy providers to improve predictions surrounding supply and demand and to monitor any maintenance needs of assets like solar PV panels and wind turbines.
The challenge, therefore, lies not in trying to prevent or limit the growth of AI, but to make it more energy efficient.
Finding a solution will require a multi-prong approach. This demands sustained investment in renewable energy and corresponding infrastructure, efficiency improvements in AI technology, better regulation, and increased awareness among the public of the true cost of models like ChatGPT. This can all help to reduce the energy consumption – and the cost – of AI.
— Lew 👋
As ever your feedback is important to me. Please help by letting me know what you love or what you think can improve.
The Transition’s work is provided for informational purposes only and should not be construed as advice in any capacity. Always do your own research.

We’re part of the Climate Media Collective - an initiative brought to you by 4WARD