Powering AI, Heating Earth: The Data Centre Climate Dilemma

The numbers should give anyone pause. Data centres worldwide consumed approximately 415 terawatt hours of electricity in 2024, representing about 1.5 per cent of global electricity consumption, according to the International Energy Agency. By 2030, that figure is projected to reach 945 terawatt hours, nearly doubling in just six years. The culprit driving much of this growth has a familiar name: artificial intelligence. The same technology that promises to optimise our energy grids, monitor deforestation from orbit, and accelerate the discovery of climate solutions is itself becoming one of the most rapidly growing sources of energy demand on the planet.
This is the defining contradiction of our technological moment. We are building systems powerful enough to model the entire Earth's climate, predict extreme weather events with unprecedented accuracy, and optimise the operation of cities in real time. Yet these very systems require data centres that consume as much electricity as 100,000 households. The largest facilities under construction today will use twenty times that amount. Training a single large language model can emit more carbon dioxide than five cars produce over their entire lifetimes. And as AI becomes embedded in everything from web searches to medical diagnostics to autonomous vehicles, its aggregate energy footprint is accelerating faster than almost any other category of industrial activity.
The question is no longer abstract. It is urgent, measurable, and contested. Will artificial intelligence prove to be our most powerful tool for addressing climate change, or will its insatiable appetite for energy accelerate the very crisis it promises to solve?
Monitoring the Planet from Above
The most compelling case for AI's climate potential begins not in server rooms but in orbit. Climate TRACE, a global coalition co-led by former United States Vice President Al Gore, uses artificial intelligence to analyse satellite imagery and remote sensing data, generating emissions estimates from over 352 million sources worldwide. Unlike traditional emissions reporting, which relies on self-reported data from governments and corporations, Climate TRACE provides independent verification at a granularity that was impossible just a decade ago.
The platform's AI systems can identify activities including fuel combustion, deforestation, methane flaring, and industrial production across every major emitting sector. Its December 2025 release includes monthly emissions data through October of that year. For the first time, policymakers and researchers can see, in near real time, which specific facilities and regions are driving climate change. The world lost eighteen football fields worth of tropical primary forests every minute in 2024, according to the University of Maryland's Global Land Analysis and Discovery Lab. That deforestation released 3.1 gigatonnes of greenhouse gas emissions. Satellite AI makes such destruction visible and attributable in ways that shame alone cannot achieve, but accountability might.
Research published in 2025 demonstrated that AI systems using machine learning algorithms and neural networks can reduce data reporting latency from 24 hours to just one hour, increase spatial resolution from 30 metres to 10 metres, and enhance detection accuracy from 80 per cent to 95 per cent. A collaboration between Planet Labs and Anthropic, announced in March 2025, combines daily geospatial satellite data with Claude's language model capabilities for pattern recognition at scale. NASA's Earth Copilot, developed with Microsoft using Azure's OpenAI Service, aims to make the space agency's vast Earth science datasets accessible to researchers worldwide.
The implications extend beyond monitoring to prediction. NVIDIA's Earth 2 platform, launched in 2024, accelerates detailed climate simulations far beyond what traditional computational models could achieve. Google's flood forecasting system now produces seven-day flood predictions across more than 80 countries, reaching approximately 460 million people. Prior to devastating floods in Brazil in May 2024, Google worked with Brazil's Geological Service to monitor over 200 new locations, helping authorities deploy effective crisis response strategies. These are not hypothetical capabilities. They are operational systems making measurable differences in how communities prepare for and respond to climate disasters.
Smart Grids and Energy Optimisation at City Scale
The Municipality of Trikala in Greece offers a glimpse of what AI-optimised urban energy management might look like at scale. As a designated City in the European Union's Mission Cities initiative, Trikala is deploying ABB's OPTIMAX platform to manage approximately 10 megawatts of energy infrastructure. The system integrates near real-time data from over 130 assets including public buildings, water infrastructure, schools, and future photovoltaic installations. Using cloud-based analytics and AI algorithms, the platform performs intraday and day-ahead optimisation to support the city's goal of achieving climate neutrality by 2030.
Across the Atlantic, the PJM regional grid serves 65 million people across the eastern United States. During the June 2024 heatwave, demand spiked well beyond normal peaks. Analysis has shown that hyper-local, AI-driven weather forecasts could have helped anticipate demand spikes and allocate resources ahead of the crisis, potentially avoiding blackouts and price spikes by proactively redistributing power.
In the United Kingdom, National Grid ESO's collaboration with the nonprofit Open Climate Fix has produced breakthrough results in solar nowcasting. By training AI systems to read satellite images and track cloud movements, the platform provides highly accurate forecasts of solar generation several hours in advance. Open Climate Fix's transformer-based AI models are three times more accurate at predicting solar energy generation than the forecasts produced by traditional methods. The practical benefit is direct: with greater confidence in solar output predictions, National Grid ESO can reduce the backup gas generation it keeps idling, saving millions of pounds in fuel and balancing costs whilst cutting carbon emissions.
National Grid Partners announced in March 2025 a commitment to invest 100 million dollars in artificial intelligence startups advancing the future of energy. The funds target development of more efficient, resilient, and dynamic grids. Part of this investment went to Amperon, a provider of AI-powered energy forecasting whose technology helps utilities manage demand and ensure grid reliability. In Germany, E.ON uses AI to predict cable failures, cutting outages by 30 per cent. Italy's Enel reduced power line outages by 15 per cent through AI monitoring sensors. Duke Energy in the United States collaborates with Amazon Web Services on AI-driven grid planning.
Google reported that its AI increased the value of wind farm output by 20 per cent through better forecasting. Research indicates that generative AI models using architectures such as Generative Adversarial Networks and transformers can reduce root mean square error by 15 to 20 per cent in solar irradiance forecasting, significantly enhancing the ability to integrate renewables into power systems.
The market recognises the opportunity. The global market for AI in renewable energy was valued at 16.19 billion dollars in 2024 and is projected to reach 158.76 billion dollars by 2034, representing a compound annual growth rate exceeding 25 per cent. Approximately 74 per cent of energy companies worldwide are implementing or exploring AI solutions.
The Energy Footprint That Cannot Be Ignored
Here is where the story turns. For all the promise of AI-optimised climate solutions, the technology itself has become a significant and rapidly growing source of energy demand.
Google's 2025 Sustainability Report revealed a 27 per cent year-over-year increase in global electricity usage, bringing its total to roughly 32 terawatt hours. Microsoft similarly reported a 27 per cent rise in electricity usage for fiscal year 2024, reaching approximately 30 terawatt hours. Both companies have seen their electricity consumption roughly double since 2018 to 2020, coinciding directly with their generative AI push. Barclays analysts noted these gains signal hyperscalers are on track for their seventh consecutive year of electricity growth exceeding 25 per cent, and that was before the surge in AI inference demand.
The United States now accounts for the largest share of global data centre electricity consumption at 45 per cent, followed by China at 25 per cent and Europe at 15 per cent. American data centres consumed 183 terawatt hours of electricity in 2024, more than 4 per cent of the country's total electricity consumption. By the end of this decade, the country is on course to consume more electricity for data centres than for the production of aluminium, steel, cement, chemicals, and all other energy-intensive goods combined.
Training large language models requires staggering amounts of energy. The training of GPT-3 consumed approximately 1,287 megawatt hours, accompanied by over 552 tonnes of carbon emissions. GPT-4, with its 1.75 trillion parameters, required more than 40 times the electricity of its predecessor. A 2019 study found that training a model using neural architecture search could emit more than 626,000 pounds of carbon dioxide equivalent, nearly five times the lifetime emissions of the average American car. According to MIT researcher Noman Bashir, a generative AI training cluster might consume seven or eight times more energy than a typical computing workload.
But training is not the largest concern. Inference is. Google estimates that of the energy used in AI, 60 per cent goes towards inference and 40 per cent towards training. Once deployed, models are queried billions of times. OpenAI reports that ChatGPT serves more than 2.5 billion queries daily. If the commonly cited estimate of 0.34 watt hours per query holds, that amounts to 850 megawatt hours daily, enough to charge thousands of electric vehicles every single day.
Research by Sasha Luccioni, the Climate Lead at Hugging Face, found that day-to-day emissions from using AI far exceeded the emissions from training large models. For very popular models like ChatGPT, usage emissions could exceed training emissions in just a couple of weeks. A single ChatGPT image generation consumes as much energy as fully charging a smartphone. Generating 1,000 images produces as much carbon dioxide as driving 6.6 kilometres in a petrol-powered car.
The energy demands come with water costs. A typical data centre uses 300,000 gallons of water each day for cooling, equivalent to the demands of about 1,000 households. The largest facilities can consume 5 million gallons daily, equivalent to a town of 50,000 residents. The Lawrence Berkeley National Laboratory estimated that in 2023, American data centres consumed 17 billion gallons of water directly for cooling. By 2028, those figures could double or even quadruple. Google's data centre in Council Bluffs, Iowa consumed 1 billion gallons of water in 2024, its most water-intensive facility globally.
Scientists at the University of California, Riverside estimate that each 100-word AI prompt uses roughly one bottle of water, approximately 519 millilitres. Global AI-related water demand is expected to reach 4.2 to 6.6 billion cubic metres by 2027, exceeding Denmark's entire annual water consumption. An assessment of 9,055 data centre facilities indicates that by the 2050s, nearly 45 per cent may face high exposure to water stress.
The Jevons Paradox and the Efficiency Trap
There is a seductive notion that efficiency improvements will solve the energy problem. As AI models become more efficient, surely their energy footprint will shrink? History suggests otherwise.
The Jevons Paradox, first observed during the Industrial Revolution, demonstrated that as coal-burning technology became more efficient, overall coal consumption rose rather than fell. Greater efficiency made coal power more economical, spurring adoption across more applications. The same dynamic threatens to unfold with AI. As models become cheaper and faster to run, they proliferate into more applications, driving up total energy demand even as energy per operation declines.
Google's report on its Gemini model illustrated both sides of this coin. Over a recent 12-month period, the energy and carbon footprint of the median Gemini Apps text prompt dropped by 33 and 44 times respectively, all whilst delivering higher-quality responses. Yet Google's total electricity consumption still rose 27 per cent year over year. Efficiency gains are real, but they are being overwhelmed by the velocity of adoption.
The projections are sobering. Between 2024 and 2030, data centre electricity consumption is expected to grow at roughly 15 per cent per year, more than four times faster than total electricity consumption from all other sectors combined. AI-optimised data centres specifically are projected to see their electricity demand more than quadruple by 2030. By 2028, more than half of the electricity going to data centres will be used specifically for AI. At that point, AI alone could consume as much electricity annually as 22 per cent of all American households.
Microsoft announced in May 2024 that its carbon dioxide emissions had risen nearly 30 per cent since 2020 due to data centre expansion. Google's 2023 greenhouse gas emissions were almost 50 per cent higher than in 2019, largely due to energy demand tied to data centres. Research published in Nature Sustainability found that the AI server industry is unlikely to meet its net-zero aspirations by 2030 without substantial reliance on highly uncertain carbon offset and water restoration mechanisms.
The Nuclear Response
The tech industry's appetite for electricity has sparked a remarkable revival in nuclear power investment, driven not by governments but by the companies building AI infrastructure.
In September 2024, Microsoft and Constellation Energy announced a 20-year power purchase agreement to bring the dormant Unit 1 reactor at Three Mile Island back online. Microsoft will purchase a significant portion of the plant's 835 megawatt output to power its AI data centres in the mid-Atlantic region. The project, renamed the Christopher M. Crane Clean Energy Center, represents the first time a retired nuclear reactor in the United States is being restored to serve a single corporate customer. In November 2025, the United States Department of Energy Loan Programs Office closed a 1 billion dollar federal loan to Constellation Energy, lowering the barrier to the restart. The reactor is targeted to resume operation in 2028.
Big tech companies signed contracts for more than 10 gigawatts of potential new nuclear capacity in the United States over the past year. Amazon Web Services secured a 10-year agreement to draw hundreds of megawatts from Talen Energy's Susquehanna nuclear plant in Pennsylvania. It subsequently obtained a 1.92 gigawatt power purchase agreement from the same facility and invested 500 million dollars in small modular reactor development. Google partnered with startup Kairos Power to deploy up to 500 megawatts of advanced nuclear capacity by the early 2030s. Kairos received a Nuclear Regulatory Commission construction licence in November 2024 for its Hermes 35 megawatt demonstration reactor in Oak Ridge, Tennessee.
Meta announced in June 2025 a 20-year agreement to buy 1.1 gigawatts of nuclear energy from the Clinton Clean Energy Center in Illinois. The commitment will support an expansion of the facility's output and deliver 13.5 million dollars in annual tax revenue to the surrounding community.
These deals represent an extraordinary acceleration in corporate energy procurement. Global electricity generation for data centres is projected to grow from 460 terawatt hours in 2024 to over 1,000 terawatt hours in 2030 and 1,300 terawatt hours by 2035. Nuclear offers carbon-free baseload power, but new reactors take years to build. The question is whether nuclear capacity can scale fast enough to meet AI's demand growth, or whether fossil fuels will fill the gap in the interim.
Quantifying the Trade-off
The most important question is whether AI's climate benefits outweigh its energy costs. Recent research offers the most rigorous attempt yet to answer it.
A study published in Nature's npj Climate Action by the Grantham Research Institute on Climate Change and the Environment and Systemiq found that AI advancements in power, transport, and food consumption could reduce global greenhouse gas emissions by 3.2 to 5.4 billion tonnes of carbon dioxide equivalent annually by 2035. In the power sector, AI could enhance renewable energy efficiency to reduce emissions by approximately 1.8 gigatonnes annually. In food systems, AI could accelerate adoption of alternative proteins to replace up to 50 per cent of meat and dairy consumption, saving approximately 3 gigatonnes per year. In mobility, AI-enabled shared transport and optimised electric vehicle adoption could reduce emissions by roughly 0.6 gigatonnes annually.
The IEA's own analysis supports a positive net impact. The adoption of existing AI applications in end-use sectors could lead to 1,400 megatonnes of carbon dioxide emissions reductions in 2035 in a Widespread Adoption scenario. That figure does not include breakthrough discoveries that might emerge thanks to AI over the next decade. By comparison, the IEA's base case projects total data centre emissions rising from approximately 180 million metric tonnes of carbon dioxide today to 300 million metric tonnes by 2035, potentially reaching 500 million metric tonnes in a high-growth scenario.
On these numbers, the potential emissions reductions from AI applications would be three to four times larger than the total emissions from the data centres running them. AI's net impact, the research suggests, remains overwhelmingly positive, provided it is intentionally applied to accelerate low-carbon technologies.
But that conditional is doing a great deal of work. The IEA cautioned that there is currently no momentum ensuring widespread adoption of beneficial AI applications. Their aggregate impact could be marginal if the necessary enabling conditions are not created. Barriers include constraints on access to data, absence of digital infrastructure and skills, regulatory and security restrictions, and social or cultural obstacles. Commercial incentives to apply AI in socially productive climate applications may be weak without active public policy.
Google Maps' eco-friendly routing uses AI to suggest routes with fewer hills, less traffic, and constant speeds. It has helped prevent over 1 million tonnes of carbon dioxide annually in its initial rollout across selected cities in Europe and the United States, equivalent to taking 200,000 cars off the road. But that application exists because it aligns with user preferences for faster routes. Many climate applications require explicit investment with less obvious commercial return.
Efficiency Gains and Green AI
Research is advancing on making AI itself more efficient. A report published by UNESCO and University College London found that small changes to how large language models are built and used can dramatically reduce energy consumption without compromising performance. Model compression through techniques such as quantisation can save up to 44 per cent in energy while maintaining accuracy. Experimental results reveal that optimisation methods can reduce energy consumption and carbon emissions by up to 45 per cent, making them suitable for resource-constrained environments.
Luccioni's research at Hugging Face demonstrated that using large generative models to create outputs is far more energy-intensive than using smaller AI models tailored for specific tasks. Using a generative model to classify movie reviews consumes around 30 times more energy than using a fine-tuned model created specifically for that purpose. The implication is significant: not every application requires a massive general-purpose model.
IBM released architecture details for its Telum II Processor and Spyre Accelerator, designed to reduce AI-based energy consumption and data centre footprint. Power-capping hardware has been shown to decrease energy consumption by up to 15 per cent whilst only increasing response time by a barely noticeable 3 per cent.
The training of Hugging Face's BLOOM model with 176 billion parameters consumed 433 megawatt hours of electricity, resulting in 25 metric tonnes of carbon dioxide equivalent. The relatively modest figure owes to its training on a French supercomputer powered mainly by nuclear energy, demonstrating that where and how AI is trained matters as much as model size.
A new movement in green AI is emerging, shifting from the bigger is better paradigm to small is sufficient, emphasising energy sobriety through smaller, more efficient models. Small models are particularly useful in settings where energy and water are scarce, and they are more accessible in environments with limited connectivity.
The Transparency Problem
Any honest assessment of AI's climate impact faces a fundamental obstacle: we do not actually know how much energy AI systems consume. Currently, there are no comprehensive global datasets on data centre electricity consumption or emissions. Few governments mandate reporting of such figures. All numbers concerning AI's energy and climate impact are therefore estimates, often based on limited disclosures and modelling assumptions.
Factors including which data centre processes a given request, how much energy that centre uses, and how carbon-intensive its energy sources are tend to be knowable only to the companies running the models. This is true for most major systems including ChatGPT, Gemini, and Claude. OpenAI's Sam Altman stated a figure of 0.34 watt hours per query in a blog post, but some researchers say the smartest models can consume over 20 watt hours for a complex query. The range of uncertainty spans nearly two orders of magnitude.
Luccioni has called for mandatory disclosure of AI systems' environmental footprints. She points out that current AI benchmarks often omit critical energy consumption metrics entirely. Without standardised reporting, neither researchers nor policymakers can make informed decisions about the technology's true costs and benefits.
The UK's AI Energy Council
The United Kingdom has taken early steps to coordinate AI and energy policy at a national level. The AI Energy Council held its inaugural meeting in April 2025, establishing five key areas of focus. These priorities centre on ensuring the UK's energy system can support AI and compute infrastructure, promoting sustainability through renewable energy solutions, focusing on safe and secure AI adoption across the energy system, and advising on how AI can support the transition to net zero.
The Council's membership spans major technology companies including Google, Microsoft, Amazon Web Services, ARM, and Equinix, alongside energy sector participants including the National Energy System Operator, Ofgem, National Grid, Scottish Power, EDF Energy, and the Nuclear Industry Association. The IEA shared analysis at Council meetings indicating that model inference, not training, will be the dominant driver of AI energy use going forward.
A National Commission was announced to accelerate safe access to AI in healthcare, with plans to publish a new regulatory framework in 2026. The NHS Fit For The Future 10 Year Health Plan, published in July 2025, identified AI alongside data, genomics, wearables, and robotics as strategic technological priorities.
These institutional developments reflect growing recognition that AI's energy demands cannot be managed through market forces alone. They require coordination between technology developers, energy providers, and government bodies.
Tension Without Resolution
The climate contradiction at the heart of artificial intelligence does not resolve itself through technological optimism or pessimism. Both narratives contain truth. AI genuinely offers capabilities for climate monitoring, energy optimisation, and scientific discovery that no other technology can match. AI also genuinely imposes energy and water costs that are growing faster than almost any other category of industrial activity.
The Grantham Institute and Systemiq research offers what may be the most useful framing. Using best available estimates, AI could add 0.4 to 1.6 gigatonnes of carbon dioxide equivalent annually by 2035 through data centre energy demand. If effectively applied to accelerate low-carbon technologies, AI could reduce emissions by 3.2 to 5.4 gigatonnes annually over the same period. The net balance favours climate benefit, but only if beneficial applications are actively developed and deployed.
This is not a technology problem. It is a policy problem. The commercial incentives driving AI development overwhelmingly favour applications that generate revenue: chatbots, image generators, productivity tools, advertising optimisation. Climate applications often require public investment, regulatory frameworks, and infrastructure that markets do not automatically provide.
Luccioni has expressed frustration with the current trajectory. “We don't need generative AI in web search. Nobody asked for AI chatbots in messaging apps or on social media. This race to stuff them into every single existing technology is truly infuriating, since it comes with real consequences to our planet.” Her critique points to a deeper issue. The AI systems consuming the most energy are not primarily those monitoring deforestation or optimising power grids. They are those generating text, images, and video for applications whose climate value is questionable at best.
The largest tech companies have all set targets to become water positive by 2030, committing to replenish more water than their operations consume. Amazon, Alphabet, Microsoft, and Meta have joined a pledge to triple the world's nuclear capacity by 2050. These commitments are meaningful, but they also constitute an acknowledgment that current trajectories are unsustainable. If the status quo were compatible with net-zero goals, such dramatic interventions would be unnecessary.
Where This Leaves Us
Will AI solve the climate crisis or accelerate it? The honest answer is that it depends entirely on choices that remain to be made.
If AI development continues primarily along commercial lines, with efficiency gains continually outpaced by proliferation into ever more applications, the technology's energy footprint will continue its rapid expansion. Data centre electricity demand doubling by 2030 is the baseline projection. Higher-growth scenarios are entirely plausible.
If governments, international institutions, and technology companies actively prioritise climate applications, if AI is deployed to optimise energy grids, accelerate materials discovery, monitor emissions, and transform food systems, the potential emissions reductions dwarf the energy costs of the technology itself.
The technology is agnostic. It will do whatever its builders and users direct it to do. A search chatbot and a deforestation monitoring system run on fundamentally similar infrastructure. The difference lies in what questions we ask and what answers we choose to act upon.
The IEA noted that nearly half of emissions reductions required by 2050 will come from technologies not yet fully developed. AI could accelerate their discovery. DeepMind's AlphaFold decoded over 200 million protein structures, unlocking advances in areas including alternative proteins and energy storage. An overly simplistic view of AI's impacts risks underestimating its potential for accelerating important climate-solution breakthroughs, such as developing less expensive and more powerful batteries in months rather than decades.
But those breakthroughs do not happen automatically. They require funding, institutional support, data access, and regulatory frameworks. They require deciding that climate applications of AI are as important as consumer applications, and investing accordingly.
The servers are humming. The electricity meters are spinning. The satellites are watching. The question is not whether artificial intelligence will shape our climate future. It is whether we will shape artificial intelligence to serve that future, or simply allow it to consume resources in pursuit of whatever generates the next quarterly return.
The answer will determine more than the trajectory of a technology. It will determine whether the most powerful tools humanity has ever built become instruments of our survival or accelerants of our crisis. The data centres do not care which role they play. That choice belongs to us.
References and Sources
International Energy Agency. (2025). “Energy and AI: Energy Demand from AI.” IEA Reports. https://www.iea.org/reports/energy-and-ai/energy-demand-from-ai
International Energy Agency. (2025). “AI and Climate Change.” IEA Reports. https://www.iea.org/reports/energy-and-ai/ai-and-climate-change
Climate TRACE. (2025). “Global Emissions Monitoring Platform.” https://climatetrace.org/
Pew Research Center. (2025). “What We Know About Energy Use at U.S. Data Centers Amid the AI Boom.” https://www.pewresearch.org/short-reads/2025/10/24/what-we-know-about-energy-use-at-us-data-centers-amid-the-ai-boom/
Carbon Brief. (2025). “AI: Five Charts That Put Data-Centre Energy Use and Emissions Into Context.” https://www.carbonbrief.org/ai-five-charts-that-put-data-centre-energy-use-and-emissions-into-context/
MIT Technology Review. (2025). “We Did the Math on AI's Energy Footprint. Here's the Story You Haven't Heard.” https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/
MIT News. (2025). “Explained: Generative AI's Environmental Impact.” https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117
Google DeepMind. (2016). “DeepMind AI Reduces Google Data Centre Cooling Bill by 40%.” https://deepmind.google/blog/deepmind-ai-reduces-google-data-centre-cooling-bill-by-40/
Google DeepMind. (2018). “Safety-First AI for Autonomous Data Centre Cooling and Industrial Control.” https://deepmind.google/blog/safety-first-ai-for-autonomous-data-centre-cooling-and-industrial-control/
Epoch AI. (2024). “How Much Energy Does ChatGPT Use?” https://epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use
Ritchie, H. (2025). “What's the Carbon Footprint of Using ChatGPT?” https://hannahritchie.substack.com/p/carbon-footprint-chatgpt
S&P Global. (2025). “Global Data Center Power Demand to Double by 2030 on AI Surge: IEA.” https://www.spglobal.com/energy/en/news-research/latest-news/electric-power/041025-global-data-center-power-demand-to-double-by-2030-on-ai-surge-iea
World Economic Forum. (2025). “How Data Centres Can Avoid Doubling Their Energy Use by 2030.” https://www.weforum.org/stories/2025/12/data-centres-and-energy-demand/
Luccioni, S. (2025). “The Environmental Impacts of AI: Primer.” Hugging Face Blog. https://huggingface.co/blog/sasha/ai-environment-primer
MIT Technology Review. (2023). “Making an Image with Generative AI Uses as Much Energy as Charging Your Phone.” https://www.technologyreview.com/2023/12/01/1084189/making-an-image-with-generative-ai-uses-as-much-energy-as-charging-your-phone/
Springer Nature. (2024). “Green AI: Exploring Carbon Footprints, Mitigation Strategies, and Trade Offs in Large Language Model Training.” https://link.springer.com/article/10.1007/s44163-024-00149-w
NPR. (2024). “Three Mile Island Nuclear Plant Will Reopen to Power Microsoft Data Centers.” https://www.npr.org/2024/09/20/nx-s1-5120581/three-mile-island-nuclear-power-plant-microsoft-ai
IEEE Spectrum. (2024). “Microsoft Powers Data Centers with Three Mile Island Nuclear.” https://spectrum.ieee.org/three-mile-island
Nature. (2025). “Will AI Accelerate or Delay the Race to Net-Zero Emissions?” https://www.nature.com/articles/d41586-024-01137-x
LSE Grantham Research Institute. (2025). “New Study Finds AI Could Reduce Global Emissions Annually by 3.2 to 5.4 Billion Tonnes of Carbon-Dioxide-Equivalent by 2035.” https://www.lse.ac.uk/granthaminstitute/news/new-study-finds-ai-could-reduce-global-emissions-annually-by-3-2-to-5-4-billion-tonnes-of-carbon-dioxide-equivalent-by-2035/
Nature. (2025). “Green and Intelligent: The Role of AI in the Climate Transition.” npj Climate Action. https://www.nature.com/articles/s44168-025-00252-3
MDPI. (2025). “AI-Based Energy Management and Optimization for Urban Infrastructure: A Case Study in Trikala, Greece.” https://www.mdpi.com/3042-5743/35/1/76
PV Magazine. (2025). “AI Powered Solar Forecasting Helps UK Grid Operator Reduce Balancing Costs.” https://www.pv-magazine.com/2025/11/07/ai-powered-solar-forecasting-helps-uk-grid-operator-reduce-balancing-costs/
NVIDIA Blog. (2024). “AI Nonprofit Forecasts Solar Energy for UK Grid.” https://blogs.nvidia.com/blog/ai-forecasts-solar-energy-uk/
GOV.UK. (2025). “AI Energy Council Minutes: Monday 30 June 2025.” https://www.gov.uk/government/publications/ai-energy-council-meetings-minutes/ai-energy-council-minutes-monday-30-june-2025-html
Brookings Institution. (2025). “AI, Data Centers, and Water.” https://www.brookings.edu/articles/ai-data-centers-and-water/
Environmental and Energy Study Institute. (2025). “Data Centers and Water Consumption.” https://www.eesi.org/articles/view/data-centers-and-water-consumption
Nature Sustainability. (2025). “Environmental Impact and Net-Zero Pathways for Sustainable Artificial Intelligence Servers in the USA.” https://www.nature.com/articles/s41893-025-01681-y
UNESCO. (2025). “AI Large Language Models: New Report Shows Small Changes Can Reduce Energy Use by 90%.” https://www.unesco.org/en/articles/ai-large-language-models-new-report-shows-small-changes-can-reduce-energy-use-90
U.S. Department of Energy. (2024). “AI for Energy: Opportunities for a Modern Grid and Clean Energy Economy.” https://www.energy.gov/sites/default/files/2024-04/AI%20EO%20Report%20Section%205.2g(i)_043024.pdf

Tim Green UK-based Systems Theorist & Independent Technology Writer
Tim explores the intersections of artificial intelligence, decentralised cognition, and posthuman ethics. His work, published at smarterarticles.co.uk, challenges dominant narratives of technological progress while proposing interdisciplinary frameworks for collective intelligence and digital stewardship.
His writing has been featured on Ground News and shared by independent researchers across both academic and technological communities.
ORCID: 0009-0002-0156-9795 Email: tim@smarterarticles.co.uk