Dr Philip Inglesant, SGR, looks at key developments in the civilian Artificial Intelligence industry over the past year, especially the massive spending increases, the rapid expansion in the number of data centres, and the continued deregulation. He points to numerous problems ahead.
Article from Responsible Science jounrnal no. 8 (April 2026). Advance online publication 24 March 2026.
Responsible Artificial Intelligence (AI) development risks being overtaken by the rush to innovate, with very large investments by the major players. The largest share of this spending is on data centre capacity. Data centres require ever-larger amounts of power, which is projected to reach a significant percentage of worldwide demand. Lack of readily available power is a constraint on AI expansion, particularly in the USA, and it is planned to meet this extra demand, in part, by on-site generation, including Small Modular (Nuclear) Reactors and some fossil fuel generators. At the same time, numerous ethical issues are being sidelined or ignored as regulation is curtailed. This article outlines some key developments over the past year or so.
Deregulation gains pace
A year ago, I wrote about the dangers of unconstrained, irresponsible AI in Responsible Science.[1] At that time, we were beginning to see the shape of the new Trump administration in the USA, the UK Labour government was a few months in power, and Large Language Models (LLMs) were suddenly in the news and public awareness. Much has changed in the short year since then – but much remains the same, only more so.
I wrote then, “AI should be judged not on how much it benefits business and ‘innovators’ but on how much it benefits humanity.” These views are widely held. The UN issued its report Governing AI for Humanity in September 2024;[2] the AI for Humanity programme ran a track at the Global Digital Collaboration Conference in Geneva in July 2025;[3] Humanity AI is gathering (almost entirely USA-based) $500 million philanthropic donations with the aim that “our future with AI can and will be what we make it”.[4] As the UN report argues, “the development, deployment and use of such a technology cannot be left to the whims of markets alone”.
Unfortunately, despite these well-meaning interventions, AI continues to surge with the eye-watering scale of investments by established and emerging corporations coinciding with a decided lack of interest in regulation, or outright opposition to regulation, by governments.
In the USA, the Trump administration promulgated Executive Order (EO) 14356 Ensuring a National Policy Framework for Artificial Intelligence[5] in December 2025, as a follow-up to EO 14179 of January 2025[6] (discussed in my previous piece). The new EO is intended to prevent American states introducing their own regulations. In addition, EO 14319, with the extraordinary title of Preventing Woke AI in the Federal Government,[7] prohibits federal agencies from procuring LLMs incorporating principles of diversity, equity, and inclusion, critical race theory etc, (“ideological biases or social agendas”) in AI models – while permitting naked racism, homophobia and misogyny, as long as it comes out of AI. These recent EOs have been clearly critiqued by the Congressional Black Caucus.[8],[9]
In the UK, the Labour government is intent on moving beyond the Sunak-era ‘distinctive UK’ approach to AI regulation, in favour of minimal regulation. Giving evidence to Parliament’s Science, Innovation and Technology Committee in December 2025, Secretary of State Liz Kendall stated that “My view is that there is no route to significantly better growth in this country… without science, technology and innovation absolutely leading at front and centre. That is my overwhelming focus”.[10] Meanwhile, a Private Member’s Bill, Artificial Intelligence (Regulation), introduced in the House of Lords by Conservative peer Lord Holmes, has not been allocated any parliamentary time since its first (and so far) only reading in March 2025.[11] There is, to be fair, concern for AI safety in terms of security and sustainability and children online – such as the recent belated move to block illegal content generated by Grok and similar chatbots,[12] but scarcely any mention of ethical issues such as mass surveillance, social sorting, or AI misuse, or for any concept that AI could prioritise the real problems facing humanity.
The US and UK approaches contrast with the European Union’s Artificial Intelligence Act 2024, which is clearly regulatory.[13] This is not the place to go into details of this Act, but we find an irony in the relative success of the EU in developing regulations across a non-federal union of nations compared with the ‘state rights’ battleground in the USA, where individual states are responding to the anti-regulatory stance of the Trump administration (and lack of regulation by the Biden administration) by developing a complex set of state-specific AI regulations.[14]
One especially important ethical issue is the increasing military use of AI, but this will be covered in a separate article.
The AI spending spree
However, now the widespread talk is less about AI as ‘the’ key to growth and prosperity and more about worries that AI will be yet another ‘bubble’,[15],[16] in the light of the enormous sums invested and even larger future commitments and the absence, so far, of clear pathways to profitability.
It is challenging to estimate just how much money has been raised in the race to develop and maintain leadership in AI. All told, AI took almost 50% of all global startup funding in 2025, rising to around $202 billion, up from $114bn in 2024.[17] Of this, 79% was to US-based companies.
OpenAI, developer of the AI assistant ChatGPT, is now the most valuable private company of all time. Nvidia and Amazon have agreed to invest a combined $80bn in the company alongside another $30bn from Japan’s SoftBank.[18] OpenAI had a valuation of $500bn at a secondary share sale in October 2025[19] and may have a valuation after further fundraising of $750-830bn, but has about $1.4tr in spending obligations over eight years.[20],[21]
Meanwhile, Meta (owner of Facebook), Google, Microsoft, and Amazon – but notably not Apple, which appears to be sitting out the ‘AI capex arms race’ – have announced plans to spend a total of $660bn during 2026 in AI capital expenditure, a 60% rise over 2025.[22]
Where is all this money going? A large fraction will be spent on data centres. Data centre partners of OpenAI have already committed to spend $800bn.[23] Anthropic, developer of AI assistant Claude, has announced $50bn investment in data centres with its partner Fluidstack.[24] OpenAI CEO Sam Altman is reported to have told journalists over a dinner in San Francisco, “You should expect OpenAI to spend trillions of dollars on data centre construction in the not very distant future”.[25]
Looking forward, a report from McKinsey & Co predicts that data centres will require around $6.7tr investments worldwide by 2030 to keep pace with demand, of which $5.2tr is related to AI demand.[26] However, this is the ‘Continued momentum’ scenario. If growth accelerates higher than this, then AI-related data centres might need up to $7.9tr of investment; if growth is more constrained then this investment might be ‘only’ $3.7tr. This spending will not only be on data centre infrastructure: roughly two-thirds will be on IT equipment, and rather less than one-tenth on electrical power, according to the McKinsey report.
As always, there is considerable uncertainty. New AI models might require less electricity for training and inference, or conversely new use cases and widespread access might increase demand. The supply chain is complex, and disruptions could slow the ability of companies to expand. Geopolitical tensions, and tariffs and trade controls, could impact demand one way or the other.
The data centre energy surge
This data centre expansion translates into raw electrical power, measured in gigawatts (GW). OpenAI ended 2025 with around 1.9 GW of data centre power, from 0.6 GW in 2024,[27] and is aiming for 30 GW by 2030, while arch-rival Anthropic is aiming for around 6 GW in three years. These are explosive increases compared with the organic growth of ‘hyperscaler’ cloud providers AWS, Microsoft Azure, and Google which each built around 10-12 GW of capacity in 15-17 years. [28],[29]
The AI behemoths believe that building more and more computing power is essential to maintain customer adoption and revenue. Sarah Friar, chief financial officer of OpenAI, has written that lack of current power is already a constraint on growth.[30]
The McKinsey report cited above[31] suggests that, in the Continued momentum scenario, AI would demand 156 GW of data centre power worldwide by 2030, which is an increase of about 125 GW since 2024. Their higher estimate would demand a whopping 205 GW extra by 2030. And that’s only AI-related demand. Non-AI data centre workload is also expected to continue to grow, albeit more slowly, roughly doubling to 64 GW.
To put this into perspective, it is projected that US data centres could consume 11.7% of the entire US power grid by 2030.[32] This is higher than other countries because the USA has around 45% of the world’s data centres, with particular concentrations in Virginia, Texas, and California,[33],[34] although in Ireland data centres are already around 20% of domestic electricity demand.[35] Across Europe, data centre power demand is projected to reach 4-5% of total electricity consumption by 2030, from around 2-3% in 2024.
Access to electricity has become a critical factor in the drive for new data centre builds. As the power ecosystem struggles to meet this demand, it faces constraints including securing reliable and sustainable energy sources, grid infrastructure, power equipment, and skilled workers to build them. For example, the lead time to power new data centres in Northern Virginia can be more than three years.[36] In Dublin, the state-owned electricity operator imposed a moratorium on data centres in early 2022 and set out conditions to connect new ones to the grid. In places including West London and the south of Sweden, businesses are waiting years to be connected.[37]
Despite this rapid growth, it should be remembered that data centre electricity demand is still far from the largest driver of global growth in final electricity demand, being behind air conditioning and electrification of industries.[38] Globally, around 85% of the projected increase in electricity demand 2024-2027 is projected to come from China, India, and other emerging economies, but demand in the ‘advanced’ economies is expected to grow strongly too, having remained virtually static for some years, driven by the deployment of electric vehicles, air conditioners, and heat pumps, as well as data centres.[39]
Embracing nuclear and fossil fuels
To meet the challenge that the US power grid is not able to keep pace with demand, there is increasing interest in onsite power generation, with the expectation that around 30% of data centres will use at least some onsite power supplemental to the grid by 2030, according to a survey of hyperscaler and colocation developer companies.[40] However, onsite or local power generation can be reliant on fossil fuels and even non-compliant with basic clean air regulations.[41] Under Trump, the Environmental Protection Agency’s enforcement actions are at a record low, while the federal government is actively restraining moves towards decarbonisation.[42]
The rapidly increasing demand also threatens firms’ sustainability targets, outpacing their green agenda.[43] The promise of stable, low-carbon, reliable power has led leading AI companies to turn towards nuclear power, especially so-called Small Modular Reactors (SMRs).[44] Sam Altman was chair of an SMR company called Oklo but stepped down in April 2025 to avoid a potential conflict of interest.[45] In October 2024, Amazon, together with partners, agreed to invest $500m in SMR start-up X-Energy, aiming to bring 5 GW online in the United States by 2039.[46] Google has ordered six to seven SMRs from nuclear energy company Kairos Power specifically to power its AI processing, aimed to come onstream between 2030 and 2035.[47]
However, these timescales are unlikely to be achieved due to numerous technical obstacles.[48] Indeed, the desperation of Big Tech has perhaps been shown most clearly by their attempts to re-start nuclear reactors which have been recently closed down. For example, in September 2024, Microsoft signed a 20-year deal to take electricity generated by a newly re-opened Unit 1 at Three Mile Island. This is next to Unit 2, site of the 1979 nuclear meltdown, the worst industrial disaster in US history. Constellation Energy, the owner of Three Mile Island Unit 1, has felt it necessary to rebrand the facility as the Crane Clean Energy Center.[49] Especially disturbing is the news that the Trump administration has recently watered down nuclear safety regulations to help developers.[50]
Pushback
In addition to the initiatives from the UN and international bodies discussed earlier, numerous civil society organisations and others are increasingly speaking out against these poorly regulated developments. New networks and coalitions are forming, including PauseAI[51] and, in the UK, Pull The Plug.[52] In addition, a range of new ‘AI-free’ consumer labels are being piloted in sectors where there are concerns that AI is a threat to jobs.[53] We must all become more engaged in this debate if society is to redirect AI onto a more responsible path.
Dr Philip Inglesant spent 20 years teaching and researching social aspects of computing and Responsible Innovation in areas including AI, quantum computing, and information technologies more broadly at universities throughout the UK. He is now retired. He is an Advisor to SGR’s Board of Directors.
Image credit: Ron Lach via Pexels.
References
[1] Inglesant P (2025). Responsible AI: are governments and corporations giving up? Responsible Science, no.7. February. https://www.sgr.org.uk/resources/responsible-ai-are-governments-and-corporations-giving
[2] United Nations Advisory Body on Artificial Intelligence (2024). Governing AI for humanity: final report. https://digitallibrary.un.org/record/4062495?v=pdf
[3] AI for Humanity (2025). Key Takeaways from the AI For Humanity Track of the 2025 Global Digital Collaboration Conference. https://ai4humanity.ai/pdf/GDC%20AI%20For%20Humanity%20Track%20Report%20-%20v2.0.pdf
[4] Humanity AI (non-dated). Our future with AI can and will be what we make it. https://humanityai.ai/
[5] Trump D (2025a). Ensuring a National Policy Framework for Artificial Intelligence. Executive Order 2025/12/11. https://www.whitehouse.gov/presidential-actions/2025/12/eliminating-state-law-obstruction-of-national-artificial-intelligence-policy/
[6] Trump D (2025b). Removing Barriers To American Leadership In Artificial Intelligence. Executive Order 2025/01/23. https://www.whitehouse.gov/presidential-actions/2025/01/removing-barriers-to-american-leadership-in-artificial-intelligence/
[7] Trump D (2025c). Preventing Woke AI in the Federal Government. Executive Order. 2025/07/23. https://www.whitehouse.gov/presidential-actions/2025/07/preventing-woke-ai-in-the-federal-government/
[8] Congressional Black Caucus Foundation (2025a). CBCF Executive Order Tracker: Ensuring a National Policy Framework for Artificial Intelligence. https://www.cbcfinc.org/wp-content/uploads/2025/12/Ensuring-a-National-Policy-Framework-for-Artificial-Intelligence-.pdf
[9] Congressional Black Caucus Foundation (2025b). CBCF Executive Order Tracker: Preventing Woke AI in the Federal Government. https://www.cbcfinc.org/wp-content/uploads/2025/08/Preventing-Woke-AI-in-the-Federal-Government-.pdf
[10] UK Innovation and Technology Committee Science (2025). Oral evidence: Work of the Secretary of State for the Department for Science, Innovation and Technology, HC 1543. https://committees. parliament.uk/oralevidence/16843/pdf
[11] UK Parliament (2025). Artificial Intelligence (Regulation) Bill [HL]. March. https://bills.parliament.uk/bills/3942
[12] Novik M (2026). UK to tighten online safety laws to include AI chatbots. Financial Times. 15 February. https://www.ft.com/content/15917aa4-2d40-49be-85c3-da395b16e7f1
[13] Wikipedia (non-dated). Artificial Intelligence Act. https://en.wikipedia.org/wiki/Artificial_Intelligence_Act
[14] Susaria A (2025). How states are placing guardrails around AI in the absence of strong federal regulation. The Conversation. 6 August. https://theconversation.com/how-states-are-placing-guardrails-around-ai-in-the-absence-of-strong-federal-regulation-260683
[15] Roberts M (2026). AI and creative destruction. Blog post. 3 February. https://thenextrecession.wordpress.com/2026/02/03/ai-and-creative-destruction/
[16] Heaven W (2025). The great AI hype correction of 2025. MIT Technology Review. 15 December. https://www.technologyreview.com/2025/12/15/1129174/the-great-ai-hype-correction-of-2025/
[17] Teare G (2025). 6 Charts That Show The Big AI Funding Trends Of 2025. Crunchbase. 16 December. https://news.crunchbase.com/ai/big-funding-trends-charts-eoy-2025/
[18] OpenAI (2026). Scaling AI for everyone. 27 February. https://openai.com/index/scaling-ai-for-everyone/
[19] Sigalos M (2025). OpenAI wraps $6.6 billion share sale at $500 billion valuation. CNBC. 2 October. https://www.cnbc.com/2025/10/02/openai-share-sale-500-billion-valuation.html
[20] Campbell D, Bergman B (2026). Can OpenAI make the numbers meet? It’s a trillion-dollar question. Business Insider. 10 February. https://www.businessinsider.com/openai-profitability-analyst-investor-opinions-funding-ipo-2026-2
[21] Altman S (2025). I would like to clarify a few things. X. 6 November. https://x.com/sama/status/1986514377470845007
[22] Morris et al (2026). Big Tech’s ‘breathtaking’ 660bn spending spree reignites AI bubble fears. Financial Times. 5 February. https://www.ft.com/content/0e7f6374-3fd5-46ce-a538-e4b0b8b6e6cd
[23] Choi E (2026). Who wins the AI race? X. 23 January. https://x.com/EthanChoi7/status/2014629421089800690
[24] Fluidstack (2025). Fluidstack selected by Anthropic to deliver custom data centers in New York and Texas. 12 November. https://www.fluidstack.io/about-us/blog/fluidstack-selected-by-anthropic-to-deliver-custom-data-centers-in-the-us
[25] Heath A (2025). I talked to Sam Altman about the GPT-5 launch fiasco. The Verge. 15 August. https://www.theverge.com/command-line-newsletter/759897/sam-altman-chatgpt-openai-social-media-google-chrome-interview
[26] Noffsinger et al (2025). The cost of compute: A 7 trillion race to scale data centers. McKinsey Report. April. https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/the-cost-of-compute-a-7-trillion-dollar-race-to-scale-data-centers
[27] Friar S (2026). A business that scales with the value of intelligence. OpenAI. https://openai.com/index/a-business-that-scales-with-the-value-of-intelligence/
[28] Choi (2026). Op. cit.
[29] A power plant with capacity of one gigawatt could power roughly 876,000 typical US households. https://www.carboncollective.co/sustainable-investing/gigawatt-gw
[30] Friar (2026). Op. cit.
[31] Noffsinger et al (2025). Op. cit.
[32] Green et al (2024). How data centers and the energy sector can sate AI’s hunger for power. McKinsey & Co September. https://www.mckinsey.com/industries/private-capital/our-insights/how-data-centers-and-the-energy-sector-can-sate-ais-hunger-for-power
[33] Cargoson Transport Management Software (2025). Number of Data Centers by Country. November. https://www.cargoson.com/en/blog/number-of-data-centers-by-country
[34] Datacenter Map (non-dated). Explore The Leading Global Data Center Database! https://www.datacentermap.com/
[35] de Roucy-Rochegonde L, Buffard A (2025). AI, Data Centers and Energy Demand Reassessing and Exploring the Trends IFRI Report. https://www.ifri.org/sites/default/files/2025-02/ifri_buffard-rochegonde_ai_data_centers_energy_2025_2.pdf
[36] Green et al (2024). Op. cit.
[37] de Roucy-Rochegonde and Buffard (2025). Op. cit.
[38] Ibid.
[39] International Energy Agency (2025). Electricity 2025: Analysis and forecast to 2027. February. https://www.iea.org/reports/electricity-2025
[40] Bloom Energy (2025). From Gridlock To Growth: How Power Bottlenecks Are Reshaping Data Center Strategies. 2025 Data Center Power Report. January. https://media.datacenterdynamics. com/media/documents/2025-Data-Center-Power-Report-1_-_Matthew_ Westwood.pdf.
[41] Simon E (2026). ‘A different set of rules’: thermal drone footage shows Musk’s AI power plant flouting clean air regulations. The Guardian. 13 February. https://www.theguardian.com/environment/2026/feb/13/elon-musk-xai-datacenters-air-pollution-mississippi
[42] Dolphin G (2025). Trump 2.0: US climate policy in retreat. Oxford Economics. May. https://www.oxfordeconomics.com/resource/trump-2-0-us-climate-policy-in-retreat/
[43] de Roucy-Rochegonde and Buffard (2025). Op. cit.
[44] Intelligent Data Centres (2025). Editor’s question: Powering tomorrow’s data centres – are SMRs the game-changer? 15 July. https://www.intelligentdatacentres.com/2025/07/15/editors-question-powering-tomorrows-data-centres-are-smrs-the-game-changer/
[45] Wikipedia (non-dated). Oklo, Inc. https://en.wikipedia.org/wiki/Oklo_Inc
[46] X-energy (2024). Amazon Invests in X-energy to Support Advanced Small Modular Nuclear Reactors and Expand Carbon Free Power. 16 October. https://x-energy.com/media/news-releases/amazon-invests-in-x-energy-to-support-advanced-small-modular-nuclear-reactors-and-expand-carbon-free-power
[47] Wikipedia (non-dated). Kairos Power. https://en.wikipedia.org/wiki/Kairos_Power
[48] See, for example: Thomas S (2023). Small Modular Reactors: the last-chance saloon for the nuclear industry? Responsible Science, no.5. https://www.sgr.org.uk/resources/small-modular-reactors-last-chance-saloon-nuclear-industry ; Ramana M (2024). Nuclear is Not the Solution. Verso.
[49] Constellation (2024). Constellation to Launch Crane Clean Energy Center, Restoring Jobs and Carbon-Free Power to The Grid. 20 September. https://www.constellationenergy.com/news/2024/Constellation-to-Launch-Crane-Clean-Energy-Center-Restoring-Jobs-and-Carbon-Free-Power-to-The-Grid.html
[50] NPR (2026). The Trump administration has secretly rewritten nuclear safety rules. January. https://www.npr.org/2026/01/28/nx-s1-5677187/nuclear-safety-rules-rewritten-trump
[51] PauseAI. https://pauseai.info/
[52] Pull The Plug. https://pulltheplug.uk/
[53] BBC News (2026). Is this product 'human-made'? The race to establish an AI-free logo. https://www.bbc.co.uk/news/articles/cj0d6el50ppo