Big Tech AI capex now drives three-quarters of US GDP growth
Capital expenditure on AI infrastructure by Amazon, Microsoft, Alphabet, Meta and Oracle is now responsible for roughly 75 per cent of all US economic growth in Q1 2026, according to estimates compiled this week. Stop the AI build-out and you print a recession headline within two quarters.

Capital expenditure on artificial-intelligence infrastructure by the five largest US technology companies is now responsible for roughly 75 per cent of all US economic growth, according to estimates compiled this week by the analytics firm TECHi, raising the political stakes for any policy change that might curb the build-out.
The figures, drawn from the disclosed 2026 capex guidance of Amazon, Microsoft, Alphabet, Meta and Oracle and from Bureau of Economic Analysis quarterly releases, show combined committed AI capex above $440 billion for 2026 and on track to exceed $580 billion by 2028. AI-related capex is expected to add roughly 2.5 percentage points to US GDP growth in 2026 and more than 3 points in 2027.
In the first quarter of 2026 alone, AI-related capex was responsible for 75 per cent of the 2 per cent annual GDP growth recorded by the Bureau of Economic Analysis. Strip out the AI build-out, and US growth was effectively flat in the quarter, despite the war-driven energy price shock and accelerating inflation.
"Big Tech is spending so much money building AI infrastructure that it is, quite literally, carrying the entire US economy on its back," wrote Jazib Zaman, the analyst who compiled the figures.
The findings are likely to become an early data point in the political fight over AI regulation. The Federal Trade Commission has flagged the vertical integration of AI infrastructure providers as a competition concern, the European Union continues to litigate the Digital Markets Act, and Beijing has tripled state-directed AI investment since 2024, framing the build-out as a national-security imperative. Pulling back AI capex now would mechanically print a recession headline within two quarters — a political outcome no administration is likely to volunteer for.
The token-factory phase
The current build-out is in what Zaman called the "token factory" phase. Money is flowing into physical AI data centres stuffed with NVIDIA Blackwell GPUs, AMD MI400 accelerators, Google's custom TPU v6, AWS Trainium 3 and Microsoft's Maia 200 silicon. The output of these facilities is not yet a commercial service — it is the capacity to produce inference and training tokens, the unit of work that powers every modern AI workload. Even non-hyperscaler companies have started restructuring around AI cost bases in anticipation of the next phase of the cycle.
The economic literature on previous general-purpose technology cycles — electricity, the internet, mobile — suggests that returns on capital from a build-out of this scale typically arrive only after the deployment phase has largely concluded. Microsoft's most recent quarterly disclosure shows roughly $13 billion in annualised AI revenue against $89 billion in AI capex, a 6.8x gap. That ratio is structurally unsustainable indefinitely, but it is consistent with the early years of every previous infrastructure cycle.
The bull case for AI capex requires the gap to close from the revenue side. AI agents at scale write production code, automate enterprise workflows, generate drug-discovery candidates and replace customer-support headcount. The bear case requires the gap to close from the capex side: hyperscalers cut spending because the marginal data centre cannot be filled with paying workloads. Investors are sharply divided on which case prevails.
Power, not silicon, is the binding constraint
The single most important physical constraint on the AI build-out has shifted from silicon to power. Grid operators in PJM, ERCOT and MISO are at limits. Constellation Energy and NextEra have multi-year nuclear power purchase agreements pre-booked with hyperscalers. GE Vernova's gas-turbine order book is a leading indicator for the next phase of the build-out, and the company's stock multiple has expanded in line with the thesis.
A senior technology analyst at one large US asset manager, who asked not to be named because they are not authorised to speak publicly, said the next two percentage points of GDP contribution from AI capex were "almost entirely a power story." The bottleneck is no longer GPU supply, the analyst said. It is grid interconnect, transmission permitting, and behind-the-meter generation.
Hyperscalers have begun signing direct power deals with Constellation, NextEra, and several private utilities in Texas. Microsoft signed a 20-year power purchase agreement in February for a restarted Three Mile Island Unit 1, the first US restart of a fully decommissioned nuclear reactor. Google signed a deal in April with Kairos Power for small modular reactor capacity. Amazon and Oracle have both pursued behind-the-meter generation strategies in Virginia and Texas.
The IA13 cohort
The capital flowing through the hyperscalers does not vanish. Every billion that Amazon, Microsoft, Google, Meta and Oracle spend on AI infrastructure flows directly into a narrow basket of providers — what TECHi has dubbed the IA13. The cohort of roughly thirteen public companies spans GPU and accelerator silicon (NVIDIA, AMD, Broadcom), networking (Arista, Cisco, Marvell), data-centre physical infrastructure (Vertiv, Eaton, Trane), and power generation (Constellation, NextEra, GE Vernova).
The defining feature of the IA13 thesis is concentration. Only a handful of companies on the planet can supply hyperscale-grade AI infrastructure at the volumes required, and order books are committed 12 to 18 months out for most of them. The AMD-Meta multi-year compute deal disclosed in March is the latest data point on how locked-in this demand is. Goldman Sachs noted in a Tuesday client memo that the hyperscaler-IA13 procurement structure was "the closest thing the public equity market has had to a regulated utility cycle since the 1990s telecom build-out."
What happens if hyperscaler capex flat-lines
The first quarter where two or more hyperscalers cut their forward capex guidance simultaneously would be the cleanest possible top-of-cycle signal. So far, every major hyperscaler has revised its 2027 guidance upward for six consecutive quarters. Alphabet's $4.67 trillion market capitalisation, after Google Cloud's most recent quarter at 63 per cent year-on-year revenue growth, is now within $200 billion of NVIDIA — a gap that has tightened by half since January.
Sovereign AI buildouts in the United Arab Emirates, Saudi Arabia, India, Japan and a growing list of mid-size nations are committing tens of billions to indigenous AI capacity. These are largely additive to US hyperscaler spending, not competitive with it, and the supplier base overlaps almost completely with the IA13. Sovereign demand could add another 50 to 100 basis points to global AI capex growth through 2028, on top of the hyperscaler totals.
The political math
The political economy of AI capex is increasingly difficult to separate from the macro picture. Pulling back AI spending in the name of antitrust, climate, or labour policy would mechanically produce a recession headline within two quarters. No administration is going to volunteer for that outcome, particularly with the political framing of the AI build-out dominated by US-versus-China strategic competition. Beijing has tripled state-directed AI investment since 2024 and is racing to deploy domestic GPU alternatives at scale.
That framing virtually guarantees continued policy support for hyperscaler capex through tax incentives, accelerated permitting for power and data centres, and continued export-control posture against advanced semiconductor sales to China. NVIDIA's position in the geopolitical equation is the single most important corporate variable, and its relationship with both US export-control authorities and Chinese customers is now, in effect, a US foreign policy file.
For investors, the question is when the convergence between AI capex and AI revenue arrives. For Washington, the question is whether to manage the build-out or just enable it. For the median household, the question is more immediate: how much of the additional GDP shows up in wages, prices, or productivity, and how soon. None of those questions has an answer yet. What is clear is that, for now, the AI build-out is the US economy.
Kai Mendel
Technology editor covering fintech, AI and the platform economy. Reports from San Francisco.


