Back to publications

Note · 30.04.2026

Infrastructure is becoming the true center of gravity of AI

The market no longer looks only at models. It looks at chips, electrical capacity, cloud and compute contracts.

A common mistake is still to assume that AI value will concentrate first in the most visible models. The fortnight examined here suggests the opposite. The economic center of gravity is shifting toward compute capacity, chips, cloud agreements, data centre density and the forms of commercial lock-in that make this infrastructure scarce. Market performance confirms it, semiconductor and robotics vehicles outperformed the broad indices even as rate cuts remain elusive and geopolitical risk has intensified. AI is increasingly traded as a critical infrastructure theme, not only as a software theme.

OpenAI triggers a new equilibrium

On 27 April, OpenAI and Microsoft formalised an amended partnership. Microsoft remains the principal cloud partner, but the licence becomes non-exclusive and OpenAI can now serve its products on any cloud. A day later, OpenAI's latest models and its Codex coding agent arrived on Bedrock at AWS. The sequence reduces OpenAI's single-supplier dependency, broadens AWS's competitive room and turns frontier model access into a cloud orchestration question. Value can shift from labs toward infrastructure operators able to secure capacity, distribution and tooling.

Amazon assembles a near-industrial platform

The group disclosed that its chip business, Graviton, Trainium and Nitro, already exceeds an annualised pace of USD 20 billion. It announced about 2 gigawatts of Trainium capacity for OpenAI, up to 5 gigawatts for Anthropic, more than 2.1 million AI chips shipped over twelve months and more than one million NVIDIA GPUs to deploy from 2026. Bedrock customer spend grew 170% quarter on quarter. This is no longer a pure cloud-architect bet, it is the build of a compute platform where proprietary chips, multi-year agreements and managed services reinforce each other. The boundary is blurring between cloud provider, silicon designer and model distributor.

Alphabet embraces the hardware option

Google Cloud is no longer just a passive recipient of external AI demand. Management stated that cloud grew 63% on the back of demand for its AI products and infrastructure, and TPUs can now be sold directly to certain clients. As long as a hyperscaler keeps its accelerators for internal use, the market values it as an integrated operator. As soon as it begins selling them, it also becomes a strategic hardware supplier. The upside on gross margin and ecosystem depth grows, but so do future capex needs and supply chain sensitivity.

Microsoft: scarcity is the new normal

Management insisted on a crucial point, demand continues to exceed supply and the company expects to remain capacity-constrained at least through end-2026 despite annual capex of around USD 190 billion. The build-out cycle is far from over. Scarcity is not only about models, it is also about data centre land, advanced components, assembly lines, electricity, networking equipment and chips. This is why the sector remains expensive while still able to advance.

Dispersion across stocks, dispersion across narratives

AMD gained 20.1% over the period, supported by the idea that the compute boom cannot remain forever concentrated on a single GPU vendor. NVIDIA was nearly flat, with the market beginning to price in buyer diversification, the weight of in-house chips and the rise of challengers. ASML fell 5.0%, not contradicting the long-term thesis but reflecting that even picks-and-shovels names can correct when the rotation becomes more selective. AMD sustained its momentum by announcing Advancing AI 2026 as its global flagship event around an end-to-end AI offering.

Geopolitics enters the valuation

The US administration is considering new restrictions to prevent Chinese foundry Hua Hong from accessing certain US manufacturing technologies through foreign subsidiaries. China launched a campaign against AI misuse targeting deepfakes, financial disinformation and manipulated content. The AI value chain is becoming a question of economic and informational security. For a family office, this argues for a stricter mapping of dependencies by jurisdiction, cloud provider, China exposure and concentration on critical components.

Three layers to distinguish

The integrated infrastructure layer, cloud plus silicon plus tools, remains the most robust today. The hardware challenger layer can capture a growing share of buyer budgets as customers diversify suppliers. Finally, the equipment and subsystem layer retains strong value creation but a more intermittent equity performance after large moves. In AI, economic scarcity is no longer the model. It is the capacity.