Mustafa Suleyman: AI Development Won't Hit a Wall — 1,000x Compute by End of 2028
Microsoft AI CEO Mustafa Suleyman writes in MIT Technology Review that AI training data has grown 1 trillion times since 2010, predicting another 1,000x in effective compute by end of 2028.
Microsoft AI CEO Mustafa Suleyman has written an op-ed in MIT Technology Review explaining why AI development won't hit a wall.
According to Suleyman, the amount of training compute has grown 1 trillion times since 2010 (from 10^14 to 10^26 FLOPS). Nvidia GPU performance has increased over 7x in six years (312 to 2,250 teraflops), and Microsoft's Maia 200 chip delivers 30% better performance per dollar than any other hardware in their fleet.
HBM3 (High Bandwidth Memory) provides 3x the bandwidth of its predecessor, while NVLink and InfiniBand connect hundreds of thousands of GPUs into warehouse-scale supercomputers. Training that took 167 minutes on 8 GPUs in 2020 now takes under 4 minutes on equivalent modern hardware.
Suleyman predicts another 1,000x in effective compute by end of 2028 and envisions 200GW of compute coming online annually by 2030 — equivalent to the combined peak energy use of the UK, France, Germany, and Italy. He forecasts a transition from chatbots to 'near-human-level agents' capable of executing weeks-long projects autonomously.
Sources
AI Newsletter
Get the latest AI tools and news delivered daily
Related Articles
OpenAI, Anthropic, and Google Unite to Combat Unauthorized Chinese AI Model Copying
Three major US AI companies collaborate through the Frontier Model Forum to combat Chinese 'adversarial distillation,' sharing intelligence to prevent billions in annual losses.
Uber Adopts Amazon's Custom AI Chips (Trainium & Graviton) to Move Beyond GPU Dependency
Uber adopts AWS Graviton4 and Trainium3 custom chips for driver matching optimization and AI model training, accelerating the shift from standard GPUs to specialized hardware.
Anthropic Accidentally Leaks 500,000 Lines of Claude Code Source Code via npm Registry
Anthropic accidentally published Claude Code's entire source code (512,000 lines, 1,906 files) to the public npm registry, exposing its three-layer memory architecture.