AI Companies

Mustafa Suleyman: AI Development Won't Hit a Wall — 1,000x Compute by End of 2028

Microsoft AI CEO Mustafa Suleyman writes in MIT Technology Review that AI training data has grown 1 trillion times since 2010, predicting another 1,000x in effective compute by end of 2028.

MicrosoftMustafa SuleymanComputeGPUSuperintelligence
※ このページにはアフィリエイトリンクが含まれています。リンク経由でご購入いただくと、運営費の一部として還元されます。

Microsoft AI CEO Mustafa Suleyman has written an op-ed in MIT Technology Review explaining why AI development won't hit a wall.


According to Suleyman, the amount of training compute has grown 1 trillion times since 2010 (from 10^14 to 10^26 FLOPS). Nvidia GPU performance has increased over 7x in six years (312 to 2,250 teraflops), and Microsoft's Maia 200 chip delivers 30% better performance per dollar than any other hardware in their fleet.


HBM3 (High Bandwidth Memory) provides 3x the bandwidth of its predecessor, while NVLink and InfiniBand connect hundreds of thousands of GPUs into warehouse-scale supercomputers. Training that took 167 minutes on 8 GPUs in 2020 now takes under 4 minutes on equivalent modern hardware.


Suleyman predicts another 1,000x in effective compute by end of 2028 and envisions 200GW of compute coming online annually by 2030 — equivalent to the combined peak energy use of the UK, France, Germany, and Italy. He forecasts a transition from chatbots to 'near-human-level agents' capable of executing weeks-long projects autonomously.

AI Newsletter

Get the latest AI tools and news delivered daily

Related Articles