
Microsoft launches Maia 200 AI chip to rewrite the rules of cloud inference
Microsoft has pulled the curtain back on Maia 200, a custom AI accelerator tuned for Azure that promises up to 3x the FP4 inference performance of Amazon Trainium3 and stronger FP8 throughput than Google’s TPU v7, aiming to cut costs while scaling frontier models.











