Product Launchai acceleratorllmmicrosoftinference efficiency
Microsoft Announces Maia 200 AI Accelerator
10.0
Relevance Score
Microsoft today announces the Maia 200 AI accelerator, its successor to Maia 100, built on TSMC’s 3nm process with more than 100 billion transistors. Microsoft says Maia 200 delivers three times the FP4 performance of Amazon Trainium Gen3 and stronger FP8 than Google’s TPU v7, will host OpenAI’s GPT-5.2, improves inference cost-efficiency by about 30 percent, and begins deployment in Azure US Central.



