Microsoft introduced Maia 200, a next-generation AI inference chipset built on TSMC 3nm technology, with over 140 billion transistors, high-bandwidth memory and scalable networking. Maia 200 aims to accelerate large-scale AI workloads in Azure with improved performance per dollar and support for demanding models such as GPT-5.2, and comes with an SDK for developers.Microsoft introduced Maia 200, a next-generation AI inference chipset built on TSMC 3nm technology, with over 140 billion transistors, high-bandwidth memory and scalable networking. Maia 200 aims to accelerate large-scale AI workloads in Azure with improved performance per dollar and support for demanding models such as GPT-5.2, and comes with an SDK for developers.