Microsoft launched the Maia 200 chip, designed specifically for AI inference workloads. The chip features over 100 billion transistors and delivers over 10 petaflops in 4-bit precision, representing a substantial performance increase over the Maia 100. Microsoft claims the Maia 200 outperforms Amazon's Trainium3 by 3x in FP4

2m read timeFrom techcrunch.com
Post cover image
Table of contents
Disrupt 2026 Tickets: One-time offerDisrupt 2026 Tickets: One-time offer
1 Comment

Sort: