Microsoft launched the Maia 200 chip, designed specifically for AI inference workloads. The chip features over 100 billion transistors and delivers over 10 petaflops in 4-bit precision, representing a substantial performance increase over the Maia 100. Microsoft claims the Maia 200 outperforms Amazon's Trainium3 by 3x in FP4
•2m read time• From techcrunch.com
1 Comment
Sort: