Major tech companies are investing billions in nuclear power to fuel AI infrastructure—Meta signed a 20-year deal for 1,121 MW, Microsoft is restarting Three Mile Island for $1.6B, and Amazon invested $1.15B in nuclear technology. This represents a fundamental failure in optimization: instead of making AI models more efficient, Big Tech is scaling power infrastructure. Meanwhile, DeepSeek trained a competitive 671B parameter model for $5.5M, Berkeley replicated OpenAI's reasoning model for $450 in 19 hours, and Stanford did it for $50 in 26 minutes. The real competition isn't about who has more compute power—it's about who can achieve the same results with dramatically less energy through smarter engineering and architectural decisions.
9 Comments
Sort: