AI poses significant environmental challenges: inference consumes enormous energy, GPU hardware churns every 2-3 years, and users are shielded from true costs. Ludi Akue, speaking at QCon London, argues that technical solutions like model compression, quantization, RAG, and small language models are necessary but insufficient

5m read timeFrom infoq.com
Post cover image

Sort: