Data centers already consume 1.5% of global electricity, with AI workloads threatening to double that by 2030. Cloud carbon accounting is difficult because over 80% of emissions are Scope 3, requiring both billing-level and infrastructure-level measurement. Open source tools like Kepler (a CNCF project using eBPF and ML to estimate workload power consumption), the SCI ISO standard, and KEIT are emerging as the foundation for carbon-aware computing. Efficiency techniques like bin-packing, right-sizing, and Karpenter-based autoscaling can cut compute costs by 40%+ while reducing carbon intensity. However, Jevons' Paradox means efficiency gains in AI may increase total energy demand, making clean energy grids and policy equally important alongside engineering improvements. The author calls on the cloud native community to treat carbon efficiency as seriously as cost and reliability.
Table of contents
Why Cloud Carbon Accounting is so Hard—and so NecessaryThe Emerging Stack: Kepler, SCI, KEIT, and the OSS Engine Behind Green CloudEfficiency is the New Cloud FrontierThe AI Paradox: Why Efficiency Alone Won’t Save UsWhat’s Next: A Call to the Cloud Native CommunityRelatedSort: