Gemma 4, Google's latest generation of open multimodal models, is now available on Docker Hub as OCI artifacts. The lineup includes four model variants ranging from small efficient models (E2B at 5.1B params, E4B at 8B params) to a Mixture-of-Experts model (26B A4B) and a flagship dense model (31B) with up to 512K context

3m read timeFrom docker.com
Post cover image
Table of contents
What Docker Brings to Gemma 4What’s New in Gemma 4?Technical SpecificationsBuild the Future of AI with Docker Hub

Sort: