LiteRT, Google's on-device AI inference framework successor to TensorFlow Lite, has graduated its advanced acceleration capabilities into full production. Key highlights include cross-platform GPU support (Android, iOS, macOS, Windows, Linux, Web) via the ML Drift engine delivering 1.4x faster performance over legacy TFLite

7m read time From developers.googleblog.com
Post cover image
Table of contents
High-performance cross-platform GPU accelerationStreamlined NPU integration with peak performanceSuperior Cross-platform GenAI supportBroad ML framework supportReliability and compatibility you can trustWhat’s nextAcknowledgements

Sort: