Google AI Edge Torch is a new offering that provides a direct path from PyTorch to the TensorFlow Lite runtime for high-performance inference of PyTorch models on mobile devices. It offers easy conversion, great model coverage, and CPU performance. The release includes coverage and performance improvements over existing workflows, early adoption by Shopify for on-device background removal, and partnerships with hardware companies for improved performance and coverage. Qualcomm has also released a new TensorFlow Lite delegate that provides significant speedups using DSP and neural processing units.

3m read timeFrom developers.googleblog.com
Post cover image
Table of contents
A simple, PyTorch-centric experienceCoverage & PerformanceEarly Adoption and PartnershipsSilicon partnerships & delegatesWhat’s next?

Sort: