Databricks has launched Qwen3-Embedding-0.6B in public preview on its platform. Built on the Qwen3 foundation, this 0.6B-parameter multilingual embedding model supports up to 32k token context length, covers 100+ languages, and outperforms flagship models from OpenAI and Cohere on MTEB benchmarks while rivaling much larger 7B+

5m read timeFrom databricks.com
Post cover image
Table of contents
Multilingual by DesignSecure Serverless DeploymentTry out Qwen3-Embedding-0.6B today!

Sort: