Float16 Vector Type Support in SingleStore: Cheaper, Faster, Better
This title could be clearer and more informative.Try out Clickbait Shieldfor free (5 uses left this month).
SingleStore 9.1 introduces the VECTOR(<N>, F16) data type, enabling 16-bit floating point storage for vector embeddings. Benchmarks on the GIST 1M dataset show F16 cuts storage by ~50% (1.79GB vs 3.58GB) and speeds up exact kNN queries by ~38% compared to F32, while maintaining statistically equivalent recall (~96-98%). ANN index search times also improve slightly due to better cache utilization. The post explains why F16 precision is sufficient for most vector search use cases (semantic search, RAG), covers migration steps from F32 to F16, and notes that F32 remains preferable for high-precision domains like finance or medical imaging.
Table of contents
IntroductionCheaperFasterANN Indexed SearchBetterPutting F16 Vectors into PracticeConclusionAppendixSort: