Dragonfly, a CNCF Graduated P2P file distribution system, now natively supports hf:// and modelscope:// URL protocols for downloading AI models directly from Hugging Face and ModelScope hubs. Instead of each GPU node independently downloading large models (e.g., a 130 GB DeepSeek-V3 across 200 nodes = 26 TB of origin traffic),

12m read timeFrom cncf.io
Post cover image
Table of contents
The problem: AI model distribution is broken at scaleWhat Is Dragonfly?Introducing native model hub protocols in DragonflyThe hf:// Protocol — Hugging Face hubThe modelscope:// Protocol — ModelScope hubUnder the hood: Technical deep diveReal-world impact: Where this mattersComparison: Why not just use platform CLIs?Getting startedWhat’s nextContributingConclusion

Sort: