DeepSeek-Coder-V2 is an open-source Mixture-of-Experts code language model that rivals GPT-4 Turbo in code-specific tasks. It supports 338 programming languages, significantly outperforming its predecessor DeepSeek-Coder-33B in various benchmarks. The model is available in 16B and 236B parameter versions, with active parameters

Sort: