China's open-source AI ecosystem has shifted toward Mixture-of-Experts (MoE) architectures as the default choice, prioritizing cost-performance balance over maximum capability. Leading organizations expanded beyond text models into multimodal domains (video, audio, 3D), with growing emphasis on small models (0.5B-30B
•7m read time• From huggingface.co
Table of contents
Mixture of Experts (MoE) as the Default ChoiceThe Rush for Supremacy by ModalityBig Preferences for Small ModelsMore Permissive Open Source LicensesFrom Model-First to Hardware-FirstReconstruction In Progress1 Comment
Sort: