Large Language Models
Kimi K2: A Trillion-Parameter Open-Source LLM
Explore Kimi K2, the 1.04T parameter open-source MoE model. Our deep dive covers its MuonClip optimizer, agentic AI training, and benchmark performance.
Ji Zhi Liu
Dive deep into the world of Artificial Intelligence with our curated collection of articles, covering the latest breakthroughs and insights from leading researchers and engineers.
Explore Kimi K2, the 1.04T parameter open-source MoE model. Our deep dive covers its MuonClip optimizer, agentic AI training, and benchmark performance.
Ji Zhi Liu