Meituan launched its first open-source large language model, LongCat-Flash-Chat, on September 2 via Hugging Face, GitHub, and its website. The model, comprising 560 billion parameters, uses a mixture-of-experts (MoE) architecture and reportedly benchmarks on par with
Alibaba’s Qwen, Anthropic’s Claude, and Google’s Gemini.