Home
»
Techmeme
» Moonshot's Kimi K2 uses a 1T-parameter MoE architecture with 32B active parameters and outperforms models like GPT-4.1 and DeepSeek-V3 on key benchmarks (Michael Nuñez/VentureBeat)
Moonshot's Kimi K2 uses a 1T-parameter MoE architecture with 32B active parameters and outperforms models like GPT-4.1 and DeepSeek-V3 on key benchmarks (Michael Nuñez/VentureBeat)
By
Eresh
•
July 12, 2025
•
http://www.techmeme.com/
Techmeme
•
Michael Nuñez / VentureBeat:
Moonshot's Kimi K2 uses a 1T-parameter MoE architecture with 32B active parameters and outperforms models like GPT-4.1 and DeepSeek-V3 on key benchmarks — Moonshot AI, the Chinese artificial intelligence startup behind the popular Kimi chatbot, released an open-source language model on Friday …
from Techmeme https://ift.tt/TdCBvy2
SIMILAR ARTICLES
Post a Comment