fireworks/models/kimi-k2-instruct
Display Name:Kimi K2 Instruct
Kimi K2 is a state-of-the-art mixture-of-experts (MoE) language model with 32 billion activated parameters and 1 trillion total parameters. Trained with the Muon optimizer, Kimi K2 achieves exceptional performance across frontier knowledge, reasoning, and coding tasks while being meticulously optimized for agentic capabilities.
Specifications
Context128,000
Inputtext
Outputtext, json
Performance (7-day Average)
Uptime
TPS
RURT
Pricing
Input$0.60×1.1/MTokens
Output$2.50×1.1/MTokens