🤖 𝐂𝐨𝐠𝐢𝐭𝐨 𝐯𝟐 from @DeepCogito just hit Together AI and these open-source reasoning models are approaching frontier performance 🤯 Four models spanning 70B dense, 109B MoE, 405B dense, and 671B MoE - designed to think smarter instead of just thinking longer.
💡 The breakthrough: Self-improvement via distillation Instead of just searching longer, these models learn efficient reasoning patterns and distill them back into parameters. They develop actual "intuition" for correct reasoning trajectories. Trained for <$3.5M total. Open frontier AI is real 🤖
🛠️ Deploy on Together AI with zero ops: ⚡Frontier 671B MoE via simple API ⚡Serverless infrastructure handles complexity ⚡You focus on building, we handle scaling
🛠️ Deploy on Together AI instantly: ⚡Frontier 671B MoE via simple API ⚡Serverless infrastructure handles complexity ⚡You focus on building, we handle scaling
2,8K