Forbes March 5, 2025
Imagine an AI that doesn’t just guess an answer but walks through each solution, like a veteran scientist outlining every decision point. That’s R1, an open-source language model from DeepSeek. R1 has 671 billion parameters in total but “activates” only about 37 billion at once, thanks to a Mixture-of-Experts (MoE) architecture. Coupled with mixed-precision floating point operations, R1 provides deep, step-by-step logic at a fraction of the usual cost. For business leaders, R1’s biggest draw is how it combines transparent reasoning, competitive scalability and the freedom of open source, all while delivering advanced analytical insights.
Based on my 6.5 years of experience in running the AI team at a unicorn AI startup with over 1 million users, I’d like to...