Forbes March 12, 2025
Thanks to innovations like DeepSeek, training AI has become cheaper. However, inference is becoming more demanding as we ask AI to think harder before answering our questions. Nvidia, Groq, and Cerebras Systems (clients of Cambrian-AI Research) have all released massive accelerators and infrastructure to support this trend. I suspect we will see more from Nvidia about inference next week than training, including clouds, robots, and cars. Jensen Huang has said this reasoning style of inference processing is 100 times more computationally demanding. I found in a recent experiment that reasoning can be even 200 times more expensive but far more intelligent and more valuable!
Cerebras Takes Inference To a New Level
Cerebras Systems, the creator of wafer-scale, Frisbee-sized AI chips,...