Forbes March 12, 2025
Karl Freund

Thanks to innovations like DeepSeek, training AI has become cheaper. However, inference is becoming more demanding as we ask AI to think harder before answering our questions. Nvidia, Groq, and Cerebras Systems (clients of Cambrian-AI Research) have all released massive accelerators and infrastructure to support this trend. I suspect we will see more from Nvidia about inference next week than training, including clouds, robots, and cars. Jensen Huang has said this reasoning style of inference processing is 100 times more computationally demanding. I found in a recent experiment that reasoning can be even 200 times more expensive but far more intelligent and more valuable!

Cerebras Takes Inference To a New Level

Cerebras Systems, the creator of wafer-scale, Frisbee-sized AI chips,...

Today's Sponsors

Venturous
Got healthcare questions? Just ask Transcarent

Today's Sponsor

Venturous

 
Topics: AI (Artificial Intelligence), Technology
Gemini Robotics uses Google’s top language model to make robots more useful
Mirror, Mirror: How AI Exposes The Blind Spots Holding You Back
Technology in your practice: How difficult is it to implement these new technologies?
Why Women In AI Are The Key To Unlocking Billion-Dollar Markets
Why Highmark trained thousands of leaders on AI prompt engineering

Share This Article