Forbes February 1, 2025
In today’s column, I examine the sudden and dramatic surge of interest in a form of AI reasoning model known as a mixture-of-experts (MoE). This useful generative AI and large language model (LLM) approach has been around for quite a while and its formulation is relatively well-known. I will lay out for you the particulars so that you’ll have a solid notion of what mixture-of-experts is all about.
The reason MoE suddenly garnered its moment in the spotlight is due to the release of DeepSeek’s model R1 which uses MoE extensively.
In case you haven’t been plugged into the latest AI revelations, DeepSeek is a China-based AI company that has made available a ChatGPT-like LLM. They also made a bold...