Forbes November 15, 2024
John Werner

Thumbnail: AGI

If you’re paying attention to what people are saying online, you may have seen some of the more prominent items around AI advancement, like Eric Schmidt suggesting that neural network scaling laws are not experiencing diminishing returns. (or more accurately, that there’s ‘no evidence’ of this.)

Or you may have seen this confusing post on X by researcher and mathematician Ethan Caballero:

Ethan Caballero is busy on X: “https://t.co/QAHC5BUQey” / X

Recent research has led some in the industry to suggest that new models aren’t getting the same amount of juice out of their scaling, and that we may be experiencing a plateau in development.

Some of them cite the lack of availability of high-quality training data.

But...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: AI (Artificial Intelligence), Technology
OpenAI Roundup: Happenings In The End Of An AI Year
Survey Suggests Pharma Industry Still Struggling with Digital Transformation
Healthcare providers will need to boost cyber defenses amid AI adoption: Moody’s
Breaking Through The Generative AI Memory Wall
Google Cloud launches AI Agent Space amid rising competition

Share This Article