Forbes November 15, 2024
Thumbnail: AGI
If you’re paying attention to what people are saying online, you may have seen some of the more prominent items around AI advancement, like Eric Schmidt suggesting that neural network scaling laws are not experiencing diminishing returns. (or more accurately, that there’s ‘no evidence’ of this.)
Or you may have seen this confusing post on X by researcher and mathematician Ethan Caballero:
Ethan Caballero is busy on X: “https://t.co/QAHC5BUQey” / X
Recent research has led some in the industry to suggest that new models aren’t getting the same amount of juice out of their scaling, and that we may be experiencing a plateau in development.
Some of them cite the lack of availability of high-quality training data.
But...