VentureBeat February 27, 2021
Dattaraj Rao, Persistent Systems

The most impressive thing about OpenAI’s natural language processing (NLP) model, GPT-3, is its sheer size. With more than 175 billion weighted connections between words known as parameters, the transformer encoder-decoder model blows its 1.5 billion parameter predecessor, GPT-2, out of the water. This has allowed the model to generate text that is surprisingly human-like after only being fed a few examples of the task you want it to do.

Its release in 2020 dominated headlines, and people were scrambling to get on the waitlist to access its API hosted on OpenAI’s cloud service. Now, months later, as more users have gained access to the API (myself included), interesting applications and use cases have been popping up every day. For...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: AI (Artificial Intelligence), Apps, mHealth, Technology
Anthropic’s chief scientist on 5 ways agents will be even better in 2025
AI tool assists doctors in sharing lab results
Self-invoking code benchmarks help you decide which LLMs to use for your programming tasks
Ranked: AI Models With the Lowest Hallucination Rates
How Artificial Intelligence Is Transforming The Job Market: A Guide To Adaptation And Career Transformation

Share This Article