VentureBeat February 27, 2021
Dattaraj Rao, Persistent Systems

The most impressive thing about OpenAI’s natural language processing (NLP) model, GPT-3, is its sheer size. With more than 175 billion weighted connections between words known as parameters, the transformer encoder-decoder model blows its 1.5 billion parameter predecessor, GPT-2, out of the water. This has allowed the model to generate text that is surprisingly human-like after only being fed a few examples of the task you want it to do.

Its release in 2020 dominated headlines, and people were scrambling to get on the waitlist to access its API hosted on OpenAI’s cloud service. Now, months later, as more users have gained access to the API (myself included), interesting applications and use cases have been popping up every day. For...

Today's Sponsors

Venturous
Got healthcare questions? Just ask Transcarent

Today's Sponsor

Venturous

 
Topics: AI (Artificial Intelligence), Apps, mHealth, Technology
GenAI is already transforming the healthcare industry
The Dawn Of Physical AI: Industry Leaders At SXSW Predict Its Impact
Will Congress’s new AI plan survive Trump?
AI In 2035: How Technology Will Reshape Human Identity
AI And The Future Of Search: How We Broke The Web And What Comes Next

Share This Article