VentureBeat February 27, 2021
Dattaraj Rao, Persistent Systems

The most impressive thing about OpenAI’s natural language processing (NLP) model, GPT-3, is its sheer size. With more than 175 billion weighted connections between words known as parameters, the transformer encoder-decoder model blows its 1.5 billion parameter predecessor, GPT-2, out of the water. This has allowed the model to generate text that is surprisingly human-like after only being fed a few examples of the task you want it to do.

Its release in 2020 dominated headlines, and people were scrambling to get on the waitlist to access its API hosted on OpenAI’s cloud service. Now, months later, as more users have gained access to the API (myself included), interesting applications and use cases have been popping up every day. For...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: AI (Artificial Intelligence), Apps, mHealth, Technology
AI agents’ momentum won’t stop in 2025
The cybersecurity provider’s next opportunity: Making AI safer
OpenAI launches ChatGPT desktop integrations, rivaling Copilot
Apple’s AI-Powered Smart Home Hub May Include eCommerce Capabilities
IPO Market Struggles as Investors Flock to AI

Share This Article