VentureBeat February 27, 2021
Dattaraj Rao, Persistent Systems

The most impressive thing about OpenAI’s natural language processing (NLP) model, GPT-3, is its sheer size. With more than 175 billion weighted connections between words known as parameters, the transformer encoder-decoder model blows its 1.5 billion parameter predecessor, GPT-2, out of the water. This has allowed the model to generate text that is surprisingly human-like after only being fed a few examples of the task you want it to do.

Its release in 2020 dominated headlines, and people were scrambling to get on the waitlist to access its API hosted on OpenAI’s cloud service. Now, months later, as more users have gained access to the API (myself included), interesting applications and use cases have been popping up every day. For...

Today's Sponsors

Venturous
Got healthcare questions? Just ask Transcarent

Today's Sponsor

Venturous

 
Topics: AI (Artificial Intelligence), Apps, mHealth, Technology
First clinical trial of an AI therapy chatbot yields significant mental health benefits
Hippocratic AI hires seven new executives and more digital health hires
Anthropic scientists expose how AI actually ‘thinks’ — and discover it secretly plans ahead and sometimes lies
Ranked: Which AI Chatbots Collect the Most Data About You?
Optimizing Care Coordination Through AI-Driven Discharge Intelligence

Share This Article