VentureBeat February 7, 2025
Michael Nuñez

Hugging Face and Physical Intelligence have quietly launched Pi0 (Pi-Zero) this week, the first foundational model for robots that translates natural language commands directly into physical actions.

“Pi0 is the most advanced vision language action model,” Remi Cadene, a principal research scientist at Hugging Face, announced in an X post that quickly gained attention across the AI community. “It takes natural language commands as input and directly outputs autonomous behavior.”

This release marks a pivotal moment in robotics: The first time a foundation model for robots has been made widely available through an open-source platform. Much like ChatGPT revolutionized text generation, Pi0 aims to transform how robots learn and execute tasks.

The future of robotics is open!

Excited to...

Today's Sponsors

Venturous
ZeOmega

Today's Sponsor

Venturous

 
Topics: AI (Artificial Intelligence), Robotics/RPA, Technology
AI-enabled clinical data abstraction: a nurse’s perspective
Contextual AI launches Agent Composer to turn enterprise RAG into production-ready AI agents
OpenAI’s latest product lets you vibe code science
WISeR in 2026: Legal, Compliance, and AI Challenges That Could Reshape Prior Authorization for Skin Substitutes
Dario Amodei warns AI may cause ‘unusually painful’ disruption to jobs

Share Article