VentureBeat October 5, 2023
Sean Michael Kerner

Underneath just about every generative AI application for training or inference today you’ll likely find Docker containers as the primary approach to deployment.

Today at the Dockercon conference in Los Angeles, Docker Inc., the eponymous company behind the open source docker container technology, is taking a dive into the deep end of AI with a series of initiatives designed to help developers more rapidly build generative AI applications.

Among the efforts is the launch of a new GenAI stack that integrates docker with the Neo4j graph database, LangChain model chaining technology and Ollama for running large language models (LLMs). The new Docker AI product is also debuting at Dockercon, as an integrated way for developers to get AI powered insights...

Today's Sponsors

Venturous
Got healthcare questions? Just ask Transcarent

Today's Sponsor

Venturous

 
Topics: AI (Artificial Intelligence), Apps, Technology
CIOs Are Adjusting to a New Job Description
New AI Tool Boosts Detection of Airway Nodules
To Deliver Meaningful Business Value, AI Must Grasp Context
How to bring AI to community hospitals
Healthcare AI newswatch: Ambient AI costs, healthcare AI holdouts, an 86-year-old AI innovator, more

Share This Article