Docker Inc. today announced the launch of major recent capabilities designed to make it dramatically easier to construct, run and deploy agentic AI applications.
The corporate, which helps ease the constructing, testing and deploying of applications by packaging them in lightweight, portable software containers, said it’s extending its Docker Compose tool to support AI agents and AI models in order that developers can deploy them at large scale with ease. The corporate can be introducing Docker Offload to permit developers to cloud-scale AI models and collaborating with integration partners corresponding to Google Cloud, Microsoft Azure and diverse AI software development kit providers.
“Agentic applications are rapidly evolving, but constructing production-grade agentic systems continues to be too hard,” said Tushar Jain, executive vp of engineering at Docker. “We’re now making agentic apps accessible to each developer by making agent-based development as easy, secure, and repeatable as container-based app development has all the time been.”
Agentic AI is a component of a brand new wave of AI software that involves using large language models to power tools to act autonomously and achieve complex goals with minimal human oversight. Unlike traditional AI chatbots, which regularly depend on direct interaction, corresponding to query and answer, AI agents could make decisions, plan actions and adapt to changing circumstances to finish step-by-step objectives using problem-solving.
Docker Compose has been considered one of Docker’s go-to tools for developers for running applications that exist across multiple containers — standardized software packages that include all the pieces to run an application, including code, runtime, system tools, libraries and configuration.
The corporate said it’s extending Compose to deal with the challenges of the agentic era by allowing developers to define agentic architectures consisting of AI models and tools needed to take them into production. These include defining agents, models and tools in a single Compose file. Developers may run agentic workloads locally or deploy them seamlessly to cloud services.
Compose allows developers to attach securely with Docker’s Model Context Protocol Gateway, facilitating communication and discovery of other AI tools and data services. This protocol enables developers to integrate large language models and AI applications with data services without having to rewrite code or create complex application interfaces.
“Expanding Docker Compose to present developers the identical familiar, easy experience for AI deployments as they’ve for traditional apps is strictly what we’d like,” said Torsten Volk, principal analyst at Enterprise Strategy Group. “Plus, the brand new capability to run AI models directly within the cloud — without clogging up your laptop — is one other major step forward. This could make an actual difference in how quickly enterprises can start adopting AI at scale.”
Docker introduces Offload
Agentic AI applications demand way more graphics processing unit power than standard AI model usage, as they complete complex tasks. Local machines often fall behind in mandatory capability, causing sloggingly slow outcomes.
To handle this challenge, Docker today unveiled Docker Offload in beta mode. This recent service allows developers to dump AI and GPU-intensive workloads to the cloud each time they need.
In keeping with Docker, the brand new service allows developers to take care of local speed and access the cloud only when needed; large models and multi-agent systems might be offloaded to high-performance cloud environments. Developers can select where and when to dump workloads based on privacy, cost and performance needs.
The brand new Offload capability integrates directly into Docker Desktop, making it easy to tap into and access with ready configuration options.
Integration partners for this cloud availability include Google Cloud via serverless environments and Microsoft Azure, coming soon. Compose integrations also support popular agentic AI frameworks, including CrewAI, Embabel, Google’s Agent Development Kit, LangGraph, Spring AI and Vercel AI SDK.
Image: SiliconANGLE/Microsoft Designer
Support our open free content by sharing and fascinating with our content and community.
Join theCUBE Alumni Trust Network
Where Technology Leaders Connect, Share Intelligence & Create Opportunities
11.4k+
CUBE Alumni Network
C-level and Technical
Domain Experts
Connect with 11,413+ industry leaders from our network of tech and business leaders forming a singular trusted network effect.
SiliconANGLE Media is a recognized leader in digital media innovation serving revolutionary audiences and types, bringing together cutting-edge technology, influential content, strategic insights and real-time audience engagement. Because the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — corresponding to those established in Silicon Valley and the Latest York Stock Exchange (NYSE) — SiliconANGLE Media operates on the intersection of media, technology, and AI. .
Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a strong ecosystem of industry-leading digital media brands, with a reach of 15+ million elite tech professionals. The corporate’s recent, proprietary theCUBE AI Video cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to assist technology corporations make data-driven decisions and stay on the forefront of industry conversations.