Optimizing AI Application Development with Docker Desktop and NVIDIA AI Workbench | Docker


Are you looking to streamline how to incorporate LLMs into your applications? Would you prefer to do this using the products and services you’re already familiar with? This is where Docker Desktop, especially when paired with the advanced capabilities offered by Docker’s Business subscription tier, comes into play — particularly when combined with NVIDIA’s cutting-edge technology.

Imagine a development environment where setting up and managing AI workloads is as intuitive as the everyday tools you’re already using. With our deepening partnership with NVIDIA, we are committed to making this a reality. This collaboration not only enhances your ability to leverage Docker containers but also significantly improves your overall experience of building and developing AI applications.

What’s more, this partnership is designed to support your long-term growth and innovation goals. Docker Desktop with Docker Business, combined with NVIDIA software, provides the perfect launchpad for developers who want to accelerate their AI development journey — whether it’s building prototypes or deploying enterprise-grade AI applications. This isn’t just about providing tools; it’s about investing in your abilities, your career, and the innovation capabilities of your organization.

With Docker Business, you gain access to advanced capabilities that enhance security, streamline management, and offer unparalleled support. Meanwhile, NVIDIA AI Workbench provides a robust, containerized environment tailored for AI and machine learning projects. Together, these solutions empower you to push the boundaries of what’s possible, bringing AI into your applications more effortlessly and effectively.

What is NVIDIA AI Workbench?

NVIDIA AI Workbench is a free developer toolkit powered by containers that enables data scientists and developers to create, collaborate, and migrate AI workloads and development environments across GPU systems. It targets scenarios like model fine-tuning, data science workflows, retrieval-augmented generation, and more. Users can install it on multiple systems but drive everything from a client application that runs locally on Windows, Ubuntu, and macOS. NVIDIA AI Workbench helps enable collaboration and distribution through Git-based platforms, like GitHub and GitLab. 

How does Docker Desktop relate to NVIDIA AI Workbench?

NVIDIA AI Workbench requires a container runtime. Docker’s container runtime (Docker Engine), delivered through Docker Desktop, is the recommended AI Workbench runtime for developers using AI Workbench on Windows and macOS. Previously, AI Workbench users had to install Docker Desktop manually. With this newest release of AI Workbench, developers who select Docker as their container runtime will have Docker Desktop installed on their machine automatically, with no manual steps required.

 You can learn about this integration in NVIDIA’s technical blog.

Moving beyond the AI application prototype

Docker Desktop is more than just a tool for application development; it’s a launchpad that provides an integrated, easy-to-use environment for developing a wide range of applications, including AI. What makes Docker Desktop particularly powerful is its ability to seamlessly create and manage containerized environments, ensuring that developers can focus on innovation without worrying about the underlying infrastructure.

For developers who have already invested in Docker, this means that the skills, automation, infrastructure, and tooling they’ve built up over the years for other workloads are directly applicable to AI workloads as well. This cross-compatibility offers a huge return on investment, as it allows teams to extend their existing Docker-based workflows to include AI applications and services without needing to overhaul their processes or learn new tools.

Docker Desktop’s compatibility with Windows, macOS, and Linux makes it an ideal choice for diverse development teams. Its robust features support a wide range of development workflows, from initial prototyping to large-scale deployment, ensuring that as AI applications move from concept to production, developers can leverage their existing Docker infrastructure and expertise to accelerate and scale their work.

For those looking to create high-quality, enterprise-grade AI applications, Docker Desktop with Docker Business offers advanced capabilities. These include enhanced security, management, and support features that are crucial for enterprise and advanced development environments. With Docker Business, development teams can build securely, collaborate efficiently, and maintain compliance, all while continuing to utilize their existing Docker ecosystem. By leveraging Docker Business, developers can confidently accelerate their workflows and deliver innovative AI solutions with the same reliability and efficiency they’ve come to expect from Docker.

Accelerating developer innovation with NVIDIA GPUs

In the rapidly evolving landscape of AI development, the ability to leverage GPU capabilities is crucial for handling the intensive computations required for tasks like model training and inference. Docker is working to offer flexible solutions to cater to different developers, whether you have your own GPUs or need to leverage cloud-based compute. 

Running containers with NVIDIA GPUs through Docker Desktop 

GPUs are at the heart of AI development, and Docker Desktop is optimized to leverage NVIDIA GPUs effectively. With Docker Desktop 4.29 or later, developers can configure CDI support in the daemon and easily make all NVIDIA GPUs available in a running container by using the --device option via support for CDI devices.

For instance, the following command can be used to make all NVIDIA GPUs available in a container:

docker run --device nvidia.com/gpu=all  

For more information on how Docker Desktop supports NVIDIA GPUs, refer to our GPU documentation.

No GPUs? No problem with Testcontainers Cloud

Not all developers have local access to powerful GPU hardware. To bridge this gap, we’re exploring GPU support in Testcontainers Cloud. This will allow developers to access GPU resources in a cloud environment, enabling them to run their tests and validate AI models without needing physical GPUs. With Testcontainers Cloud, you will be able to harness the power of GPUs from anywhere, democratizing high-performance AI development.

Trusted AI/ML content on Docker Hub

Docker Desktop provides a reliable and efficient platform for developers to discover and experiment with new ideas and approaches in AI development. Through its trusted content program, Docker selects and curates with open source and commercial communities high-quality images and distributes them on Docker Hub, under Docker Official Images, Docker Sponsored Open Source, and Docker Verified Publishers. With a wealth of AI/ML content, Docker makes it easy for users to discover and pull images for quick experimentation. This includes various images, such as NVIDIA software offerings and many more, allowing developers to get started quickly and efficiently.

Accelerated builds with Docker Build Cloud

Docker Build Cloud is a fully managed service designed to streamline and accelerate the building, testing, and deployment of any application. By leveraging Docker Build Cloud, AI application developers can shift builds from local machines to remote BuildKit instances — resulting in up to 39x faster builds. By offloading the complex build process to Docker Build Cloud, AI development teams can focus on refining their models and algorithms while Docker handles the rest.

Docker Business users can experience faster, more efficient builds and reproducible AI deployments with Docker Build Cloud minutes as part of their subscription.

Ensuring quality with Testcontainers

As AI applications evolve from prototypes to production-ready solutions, ensuring their reliability and performance becomes critical. This is where testing frameworks like Testcontainers come into play. Testcontainers allows developers to test their applications using real containerized dependencies, making it easier to validate application logic that utilize AI models in self-contained, idempotent, reproducible ways. 

For instance, developers working with LLMs can create Testcontainers-based tests that will test their application by utilizing any model available on Hugging Face utilizing the recently released Ollama container.  

Wrap up

The collaboration between Docker and NVIDIA marks a significant step forward in the AI development landscape. By integrating Docker Desktop into NVIDIA AI Workbench, we are making it easier than ever for developers to build, ship, and run AI applications. Docker Desktop provides a robust, streamlined environment that supports a wide range of development workflows, from initial prototyping to large-scale deployment. 

With advanced capabilities from Docker Business, AI developers can focus on innovation and efficiency. As we deepen our partnership with NVIDIA, we look forward to bringing even more enhancements to the AI development community, empowering developers to push the boundaries of what’s possible in AI and machine learning. 

Stay tuned for more exciting updates as we work to revolutionize AI application development.

Learn more



Source link