Meet Gordon: An AI Agent for Docker

This ongoing Docker Labs GenAI series explores the exciting space of AI developer tools. At Docker, we believe there is a vast scope to explore, openly and without the hype. We will share our explorations and collaborate with the developer community in real time. Although developers have adopted autocomplete tooling like GitHub Copilot and use chat, there is significant potential for AI tools to assist with more specific tasks and interfaces throughout the entire software lifecycle. Therefore, our exploration will be broad. We will be releasing software as open source so you can play, explore, and hack with us, too.

In previous articles, we focused on how AI-based tools can help developers streamline tasks and offered ideas for enabling agentic workflows, like reviewing branches and understanding code changes.

In this article, we’ll explore our experiments around the idea of creating a Docker AI Agent — something that could both help new users learn about our tools and products and help power users get things done faster.

During our explorations around this Docker Agent and AI-based tools, we noticed that the main pain points we encountered were often the same:

LLMs need good context to provide good answers (garbage in -> garbage out).

Using AI tools often requires context switching (moving to another app, to a different website, etc.).

We’d like agents to be able to suggest and perform actions on behalf of the users.

Direct product integrations with AI are often more satisfying to use than chat interfaces.

At first, we tried to see what’s possible using off-the-shelf services like ChatGPT or Claude. 

By using testing prompts such as “optimize the following Dockerfile, following all best practices” and providing the model with a sub-par but common Dockerfile, we could sometimes get decent answers. Often, though, the resulting Dockerfile had subtle bugs, hallucinations, or simply wasn’t optimized or didn’t use many of the best practices we would’ve hoped for. Thus, this approach was not reliable enough.

Data ended up being the main issue. Training data for LLM models is always outdated by some amount of time, and the number of bad Dockerfiles that you can find online vastly outnumbers the amount of up-to-date Dockerfiles using all best practices, etc.

After doing proof-of-concept tests using a RAG approach, including some documents with lots of useful advice for creating good Dockerfiles, we realized that the AI Agent idea was definitely possible. However, setting up all the things required for a good RAG would’ve taken too much bandwidth from our small team.

Because of this, we opted to use kapa.ai for that specific part of our agent. Docker already uses them to provide the AI docs assistant on Docker docs, so most of our high-quality documentation is already available for us to reference as part of our LLM usage through their service. Using kapa.ai allowed us to experiment more, getting high-quality results faster, and allowing us to try different ideas around the AI agent concept.

Enter Gordon

Out of this experimentation came a new product that you can try: Gordon. With Gordon, we’d like to tackle these pain points. By integrating Gordon into Docker Desktop and the Docker CLI (Figure 1), we can:

Access much more context that can be used by the LLMs to best understand the user’s questions and provide better answers or even perform actions on the user’s behalf.

Be where the users are. If you launch a container via Docker Desktop and it fails, you can quickly debug with Gordon. If you’re in the terminal hacking away, Docker AI will be there, too.

Avoid being a purely chat-based agent by providing Gordon-based features directly as part of Docker Desktop UI elements. If Gordon detects certain scenarios, like a container that failed to start, a button will appear in the UI to directly get suggestions, or run actions, etc. (Figure 2).

Figure 1: Gordon icon on Docker Desktop.

Figure 2: Ask Gordon (beta).

What Gordon can do

We want to start with Gordon by optimizing for Docker-related tasks — not general-purpose questions — but we are not excluding expanding the scope to more development-related tasks as work on the agent continues.

Work on Gordon is at an early stage and its capabilities are constantly evolving, but it’s already really good at some things (Figure 3). Here are things to definitely try out:

Ask general Docker-related questions. Gordon knows Docker well and has access to all of our documentation.

Get help debugging container build or runtime errors.

Remediate policy deviations from Docker Scout.

Get help optimizing Docker-related files and configurations.

Ask it how to run specific containers (e.g., “How can I run MongoDB?”).

Figure 3: Using Gordon to understand a Dockerfile.

How Gordon works

The Gordon backend lives on Docker servers, while the client is a CLI that lives on the user’s machine and is bundled with Docker Desktop. Docker Desktop uses the CLI to access the local machine’s files, asking the user for the directory each time it needs that context to answer a question. When using the CLI directly, it has access to the working directory it’s executed in. For example, if you are in a directory with a Dockerfile and you run “Docker AI, rate my Dockerfile”, it will find the one that’s present in that directory

Currently, Gordon does not have write access to any files, so it will not edit any of your files. We’re hard at work on future features that will allow the agent to do the work for you, instead of only suggesting solutions. 

Figure 4 shows a rough overview of how we are thinking about things behind the scenes.

Figure 4: Overview of Gordon.

The first step of this pipeline, “Understand the user’s input and figure out which action to perform”, is done using “tool calling” (also known as “function calling”) with the OpenAI API. 

Although this is a popular approach, we noticed that the documentation online isn’t very good, and general best practices aren’t well defined yet. This led us to experiment a lot with the feature and try to figure out what works for us and what doesn’t.

Things we noticed:

Tool descriptions are important, and we should prefer more in-depth descriptions with examples.

Testing around tool-detection code is also important. Adding new tools to a request could confuse the LLM and cause it to no longer trigger the expected tool.

The LLM model used influences how the whole tool calling functionality should be implemented, as different models might prefer descriptions written in a certain way, behave better/worse under certain scenarios (e.g. when using lots of tools), etc.

Try Gordon for yourself

Gordon is available as an opt-in Beta feature starting with Docker Desktop version 4.37. To participate in the closed beta, all you need to do is fill out the form on the site.

Initially, Gordon will be available for use both in Docker Desktop and the Docker CLI, but our idea is to surface parts of this tech in various other parts of our products as well.

For more on what we’re doing at Docker, subscribe to our newsletter.

Learn more

Subscribe to the Docker Newsletter. 

Learn about accelerating AI development with the Docker AI Catalog.

Read the Docker Labs GenAI series.

Get the latest release of Docker Desktop.

Have questions? The Docker community is here to help.

New to Docker? Get started.

Quelle: https://blog.docker.com/feed/

Unlocking Efficiency with Docker for AI and Cloud-Native Development

The need for secure and high quality software becomes more critical every day as the impact of vulnerabilities increases and related costs continue to rise. For example, flawed software cost the U.S. economy $2.08 trillion in 2020 alone, according to the Consortium for Information and Software Quality (CISQ). And, a software defect that might cost $100 to fix if found early in the development process can grow exponentially to $10,000 if discovered later in production. 

Docker helps you deliver secure, efficient applications by providing consistent environments and fast, reliable container management, building on best practices that let you discover and resolve issues earlier in the software development life cycle (SDLC).

Shifting left to ensure fewer defects

In a previous blog post, we talked about using the right tools, including Docker’s suite of products to boost developer productivity. Besides having the right tools, you also need to implement the right processes to optimize your software development and improve team productivity. 

The software development process is typically broken into two distinct loops, the inner and the outer loops. At Docker, we believe that investing in the inner loop is crucial. This means shifting security left and identifying problems as soon as you can. This approach improves efficiency and reduces costs by helping teams find and fix software issues earlier.

Using Docker tools to adopt best practices

Docker’s products help you adopt these best practices — we are focused on enhancing the software development lifecycle, especially around refining the inner loop. Products like Docker Desktop allow your dev team in the inner loop to run, test, code, and build everything fast and consistently. This consistency eliminates the “it works on my machine” issue, meaning applications behave the same in both development and production.  

Shifting left lets your dev team identify problems earlier in your software project lifecycle. When you detect issues sooner, you increase efficiency and help ensure secure builds and compliance. By shifting security left with Docker Scout, your dev teams can identify vulnerabilities sooner and help avoid issues down the road. 

Another example of shifting left involves testing — doing testing earlier in the process leads to more robust software and faster release cycles. This is when Testcontainers Cloud comes in handy because it enables developers to run reliable integration tests, with real dependencies defined in code. 

Accelerate development within the hybrid inner loop

We see more and more companies adopting the so-called hybrid inner loop, which combines the best of two worlds — local and cloud. The results provide greater flexibility for your dev teams and encourage better collaboration. For example, Docker Build Cloud uses the power of the cloud to speed up build time without sacrificing the local development experience that developers love. 

By using these Docker products across the software development life cycle, teams get quick feedback loops and faster issue resolution, ensuring a smooth development flow from inception to deployment. 

Simplifying AI application development

When you’re using the right tools and processes to accelerate your application delivery and maximize efficiency throughout your SDLC, processes that were once cumbersome become your new baseline, freeing up time for true innovation. 

Docker also helps accelerate innovation by simplifying AI/ML development. We are continually investing in AI to help your developers deliver AI-backed applications that differentiate your business and enhance competitiveness.

Docker AI tools

Docker’s GenAI Stack accelerates the incorporation of large language models (LLMs) and AI/ML into your code, enabling the delivery of AI-backed applications. All containers work harmoniously and are managed directly from Docker Desktop, allowing your team to monitor and adjust components without leaving their development environment. Deploying the GenAI Stack is quick and easy, and leveraging Docker’s containerization technology helps speed setup and simplify scaling as applications grow.

Earlier this year, we announced the preview of Docker Extension for GitHub Copilot. By standardizing best practices and enabling integrations with tools like GitHub Copilot, Docker empowers developers to focus on innovation, closing the gap from the first line of code to production.

And, more recently, we launched the Docker AI Catalog in Docker Hub. This new feature simplifies the process of integrating AI into applications by providing trusted and ready-to-use content supported by comprehensive documentation. Your dev team will benefit from shorter development cycles, improved productivity, and a more streamlined path to integrating AI into both new and existing applications.

Wrapping up

Docker products help you establish sound processes and practices related to shifting left and discovering issues earlier to avoid headaches down the road. This approach ultimately unlocks developer productivity, giving your dev team more time to code and innovate. Docker also allows you to quickly use AI to close knowledge gaps and offers trusted tools to build AI/ML applications and accelerate time to market. 

To see how Docker continues to empower developers with the latest innovations and tools, check out our Docker 2024 Highlights.

Learn about Docker’s updated subscriptions and find the ideal plan for your team’s needs.

Learn more

Subscribe to the Docker Navigator Newsletter. 

Get the latest release of Docker Desktop.

Have questions? The Docker community is here to help.

New to Docker? Get started.

Quelle: https://blog.docker.com/feed/

How to Dockerize a Django App: Step-by-Step Guide for Beginners

One of the best ways to make sure your web apps work well in different environments is to containerize them. Containers let you work in a more controlled way, which makes development and deployment easier. This guide will show you how to containerize a Django web app with Docker and explain why it’s a good idea.

We will walk through creating a Docker container for your Django application. Docker gives you a standardized environment, which makes it easier to get up and running and more productive. This tutorial is aimed at those new to Docker who already have some experience with Django. Let’s get started!

Why containerize your Django application?

Django apps can be put into containers to help you work more productively and consistently. Here are the main reasons why you should use Docker for your Django project:

Creates a stable environment: Containers provide a stable environment with all dependencies installed, so you don’t have to worry about “it works on my machine” problems. This ensures that you can reproduce the app and use it on any system or server. Docker makes it simple to set up local environments for development, testing, and production.

Ensures reproducibility and portability: A Dockerized app bundles all the environment variables, dependencies, and configurations, so it always runs the same way. This makes it easier to deploy, especially when you’re moving apps between environments.

Facilitates collaboration between developers: Docker lets your team work in the same environment, so there’s less chance of conflicts from different setups. Shared Docker images make it simple for your team to get started with fewer setup requirements.

Speeds up deployment processes: Docker makes it easier for developers to get started with a new project quickly. It removes the hassle of setting up development environments and ensures everyone is working in the same place, which makes it easier to merge changes from different developers.

Getting started with Django and Docker

Setting up a Django app in Docker is straightforward. You don’t need to do much more than add in the basic Django project files.

Tools you’ll need

To follow this guide, make sure you first:

Install Docker Desktop and Docker Compose on your machine.

Use a Docker Hub account to store and access Docker images.

Make sure Django is installed on your system.

If you need help with the installation, you can find detailed instructions on the Docker and Django websites.

How to Dockerize your Django project

The following six steps include code snippets to guide you through the process.

Step 1: Set up your Django project

1. Initialize a Django project. 

If you don’t have a Django project set up yet, you can create one with the following commands:

django-admin startproject my_docker_django_app
cd my_docker_django_app

2. Create a requirements.txt file. 

In your project, create a requirements.txt file to store dependencies:

pip freeze > requirements.txt

3. Update key environment settings.

You need to change some sections in the settings.py file to enable them to be set using environment variables when the container is started. This allows you to change these settings depending on the environment you are working in.

# The secret key
SECRET_KEY = os.environ.get("SECRET_KEY")

DEBUG = bool(os.environ.get("DEBUG", default=0))

ALLOWED_HOSTS = os.environ.get("DJANGO_ALLOWED_HOSTS","127.0.0.1").split(",")

Step 2: Create a Dockerfile

A Dockerfile is a script that tells Docker how to build your Docker image. Put it in the root directory of your Django project. Here’s a basic Dockerfile setup for Django:

# Use the official Python runtime image
FROM python:3.13

# Create the app directory
RUN mkdir /app

# Set the working directory inside the container
WORKDIR /app

# Set environment variables
# Prevents Python from writing pyc files to disk
ENV PYTHONDONTWRITEBYTECODE=1
#Prevents Python from buffering stdout and stderr
ENV PYTHONUNBUFFERED=1

# Upgrade pip
RUN pip install –upgrade pip

# Copy the Django project and install dependencies
COPY requirements.txt /app/

# run this command to install all dependencies
RUN pip install –no-cache-dir -r requirements.txt

# Copy the Django project to the container
COPY . /app/

# Expose the Django port
EXPOSE 8000

# Run Django’s development server
CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]

Each line in the Dockerfile serves a specific purpose:

FROM: Selects the image with the Python version you need.

WORKDIR: Sets the working directory of the application within the container.

ENV: Sets the environment variables needed to build the application

RUN and COPY commands: Install dependencies and copy project files.

EXPOSE and CMD: Expose the Django server port and define the startup command.

You can build the Django Docker container with the following command:

docker build -t django-docker .

To see your image, you can run:

docker image list

The result will look something like this:

REPOSITORY TAG IMAGE ID CREATED SIZE
django-docker latest ace73d650ac6 20 seconds ago 1.55GB

Although this is a great start in containerizing the application, you’ll need to make a number of improvements to get it ready for production.

The CMD manage.py is only meant for development purposes and should be changed for a WSGI server.

Reduce the size of the image by using a smaller image.

Optimize the image by using a multistage build process.

Let’s get started with these improvements.

Update requirements.txt

Make sure to add gunicorn to your requirements.txt. It should look like this:

asgiref==3.8.1
Django==5.1.3
sqlparse==0.5.2
gunicorn==23.0.0
psycopg2-binary==2.9.10

Make improvements to the Dockerfile

The Dockerfile below has changes that solve the three items on the list. The changes to the file are as follows:

Updated the FROM python:3.13 image to FROM python:3.13-slim. This change reduces the size of the image considerably, as the image now only contains what is needed to run the application.

Added a multi-stage build process to the Dockerfile. When you build applications, there are usually many files left on the file system that are only needed during build time and are not needed once the application is built and running. By adding a build stage, you use one image to build the application and then move the built files to the second image, leaving only the built code. Read more about multi-stage builds in the documentation.

Add the Gunicorn WSGI server to the server to enable a production-ready deployment of the application.

# Stage 1: Base build stage
FROM python:3.13-slim AS builder

# Create the app directory
RUN mkdir /app

# Set the working directory
WORKDIR /app

# Set environment variables to optimize Python
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1

# Upgrade pip and install dependencies
RUN pip install –upgrade pip

# Copy the requirements file first (better caching)
COPY requirements.txt /app/

# Install Python dependencies
RUN pip install –no-cache-dir -r requirements.txt

# Stage 2: Production stage
FROM python:3.13-slim

RUN useradd -m -r appuser &&
mkdir /app &&
chown -R appuser /app

# Copy the Python dependencies from the builder stage
COPY –from=builder /usr/local/lib/python3.13/site-packages/ /usr/local/lib/python3.13/site-packages/
COPY –from=builder /usr/local/bin/ /usr/local/bin/

# Set the working directory
WORKDIR /app

# Copy application code
COPY –chown=appuser:appuser . .

# Set environment variables to optimize Python
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1

# Switch to non-root user
USER appuser

# Expose the application port
EXPOSE 8000

# Start the application using Gunicorn
CMD ["gunicorn", "–bind", "0.0.0.0:8000", "–workers", "3", "my_docker_django_app.wsgi:application"]

Build the Docker container image again.

docker build -t django-docker .

After making these changes, we can run a docker image list again:

REPOSITORY TAG IMAGE ID CREATED SIZE
django-docker latest 3c62f2376c2c 6 seconds ago 299MB

You can see a significant improvement in the size of the container.

The size was reduced from 1.6 GB to 299MB, which leads to faster a deployment process when images are downloaded and cheaper storage costs when storing images.

You could use docker init as a command to generate the Dockerfile and compose.yml file for your application to get you started.

Step 3: Configure the Docker Compose file

A compose.yml file allows you to manage multi-container applications. Here, we’ll define both a Django container and a PostgreSQL database container.

The compose file makes use of an environment file called .env, which will make it easy to keep the settings separate from the application code. The environment variables listed here are standard for most applications:

services:
db:
image: postgres:17
environment:
POSTGRES_DB: ${DATABASE_NAME}
POSTGRES_USER: ${DATABASE_USERNAME}
POSTGRES_PASSWORD: ${DATABASE_PASSWORD}
ports:
– "5432:5432"
volumes:
– postgres_data:/var/lib/postgresql/data
env_file:
– .env

django-web:
build: .
container_name: django-docker
ports:
– "8000:8000"
depends_on:
– db
environment:
DJANGO_SECRET_KEY: ${DJANGO_SECRET_KEY}
DEBUG: ${DEBUG}
DJANGO_LOGLEVEL: ${DJANGO_LOGLEVEL}
DJANGO_ALLOWED_HOSTS: ${DJANGO_ALLOWED_HOSTS}
DATABASE_ENGINE: ${DATABASE_ENGINE}
DATABASE_NAME: ${DATABASE_NAME}
DATABASE_USERNAME: ${DATABASE_USERNAME}

DATABASE_PASSWORD: ${DATABASE_PASSWORD}
DATABASE_HOST: ${DATABASE_HOST}
DATABASE_PORT: ${DATABASE_PORT}
env_file:
– .env
volumes:
postgres_data:

And the example .env file:

DJANGO_SECRET_KEY=your_secret_key
DEBUG=True
DJANGO_LOGLEVEL=info
DJANGO_ALLOWED_HOSTS=localhost
DATABASE_ENGINE=postgresql_psycopg2
DATABASE_NAME=dockerdjango
DATABASE_USERNAME=dbuser
DATABASE_PASSWORD=dbpassword
DATABASE_HOST=db
DATABASE_PORT=5432

Step 4: Update Django settings and configuration files

1. Configure database settings. 

Update settings.py to use PostgreSQL:

DATABASES = {
'default': {
'ENGINE': 'django.db.backends.{}'.format(
os.getenv('DATABASE_ENGINE', 'sqlite3')
),
'NAME': os.getenv('DATABASE_NAME', 'polls'),
'USER': os.getenv('DATABASE_USERNAME', 'myprojectuser'),
'PASSWORD': os.getenv('DATABASE_PASSWORD', 'password'),
'HOST': os.getenv('DATABASE_HOST', '127.0.0.1'),
'PORT': os.getenv('DATABASE_PORT', 5432),
}
}

2. Set ALLOWED_HOSTS to read from environment files. 

In settings.py, set ALLOWED_HOSTS to:

# 'DJANGO_ALLOWED_HOSTS' should be a single string of hosts with a , between each.
# For example: 'DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1,[::1]'
ALLOWED_HOSTS = os.environ.get("DJANGO_ALLOWED_HOSTS","127.0.0.1").split(",")

3. Set the SECRET_KEY to read from environment files.

In settings.py, set SECRET_KEY to:

# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = os.environ.get("DJANGO_SECRET_KEY")

4. Set DEBUG to read from environment files.

In settings.py, set DEBUG to:

# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = bool(os.environ.get("DEBUG", default=0))

Step 5: Build and run your new Django project

To build and start your containers, run:

docker compose up –build

This command will download any necessary Docker images, build the project, and start the containers. Once complete, your Django application should be accessible at http://localhost:8000.

Step 6: Test and access your application

Once the app is running, you can test it by navigating to http://localhost:8000. You should see Django’s welcome page, indicating that your app is up and running. To verify the database connection, try running a migration:

docker compose run django-web python manage.py migrate

Troubleshooting common issues with Docker and Django

Here are some common issues you might encounter and how to solve them:

Database connection errors: If Django can’t connect to PostgreSQL, verify that your database service name matches in compose.yml and settings.py.

File synchronization issues: Use the volumes directive in compose.yml to sync changes from your local files to the container.

Container restart loops or crashes: Use docker compose logs to inspect container errors and determine the cause of the crash.

Optimizing your Django web application

To improve your Django Docker setup, consider these optimization tips:

Automate and secure builds: Use Docker’s multi-stage builds to create leaner images, removing unnecessary files and packages for a more secure and efficient build.

Optimize database access: Configure database pooling and caching to reduce connection time and boost performance.

Efficient dependency management: Regularly update and audit dependencies listed in requirements.txt to ensure efficiency and security.

Take the next step with Docker and Django

Containerizing your Django application with Docker is an effective way to simplify development, ensure consistency across environments, and streamline deployments. By following the steps outlined in this guide, you’ve learned how to set up a Dockerized Django app, optimize your Dockerfile for production, and configure Docker Compose for multi-container setups.

Docker not only helps reduce “it works on my machine” issues but also fosters better collaboration within development teams by standardizing environments. Whether you’re deploying a small project or scaling up for enterprise use, Docker equips you with the tools to build, test, and deploy reliably.

Ready to take the next step? Explore Docker’s powerful tools, like Docker Hub and Docker Scout, to enhance your containerized applications with scalable storage, governance, and continuous security insights.

Learn more 

Subscribe to the Docker Newsletter. 

Learn more about Docker commands, Docker Compose, and security in the Docker Docs. 

Find Dockerized Django projects for inspiration and guidance in GitHub. 

Discover Docker plugins that improve performance, logging, and security.

Get the latest release of Docker Desktop.

Have questions? The Docker community is here to help.

New to Docker? Get started.

Quelle: https://blog.docker.com/feed/

Incident Update: Docker Desktop for Mac 

We want to inform you about a new issue affecting Docker Desktop for some macOS users. This causes Docker Desktop to not start. Some users may also have received malware warnings. Those warnings are inaccurate.  

Current status

We have identified the root cause. A temporary workaround that will restore functionality is available for any affected users. Detailed instructions for the workaround are available on GitHub.

Next steps

Our team is prioritizing this issue and working diligently on a permanent fix. If you prefer to wait for the longer-term patch update, please refrain from (re)-starting Docker Desktop.

We know how important Docker Desktop is to your work, and we’re committed to resolving this issue quickly and effectively. For assistance or additional information, please reach out to our Support team or check the Docker Status page for the latest updates.
Quelle: https://blog.docker.com/feed/

How to Set Up a Kubernetes Cluster on Docker Desktop

Kubernetes is an open source platform for automating the deployment, scaling, and management of containerized applications across clusters of machines. It’s become the go-to solution for orchestrating containers in production environments. But if you’re developing or testing locally, setting up a full Kubernetes cluster can be complex. That’s where Docker Desktop comes in — it lets you run Kubernetes directly on your local machine, making it easy to test microservices, CI/CD pipelines, and containerized apps without needing a remote cluster.

Getting Kubernetes up and running can feel like a daunting task, especially for developers working in local environments. But with Docker Desktop, spinning up a fully functional Kubernetes cluster is simpler than ever. Whether you’re new to Kubernetes or just want an easy way to test containerized applications locally, Docker Desktop provides a streamlined solution. In this guide, we’ll walk through the steps to start a Kubernetes cluster on Docker Desktop and offer troubleshooting tips to ensure a smooth experience. 

Note: Docker Desktop’s Kubernetes cluster is designed specially for local development and testing; it is not for production use. 

Benefits of running Kubernetes in Docker Desktop 

The benefits of this setup include: 

Easy local Kubernetes cluster: A fully functional Kubernetes cluster runs on your local machine with minimal setup, handling network access between the host and Kubernetes as well as storage management. 

Easier learning path and developer convenience: For developers familiar with Docker but new to Kubernetes, having Kubernetes built into Docker Desktop offers a low-friction learning path. 

Testing Kubernetes-based applications locally: Docker Desktop gives developers a local environment to test Kubernetes-based microservices applications that require Kubernetes features like services, pods, ConfigMaps, and secrets without needing access to a remote cluster. It also helps developers to test CI/CD pipelines locally. 

How to start Kubernetes cluster on Docker Desktop in three steps

Download the latest Docker Desktop release.

Install Docker Desktop on the operating system of your choice. Currently, the supported operating systems are macOS, Linux, and Windows.

In the Settings menu, select Kubernetes > Enable Kubernetes and then Apply & restart to start a one-node Kubernetes cluster (Figure 1). Typically, the time it takes to set up the Kubernetes cluster depends on your internet speed to pull the needed images.

Figure 1: Starting Kubernetes.

Once the Kubernetes cluster is started successfully, you can see the status from the Docker Desktop dashboard or the command line.

From the dashboard (Figure 2):

Figure 2: Status from the dashboard.

The command-line status:

$ kubectl get node
NAME STATUS ROLES AGE VERSION
docker-desktop Ready control-plane 5d v1.30.2

Getting Kubernetes support

Docker bundles Kubernetes but does not provide official Kubernetes support. If you are experiencing issues with Kubernetes, however, you can get support in several ways, including from the Docker community, Docker guides, and GitHub documentation: 

Docker community forums

Deploy to Kubernetes | Docker Docs

Docker Docs

GitHub documentation:

Docker Desktop for Windows documentation

Docker Desktop for Mac documentation 

Docker Desktop for Linux documentation 

What to do if you experience an issue 

Generate a diagnostics file

Before troubleshooting, generate a diagnostics file using your terminal.

Refer to the documentation for diagnosing from the terminal. For example, if you are using a Mac, run the following command:

/Applications/Docker.app/Contents/MacOS/com.docker.diagnose gather -upload

The command will show you where the diagnostics file is saved:

Gathering diagnostics for ID into /var/folders/50/<Random Character>/<Random Character>/<Machine unique ID>/<YYYYMMDDTTTT>.zip.

In this case, the file is saved at /var/folders/50/<Random Characters>/<Random Characters>/<YYYMMDDTTTT>.zip. Unzip the file (<YYYYMMDDTTTT>.zip) where you can find the logs file for Docker Desktop.

Check for logs

Checking for logs instead of guessing the issue is good practice. Understanding what Kubernetes components are available and what their functions are is essential before you start troubleshooting. You can narrow down the process by looking at the specific component logs. Look for the keyword error or fatal in the logs. 

Depending on which platform you are using, one method is to use the grep command and search for the keyword in the macOS terminal, a Linux distro for WSL2, or the Linux terminal for the file you unzipped:

$ grep -Hrni "<keyword>" <The path of the unzipped file>

## For example, one of the error found related to Kubernetes in the "com.docker.backend.exe" logs:

$ grep -Hrni "error" *
[com.docker.backend.exe.log:[2022-12-05T05:24:39.377530700Z][com.docker.backend.exe][W] starting kubernetes: 1 error occurred:
com.docker.backend.exe.log: * starting kubernetes: pulling kubernetes images: pulling registry.k8s.io/coredns:v1.9.3: Error response from daemon: received unexpected HTTP status: 500 Internal Server Error

Troubleshooting example

Let’s say you notice there is an issue starting up the cluster. This issue could be related to the Kubelet process, which works as a node-level agent to help with container management and orchestration within a Kubernetes cluster. So, you should check the Kubelet logs. 

But, where is the Kubelet log located? It’s at log/vm/kubelet.log in the diagnostics file.

An example of a possible related issue can be found in the kubelet.log. The images needed to set up Kubernetes are not able to be pulled due to network/internet restrictions. You might find errors related to failing to pull the necessary Kubernetes images to set up the Kubernetes cluster.

For example:

starting kubernetes: pulling kubernetes images: pulling registry.k8s.io/coredns:v1.9.3: Error response from daemon: received unexpected HTTP status: 500 Internal Server Error

Normally, 10 images are needed to set up the cluster. The following output is from a macOS running Docker Desktop version 4.33:

$ docker image ls
REPOSITORY TAG IMAGE ID CREATED SIZE
docker/desktop-kubernetes kubernetes-v1.30.2-cni-v1.4.0-critools-v1.29.0-cri-dockerd-v0.3.11-1-debian 5ef3082e902d 4 weeks ago 419MB
registry.k8s.io/kube-apiserver v1.30.2 84c601f3f72c 7 weeks ago 112MB
registry.k8s.io/kube-scheduler v1.30.2 c7dd04b1bafe 7 weeks ago 60.5MB
registry.k8s.io/kube-controller-manager v1.30.2 e1dcc3400d3e 7 weeks ago 107MB
registry.k8s.io/kube-proxy v1.30.2 66dbb96a9149 7 weeks ago 87.9MB
registry.k8s.io/etcd 3.5.12-0 014faa467e29 6 months ago 139MB
registry.k8s.io/coredns/coredns v1.11.1 2437cf762177 11 months ago 57.4MB
docker/desktop-vpnkit-controller dc331cb22850be0cdd97c84a9cfecaf44a1afb6e 3750dfec169f 14 months ago 35MB
registry.k8s.io/pause 3.9 829e9de338bd 22 months ago 514kB
docker/desktop-storage-provisioner v2.0 c027a58fa0bb 3 years ago 39.8MB

You can check whether you successfully pulled the 10 images by running docker image ls. If images are missing, a workaround is to save the missing image using docker image save from a machine that successfully starts the Kubernetes cluster (provided both run the same Docker Desktop version). Then, you can transfer the image to your machine, use docker image load to load the image into your machine, and tag it. 

For example, if the registry.k8s.io/coredns:v<VERSION> image is not available,  you can follow these steps:

Use docker image save from a machine that successfully starts the Kubernetes cluster to save it as a tar file: docker save registry.k8s.io/coredns:v<VERSION> > <Name of the file>.tar.

Manually transfer the <Name of the file>.tar to your machine.

Use docker image load to load the image on your machine: docker image load < <Name of the file>.tar.

Tag the image: docker image tag registry.k8s.io/coredns:v<VERSION> <Name of the file>.tar.

Re-enable the Kubernetes from your Docker Desktop’s settings.

Check other logs in the diagnostics log.

What to look for in the diagnostics log

In the diagnostics log, look for the folder starting named kube/. (Note that the <kube> below,  for macOS and Linux is kubectl and for Windows is kubectl.exe.)

kube/get-namespaces.txt: List down all the namespaces, equal to <kube> –context docker-desktop get namespaces.

kube/describe-nodes.txt: Describe the docker-desktop node, equal to <kube> –context docker-desktop describe nodes.

kube/describe-pods.txt: Description of all pods running in the Kubernetes cluster.

kube/describe-services.txt: Description of the services running, equal to <kube> –context docker-desktop describe services –all-namespaces.

You also can find other useful Kubernetes logs in the mentioned folder.

Search for known issues

For any error message found in the steps above, you can search for known Kubernetes issues on GitHub to see if a workaround or any future permanent fix is planned.

Reset or reboot 

If the previous steps weren’t helpful, try a reboot. And, if the previous steps weren’t helpful, try a reboot. And, if a reboot is not helpful, the last alternative is to reset your Kubernetes cluster, which often helps resolve issues: 

Reboot: To reboot, restart your machine. Rebooting a machine in a Kubernetes cluster can help resolve issues by clearing transient states and restoring the system to a clean state.

Reset: For a reset, navigate to Settings > Kubernetes > Reset the Kubernetes Cluster. Resetting a Kubernetes cluster can help resolve issues by essentially reverting the cluster to a clean state, and clearing out misconfigurations, corrupted data, or stuck resources that may be causing problems.

Bringing Kubernetes to your local development environment

This guide offers a straightforward way to start a Kubernetes cluster on Docker Desktop, making it easier for developers to test Kubernetes-based applications locally. It covers key benefits like simple setup, a more accessible learning path for beginners, and the ability to run tests without relying on a remote cluster. We also provide some troubleshooting tips and resources for resolving common issues. 

Whether you’re just getting started or looking to improve your local Kubernetes workflow, give it a try and see what you can achieve with Docker Desktop’s Kubernetes integration.

Learn more

Subscribe to the Docker Newsletter. 

Get the latest release of Docker Desktop.

On-demand training: Container Orchestration 101

Docker and Kubernetes

Docker community forums

Deploy to Kubernetes | Docker Docs

Docker Docs

Docker Compose Bridge

GitHub documentation

Docker Desktop for Windows documentation

Docker Desktop for Mac documentation 

Docker Desktop for Linux documentation 

Have questions? The Docker community is here to help.

New to Docker? Get started.

Quelle: https://blog.docker.com/feed/

Mastering Peak Software Development Efficiency with Docker

In modern software development, businesses are searching for smarter ways to streamline workflows and deliver value faster. For developers, this means tackling challenges like collaboration and security head-on, while driving efficiency that contributes directly to business performance. But how do you address potential roadblocks before they become costly issues in production? The answer lies in optimizing the development inner loop — a core focus for the future of app development.

By identifying and resolving inefficiencies early in the development lifecycle, software development teams can overcome common engineering challenges such as slow dev cycles, spiraling infrastructure costs, and scaling challenges. With Docker’s integrated suite of development tools, developers can achieve new levels of engineering efficiency, creating high-quality software while delivering real business impact.

Let’s explore how Docker is transforming the development process, reducing operational overhead, and empowering teams to innovate faster.

Speed up software development lifecycles: Faster gains with less effort

A fast software development lifecycle is a crucial aspect for delivering value to users, maintaining a competitive edge, and staying ahead of industry trends. To enable this, software developers need workflows that minimize friction and allow them to iterate quickly without sacrificing quality. That’s where Docker makes a difference. By streamlining workflows, eliminating bottlenecks, and automating repetitive tasks, Docker empowers developers to focus on high-impact work that drives results.

Consistency across development environments is critical for improving speed. That’s why Docker helps developers create consistent environments across local, test, and production systems. In fact, a recent study reported developers experiencing a 6% increase in productivity when leveraging Docker Business. This consistency eliminates guesswork, ensuring developers can concentrate on writing code and improving features rather than troubleshooting issues. With Docker, applications behave predictably across every stage of the development lifecycle.

Docker also accelerates development by significantly reducing time spent on iteration and setup. More specifically, organizations leveraging Docker Business achieved a three-month faster time-to-market for revenue-generating applications. Engineering teams can move swiftly through development stages, delivering new features and bug fixes faster. By improving efficiency and adapting to evolving needs, Docker enables development teams to stay agile and respond effectively to business priorities.

Improve scaling agility: Flexibility for every scenario

Scalability is another essential for businesses to meet fluctuating demands and seize opportunities. Whether handling a surge in user traffic or optimizing resources during quieter periods, the ability to scale applications and infrastructure efficiently is a critical advantage. Docker makes this possible by enabling teams to adapt with speed and flexibility.

Docker’s cloud-native approach allows software engineering teams to scale up or down with ease to meet changing requirements. This flexibility supports experimentation with cutting-edge technologies like AI, machine learning, and microservices without disrupting existing workflows. With this added agility, developers can explore new possibilities while maintaining focus on delivering value.

Whether responding to market changes or exploring the potential of emerging tools, Docker equips companies to stay agile and keep evolving, ensuring their development processes are always ready to meet the moment.

Optimize resource efficiency: Get the most out of what you’ve got

Maximizing resource efficiency is crucial for reducing costs and maintaining agility. By making the most of existing infrastructure, businesses can avoid unnecessary expenses and minimize cloud scaling costs, meaning more resources for innovation and growth. Docker empowers teams to achieve this level of efficiency through its lightweight, containerized approach.

Docker containers are designed to be resource-efficient, enabling multiple applications to run in isolated environments on the same system. Unlike traditional virtual machines, containers minimize overhead while maintaining performance, consolidating workloads, and lowering the operational costs of maintaining separate environments. For example, a leading beauty company reduced infrastructure costs by 25% using Docker’s enhanced CPU and memory efficiency. This streamlined approach ensures businesses can scale intelligently while keeping infrastructure lean and effective.

By containerizing applications, businesses can optimize their infrastructure, avoiding costly upgrades while getting more value from their current systems. It’s a smarter, more efficient way to ensure your resources are working at their peak, leaving no capacity underutilized.

Establish cost-effective scaling: Growth without growing pains

Similarly, scaling efficiently is essential for businesses to keep up with growing demands, introduce new features, or adopt emerging technologies. However, traditional scaling methods often come with high upfront costs and complex infrastructure changes. Docker offers a smarter alternative, enabling development teams to scale environments quickly and cost-effectively.

With a containerized model, infrastructure can be dynamically adjusted to match changing needs. Containers are lightweight and portable, making it easy to scale up for spikes in demand or add new capabilities without overhauling existing systems. This flexibility reduces financial strain, allowing businesses to grow sustainably while maximizing the use of cloud resources.

Docker ensures that scaling is responsive and budget-friendly, empowering teams to focus on innovation and delivery rather than infrastructure costs. It’s a practical solution to achieve growth without unnecessary complexity or expense.

Software engineering efficiency at your fingertips

The developer community consistently ranks Docker highly, including choosing it as the most-used and most-admired developer tool in Stack Overflow’s Developer Survey. With Docker’s suite of products, teams can reach a new level of efficient software development by streamlining the dev lifecycle, optimizing resources, and providing agile, cost-effective scaling solutions. By simplifying complex processes in the development inner loop, Docker enables businesses to deliver high-quality software faster while keeping operational costs in check. This allows developers to focus on what they do best: building innovative, impactful applications.

By removing complexity, accelerating development cycles, and maximizing resource usage, Docker helps businesses stay competitive and efficient. And ultimately, their teams can achieve more in less time — meeting market demands with efficiency and quality.

Ready to supercharge your development team’s performance? Download our white paper to see how Docker can help streamline your workflow, improve productivity, and deliver software that stands out in the market.

Learn more

Find a Docker plan that’s right for you.

Subscribe to the Docker Newsletter. 

Get the latest release of Docker Desktop.

New to Docker? Get started.

Quelle: https://blog.docker.com/feed/

Why Secure Development Environments Are Essential for Modern Software Teams

“You don’t want to think about security — until you have to.”

That’s what I’d tell you if I were being honest about the state of development at most organizations I have spoken to. Every business out there is chasing one thing: speed. Move faster. Innovate faster. Ship faster. To them, speed is survival. There’s something these companies are not seeing — a shadow. An unseen risk hiding behind every shortcut, every unchecked tool, and every corner cut in the name of “progress.”

Businesses are caught in a relentless sprint, chasing speed and progress at all costs. However, as Cal Newport reminds us in Slow Productivity, the race to do more — faster — often leads to chaos, inefficiency, and burnout. Newport’s philosophy calls for deliberate, focused work on fewer tasks with greater impact. This philosophy isn’t just about how individuals work — it’s about how businesses innovate. Development teams rushing to ship software often cut corners, creating vulnerabilities that ripple through the entire supply chain. 

The strategic risk: An unsecured development pipeline

Development environments are the foundation of your business. You may think they’re inherently secure because they’re internal. Foundations crumble when you don’t take care of them, and that crack doesn’t just swallow your software — it swallows established customer trust and reputation. That’s how it starts: a rogue tool here, an unpatched dependency there, a developer bypassing IT to do things “their way.” They’re not trying to ruin your business. They’re trying to get their jobs done. But sometimes you can’t stop a fire after it’s started. Shadow IT isn’t just inconvenient — it’s dangerous. It’s invisible, unmonitored, and unregulated. It’s the guy leaving the back door open in a neighborhood full of burglars.

You need control, isolation, and automation — not because they’re nice to have, but because you’re standing on a fault line without them. Docker gives you that control. Fine-grained, role-based access ensures that the only people touching your most critical resources are the ones you trust. Isolation through containerization keeps every piece of your pipeline sealed tight so vulnerabilities don’t spread. Automation takes care of the updates, the patch management, and the vulnerabilities before they become a problem. In other words, you don’t have to hope your foundation is solid — you’ll know it is.

Shadow IT: A growing concern

While securing official development environments is critical, shadow IT remains an insidious and hidden threat. Shadow IT refers to tools, systems, or environments implemented without explicit IT approval or oversight. In the pursuit of speed, developers may bypass formal processes to adopt tools they find convenient. However, this creates unseen vulnerabilities with far-reaching consequences.

In the pursuit of performative busywork, developers often take shortcuts, grabbing tools and spinning up environments outside the watchful eyes of IT. The intent may not be malicious; it’s just human nature. Here’s the catch: What you don’t see, you can’t protect. Shadow IT is like a crack in the dam: silent, invisible, and spreading. It lets unvetted tools and insecure code slip into your supply chain, infecting everything from development to production. Before you know it, that “quick fix” has turned into a legal nightmare, a compliance disaster, and a stain on your reputation. In industries like finance or healthcare, that stain doesn’t wash out quickly. 

A solution rooted in integration

The solution lies in a unified, secure approach to development environments that removes the need for shadow IT while fortifying the software supply chain. Docker addresses these vulnerabilities by embedding security directly into the development lifecycle. Our solution is built on three foundational principles: control, isolation, and automation.

Control through role-based access management: Docker Hub establishes clear boundaries within development environments by enabling fine-grained, role-based access. You want to ensure that only authorized personnel can interact with sensitive resources, which will ideally minimize the risk of unintended or malicious actions. Docker also enables publishers to enforce role-based access controls, ensuring only authorized users can interact with development resources. It streamlines patch management through verified, up-to-date images. Docker Official Images and Docker Verified Publisher content are scanned with our in-house image analysis tool, Docker Scout. This helps find vulnerabilities before they can be exploited.

Isolation through containerization: Docker’s value proposition centers on its containerization technology. By creating isolated development spaces, Docker prevents cross-environment contamination and ensures that applications and their dependencies remain secure throughout the development lifecycle.

Automation for seamless security: Recognizing the need for speed in modern development cycles, Docker integrates recommendations with Scout through recommendations for software updates and patch management for CVEs. This ensures that environments remain secure against emerging threats without interrupting the flow of innovation.

Delivering tangible business outcomes

Businesses are always going to face this tension between speed and security, but the truth is you don’t have to choose. Docker gives you both. It’s not just a platform; it’s peace of mind. Because when your foundation is solid, you stop worrying about what could go wrong. You focus on what comes next.

Consider the example of a development team working on a high-stakes application feature. Without secure environments, a single oversight — such as an unregulated access point — can result in vulnerabilities that disrupt production and erode customer trust. By leveraging Docker’s integrated security solutions, the team mitigates these risks, enabling them to focus on value creation rather than crisis management.

Aligning innovation with security

As a previous post covers, securing the development pipeline is not simply deploying technical solutions but establishing trust across the entire software supply chain. With Docker Content Trust and image signing, organizations can ensure the integrity of software components at every stage, reducing the risk of third-party code introducing unseen vulnerabilities. By eliminating the chaos of shadow IT and creating a transparent, secure development process, businesses can mitigate risk without slowing the pace of innovation.

The tension between speed and security has long been a barrier to progress, but businesses can confidently pursue both with Docker. A secure development environment doesn’t just protect against breaches — it strengthens operational resilience, ensures regulatory compliance, and safeguards brand reputation. Docker empowers organizations to innovate on a solid foundation as unseen risks lurk within an organization’s fragmented tools and processes. 

Security isn’t a luxury. It’s the cost of doing business. If you care about growth, if you care about trust, if you care about what your brand stands for, then securing your development environments isn’t optional — it’s survival. Docker Business doesn’t just protect your pipeline; it turns it into a strategic advantage that lets you innovate boldly while keeping your foundation unshakable. Integrity isn’t something you hope for — it’s something you build.

Start today

Securing your software supply chain is a critical step in building resilience and driving sustained innovation. Docker offers the tools to create fortified development environments where your teams can operate at their best.

The question is not whether to secure your development pipeline — it’s how soon you can start. Explore Docker Hub and Scout today to transform your approach to innovation and security. In doing so, you position your organization to navigate the complexities of the modern development landscape with confidence and agility.

Learn more

Subscribe to the Docker Newsletter. 

Visit Docker’s trusted content page.

Get the latest release of Docker Desktop.

Have questions? The Docker community is here to help.

New to Docker? Get started.

Quelle: https://blog.docker.com/feed/

The Model Context Protocol: Simplifying Building AI apps with Anthropic Claude Desktop and Docker

Anthropic recently unveiled the Model Context Protocol (MCP), a new standard for connecting AI assistants and models to reliable data and tools. However, packaging and distributing MCP servers is very challenging due to complex environment setups across multiple architectures and operating systems. Docker is the perfect solution for this – it allows developers to encapsulate their development environment into containers, ensuring consistency across all team members’ machines and deployments consistent and predictable. In this blog post, we provide a few examples of using Docker to containerize Model Context Protocol (MCP) to simplify building AI applications. 

What is Model Context Protocol (MCP)?

MCP (Model Context Protocol), a new protocol open-sourced by Anthropic, provides standardized interfaces for LLM applications to integrate with external data sources and tools. With MCP, your AI-powered applications can retrieve data from external sources, perform operations with third-party services, or even interact with local filesystems.

Among the use cases enabled by this protocol is the ability to expose custom tools to AI models. This provides key capabilities such as:

Tool discovery: Helping LLMs identify tools available for execution

Tool invocation: Enabling precise execution with the right context and arguments

Since its release, the developer community has been particularly energized. We asked David Soria Parra, Member of Technical Staff from Anthropic, why he felt MCP was having such an impact: “Our initial developer focus means that we’re no longer bound to one specific tool set.  We are giving developers the power to build for their particular workflow.”

How does MCP work? What challenges exist?

MCP works by introducing the concept of MCP clients and MCP Servers – clients request resources and the servers handle the request and perform the requested action. MCP Clients are often embedded into LLM-based applications, such as the Claude Desktop App. The MCP Servers are launched by the client to then perform the desired work using any additional tools, languages, or processes needed to perform the work.

Examples of tools include filesystem access, GitHub and GitLab repo management, integrations with Slack, or retrieving or modifying state in Kubernetes clusters.

Figure 1: A high-level architecture diagram of MCP client and server interactions

The goal of MCP servers is to provide reusable toolsets and reuse them across clients, like Claude Desktop – write one set of tools and reuse them across many LLM-based applications. But, packaging and distributing these servers is currently a challenge. Specifically:

Environment conflicts: Installing MCP servers often requires specific versions of Node.js, Python, and other dependencies, which may conflict with existing installations on a user’s machine

Lack of host isolation: MCP servers currently run on the host, granting access to all host files and resources

Complex setup: MCP servers currently require users to download and configure all of the code and configure the environment, making adoption difficult

Cross-platform challenges: Running the servers consistently across different architectures (e.g., x86 vs. ARM, Windows vs Mac) or operating systems introduces additional complexity

Dependencies: Ensuring that server-specific runtime dependencies are encapsulated and distributed safely.

How does Docker help?

Docker solves these challenges by providing a standardized method and tooling to develop, package, and distribute applications, including MCP servers. By packaging these MCP servers as containers, the challenges of isolation or environment differences disappear. Users can simply run a container, rather than spend time installing dependencies and configuring the runtime.

Docker Desktop provides a development platform to build, test, and run these MCP servers. Docker Hub is the world’s largest repository of container images, making it the ideal choice to distribute containerized MCP servers. Docker Scout helps ensure images are kept secure and free of vulnerabilities. Docker Build Cloud helps you build images more quickly and reliably, especially when cross-platform builds are required.

The Docker suite of products brings benefits to both publishers and consumers – publishers can easily package and distribute their servers and consumers can easily download and run them with little to no configuration.

Again quoting David Soria Parra, 

“Building an MCP server for ffmpeg would be a tremendously difficult undertaking without Docker. Docker is one of the most widely used packaging solutions for developers. The same way it solved the packaging problem for the cloud, it now has the potential to solve the packaging problem for rich AI agents”. 

Figure 2: Architecture diagram demonstrating MCP servers running in a Docker container

As we continue to explore how MCP allows us to connect to existing ecosystems of tools, we also envision MCP bridges to existing containerized tools.

Figure 3: Architecture diagram that shows a single MCP server calling multiple tools in their own containers

Try it yourself with containerized Reference Servers

As part of publishing the specification, Anthropic published an initial set of reference servers. We have worked with the Anthropic team to create Docker images for these servers and make them available from the new Docker Hub mcp namespace.

Developers can try this out today using Claude Desktop as the MCP client and Docker Desktop to run any of the reference servers by updating your claude_desktop_config.json file.

The list of current servers documents how to update the claude_desktop_config.json to activate these MCP server docker containers on your local host.

Using Puppeteer to take and modify screenshots using Docker

This demo will use the Puppeteer MCP server to take a screenshot of a website and invert the colors using Claude Desktop and Docker Desktop. Doing this without a containerized environment requires quite a bit of setup, but is fairly trivial using containers.

Update your claude_desktop_config.json file to include the following configuration:

For example, extending Claude Desktop to use puppeteer for browser automation and web scraping requires the following entry (which is fully documented here):

{
"mcpServers": {
"puppeteer": {
"command": "docker",
"args": ["run", "-i", "–rm", "–init", "-e", "DOCKER_CONTAINER=true", "mcp/puppeteer"]
}
}
}

Restart Claude Desktop to apply the changed config file.

Submit the following prompt using the Sonnet 3.5 model:“Take a screenshot of docs.docker.com and then invert the colors“

Claude will run through several consent screens ensuring that you’re okay running these new tools.

After a brief moment, you’ll have your requested screenshot

What happened? Claude planned out a series of tool calls, starting the puppeteer MCP server in a container, and then used the headless browser in that container to navigate to a site, grab a screenshot, invert the colors on the page, and then finally grab a screenshot of the altered page.

Figure 4: Running Dockerized Puppeteer in Claude Desktop to invert colors on https://docs.docker.com/

Next steps

There’s already a lot that developers can try with this first set of servers. For an educational glimpse into what’s possible with database containers, we recommend that you connect the sqlite server container, and run the sample prompt that it provides. It’s an eye-opening display of what’s already possible today. Plus, the demo is containerized!

We’re busy adding more content to enable you to easily build and distribute your own MCP docker images. We are also encouraging and working closely with the community to package more Docker containers. Please reach out with questions in the discussion group.  

Learn more

Read the Docker Desktop release collection to see even more updates and innovation announcements.

Subscribe to the Docker Navigator newsletter

Subscribe to the Docker Labs: GenAI newsletter

Discover the upgraded Docker plans. 

See what’s new in Docker Desktop.

Quelle: https://blog.docker.com/feed/

Recipe for Efficient Development: Simplify Collaboration and Security with Docker

Collaboration and security are essential for delivering high-quality applications in modern software development, especially in cloud-native environments. Developers navigate intricate workflows, connect diverse systems, and safeguard applications against emerging threats — all while maintaining velocity and efficiency.

Think of development as preparing a multi-course meal in a high-pressure, professional kitchen, where precision, timing, and communication are critical. Each developer is a chef working on different parts of the dish, passing ingredients (code) along the way. When one part of the system encounters delays, it can ripple across the process, impacting the final result. Similarly, poor collaboration or security gaps can derail a project, causing delays and inefficiencies. 

Docker serves as the kitchen manager, ensuring everything flows smoothly, ingredients are passed securely, and security is integrated from start to finish.

Seamless collaboration with Docker Hub and Testcontainers Cloud

Success in a professional kitchen depends on clear communication and coordination. In development, it’s no different. Docker’s collaboration tools, like Docker Hub and Testcontainers Cloud, simplify how teams work together, share resources, and test efficiently.

Docker Hub can be thought of as a kitchen’s “prepped ingredients station.” It’s where some of the most essential ingredients are always ready to go. With a vast selection of curated, trusted images, developers can quickly access high-quality, pre-configured containers, ensuring consistency and reducing the chance for mistakes.

Testcontainers Cloud is like the kitchen’s test station, providing on-demand, production-like environments for testing. Developers can spin up these environments quickly, reducing setup time and ensuring code performs in a real-world setting. 

Effective coordination is critical whether you’re in a kitchen or on a development team, especially when projects involve distributed or hybrid teams. Clear communication ensures everyone is aligned and productive. The Docker suite of products provides the tools that make it possible for companies to more easily break down silos, share resources seamlessly, and ensure alignment — no matter how large your team is or where they work!

By streamlining collaboration, Docker reduces complexity and allows teams to move forward with confidence. With Docker Hub, Testcontainers Cloud, and integrated security features, teams can share resources, track progress, and catch issues early, enabling them to deliver high-quality results on time.

These tools improve efficiency, reduce errors, and help teams move faster through the development inner loop by making collaboration seamless and resource sharing simple.

Integrated security from code to production

Embedding security into every development step is essential to maintaining speed and delivering high-quality software. With Docker, security is embedded into every step of the development process so teams can identify and fix issues earlier than ever.

Docker Scout monitors container images in real-time, identifying vulnerabilities early to ensure your software is production-ready. By identifying and resolving risks early, developers can maintain high-quality standards and accelerate time to market.

Docker also integrates additional security features that work behind the scenes:

Image Access Management (IAM) ensures that only trusted images are used.

Registry Access Management (RAM) restricts access to approved registries.

Hardened Docker Desktop (HDD) strengthens container isolation, protecting against software supply chain risks like malware.

Trusted Content offers developers secure, reliable base images, reducing vulnerabilities in the app’s foundation.

By building security into the workflow, Docker helps teams identify risks earlier, improve code quality, and maintain momentum without compromising safety.

Efficiency in action with Docker

Speed, collaboration, and security are paramount in today’s development landscape. Docker simplifies and secures the development process, helping teams collaborate efficiently and deliver secure, high-quality software faster.

Just as a well-managed kitchen runs smoothly, Docker helps development teams stay coordinated, ensuring security and productivity work together in perfect harmony. Docker removes complexity, accelerates delivery, and embeds security, enabling teams to create efficient, secure applications on time.

Ready to boost efficiency and collaboration in your development process? Explore the Docker suite of products to see how they can streamline your workflow and improve your team’s productivity today. 

To learn more about fueling development efficiency, download our white paper, Reducing Every-Day Complexities for More Efficient Software Development with Docker.
Quelle: https://blog.docker.com/feed/

Building Trust into Your Software with Verified Components

Within software development, security and compliance are more than simple boxes to check. Each attestation and compliance check is backed by a well-considered risk assessment that aims to avoid ever-changing vulnerabilities and attack vectors. Software development teams don’t want to worry about vulnerabilities when they are focused on building something remarkable.

In this article, we explain how Docker Hub and Docker Scout can help development teams ensure a more secure and compliant software supply chain. 

Security starts with trusted foundations

Every structure needs a strong foundation. A weak base is where cracks begin to show. Using untrusted or outdated software is like building a skyscraper on sand, and security issues can derail progress, leading to costly fixes and delayed releases. By “shifting security left” — addressing vulnerabilities early in the development process — teams can avoid these setbacks down the road.  

Modern development demands a secure and compliant software supply chain. Unverified software or vulnerabilities buried deep within base images can become costly compliance issues, disrupting development timelines and eroding customer trust. One weak link in the supply chain can snowball into more significant issues, affecting product delivery and customer satisfaction. Without security and compliance checks, organizations will lack the credibility their customers rely on.

How Docker Hub and Scout help teams shift left

Software developers are like a construction crew building a skyscraper. The process requires specialized components — windows, elevators, wiring, concrete, and so on — which are found at a single supply depot and which work in harmony with each other. This idea is similar to microservices, which are pieced together to create modern applications. In this analogy, Docker Hub acts as the supply depot for a customer’s software supply chain, stocked with trusted container images that help developer teams streamline development.

Docker Hub is more than a container registry. It is the most widely trusted content distribution platform built on secure, verified, and dependable container images. Docker Official Images (DOI) and Docker Verified Publisher (DVP) programs provide a rock-solid base to help minimize risks and let development teams focus on creating their projects. 

Docker Hub simplifies supply chain security by ensuring developers start with trusted components. Its library of official and verified publisher images offers secure, up-to-date resources vetted for compliance and reliability, eliminating the risk of untrusted or outdated components.

Proactive risk management is critical to software development

To avoid breaking production environments, organizations need to plan ahead by catching and tracking common vulnerabilities and exposures (CVEs) early in the development process. Docker Scout enables proactive risk management by integrating security checks early in the development lifecycle. Scout reduces the likelihood of security incidents and streamlines the development process.

Additionally, Docker Scout Health Scores provide a straightforward framework for evaluating the security posture of container images used daily by development teams. Using an easy-to-understand alphabetical grading system (A to F), these scores assess CVEs in software components within Docker Hub. This feature lets developers quickly evaluate and select trusted content, ensuring a secure software supply chain.

Avoid shadow changes with IAM and RBAC for secure collaboration

Compliance is not glamorous, but it is essential to running a business. Development teams don’t want to have to worry about whether they are meeting industry standards — they want to know they are. Docker Hub makes compliance simple with pre-certified images and many features that take the guesswork out of governance. That means you can stay compliant while your teams keep growing and innovating.

The biggest challenge to scaling a team or growing your development operations is not about adding people — it’s about maintaining control without losing momentum. Tracking, reducing, and managing shadow changes means that your team does not lose the flow state in development velocity. 

Docker Hub’s Image Access Management (IAM) enforces precise permissions to ensure that only authorized people have access to modify sensitive information in repositories. Additionally, with role-based access control (RBAC), you’re not just delegating; you’re empowering your team with predefined roles that streamline onboarding, reduce mistakes, and keep everyone moving in harmony.

Docker Hub’s activity logs provide another layer of confidence as they let you track changes, enforce compliance, and build trust. These capabilities enhance security and boost collaboration by creating an environment where team members can focus on delivering high-quality applications.

Built-in trust

Without verified components, development teams can end up playing whack-a-mole with vulnerabilities. Time is lost. Money is spent. Trust is damaged. Now, picture a team working with trusted content and images that integrate security measures from the start. They deliver on time, on budget, and with confidence.

Building security into your applications doesn’t slow you down; it’s your superpower. Docker weaves trust and security into every part of your development process. Your applications are safeguarded, your delivery is accelerated, and your team is free to focus on what matters most — creating value.

Start your journey today. With Docker, you’re not just developing applications but building trust. Learn how trusted components help simplify compliance, enhance security, and empower your team to innovate fearlessly. 

Learn more

Subscribe to the Docker Newsletter. 

Visit Docker’s trusted content page.

Get the latest release of Docker Desktop.

Have questions? The Docker community is here to help.

New to Docker? Get started.

Quelle: https://blog.docker.com/feed/