Scaling Docker Compose Up | Docker
Docker Compose‘s simplicity — just run compose up
— has been an integral part of developer workflows for a decade, with the first commit occurring in 2013, back when it was called Plum. Although the feature set has grown dramatically in that time, maintaining that experience has always been integral to the spirit of Compose.
In this post, we’ll walk through how to manage microservice sprawl with Docker Compose by importing subprojects from other Git repos.
Maintaining simplicity
Now, perhaps more than ever, that simplicity is key. The complexity of modern software development is undeniable regardless of whether you’re using microservices or a monolith, deploying to the cloud or on-prem, or writing in JavaScript or C.
Compose has not kept up with this “development sprawl” and is even sometimes an obstacle when working on larger, more complex projects. Maintaining Compose to accurately represent your increasingly complex application can require its own expertise, often resulting in out-of-date configuration in YAML or complex makefile tasks.
As an open source project, Compose serves everyone from home lab enthusiasts to transcontinental corporations, which is no small feat, and our commitment to maintaining Compose’s signature simplicity for all users hasn’t changed.
The increased flexibility afforded by Compose watch and include means your project no longer needs to be one-size-fits-all. Now, it’s possible to split your project across Git repos and import services as needed, customizing their configuration in the process.
Application architecture
Let’s take a look at a hypothetical application architecture. To begin, the application is split across two Git repos:
backend
— Backend in Python/Flaskfrontend
— Single-page app (SPA) frontend in JavaScript/Node.js
While working on the frontend, the developers run without using Docker or Compose, launching npm start
on their laptops directly and proxy API requests to a shared staging server (as opposed to running the backend locally). Meanwhile, while working on the backend, developers and CI (for integration tests) share a Compose file and rely on command-line tools like cURL to manually test functionality locally.
We’d like a flexible configuration that enables each group of developers to use their optimal workflow (e.g., leveraging hot reload for the frontend) while also allowing reuse to share project configuration between repos. At first, this seems like an impossible situation to resolve.
Frontend
We can start by adding a compose.yaml
file to frontend
:
services:
frontend:
pull_policy: build
build:
context: .
environment:
BACKEND_HOST: ${BACKEND_HOST:-https://staging.example.com}
ports:
- 8000:8000
Note: If you’re wondering what the Dockerfile looks like, take a look at this samples page for an up-to-date example of best practices generated by docker init
.
This is a great start! Running docker compose up
will now build the Node.js frontend and make it accessible at http://localhost:8000/.
The BACKEND_HOST
environment variable can be used to control where upstream API requests are proxied to and defaults to our shared staging instance.
Unfortunately, we’ve lost the great developer experience afforded by hot module reload (HMR) because everything is inside the container. By adding a develop.watch
section, we can preserve that:
services:
frontend:
pull_policy: build
build:
context: .
environment:
BACKEND_HOST: ${BACKEND_HOST:-https://staging.example.com}
ports:
- 8000:8000
develop:
watch:
- path: package.json
action: rebuild
- path: src/
target: /app/src
action: sync
Now, while working on the frontend, developers continue to benefit from the rapid iteration cycles due to HMR. Whenever a file is modified locally in the src/
directory, it’s synchronized into the container at /app/src
.
If the package.json
file is modified, the entire container is rebuilt, so that the RUN npm install
step in the Dockerfile will be re-executed and install the latest dependencies. The best part is the only change to the workflow is running docker compose watch
instead of npm start
.
Backend
Now, let’s set up a Compose file in backend
:
services:
backend:
pull_policy: build
build:
context: .
ports:
- 1234:8080
develop:
watch:
- path: requirements.txt
action: rebuild
- path: ./
target: /app/
action: sync
include:
- path: [email protected]:myorg/frontend.git
env_file: frontend.env
frontend.env
BACKEND_HOST=http://backend:8080
Much of this looks very similar to the frontend compose.yaml
.
When files in the project directory change locally, they’re synchronized to /app
inside the container, so the Flask dev server can handle hot reload. If the requirements.txt
is changed, the entire container is rebuilt, so that the RUN pip install
step in the Dockerfile will be re-executed and install the latest dependencies.
However, we’ve also added an include
section that references the frontend project by its Git repository. The custom env_file
points to a local path (in the backend
repo), which sets BACKEND_HOST
so that the frontend service container will proxy API requests to the backend service container instead of the default.
Note: Remote includes are an experimental feature. You’ll need to set COMPOSE_EXPERIMENTAL_GIT_REMOTE=1
in your environment to use Git references.
With this configuration, developers can now run the full stack while keeping the frontend and backend Compose projects independent and even in different Git repositories.
As developers, we’re used to sharing code library dependencies, and the include
keyword brings this same reusability and convenience to your Compose development configurations.
What’s next?
There are still some rough edges. For example, the remote project is cloned to a temporary directory, which makes it impractical to use with watch mode when imported, as the files are not available for editing. Enabling bigger and more complex software projects to use Compose for flexible, personal environments is something we’re continuing to improve upon.
If you’re a Docker customer using Compose across microservices or repositories, we’d love to hear how we can better support you. Get in touch!