Docker used to be something I associated with backend developers and DevOps engineers. As a frontend developer, I thought it had nothing to do with me. Then I joined a team that used Docker for everything, including local development environments, CI pipelines, and production deployments. It took a while, but now I consider Docker one of the most useful tools in my toolkit. Here is the stuff I wish someone had explained to me when I started.

Dockerizing a Single-Page Application

The most common use case for frontend developers is containerizing a production build of your SPA. The pattern is straightforward: use a multi-stage build where the first stage compiles your app and the second stage serves it with nginx. Here is the Dockerfile I use for most Angular and Ionic projects:

# Stage 1: Build
FROM node:20-alpine AS build
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
RUN npm run build -- --configuration=production

# Stage 2: Serve
FROM nginx:alpine
COPY --from=build /app/www /usr/share/nginx/html
COPY nginx.conf /etc/nginx/conf.d/default.conf
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]

The multi-stage approach is important because your final image only contains the compiled static files and nginx. It does not include node_modules, source code, or build tools. This keeps the image small, typically under 30 MB for a typical SPA, compared to hundreds of megabytes if you ship the full Node.js build environment.

The Nginx Configuration

If you are deploying a SPA, you need nginx to handle client-side routing. Without this configuration, refreshing the browser on any route other than the root will return a 404. Here is the nginx config I pair with the Dockerfile above:

server {
    listen 80;
    server_name _;
    root /usr/share/nginx/html;
    index index.html;

    # Handle client-side routing
    location / {
        try_files $uri $uri/ /index.html;
    }

    # Cache static assets aggressively
    location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg|woff2?)$ {
        expires 1y;
        add_header Cache-Control "public, immutable";
    }

    # Don't cache index.html
    location = /index.html {
        expires -1;
        add_header Cache-Control "no-store, no-cache, must-revalidate";
    }

    # Gzip compression
    gzip on;
    gzip_types text/plain text/css application/json application/javascript text/xml;
    gzip_min_length 256;
}

The caching strategy here is important. Your JavaScript and CSS files should have content hashes in their filenames (Angular CLI does this by default), so they can be cached indefinitely. But index.html must never be cached, because it contains the references to those hashed files. If a user loads a cached index.html that references files that no longer exist on the server, the app will break.

Docker for Local Development

Beyond production builds, Docker is incredibly useful for standardizing development environments. Instead of every developer installing different versions of Node.js and dealing with "works on my machine" issues, you can define the entire environment in a docker-compose.yml:

version: '3.8'
services:
  app:
    build:
      context: .
      dockerfile: Dockerfile.dev
    ports:
      - "4200:4200"
    volumes:
      - .:/app
      - /app/node_modules
    environment:
      - API_URL=http://api:3000

  api:
    image: node:20-alpine
    working_dir: /api
    volumes:
      - ../api:/api
    ports:
      - "3000:3000"
    command: npm run dev

  db:
    image: mongo:7
    ports:
      - "27017:27017"
    volumes:
      - mongo_data:/data/db

volumes:
  mongo_data:

With this setup, a new developer joins the team, clones the repo, runs docker-compose up, and has the entire stack running in minutes. No installing MongoDB. No configuring Node versions. No wondering why the API is not connecting. Everything is defined in code and works the same on every machine.

Optimizing Build Times

One thing that frustrated me early on was how slow Docker builds were. Every time I changed a single line of code, Docker would reinstall all of node_modules from scratch. The fix is understanding Docker's layer caching. Each instruction in a Dockerfile creates a layer, and Docker caches layers that have not changed. The key insight is to copy package.json and package-lock.json first, run npm ci, and only then copy the rest of your source code:

# This order is important for caching
COPY package*.json ./
RUN npm ci
# Source code changes won't invalidate the npm ci layer
COPY . .

With this ordering, the npm ci layer is cached unless package.json or package-lock.json changes. Source code changes only invalidate the final COPY layer, making rebuilds much faster.

A Note on .dockerignore

Always create a .dockerignore file. Without it, Docker sends your entire project directory (including node_modules, .git, and build artifacts) to the Docker daemon as build context. This can add minutes to your build time for no reason:

node_modules
.git
dist
www
.angular
*.md
.env*

Docker is not something you need to master overnight. Start with a production Dockerfile for your SPA, then gradually adopt it for development environments and CI. Once you get comfortable with the basics, it becomes second nature, and you will wonder how you ever managed without it.