← Writings

Docker: The Superpower of Consistent Environments

February 2025·12 min read

So, what is Docker? Docker is a container platform that gives us the superpower of running the same environment on every laptop or system. Let me tell you why this matters.

The Problem We All Face

Imagine this: I'm developing an application. I've installed all the dependencies my code needs, specific versions of libraries, everything works perfectly on my local machine. After a year, I want to collaborate with my friend Skanda. I give him my GitHub link, and he clones it.

But here's where things go wrong. Skanda might face problems right from the start — version mismatches, missing dependencies, and he's on Windows while I'm on Linux. We can't just run the code on the first try because there will be issues. Sound familiar?

The Docker Solution

If I used Docker in my codebase, Skanda would simply pull my Docker image and run it on his machine. Using this approach, the exact versions, the same code, and all dependencies running in my environment will also run in his environment without any errors. That's the beauty of Docker.

Getting Started with Docker

When you install Docker, you can check if it's running by typing docker in your terminal. You can also use docker -v to check the version.

Let's install your first image:

docker run -it ubuntu

What does this command do? We're telling Docker to run in interactive mode (-it) and use Ubuntu. First, Docker looks for an image named "ubuntu" on your system. If it finds it, great! If not, it downloads the image from Docker Hub.

About Docker Hub

hub.docker.com is like GitHub.com. Just like how we add code to GitHub and people can pull it, we can pull public images from Docker Hub. It's that simple.

Once you have the image, run the command again. This time, Docker serves the cached Ubuntu image and creates a new container. The cool thing? Once you're inside the container, you can do literally anything, and it doesn't affect your actual system. That's isolation at its best.

Docker Networking

Containers are isolated by default — they can't talk to other containers or access the internet. But when you spin up a container, Docker creates a bridge network between the host network and the container's IP address.

How does it work? When Docker starts, it sets up these configs on your host machine: - It creates a virtual ethernet bridge named docker0 on your host - It connects all containers together through this bridge - It connects them to your host network

Docker Volumes: Persisting Your Data

Here's the thing about containers — they're isolated, and when a container is killed, all its data is gone. That's where Docker volumes come in.

Docker volumes are used for storing and persisting data even if the container is killed or removed. When you spin up a new container, you can mount it to the same volume. Whatever data is in the volume gets synced to the container again.

Example Dockerfile

Here's a simple Dockerfile I use in my projects:

FROM node:20-alpine
WORKDIR /usr/src/app

COPY package.json package-lock.json turbo.json tsconfig.json ./
COPY apps ./apps
COPY packages ./packages

RUN npm install
RUN npm run db:generate
RUN npm run build

CMD ["npm", "run", "start-user-app"]

Docker Compose: Running Multiple Containers

Docker Compose is a higher-level tool where you can run your entire codebase with one command. Multiple containers can be networked in the same network, share volumes, and it's perfect for production apps.

version: '3.8'

services:
  app:
    build:
      context: .
      dockerfile: Dockerfile
    ports:
      - "3000:3000"
    environment:
      - DATABASE_URL=postgresql://user:password@db:5432/myapp
    depends_on:
      - db
    volumes:
      - ./src:/usr/src/app/src

  db:
    image: postgres:15-alpine
    environment:
      - POSTGRES_USER=user
      - POSTGRES_PASSWORD=password
      - POSTGRES_DB=myapp
    volumes:
      - postgres_data:/var/lib/postgresql/data

volumes:
  postgres_data:

Instead of running multiple docker run commands, you define everything in one file and use docker-compose up. This starts all your services together with proper networking and volumes configured.

Docker isn't just a tool; it's a way of thinking about how we build and deploy applications. Once you start using it, you'll wonder how you ever lived without it.