Dockerize your Development Environment for NodeJS

Featured on Hashnode

Subscribe to my newsletter and never miss my upcoming articles

Using Docker in your development workflow has a positive impact on your productivity. It eliminates the typical "It worked on my machine" type of bugs and the setup on a different machine only requires a running Docker daemon and nothing else. Before we get started implementing we will go over Docker real quick.

What is Docker?

Docker is a platform that can run containers, packages of software. To run these containers Docker uses OS-level virtualization. You can think of a container as a lightweight version of a virtual machine.

All containers you run on your Docker platform are isolated from one another. For example, the host, on which Docker runs, and one container running on the host, do not share the same filesystem except to explicitly tell them to.

To start a container you need a Docker image. This image is the blueprint for your container. You can take already predefined images from Docker-Hub or configure your own ones by writing a so-called Dockerfile.

This is just a quick overview of Docker if you want to dig deeper I encourage you to start here.

Why would you dockerize your development workflow?

In the introduction, I already touched on one benefit of using Docker in your development environment. This being the fact that it gets rid of the typical "It works on my machine" issue. Some other benefits are:

  • Standardize development workflow between team members even more
  • Reduction of production-only bugs if you use Docker for deployment too (Configurations between production and development can be quite similar)
  • Getting rid of the forementioned "Works on my machine" type of bugs

Getting started

We start out by creating a new folder in which we place our project, and we create our Dockerfile like this:

$ mkdir node-docker && cd node-docker
$ touch Dockerfile


The container that we will use for our express application will be configured in the Dockerfile. For that, we need to give it some life:

FROM node:latest

WORKDIR /usr/src/app
COPY package*.json ./

RUN npm cache clear --force && npm install

FROM tells Docker to get an image called node (version: latest) from the docker hub.

WORKDIR sets the directory in which all the upcoming commands will be executed.

COPY does exactly what it says, it gets the package.json and package-lock.json and copies it to the WORKDIR.

ENV sets an environment variable inside the container with the name PORT and the value 5000

RUN executes the commands we pass in. In this case, clearing the npm cache and then installing all the dependencies from package.json.

ENTRYPOINT executes the command you insert here, right when the docker container is started

Simple Express App

Now that we have our Dockerfile ready to go we need a simple express application that we can run inside a container. For that, we create two new files like this:

$ touch server.js package.json

package.json will get two dependencies, first express, and second nodemon:

  "name": "node-docker",
  "version": "1.0.0",
  "description": "",
  "main": "server.js",
  "scripts": {
    "start": "nodemon server.js"
  "author": "Jakob Klamser",
  "license": "MIT",
  "dependencies": {
    "express": "^4.17.1"
  "devDependencies": {
    "nodemon": "^2.0.4"

The express application will just return simple HTML when hitting the main page. Therefore server.js should look like this:

const express = require('express');

const app = express();

const PORT = process.env.PORT || 5000;

app.get('/', (req, res) => {
    <h1>Express + Docker</h1>
    <span>This projects runs inside a Docker container</span>

app.listen(PORT, () => {
  console.log(`Listening on port ${PORT}!`);


Before we start setting up a MongoDB container together with our express container, we want to exclude some files from the running container. The syntax of a .dockerignore files is exactly the same as for a .gitignore file:

# Git

# Docker

# NPM dependencies


Last but not least we want to define a docker-compose.yml. This file will contain all the information needed to run the express application and the MongoDB at the same time in two different containers. Let's go ahead and create the file.

$ touch docker-compose.yml

Now we configure it like this:

version: '3'
    build: .
      - "5000:5000"
      - mongo
      - "./:/usr/src/app"
      - "reserved:/usr/src/app/node_modules"
    image: "mongo" 
      - "27017:27017"

version: First we define the version of the docker-compose we want to use. There are quite a lot of differences between version 3 and 2, so be careful when picking a version!

services: This is the section in which we define our express API (api) and the MongoDB (mongo)

build & image: build tells Docker to build an image out of a Dockerfile. In our case we want it to use the Dockerfile in the current directory. That's why we put . as a parameter because this defines the current directory. image tells Docker to pull an already existing image from docker hub.

ports & volumes: As the name of ports suggests we define the ports here. The colon is a mapping operator. We map the port 5000 of the container to the port 5000 of our host system, in this case, our local machine so that we can access the application outside of the container. The same goes for the port mapping of the MongoDB. volumes do something similar but this time with volumes. We map our local directory in which we write our code into the WORKDIR of the container. This way the container immediately reacts if we change anything in the source code.

reserved: This is a special volume that the local node_modules folder if existing, won't override the node_modules folder inside the container.

If you run the following command Docker will create an image from our Dockerfile and then run both containers (api and mongo):

$ docker-compose up

If you want to stop the containers just use this command:

$ docker-compose down


This is a simple Docker development environment setup that can easily be extended. If you want to change the database or add an Nginx to render your frontend, just go ahead and add a new service to docker-compose.yml or change an existing one.

You can also dockerize .NET Core, Java, or GoLang applications if you want to. Tell me about your experience with Docker in the comment section down below, I'd love to hear it!

The code for this is up on my GitHub as usual.

Photo by Dominik Lückmann on Unsplash

Fawas Kareem's photo

Thank you for writing this, but i am totally new to docker, this is my first encounter with it, so after everything worked, where will i find the docker image or how do i use it. i am lost on this

Jakob Klamser's photo

If you are in the root directory of the project an use the command „docker-compose up“ you will start the docker images

Edidiong Asikpo's photo

Amazing article. Thanks for sharing.

Krys Alead's photo


Thanks for the article. I would nevertheless not use "latest" as a tag as depending on when you build your image you got a different version of node. Something I missed is when do you start your server (not the docker image) what is the magic happening that make your docker-compose up command running the npm start command?

Show +1 replies
Krys Alead's photo

Hi Jakob Klamser

I thought your article was about developing with Docker and Node and sharing the image with other developers requires to fix the version. And indeed once you are happy with your dev you need to fix it :). I was reasoning as a development environment for a large team.

Jakob Klamser's photo

Hi Krys Alead

you are free to do so whenever you please. If you are sharing your development environment with your whole team fixing the nodejs version should be prioritized.

Totally agree with you there :)

Bolaji Ayodeji's photo

Love this, thanks for sharing!