since 1999

 

7 minutes estimated reading time.

Top 5 Tips and Tricks on Developing with Docker

We are going into the third year of fully Dockerized development for our web application work. This is especially beneficial as a team who maintains many web apps developed over a decade ago in different languages by other developers who have long since left our clients' companies.

We have Ruby on Rails, Python, NodeJS, and many different versions of third party native dependencies. Getting up to speed on all these at once was always painful. Docker lets us keep the special snowflakes wrapped up inside the containers. It’s nice.

So here are my top 5 quick tips on how I stay productive with Docker.

5. Fully commit for the best results

If you are going to do 80% of your work in Docker containers, going all in has value. This will let you learn the edge cases and get good with the tool. In the Ruby on Rails world, it ends up replacing the need for rvm or rbenv to a great extent.

4. Embrace shell scripting

There are two places where shell scripting is very valuable:

1. Commands you need to run all the time

When we work on many different projects, it’s really nice to have a similar project structure that one can execute commands that are used regularly without having to remember the project specifics. We generally put these scripts inside the bin/docker folder within the project. Some common examples include:

Script Purpose
bin/docker/setup Build the docker images and setup any data necessary to test or poke at locally.
bin/docker/bash Opens an interactive bash console within the app’s Docker image
bin/docker/serve Fire up the application so it can be tested locally, like at http://localhost:3000
bin/docker/console Open the interactive REPL console with the app, the bundle exec rails c or python manage.py shell
bin/docker/migrate Prepare and/or run database migrations
bin/docker/test Run the automated test suite

The details of these scripts can change based upon the details of the project, for example in a Django application bin/docker/console may look like:

#!/usr/bin/env bash
####################################################
# Open Interactive Django Console in Docker Image
#
# Usage:
# bin/docker/console [OPTIONS]

set -e

docker-compose run django bash -c "python manage.py shell $@"

whereas for a Rails application it could be:

#!/usr/bin/env bash
####################################################
# Open Interactive Rails Console in Docker Image
#
# Usage:
# bin/docker/console [OPTIONS]

set -e

docker-compose up -d rails 2> /dev/null
docker-compose exec rails "bundle exec rails c $@"

This provides a convenient way to code the differences between your projects and reduce cognitive load while working on otherwise very different applications.

2. Conditional logic when your Docker image fires up

Shell scripting is very useful as the entrypoint for your application in your Dockerfile because it lets you use configuration (e.g., environment variables) to reuse the same Docker image for more than one job. Perhaps for the web server and for the background job worker as an example.

Suppose a Dockerfile for a Rails app might look like:


FROM ruby:3.0.0

ENV RUNNING_IN_DOCKER "yes"
ENV PORT 3000
EXPOSE $PORT

# Install native Linux dependencies for the Gems, including spelling, nodejs, and
# the PostgreSQL database client
RUN apt-get update && apt-get install -y \
    apt-utils \
    aspell\
    libaspell-dev\
    postgresql-client \
    texlive-extra-utils \
    iproute2

# Setup the App Directory for the Rails Application to be Mapped Into
RUN mkdir -p /app
WORKDIR /app

# Copy and Build the Ruby Gems
RUN gem update bundler
COPY Gemfile Gemfile.lock Rakefile ./
RUN gem install bundler && bundle install --jobs 20 --retry 5

COPY . ./

# Start the application based upon it's role.
# Configure by setting the SERVER_ROLE environment variable to
# web or worker. Set SERVER_PORT to 3000 or 80
CMD /app/bin/start/run-with-role.sh


Notice that the final command doesn’t fire up the server specifically, instead it runs the run-with-role.sh script. This becomes a very convenient place to put conditional logic that helps the app deploy and run cleanly.

Here is an example from a real production application. Notice the reference to environment variables that allow the environment to be configured without needing to build multiple Docker images for each role. The script conditionally does the tasks needed, such as:

This way the same image can be used for multiple jobs when deployed to the cloud environment!

#!/usr/bin/env sh
# In your Task definition, set your SERVER_ROLE to be one of
# - web
# - worker
# Then this script will automatically start it when Docker starts

cd /app

if [ "$RUN_MIGRATIONS" = "yes" ]; then
  echo "Running database migrations if any"
  bundle exec rake db:migrate
fi

if [ "$SERVER_ROLE" = "web" ]; then
  #echo "Building server assets in the background"
  #bundle exec rake assets:precompile & 

  echo "Starting in web server mode..."
  bundle exec rails s -b 0.0.0.0 -p $SERVER_PORT
  echo "Done with web server mode, shutting down..."
elif [ "$SERVER_ROLE" = "worker" ]; then
  echo "Starting in delayed job worker mode..."
  bundle exec rake jobs:work
  echo "Done with worker mode, shutting down..."
else
  echo "Unknown server role '$SERVER_ROLE'. Must be 'web' or 'worker'."
  exit 1
fi

echo "Done."

This provides a convenient way to avoid multiple Docker images for different roles your application performs or move configuration differences outside of the Git repository and into error prone copy+paste in your cloud configuration web UI.

3. Embrace native Linux dependencies if they are the best tool for the job

For a long time, a lot of Ruby on Rails development work was done with Apple MacBooks running on Mac OS (Darwin), a Unix variant. Then these applications were deployed to Linux servers and needed different native libraries or utilities to work. These systems were similar but different. With Docker, you can use production dependencies in your development environment! This is a very useful thing. It also means that your localized testing passing has a stronger correlation with the application passing testing in CI and being ready for deployment.

2. Use the available tool images!

It’s nice to have containers and not have to reinvent the wheel or commingle functionality into your images!

Mailcatcher

Testing e-mail in development and ensuring that no mail goes out for real is important. In Ruby on Rails, the mailcatcher Gem is very useful. However, with Docker you can use it standalone! We in fact use it with our Python work as well!

In the docker-compose.yml:

mailcatcher:
    image: chatwork/mailcatcher
    ports:
      - 1025:1025
      - 1080:1080

Microsoft SQL Server

We have a legacy Ruby on Rails application that communicates with a Microsoft SQL Server in production. We now use Docker to locally test. This is setup in two steps:

  1. The official Microsoft SQL Server for Linux Docker image
  2. A custom Docker image that initializes the database with fixture data so that our Rails app is not responsible for it at all

In the docker-compose.yml:

microsoft-sql-database:
    image: "mcr.microsoft.com/mssql/server"
    environment:
      SA_PASSWORD: 'XXXX_CHANGE_THIS_PASSWORD_XXXX'
      MSSQL_SA_PASSWORD: 'XXXX_CHANGE_THIS_PASSWORD_XXXX'
      ACCEPT_EULA: 'Y'
    ports:
      - 1433 # Only open port inside Docker network, not localhost

microsoft-sql-server-setup:
  build: ./db/mssql/    # Custom folder that has a Dockerfile in add fixture data
  environment:
    SA_PASSWORD: 'XXXX_CHANGE_THIS_PASSWORD_XXXX'
    MSSQL_SA_PASSWORD: 'XXXX_CHANGE_THIS_PASSWORD_XXXX'
    ACCEPT_EULA: 'Y'
  command: /app/setup
  depends_on:
    - microsoft-sql-database

1. Run on a Linux Host if you Got It

I personally do 99% of my development work on a Linux workstation or a Dell XPS 13. These run Linux on the machine and are super fast when working with Docker. Don’t get me wrong, I also have a M1 MacBook Pro and the tools are getting better to enable the work on this system. I tend to use the MacBook Pro for Mac-like things now such as family photo maintenance and light video production rather than raw development work.

Conclusion

Docker has proven to be a very useful tool for our company. While far from exhaustive, I hope that you find these top five tips useful. Something to think about and apply to your own work.