Friendly Introduction to Docker
#Docker, #DevOps

Docker is an open-source tool that helps developers create, manage, and deploy containers. These containers can be run locally or easily deployed to the cloud with great support available in AWS, Azure, and DigitalOcean. In addition to the open-source tools, Docker is also a company that provides hosted services for running and managing containers.

Before discussing why Docker is so useful, it’s worth taking a step back and making sure that we understand the concept of a container. Containers are basically a way to isolate processes such that everything running in the container have their own unique view of various resources (i.e., list of processes, memory, filesystems). That means that if a process is running in one container and writes a file, a process running in other another container won’t see those changes.


So how is this different from a virtual machine (VM)? While both containers and VMs isolate resources, VMs accomplish this by emulating the entire hardware layer. While hardware virtualization is relatively efficient these days, you still get the additional overhead of running a separate operating system environment. Containers, on the other hand, rely on an existing OS kernel for isolation.

Although the technical differences between VMs and containers may not seem like a big deal, containers have many real-world benefits. For example,

  • VMs take tens of seconds to start, while a container will often take less than a second.
  • Containers are much more efficient in terms of resource utilization.
  • Containers are usually single-purpose, making it easier to secure and deploy your application.

Prior to Docker, working with containers was very cumbersome, and involved using a patchwork of tools (cgroups, lxc-containers, etc.). Docker greatly simplified the process of creating, running, and managing containers by combining these utilities into an easy to use API and command-line interface.

Use cases

While resource isolation isn’t particularly new, the ability to quickly deploy low-overhead containers has opened possibilities for some interesting use cases.

Synchronized environments
Once you create a Docker container of your working environment, the same container can be pushed to a production environment. Since the container contains all the dependencies needed for running, you can be sure that if the image runs on your machine, then it will successfully run on the production machine.
Efficient consolidation
Containers are very lightweight compared to full-blown virtual machines. That means it’s possible to run many more containers on a single machine. This will let you run more services for less cost. You can even run your very own “platform-as-a-service” like Heroku. There are already some great tools to help with this.
Cloud independence
Once you’ve “dockerized” your application, you can choose between one of many public cloud providers to host your application. Amazon Web Services, Google, and DigitalOcean all support Docker. If you don’t like your current cloud provider, you can just switch. The beauty is that your application will run the same everywhere!
If you’re running OS X or Windows, the easiest way to install Docker is to use the official Docker Toolbox. The toolbox comes with all the necessary tools and clients to get started. After downloading the Docker Toolbox, install the package:
Be aware that you’ll need administrative privileges to install Docker.

If you’re using a popular Linux distro (i.e., Ubuntu, Redhat/CentOS) Docker provides some scripts that will do most of the heavy lifting. Just type the following into your terminal. You’ll need to have sudo privileges, so you may want to prepend sudo to all the commands.

$ wget -qO- | sh
Running Docker
Once Docker is installed, you’ll want to verify that everything is working. The way to do this will depend on whether you’re running on Windows, OS X, or Linux. Let’s say for the moment that you’re on OS X. First find and start Kitematic. The application should have been installed along with the Docker Toolbox.
Kitematic is an awesome graphical interface to browse, download, and run pre-built images on Docker Hub.

A common use case is to run a database in a Docker container. Let’s look for the MongoDB image and install that. First, search for “mongodb” in the Kitematic search bar. Then click the install button. This will automatically download and start the image.

Once the image has started, you’ll see it on the left-hand pane. If you click on that, it should give you information about how to connect to that instance under the Ports section.

If you’re not using Kitematic, you can still download and run the MongoDB image manually. Just type the following into your terminal:

$ docker pull mongo
$ docker run --name my-mongo-instance -d mongo
$ docker ps
CONTAINER ID    IMAGE          COMMAND           CREATED        STATUS          PORTS                              NAMES
1b978f09287d    mongo:latest   "/   2 seconds ago  Up 1 seconds>27017/tcp           my-mongo-instance

The connection information is located under the PORTS column. In this case, you should be able to connect to the MongoDB instance via localhost:32768.

Now that you have MongoDB running, let’s connect to it to test everything out. For this I’ll be using Robomongo, an open-source MongoDB graphical user interface that’s available for Windows, OS X, and Linux. Using Robomongo, connect to a new database using the connection information from Kitematic and/or Docker.
There’s nothing in this MongoDB collection yet, so there’s not much to see. At this point you can insert some data into the collection or write an application that uses this MongoDB instance. However, we’re not doing that in this tutorial, so let’s just shutdown the instance. In Kitematic, click on the MongoDB instance and then press Stop.
Conclusion & next steps

You should feel confident enough to start downloading and installing additional Docker images. Some popular ones include Redis, Wordpress, and NodeJS. The real fun though will start once you start creating your own Docker images.

In Part 2 of this series, we’ll explore how to create your own Docker images using Dockerfiles. We’ll also cover how to upload and share these images to Docker Hub.