As the DevOps movement continues evolving, it is becoming increasingly important for admins and engineers to know how to work with a much broader set of tools than ever before. More importantly, admins and engineers increasingly need to be able to work with and understand diverse codebases and interact with developers. Luckily, Docker makes most of this a lot easier, and this article will break down how to setup Docker dev environments.

If you’re not familiar with Docker, it is a runtime platform for running processes in a contained environment. Instead of using a VM to run code, Docker essentially takes this abstraction one step further to allow single processes to be isolated and run in a contained environment. This allows developers (and admins) to cleanly separate applications and code in a reproducible and convenient way. I’ll give a few examples later to help internalize this idea a little bit more.

Getting Started

If you work on a Windows or OS X machine, there is currently no support for Docker in either of the kernels yet, so a wrapper of sorts needs to be used, which is where docker-machine comes into play. This tool allows users to install a minimal Linux OS with Docker and some other tools baked in to interact with the host system. This includes the Docker client CLI which enables users to interact with the Docker VM. It’s not a perfect solution but, for the most part, it’s good enough. One issue, which I have written a little bit about before is the shared folder issue when using Virtualbox as the driver for the Docker VM.

I should note that recently (as of March 24) Docker has announced a new beta program for bringing more “native” support to both OS X and Windows. You can check out their announcement here. Essentially, the folks at Docker are trying to streamline the Docker tools to work with native hypervisors for OS X and Windows, including Hyper-V for Windows and xhyve for OS X and native apps on both platforms. This step forward should help make the Docker experience across platforms even easier so definitely keep an eye on developments.

Aside from the closed invitation beta program, the best and currently easiest way to get all of the necessary Docker tools is to use the Docker Toolbox. The toolbox provides a packaged installation of all the tools needed for OS X and Windows.

An alternate way to install the toolbox via CLI on OS X is to use brew.

Once the toolbox has been installed the docker-machine VM will need to be created. That step is pretty easy; it will create the most basic VM to get started working with Docker. Please take a look at the docker-machine documentation because there are many more options for more advanced use cases.

Once the docker-machine VM has been created you should be able to take a look at it with the ls command.

You might have noticed that after the VM has finished bootstrapping, there is a message about configuring your shell. Just run the following to set up your environment variables. You will need to either run this command for every terminal you use that interacts with the VM or place the command in your bash profile (OS X).

To test that things are working you can run a simple test.

If this command works then your environment is configured. There are many more options and commands that allow you to interact with the docker-machine VM but that is out of scope for this post. Run “docker-machine help” for a full list of commands and options.

Creating a SQL Dump

Let’s takedocker dev environments a look at another example. Say you need to look at a SQL dump but don’t want to screw around with installing MySQL on your machine or in a VM. Just pull and run the officially maintained image from Docker Hub (or whatever registry you want) and mount in the data. Note that official Docker images on Docker Hub are maintained by members affiliated with both Docker and the project they are associated with and are denoted by the “official” marker.

Also note that the official images have very good examples of how to do specific tasks so if you are having trouble getting your command to run, chances are good that there will be some example usage in the docs on Docker Hub.

Look up the version tag of MySQL that you would like to use and then drop it into a shell and with the version.

One of the big advantages of using Docker is that you don’t need to install anything extra on your host OS and when you’re done with your container you can simply pitch it. If you need another MySQL container just spin up a new one. Docker images are immutable, so you can spin up containers and have confidence that they will always be identical when they get started fresh.

Docker Compose

Finally, I’d like to cover Docker Compose briefly. Docker Compose allows users to essentially stitch together containers to create fully functioning environments and application stacks. It also turns out that Docker Compose is a perfect tool for development.

The first step is to create a Dockerfile that sets up the environment and copies your code into the Docker image.

This will set up and install Logstash. Other Dockerfiles are omitted here so be sure to check the link to the repo below for more specifics. Also check out the documentation for Dockerfiles if any of the implementation details here aren’t clear.

The docker-compose.yml file for the stack might look similar to the following.

This example obviously won’t work as-is because the additional files and code aren’t included, check here for the full example. This should give readers a good enough idea of how to get started in their own environments.

Please take a look at some other example Docker Compose files and the documentation provided by Docker for more complete details.

How valuable is your time as an IT admin? Spend time focusing on the big tasks by taking FTP servers and file management off your plate. Give SmartFile a try and see how much time you save.

Concluding thoughts on Docker Dev Environments

The Docker ecosystem continues to grow and evolve at a tremendous pace and your Docker skillset should too. Learning Docker should definitely be treated as an ongoing process. It is fairly easy to get the basics down, which is a good start, but it is also important to follow along with the newest developments and make sure your skills are up-to-date so they don’t get stale.

Docker is fast becoming an essential tool for admins and engineers as the DevOps movement continues to progress. Being able to quickly spin up Docker dev environments with containers and work with different codebases quickly and effectively will increasingly be an important skillset to have.

The Docker rabbit hole is a deep one and it is very easy to get immersed in. I definitely recommend taking some time to play around with Docker a little bit on your own to see how it works and to see what all is possible; there are lots of possibilities to explore.

BECOME A DEVOPS PRO
FREE DEVOPS COURSE

Get our free DevOps course delivered straight to your inbox! You’ll learn these tactics:

  • Docker Tips and Tricks
  • Agile Methodologies
  • Documentation and Tools Hacks
  • Containerization vs Virtualization

These DevOps lessons will help your team collaborate and become more agile, so sign up now!

<![endif]–>
Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInShare on RedditBuffer this pagePrint this pageEmail this to someone

Related Posts

Related Topics & Tags: Guest Post Help

About Josh Reichardt

Josh Reichardt is a DevOps Engineer with about.me and the owner of Practical System Administration, where he writes about scripting, devops, virtualization, hardware and policies.

Leave a Reply

Your email address will not be published. Required fields are marked *