Influence of Docker on Development Processes

Phil Feb 27, 2018
Influence of Docker on Development Processes
The complexity of (web) applications has increased hugely in recent years. While many application environments used to consist of a web server and a database, today many other services are often required:

  • a JavaScript runtime for compiling JavaScript and CSS
  • a key-/value store (e.g. Redis)
  • a library for processing images (e.g. ImageMagick)
  • a message queue (e.g RabbitMQ)
  • additional, in-house developed (micro)services

In many cases the services themselves may have dependencies. In order to develop efficiently, all these services must be running on the software developer's computer.

Many development teams rapidly recognized the need for automated setup of the development environment, and many have also implemented this. Virtualization solutions such as Vagrant made this possible. This makes it relatively easy to build a standardized virtual environment with all the dependencies present, which you can use to set up a development environment within a reasonable amount of time.

The Trouble With Dependencies

An example: For the management of our KVM-based virtualization environment, we have written our own Ruby on Rails application. If a developer now wants to implement a new feature or fix a bug, up to 15 additional services and applications are required. In the past, these services all had to be run locally. We can immediately see a problem with this as MacOS doesn't support KVM. That's why we used Vagrant to build a virtual Linux machine that ran some of these services. Once set up, it worked very well and the environment could be restored automatically.

That was the theory. Since some dependencies have changed in the meantime, in practice it is rarely possible to set up this environment successfully at the first attempt. During what was often many hours of debugging, the environment had to be tweaked to run. An added factor was that a large part of the application was still running on the developer's computer and, even there, changes crept in from one version of the operating system to the next, which instantly stopped the application working.

Enter Docker!

About 18 months ago, we started putting the first applications into Docker containers and using them in our daily workflow. Since then, our life as developers has been dramatically simplified. All of an application's dependencies are now described in their "Dockerfile". All the services required (database, Redis, other microservices) are defined and configured in a "docker-compose.yml" file. Now all that is needed is a single command ("docker-compose up") to start the entire environment and immediately get to work on development. Other than Docker itself, no other dependencies need be installed on the developer's computer.

The same configurations can also be used on our CI server. Here too, no other services need be installed, except Docker. "It just works." 

The Path to Productive Work

The experience we have gained in setting up and maintaining this structure is now helping us move our applications from a classic Linux server to our container platform. Most of the preliminary work has already been done to ensure an application can run properly in a Docker container. By simplifying our development process, we have also created a solid foundation for improving our deployment and operational processes. We are now working on further refining these processes.

We recently started using Gitlab and its pipelines. These allow us to completely automate the process from checking-in source code to deployment. Gitlab performs our unit and integration tests, builds the Docker image, and makes it available in registry. With one click, the finished image is deployed in the production environment.

Dynamic Test Environments

Another great option we gain from using containers is dynamic staging environments. As part of a prototype, we created an application that listens to Github (or Gitlab) webhooks and then creates a complete environment for the specific branch. Specifically, a previously specified OpenShift template is used. The URL for the new environment is then posted directly as a comment to the pull request. After the merge, the complete environment is removed again. The prototype application "Actuator" is open source and is available here.

Conclusion

Containers completely transform familiar processes from development through to operation. But you should not underestimate the fact that quite a bit of new expertise is required. For us, however, the investment was definitely worth it. It has made many things easier and has got rid of some of the tedious manual work.


 

Would you like to use Docker for your application?

At the TechTalkThursday in March 2018, David spoke about «Docker for Developers» and gave an overview of the use of Docker in container technology. The video presentation including the transcripts is available under the following link.

Watch the video presentation now