Book Image

Beginning DevOps with Docker

By : Joseph Muli
5 (1)
Book Image

Beginning DevOps with Docker

5 (1)
By: Joseph Muli

Overview of this book

Making sure that your application runs across different systems as intended is quickly becoming a standard development requirement. With Docker, you can ensure that what you build will behave the way you expect it to, regardless of where it's deployed. By guiding you through Docker from start to finish (from installation, to the Docker Registry, all the way through to working with Docker Swarms), we’ll equip you with the skills you need to migrate your workflow to Docker with complete confidence.
Table of Contents (7 chapters)

How Docker Improves a DevOps Workflow


DevOps is a mindset, a culture, and a way of thinking. The ultimate goal is to always improve and automate processes as much as possible. In layman language, DevOps requires one to think in the laziest point of view, which puts most, if not all, processes as automatic as possible.

Docker is an open source containerization platform that improves the shipping process of a development life cycle. Note it is neither a replacement for the already existing platforms nor does the organization want it to be.

Docker abstracts the complexity of configuration management like Puppet. With this kind of setup, shell scripts become unnecessary. Docker can also be used on small or large deployments, from a hello world application to a full-fledged production server.

As a developer on different levels, whether beginner or expert, you may have used Docker and you didn't even realize it. If you have set up a continuous integration pipeline to run your tests online, most servers use Docker to build and run your tests.

Docker has gained a lot of support in the tech community because of its agility and, as such, a lot of organizations are running containers for their services. Such organizations include the following:

  • Continuous integration and continuous delivery platforms such as Circle CI, Travis CI, and Codeship

  • Cloud platforms such as Amazon Web Services (AWS) and Google Cloud Platform (GCP) allow developers to run applications out of containers

  • Cisco and the Alibaba group also run some of their services in containers

Docker's place in the DevOps workflow involves, but is not limited to, the following:

Note

Examples of Docker's use cases in a development workflow.

Unifying requirements refers to using a single configuration file. Docker abstracts and limits requirements to a single Dockerfile file.

Abstraction of OS means one doesn't need to worry about building the OS because there exist prebuilt images.

Velocity has to define a Dockerfile and build containers to test in, or use an already built image without writing a Dockerfile.Docker allows development teams to avoid investment on steep learning curves through shell scripts because "automation tool X" is too complicated.

Recap of the Docker Environment

We walked through the fundamentals of containerization earlier. Allow me to emphasize the alternative workflow that Docker brings to us.

Normally, we have two pieces to a working application: the project code base and the provisioning script. The code base is the application code. It is managed by version control and hosted in GitHub, among other platforms.

The provisioning script could be a simple shell script to be run in a host machine, which could be anywhere from a Windows workstation to a fully dedicated server in the cloud.

Using Docker does not interfere with the project code base, but innovates on the provisioning aspect, improving the workflow and delivery velocity. This is a sample setup of how Docker implements this:

The Dockerfile takes the place of the provisioning script. The two combined (project code and Dockerfile) make a Docker image. A Docker image can be run as an application. This running application sourced from a Docker image is called a Docker container.

The Docker container allows us to run the application in a completely new environment on our computers, which is completely disposable. What does this mean?

It means that we are able to declare and run Linux or any other operating system on our computers and then, run our application in it. This also emphasizes that we can build and run the container as many times as we want without interfering with our computer's configuration.

With this, I have brought to your attention four key words: image, container, build, and run. We will get to the nitty-gritty of the Docker CLI next.