Book Image

Docker for Serverless Applications

By : Chanwit Kaewkasi
Book Image

Docker for Serverless Applications

By: Chanwit Kaewkasi

Overview of this book

Serverless applications have gained a lot of popularity among developers and are currently the buzzwords in the tech market. Docker and serverless are two terms that go hand-in-hand. This book will start by explaining serverless and Function-as-a-Service (FaaS) concepts, and why they are important. Then, it will introduce the concepts of containerization and how Docker fits into the Serverless ideology. It will explore the architectures and components of three major Docker-based FaaS platforms, how to deploy and how to use their CLI. Then, this book will discuss how to set up and operate a production-grade Docker cluster. We will cover all concepts of FaaS frameworks with practical use cases, followed by deploying and orchestrating these serverless systems using Docker. Finally, we will also explore advanced topics and prototypes for FaaS architectures in the last chapter. By the end of this book, you will be in a position to build and deploy your own FaaS platform using Docker.
Table of Contents (15 chapters)
Title Page
Packt Upsell

Chapter 1. Serverless and Docker

When talking about containers, most of us already know how to pack an application into a container as a deployment unit. Docker allows us to deploy applications in its de facto standard format to virtually everywhere, ranging from our laptop, a QA cluster, a customer site, or a public cloud, as shown in the following diagram:

Figure 1.1: Deploying a Docker container to various infrastructures and platforms

Running Docker containers on public clouds is considered normal these days. We have already gained benefits such as starting cloud instances on demand with pay-as-you-go billing. Without the need to wait for hardware purchase, we can also move faster using an Agile method with a continuous delivery pipeline to optimize our resources.

According to a Docker report, the total cost of ownership (TCO) of one of their customers was cut by 66% when using Docker to migrate existing applications to the cloud. Not only can the TCO be dramatically reduced, the companies using Docker can also accelerate the time to market from months to days. This is a huge win.

Deploying containers to cloud infrastructures, such as AWS, Google Cloud, or Microsoft Azure, already makes things simpler. Cloud infrastructures eliminate the need for organizations to buy their own hardware and to have a dedicated team for maintaining it.

However, organizations still require the role, such as that of the architect, to take care of site reliability and scalability even when they use the public cloud infrastructure. Some of these people are known as SREs, the site reliability engineers.

In addition, organizations also need to take care of system-level packages and dependencies. They need to perform patching for application security and the OS kernel on their own because the software stack will be constantly changing. In many scenarios, the team in these organizations must scale the size of their clusters to unexpectedly serve requests when loads are peaking. Also, the engineers need to do their best to scale the clusters down, when possible, so as to reduce the cloud costs as it is a pay-as-you-go model.

Developers and engineering teams always work hard to deliver great user experience and site availability. While doing that, over provisioning on-demand instances or under utilizing them, can be costly. According to an AWS white paper,, the amount of underutilized instances is as much as 85% of the provisioned machines.

Serverless computing platforms, such as AWS Lambda, Google Cloud Functions, Azure Functions, and IBM Cloud Functions, are designed to address these overprovisioning and underutilization problems.

The following topics will be covered in this chapter:

  • Serverless
  • The common architecture of a serverless FaaS
  • Serverless/FaaS use cases
  • Hello world, the FaaS/Docker way