Book Image

Building CI/CD Systems Using Tekton

By : Joel Lord
Book Image

Building CI/CD Systems Using Tekton

By: Joel Lord

Overview of this book

Tekton is a powerful yet flexible Kubernetes-native open source framework for creating continuous integration and continuous delivery (CI/CD) systems. It enables you to build, test, and deploy across multiple cloud providers or on-premise systems. Building CI/CD Systems Using Tekton covers everything you need to know to start building your pipeline and automating application delivery in a cloud-native environment. Using a hands-on approach, you will learn about the basic building blocks, such as tasks, pipelines, and workspaces, which you can use to compose your CI/CD pipelines. As you progress, you will understand how to use these Tekton objects in conjunction with Tekton Triggers to automate the delivery of your application in a Kubernetes cluster. By the end of this book, you will have learned how to compose Tekton Pipelines and use them with Tekton Triggers to build powerful CI/CD systems.
Table of Contents (20 chapters)
1
Section 1: Introduction to CI/CD
4
Section 2: Tekton Building Blocks
12
Section 3: Tekton Triggers
15
Section 4: Putting It All Together

Understanding the impacts of Agile development practices

At the same time as I was making all those round trips to my customer, a group of software practitioners met at a conference. These thinkers came out of this event with the foundation of what became the "Agile Alliance." You can find out more about the Agile Alliance and the manifesto they wrote at http://agilemanifesto.org.

The agile manifesto, which lists the main principles behind the methodology by the same name, can be summarized as follows:

  • Individuals and interactions over processes and tools
  • Working software over comprehensive documentation
  • Customer collaboration over contract negotiation
  • Responding to change over following a plan

Those principles revolutionized software engineering. It was a significant change from the Waterfall model, and it is now the method that's used for most modern software development projects.

Had agile methodologies been used when I originally wrote my first piece of software, there are many things I would have done differently.

First, I would have fostered working much more closely with my customer. Right from my first release, it was apparent that we had a disconnect in the project's vision. Some of the features that he needed were not implemented in a way that made sense for his day-to-day usage. Even though our team provided many documents and charts to him to explain what I was about to implement, it would have probably been easier to discuss how they were planning to use the software. Picking up a phone or firing off an email to ask a question will always provide a better solution than blindly following a requirements document. Nowadays, there is tooling to make it easier to collaborate more closely and get better feedback.

One part of the software that I delivered that made me immensely proud was an advanced templating engine that would let the customer automate a mail-out process. It used a particular syntax, and I provided a guide that was a few pages long (yes, a hard copy!) for the users to be able to use it. They barely ever used it, and I ultimately removed the engine in a future version favoring a hardcoded template. They filled in one or two fields, clicked Submit, and they were done. When the template needed to be changed, I would update the software, and within a few hours, they had a patch for the new template. In this specific case, it didn't matter how well-written my documentation was; the solution did not work for them.

This over-engineered feature is also a great example of where customer collaboration is so important. In this specific situation, had I worked more closely with the customer, I might have better understood their needs. Instead, I focused on the documentation that was prepared in advance and stuck to it.

Finally, there's responding to change over following a plan. Months would go by between my updates. In this day and age, this might seem inconceivable. The planning processes were long, and it was common practice to publish all the requirements beforehand. Not only that, but deploying software was a lot harder than it is nowadays. Every time I needed to push an update, I needed to meet with the system administrators a couple of weeks before the installation. This sysadmin would check the requirements, test everything out, and eventually prepare the desktop to receive the software's dependencies. On the day of installation, I needed to coordinate with the users and system administrators to access those machines. I was then able to install the latest version on their device manually. It required many people's intervention, and no one wanted me to come back in 2 days with a new update, which made it hard to respond to changes.

Those agile principles might seem like the norm nowadays, but the world was different back then. A lot of those cumbersome processes were required due to technological limitations. Sending large files over the internet was tricky, and desktop applications were the norm. It was also the beginning of what came to be known as Web 2.0. With the emergence of new languages such as PHP and ASP, more and more applications were being developed and deployed to the web.

It was generally easier to deploy applications to run on the web; it simply consisted of uploading files to an FTP server. It didn't require physical access to a computer and much fewer interactions with system administrators. The end users didn't need to update their application manually; they would access the application as they always would and notice a change in a feature or an interface. The interactions were limited between the software developers and the system administrators to get a new version of the application up and running.

Yet, the Waterfall mentality was still strong. More and more software development teams were trying to implement agile practices, but the application deployment cycle was still somewhat slow. The main reason for this was that they were scared of breaking a production build with an update.

Here be testing

Software engineers adopted many strategies to mitigate the risk associated with deploying a new version of the application. One such method was unit testing and test-driven development. With unit testing, software developers were able to run many tests on their code base, ensuring that the software was still working. By executing a test run, developers could be reassured that the new features they implemented didn't break a previously developed component.

Having those tests in place made it much easier to build in small iterations and show the changes to a customer, knowing that the software didn't suffer from any regressions. The customer was then able to provide feedback much earlier in the development loop. The development teams could react to those comments before they invested too much time in a feature that would end up not satisfying the users in the end.

It was a great win for the customers, but it also turned out to be a great way to help the system administrators. With software that was tested, there were much fewer chances of introducing regression in the current application. Sysadmins were more confident in the build and more willing to deploy the applications regularly. The processes were starting to become automated via some bash scripting by the administrators to facilitate the processes.

Still, some changes were harder to push. When changes needed to be made to a database or an upgrade was required for a runtime, operators were usually more hesitant to implement those changes. They would need to set up a new environment to test out the new software and ensure that those changes would not cause problems with the servers. That reality changed in 2006 when Amazon first introduced AWS.

Cloud computing was to technology what agile methodologies were to software development processes. The changes that they brought changed the way developers did their jobs. Now, let's dig deeper to see how the cloud impacted software engineering.