Book Image

AWS Certified DevOps Engineer - Professional Certification and Beyond

By : Adam Book
Book Image

AWS Certified DevOps Engineer - Professional Certification and Beyond

By: Adam Book

Overview of this book

The AWS Certified DevOps Engineer certification is one of the highest AWS credentials, vastly recognized in cloud computing or software development industries. This book is an extensive guide to helping you strengthen your DevOps skills as you work with your AWS workloads on a day-to-day basis. You'll begin by learning how to create and deploy a workload using the AWS code suite of tools, and then move on to adding monitoring and fault tolerance to your workload. You'll explore enterprise scenarios that'll help you to understand various AWS tools and services. This book is packed with detailed explanations of essential concepts to help you get to grips with the domains needed to pass the DevOps professional exam. As you advance, you'll delve into AWS with the help of hands-on examples and practice questions to gain a holistic understanding of the services covered in the AWS DevOps professional exam. Throughout the book, you'll find real-world scenarios that you can easily incorporate in your daily activities when working with AWS, making you a valuable asset for any organization. By the end of this AWS certification book, you'll have gained the knowledge needed to pass the AWS Certified DevOps Engineer exam, and be able to implement different techniques for delivering each service in real-world scenarios.
Table of Contents (31 chapters)
1
Section 1: Establishing the Fundamentals
7
Section 2: Developing, Deploying, and Using Infrastructure as Code
16
Section 3: Monitoring and Logging Your Environment and Workloads
21
Section 4: Enabling Highly Available Workloads, Fault Tolerance, and Implementing Standards and Policies
27
Section 5: Exam Tips and Tricks

Understanding DynamoDB Streams

There may be times when you have a table in DynamoDB and you want to either be updated when a change comes in or have an event-driven process happen. This was the exact reason why AWS created Dynamo Streams. Streams are a time-ordered sequence of item modifications, such as insert, update, and delete operations.

When a stream in DynamoDB writes data, it does so in a strict ordering format. This means that as you write data to the table, pending the configuration settings you have set for the stream, it will push out the items in the same order in which they were written to the table.

Global tables

There are times when you need to either have a high availability plan in place in case of a regional outage for a service such as DynamoDB, or must have quicker local access to your data from another region besides where you originally created your data.

Global Tables, even though they are replicas of an origin table, are all owned by a single...