Book Image

Implementing AWS: Design, Build, and Manage your Infrastructure

By : Yohan Wadia, Rowan Udell, Lucas Chan, Udita Gupta
Book Image

Implementing AWS: Design, Build, and Manage your Infrastructure

By: Yohan Wadia, Rowan Udell, Lucas Chan, Udita Gupta

Overview of this book

With this Learning Path, you’ll explore techniques to easily manage applications on the AWS cloud. You’ll begin with an introduction to serverless computing, its advantages, and the fundamentals of AWS. The following chapters will guide you on how to manage multiple accounts by setting up consolidated billing, enhancing your application delivery skills, with the latest AWS services such as CodeCommit, CodeDeploy, and CodePipeline to provide continuous delivery and deployment, while also securing and monitoring your environment's workflow. It’ll also add to your understanding of the services AWS Lambda provides to developers. To refine your skills further, it demonstrates how to design, write, test, monitor, and troubleshoot Lambda functions. By the end of this Learning Path, you’ll be able to create a highly secure, fault-tolerant, and scalable environment for your applications. This Learning Path includes content from the following Packt products: • AWS Administration: The Definitive Guide, Second Edition by Yohan Wadia • AWS Administration Cookbook by Rowan Udell, Lucas Chan • Mastering AWS Lambda by Yohan Wadia, Udita Gupta
Table of Contents (29 chapters)
Title Page
Copyright and Credits
About Packt
Contributors
Preface
Index

Calculating DyanmoDB performance


DynamoDB (DDB) is the managed NoSQL database service from AWS.

As DDB pricing is based on the amount of read and write capacity units provisioned, it is important to be able to calculate the requirements for your use case.

This recipe uses a written formula to estimate the required read capacity units (RCU) and write capacity units (WCU) that should be allocated to you DDB table.

It is also crucial to remember that while new partitions will be automatically added to a DDB table, they cannot be automatically taken away. This means that excessive partitioning can cause long-term impacts to your performance, so you should be aware of them.

Getting ready

All of these calculations assume that you have chosen a good partition key for your data. A good partition key ensures the following:

  • Data is evenly spread across all the available partitions
  • Read and write activity is spread evenly in time

Unfortunately, choosing a good partition key is very data-specific, and beyond...