Book Image

Azure Serverless Computing Cookbook - Second Edition

By : Praveen Kumar Sreeram, Jason Marston
Book Image

Azure Serverless Computing Cookbook - Second Edition

By: Praveen Kumar Sreeram, Jason Marston

Overview of this book

Microsoft provides a solution for easily running small segments of code in the cloud with Azure Functions. The second edition of Azure Serverless Computing Cookbook starts with intermediate-level recipes on serverless computing along with some use cases demonstrating the benefits and key features of Azure Functions. You’ll explore the core aspects of Azure Functions, such as the services it provides, how you can develop and write Azure Functions, and how to monitor and troubleshoot them. As you make your way through the chapters, you’ll get practical recipes on integrating DevOps with Azure Functions, and providing continuous integration and continuous deployment with Azure DevOps. This book also provides hands-on, step-by-step tutorials based on real-world serverless use cases to guide you through configuring and setting up your serverless environments with ease. You will also learn how to build solutions for complex, real-world, workflow-based scenarios quickly and with minimal code using Durable Functions. In the concluding chapters, you will ensure enterprise-level security within your serverless environment. The most common tips and tricks that you need to be aware of when working with Azure Functions on production environments will also be covered in this book. By the end of this book, you will have all the skills required for working with serverless code architecture, providing continuous delivery to your users.
Table of Contents (13 chapters)

Auto-scaling Cosmos DB throughput

In the previous recipe, we read data from an Excel sheet and put it into an employee collection. The next step is to insert the collection into a Cosmos DB collection. However, before we insert the data into the Cosmos DB collection, we need to understand that, in real-world scenarios, the number of records that we would need to import would be huge, and so you might face performance issues if the capacity of the Cosmos DB collection is not sufficient.

Cosmos DB collection throughput is measured by the number of Request Units (RUs) allocated to the collection. You can read more about this at https://docs.microsoft.com/en-us/azure/cosmos-db/request-units.

Also, in order to lower costs, for every service, it is recommended that you have the capacity at a lower level and increase it whenever needed. The Cosmos DB API allows us to control the number...