Book Image

Effective Concurrency in Go

By : Burak Serdar
Book Image

Effective Concurrency in Go

By: Burak Serdar

Overview of this book

The Go language has been gaining momentum due to its treatment of concurrency as a core language feature, making concurrent programming more accessible than ever. However, concurrency is still an inherently difficult skill to master, since it requires the development of the right mindset to decompose problems into concurrent components correctly. This book will guide you in deepening your understanding of concurrency and show you how to make the most of its advantages. You’ll start by learning what guarantees are offered by the language when running concurrent programs. Through multiple examples, you will see how to use this information to develop concurrent algorithms that run without data races and complete successfully. You’ll also find out all you need to know about multiple common concurrency patterns, such as worker pools, asynchronous pipelines, fan-in/fan-out, scheduling periodic or future tasks, and error and panic handling in goroutines. The central theme of this book is to give you, the developer, an understanding of why concurrent programs behave the way they do, and how they can be used to build correct programs that work the same way in all platforms. By the time you finish the final chapter, you’ll be able to develop, analyze, and troubleshoot concurrent algorithms written in Go.
Table of Contents (13 chapters)

Rate limiting

Limiting the rate of requests for a resource is important to maintain a predictable quality of service. There are several ways rate control can be achieved. We will study two implementations of the same algorithm. The first one is a relatively simple implementation of the token bucket algorithm that uses channels, a ticker, and a goroutine. Then, we will study a more advanced implementation that requires fewer resources.

First, let’s take a look at the token bucket algorithm and show how it is used for rate limiting. Imagine a fixed-sized bucket containing tokens. There is a producer process that deposits tokens into this bucket at a fixed rate, say two tokens/second. Every 500 milliseconds, this process adds a token to the bucket if the bucket has empty slots. If the bucket is full, it waits for another 500 milliseconds and checks the bucket again. There is also a consumer process that consumes tokens at random intervals. However, in order for the consumer...