Book Image

Hands-On High Performance with Go

By : Bob Strecansky
Book Image

Hands-On High Performance with Go

By: Bob Strecansky

Overview of this book

Go is an easy-to-write language that is popular among developers thanks to its features such as concurrency, portability, and ability to reduce complexity. This Golang book will teach you how to construct idiomatic Go code that is reusable and highly performant. Starting with an introduction to performance concepts, you’ll understand the ideology behind Go’s performance. You’ll then learn how to effectively implement Go data structures and algorithms along with exploring data manipulation and organization to write programs for scalable software. This book covers channels and goroutines for parallelism and concurrency to write high-performance code for distributed systems. As you advance, you’ll learn how to manage memory effectively. You’ll explore the compute unified device architecture (CUDA) application programming interface (API), use containers to build Go code, and work with the Go build cache for quicker compilation. You’ll also get to grips with profiling and tracing Go code for detecting bottlenecks in your system. Finally, you’ll evaluate clusters and job queues for performance optimization and monitor the application for performance regression. By the end of this Go programming book, you’ll be able to improve existing code and fulfill customer requirements by writing efficient programs.
Table of Contents (20 chapters)
Section 1: Learning about Performance in Go
Section 2: Applying Performance Concepts in Go
Section 3: Deploying, Monitoring, and Iterating on Go Programs with Performance in Mind

Exploring queues

A queue is a pattern that is frequently used in computer science to implement a first in first out (FIFO) data buffer. The first thing to come into the queue is also the first thing to leave. This happens in an ordered fashion in order to process sorted data. Adding things to the queue is known as enqueueing the data into the queue, and removing it from the end of the queue is known as dequeuing. Queues are commonly used as a fixture in which data is stored and processed at another time.

Queues are beneficial because they don't have a fixed capacity. A new element can be added to the queue at any time, which makes a queue an ideal solution for asynchronous implementations such as a keyboard buffer or a printer queue. Queues are used in situations where tasks must be completed in the order that they were received, but when real-time occurs, it may not be possible...