Book Image

Learn LLVM 17 - Second Edition

By : Kai Nacke, Amy Kwan
Book Image

Learn LLVM 17 - Second Edition

By: Kai Nacke, Amy Kwan

Overview of this book

LLVM was built to bridge the gap between the theoretical knowledge found in compiler textbooks and the practical demands of compiler development. With a modular codebase and advanced tools, LLVM empowers developers to build compilers with ease. This book serves as a practical introduction to LLVM, guiding you progressively through complex scenarios and ensuring that you navigate the challenges of building and working with compilers like a pro. The book starts by showing you how to configure, build, and install LLVM libraries, tools, and external projects. You’ll then be introduced to LLVM's design, unraveling its applications in each compiler stage: frontend, optimizer, and backend. Using a real programming language subset, you'll build a frontend, generate LLVM IR, optimize it through the pipeline, and generate machine code. Advanced chapters extend your expertise, covering topics such as extending LLVM with a new pass, using LLVM tools for debugging, and enhancing the quality of your code. You'll also focus on just-in-time compilation issues and the current state of JIT-compilation support with LLVM. Finally, you’ll develop a new backend for LLVM, gaining insights into target description and how instruction selection works. By the end of this book, you'll have hands-on experience with the LLVM compiler development framework through real-world examples and source code snippets.
Table of Contents (20 chapters)
1
Part 1: The Basics of Compiler Construction with LLVM
4
Part 2: From Source to Machine Code Generation
10
Part 3: Taking LLVM to the Next Level
14
Part 4: Roll Your Own Backend

Building a JIT compiler class from scratch

Using the layered approach of ORC, it is very easy to build a JIT compiler customized for the requirements. There is no one-size-fits-all JIT compiler, and the first section of this chapter gave some examples. Let’s have a look at how to set up a JIT compiler from scratch.

The ORC API uses layers that are stacked together. The lowest level is the object-linking layer, represented by the llvm::orc::RTDyldObjectLinkingLayer class. It is responsible for linking in-memory objects and turning them into executable code. The memory required for this task is managed by an instance of the MemoryManager interface. There is a default implementation, but we can also use a custom version if we need.

Above the object-linking layer is the compile layer, which is responsible for creating an in-memory object file. The llvm::orc::IRCompileLayer class takes an IR module as input and compiles it to an object file. The IRCompileLayer class is a subclass...