Book Image

Learn Quantum Computing with Python and IBM Quantum Experience

By : Robert Loredo
Book Image

Learn Quantum Computing with Python and IBM Quantum Experience

By: Robert Loredo

Overview of this book

IBM Quantum Experience is a platform that enables developers to learn the basics of quantum computing by allowing them to run experiments on a quantum computing simulator and a real quantum computer. This book will explain the basic principles of quantum mechanics, the principles involved in quantum computing, and the implementation of quantum algorithms and experiments on IBM's quantum processors. You will start working with simple programs that illustrate quantum computing principles and slowly work your way up to more complex programs and algorithms that leverage quantum computing. As you build on your knowledge, you’ll understand the functionality of IBM Quantum Experience and the various resources it offers. Furthermore, you’ll not only learn the differences between the various quantum computers but also the various simulators available. Later, you’ll explore the basics of quantum computing, quantum volume, and a few basic algorithms, all while optimally using the resources available on IBM Quantum Experience. By the end of this book, you'll learn how to build quantum programs on your own and have gained practical quantum computing skills that you can apply to your business.
Table of Contents (21 chapters)
1
Section 1: Tour of the IBM Quantum Experience (QX)
5
Section 2: Basics of Quantum Computing
9
Section 3: Algorithms, Noise, and Other Strange Things in Quantum World
18
Assessments
Appendix A: Resources

Chapter 5: Understanding the Quantum Bit (Qubit)

We are all very familiar with the classic bit, or just bit, with respect to current computer hardware systems. It is the fundamental unit used to compute everything from simple mathematical problems, such as addition and multiplication, to more complex algorithms that involve a large collection of information.

Quantum computers have a similar fundamental unit called a quantum bit or qubit, as it is commonly referred to. In this chapter, we will describe what a qubit is, both from a mathematical (computational) and hardware perspective. We will cover the differences between qubits and bits, particularly regarding how calculations are defined. This chapter will then transition from single to multi-qubits and talk about the advantages of multi-bits.

We will also provide an overview of the various hardware implementations and how the different quantum systems implement their qubits to compute information. Finally, we will discuss how...