Book Image

Learn Quantum Computing with Python and IBM Quantum Experience

By : Robert Loredo
Book Image

Learn Quantum Computing with Python and IBM Quantum Experience

By: Robert Loredo

Overview of this book

IBM Quantum Experience is a platform that enables developers to learn the basics of quantum computing by allowing them to run experiments on a quantum computing simulator and a real quantum computer. This book will explain the basic principles of quantum mechanics, the principles involved in quantum computing, and the implementation of quantum algorithms and experiments on IBM's quantum processors. You will start working with simple programs that illustrate quantum computing principles and slowly work your way up to more complex programs and algorithms that leverage quantum computing. As you build on your knowledge, you’ll understand the functionality of IBM Quantum Experience and the various resources it offers. Furthermore, you’ll not only learn the differences between the various quantum computers but also the various simulators available. Later, you’ll explore the basics of quantum computing, quantum volume, and a few basic algorithms, all while optimally using the resources available on IBM Quantum Experience. By the end of this book, you'll learn how to build quantum programs on your own and have gained practical quantum computing skills that you can apply to your business.
Table of Contents (21 chapters)
Section 1: Tour of the IBM Quantum Experience (QX)
Section 2: Basics of Quantum Computing
Section 3: Algorithms, Noise, and Other Strange Things in Quantum World
Appendix A: Resources

Introducing quantum computing

Quantum computing isn't a subject that is as common as learning algebra or reading some of the literary classics. However, for most scientists and engineers or any other field that includes studying physics, quantum computing is part of the curriculum. For some of us who don't quite recall our studies in physics, or have never studied it, need not worry, as this section aims to provide you with information that will either refresh your recollection on the topic or at least perhaps help you understand what each of the principles used in quantum computing mean. Let's start with a general definition of quantum mechanics.

Quantum mechanics, as defined by most texts, is the study of nature at its smallest scale – in this case, the subatomic scale. The study of quantum mechanics is not new. Its growth began in the early 1900s by many physicists, whose names still chime in many of the current theories and experiments. The names of such...