# Comparing a bit and a qubit

So, let's start with the obvious—or perhaps, not so obvious—notion that most people who read this book know what a bit is.

An intuitive feeling that we have says that a bit is something that is either **zero** (**0**) or **one** (**1**). By putting many bits together, you can create bytes as well as arbitrary large binary numbers, and with those, build the most amazing computer programs, encode digital images, encrypt your love letters and bank transactions, and more.

In a classical computer, a bit is realized by using low or high voltages over the transistors that make up the logic board, typically something such as 0 V and 5 V. In a hard drive, the bit might be a region magnetized in a certain way to represent 0 and the other way for 1, and so on.

In books about quantum computing, the important point to drive home is that a classical bit can only be a 0 or a 1; it can never be anything else. In the computer example, you can imagine a box with...