The basic whole number type is an integer or just int. Integers can either be positive only, called unsigned, or they can be negative and positive, called signed. As you might expect, the natural use for integers is to count things. You must specify unsigned if you know you will not need negative values.
To be explicit, the default type is unsigned int, where the keyword unsigned is optional.
An unsigned integer has its lowest value of 0 and its highest value when all bits are set to 1. For instance, a single byte value has a possible 256 values but their range is 0 to 255. This is sometimes called the one-off problem where the starting value for counting is 0 and not 1, as we were taught when we first learned to count. It is a problem because it takes some time for new programmers to adjust their thinking. Until you are comfortable thinking in this way, the one-off problem will be a common source of confusion and possibly the cause of...