Book Image

Learn C Programming

By : Jeff Szuhay
Book Image

Learn C Programming

By: Jeff Szuhay

Overview of this book

C is a powerful general-purpose programming language that is excellent for beginners to learn. This book will introduce you to computer programming and software development using C. If you're an experienced developer, this book will help you to become familiar with the C programming language. This C programming book takes you through basic programming concepts and shows you how to implement them in C. Throughout the book, you'll create and run programs that make use of one or more C concepts, such as program structure with functions, data types, and conditional statements. You'll also see how to use looping and iteration, arrays, pointers, and strings. As you make progress, you'll cover code documentation, testing and validation methods, basic input/output, and how to write complete programs in C. By the end of the book, you'll have developed basic programming skills in C, that you can apply to other programming languages and will develop a solid foundation for you to advance as a programmer.
Table of Contents (33 chapters)
1
Section 1: C Fundamentals
10
Section 2: Complex Data Types
19
Section 3: Memory Manipulation
22
Section 4: Input and Output
28
Section 5: Building Blocks for Larger Programs

Representing whole numbers

The basic whole number type is an integer or just int. Integers can either be positive only, called unsigned, or they can be negative and positive, called signed. As you might expect, the natural use for integers is to count things. You must specify unsigned if you know you will not need negative values.

To be explicit, the default type is unsigned int, where the keyword unsigned is optional.

An unsigned integer has its lowest value of 0 and its highest value when all bits are set to 1. For instance, a single byte value has a possible 256 values but their range is 0 to 255. This is sometimes called the one-off problem where the starting value for counting is 0 and not 1, as we were taught when we first learned to count. It is a problem because it takes some time for new programmers to adjust their thinking. Until you are comfortable thinking in this way, the one-off problem will be a common source of confusion and possibly the cause of...