Book Image

Computer Architecture with Python and ARM

By : Alan Clements
Book Image

Computer Architecture with Python and ARM

By: Alan Clements

Overview of this book

This comprehensive guide offers a unique and immersive learning experience by combining Python programming with ARM architecture. Starting with an introduction to computer architecture and the flow of data within a computer system, you’ll progress to building your own interpreter using Python. You’ll see how this foundation enables the simulation of computer operations and learn ways to enhance a simulator by adding new instructions and displaying improved results. As you advance, you’ll explore the TC1 Assembler and Simulator Program to gain insights into instruction analysis and explore practical examples of simulators. This will help you build essential skills in understanding complex computer instructions, strengthening your grasp of computer architecture. Moreover, you’ll be introduced to the Raspberry Pi operating system, preparing you to delve into the detailed language of the ARM computer. This includes exploring the ARM instruction set architecture, data-processing instructions, subroutines, and the stack. With clear explanations, practical examples, and coding exercises, this resource will enable you to design and construct your own computer simulator, simulate assembly language programs, and leverage the Raspberry Pi for ARM programming.
Table of Contents (18 chapters)
1
Part 1: Using Python to Simulate a Computer
Free Chapter
2
Chapter 1: From Finite State Machines to Computers
10
Part 2: Using Raspberry Pi to Study a Real Computer Architecture

1

From Finite State Machines to Computers

In this chapter, you will discover the fundamental nature of computers. Our goal is to explain what makes a computer a computer. These concepts are important because you can’t understand how a computer works until you appreciate the implications of its sequential nature.

Once we’ve introduced the concept of digital systems, the next chapter will demonstrate how a computer operates by fetching instructions from memory and executing them. After that, we will introduce Python and demonstrate how you can write a program to simulate a computer and observe its operations. This book is all about learning by doing; by building a computer with software, you will learn how it operates and how to extend and modify it.

The remainder of this book will look at a real computer, a Raspberry Pi, and show you how to write programs for it and observe their execution. In doing so, we will move on from simulating a hypothetical computer to learning about a real computer.

A computer is a deterministic symbol processing machine. But what does that mean? Deterministic tells us that a computer always behaves in the same way when operating under the same conditions (that is, programs and inputs). If you use a computer to evaluate 2, you will always get the same result, no matter how many times you perform the operation. Not all systems are deterministic – if you toss a coin, the sequence of heads and tails is not predictable.

When we say that a computer is a symbol processing machine, we mean that it takes in symbols and operates on them to provide new symbols as an output. These symbols are anything that can be represented in digital form: letters, words, numbers, images, sound, and video. Consider a computer that’s playing chess. The input symbols that are received by the program correspond to the moves made by a player. The program operates on the input symbols according to a set of rules and produces an output symbol – the computer’s move.

Although we’ve just provided a theoretical definition of a computer, it is important to appreciate that programming involves translating information in the real world into symbols that can be manipulated by a computer – writing a set of rules (that is, a program) that tells the computer how to manipulate the symbols, and then converting the output symbols into a form that is meaningful to a human. The symbols that are processed by a computer have no intrinsic meaning to the computer – a certain pattern of bits (that is, the symbol) might represent a number, a name, a move in a game, and so on. The computer processes these bits to produce a new pattern of bits (that is, an output symbol) that has a meaning only to the programmer or user.

We are going to pose a simple problem and then solve it. Our solution will lead us to the concepts of algorithms and computers, and also introduce key concepts such as discrete digital operations, memory and storage, variables, and conditional operations. By doing this, we can determine the types of operations a computer needs to perform to solve a problem. After this, we can ask, “How can we automate this? That is, how can we build a computer?”

It’s a trite statement, but once you understand a problem, you’re well on the way to finding a solution. When you first encounter a problem that requires an algorithmic solution, you have to think about what you want to do, rather than how you are going to do it. The worst approach to problem solving is to start writing an algorithm (or even actual computer code) before you have fully explored the problem. Suppose you were asked to design a cruise control system for an automobile. In principle, this is a very simple problem with an equally simple solution:

IF cruise control on THEN keep speed constant
                     ELSE read the position of the gas pedal

Couldn’t be simpler, could it? Well, what happens if you’ve selected cruise control and someone pulls out in front of you? You could brake, but this algorithm would attempt to keep the speed constant while you are braking by applying full throttle at the same time. Alternatively, you might suggest that the act of braking should disengage the cruise control mechanism. But is the cruise control to be disengaged permanently, or should the automobile accelerate to its previous speed once the braking action has ceased? You have to think about all the aspects of the problem.

Even if you design a correct algorithm, you have to consider the effect erroneous or spurious data will have on your system. One of the most popular criticisms of computers is that they produce meaningless results if you feed them with incorrect data. This idea is summed up by the expression garbage in, garbage out (GIGO). A well-constructed algorithm should detect and filter out any garbage in the input data stream.

In this chapter, we will introduce the following topics:

  • The finite state machine
  • Solving a problem algorithmically