Book Image

Applied Computational Thinking with Python

By : Sofía De Jesús, Dayrene Martinez
Book Image

Applied Computational Thinking with Python

By: Sofía De Jesús, Dayrene Martinez

Overview of this book

Computational thinking helps you to develop logical processing and algorithmic thinking while solving real-world problems across a wide range of domains. It's an essential skill that you should possess to keep ahead of the curve in this modern era of information technology. Developers can apply their knowledge of computational thinking to solve problems in multiple areas, including economics, mathematics, and artificial intelligence. This book begins by helping you get to grips with decomposition, pattern recognition, pattern generalization and abstraction, and algorithm design, along with teaching you how to apply these elements practically while designing solutions for challenging problems. You’ll then learn about various techniques involved in problem analysis, logical reasoning, algorithm design, clusters and classification, data analysis, and modeling, and understand how computational thinking elements can be used together with these aspects to design solutions. Toward the end, you will discover how to identify pitfalls in the solution design process and how to choose the right functionalities to create the best possible algorithmic solutions. By the end of this algorithm book, you will have gained the confidence to successfully apply computational thinking techniques to software development.
Table of Contents (21 chapters)
1
Section 1: Introduction to Computational Thinking
9
Section 2:Applying Python and Computational Thinking
14
Section 3:Data Processing, Analysis, and Applications Using Computational Thinking and Python
20
Other Books You May Enjoy

Introduction to computer science

When looking for a definition of computer science, you will encounter multiple variations, but all state that computer science encompasses all aspects of computers and computing concepts, including hardware and software. In computer science, hardware design is learned in courses offered in engineering or computer engineering, for the most part. The software side of computer science includes operating systems and applications, among other programming areas. For the purposes of this book, we will be concentrating on the software side of computer science.

In this chapter, we'll look at some of the basic definitions, theories, and systems that are important as we delve deeper into the computational thinking world. Once we have identified key areas and defined the concepts, we will be ready to move on to the applications and real-world challenges we face in an ever-changing tech world while also exploring the elements of computational thinking and the Python programming capabilities that can help us tackle these challenges.

The wide range of topics available in computer science can be both daunting and exciting and it is ever evolving. Some of these topics include game design, operating systems, applications for mobile or desktop devices, the programming of robots, and much more. Constant and consistent breakthroughs in computers and computing provide new and exciting opportunities, much of which are unknown to us. Having a basic understanding of the systems behind computer science can help us interact with technology and tackle problems more efficiently.

Learning about computers and the binary system

All computers store information as binary data. The binary system reads all information as a switch, which can be on or off, 0 or 1. The binary system is a base-2 system. You'll need a basic understanding of binary numbers and the binary system to progress in computer science.

The binary system translates all data so that it can be stored as strings using only two numbers: 0 and 1. Data is stored in computers using bits. A bit (which stands for binary digit) is the smallest unit of data you can find in a computer, that is, a 0 or a 1.

When counting in the binary system, the first two numbers are 0 (or 00) and 1 (or 01), much like in the base-10 number system we use in everyday life. If we were to continue counting in binary, our next number would be 10. Let's compare the first three numbers in the base-10 system and the binary system before we learn how to convert from one to the other:

Figure 1.1 – Base-10 and binary comparison

Figure 1.1 – Base-10 and binary comparison

The next number in the base-10 system would be 3. In the binary system, the next number would be 11, which is read as one one. The first 10 numbers in the base-10 and binary systems are shown as follows:

Figure 1.2 – Base-10 and binary comparison (cont'd)

Figure 1.2 – Base-10 and binary comparison (cont'd)

As mentioned, the binary system is a base-2 system. This means that each digit of the base-10 system is paired with a power of 2, so we use those powers to convert between numbers. Understanding how to convert from base-2 to base-10 and vice versa can help us have a better understanding of the relationship between numbers in the different systems.

Converting from binary to base-10

We will start with an example to convert from a binary number to a base-10 number. Take the number 101101. To convert the number, each digit is multiplied by the corresponding base-2 power. The binary number given has 6 digits, so the powers of 2 we will use are 5, 4, 3, 2, 1, and 0. This means the number is converted as follows:

The binary number 101101 is equivalent to 45 in the base-10 system. In everyday life, we write numbers in base-10, so we understand the number 45 as written. However, our computers convert this information into binary to be able to process it, so the number becomes the binary number 101101 so that it can be easily read by the computer.

Converting from base-10 to binary

Again, let's start with an example to demonstrate the process of converting from a base-10 number to a binary number. Take the number 591. To convert the base-10 number to binary, we have to divide the number by 2 iteratively. If the result has no remainder, we insert a 0 (if it is the first number) or insert a 0 to the left of the existing numbers.

If the result has a remainder of 1, we insert a 1 (if it is the first number) or insert a 1 to the left of the existing numbers.

When we divide 591 by 2, the result is 295 with a remainder of 1. That means our right-most number, which is our first number, is 1.

Now divide 295 by 2. The result is 147 with a remainder of 1. So, we insert a 1 to the left of the 1. Our number is now 11.

Now divide 147 by 2. The result is 73 with a remainder of 1. Our result is now 111. Now we'll carry out further divisions:

  • with a remainder of 1. Our number is now 1111.
  • with no remainder. Our number is now 01111.
  • with no remainder. Our number is now 001111.
  • with a remainder of 1. Our number is now 1001111.
  • with no remainder. Our number is now 01001111.
  • with no remainder. Our number is now 001001111.
  • with a remainder of 1. Our number is now 1001001111.

The number 591 in base-10 is equivalent to the number 1001001111 in the binary system.

Another way to convert the number is to use a table for the divisions:

Table 1.1 – Conversion of the base-10 number 591 to binary

Table 1.1 – Conversion of the base-10 number 591 to binary

Using the table, take the numbers from the right-most column and write them starting with the last row from bottom to top. The result is 1001001111.

Learning how to convert numbers is only a small piece of converting data to binary, but it is an important piece. All information, including letters and symbols, must be converted to binary in order to be read by a computer. ASCII (which stands for American Standard Code for Information Exchange) is a protocol that has been adopted universally to convert information. That said, some of the protocol is obsolete, so other protocols use ASCII as a base to expand on its capabilities. Unicode is a widely used 16-bit character set that is based on ASCII.

As discussed, in this section, we learned that information must be encoded or converted in order for a computer to read it. Multiple systems and protocols exist, but for now, we will move on to computer science theory. However, revisiting binary, ASCII, and Unicode as you work through problems can be helpful.