Book Image

Cryptography Algorithms

By : Massimo Bertaccini
Book Image

Cryptography Algorithms

By: Massimo Bertaccini

Overview of this book

Cryptography Algorithms is designed to help you get up and running with modern cryptography algorithms. You'll not only explore old and modern security practices but also discover practical examples of implementing them effectively. The book starts with an overview of cryptography, exploring key concepts including popular classical symmetric and asymmetric algorithms, protocol standards, and more. You'll also cover everything from building crypto codes to breaking them. In addition to this, the book will help you to understand the difference between various types of digital signatures. As you advance, you will become well-versed with the new-age cryptography algorithms and protocols such as public and private key cryptography, zero-knowledge protocols, elliptic curves, quantum cryptography, and homomorphic encryption. Finally, you'll be able to apply the knowledge you've gained with the help of practical examples and use cases. By the end of this cryptography book, you will be well-versed with modern cryptography and be able to effectively apply it to security applications.
Table of Contents (15 chapters)
1
Section 1: A Brief History and Outline of Cryptography
3
Section 2: Classical Cryptography (Symmetric and Asymmetric Encryption)
7
Section 3: New Cryptography Algorithms and Protocols
12
Section 4: Homomorphic Encryption and the Crypto Search Engine

Notes on security and computation

All the algorithms we have seen in this chapter are symmetric. The basic problem that remains unsolved is the transmission of the key. As I've already said, this problem will be overcome by the asymmetric cryptography that we will explore in the next chapter. In this section, we will analyze the computational problem related to the security of cryptographic algorithms generally speaking. Later in the book, we will focus on the security of any algorithm we will analyze.

To make a similitude, we can say that in cryptography the weak link of the chain destroys the entire chain. That is the same problem as using a very strong cryptographic algorithm to protect the data but leaving its password on the computer screen. In other words, a cryptographic algorithm has to be made of a similar grade of security with respect to mathematical problems. To clarify this concept with an example: factorization and discrete logarithm problems hold similar computational characteristics for now; however, if tomorrow one of these problems were solved, then an algorithm that is based on both would be unuseful.

Let's go deeper to analyze some of the principles universally recognized in cryptography. The first statement is Cryptography has to be open source.

With the term open source, I am referring to the algorithm and not, obviously, to the key. In other words, we have to rely on Kerckhoffs' principle, which states the following:

"A cryptosystem should be secure even if everything about the system, except the key, is public knowledge.

Kerckhoffs' principle applies beyond codes and ciphers to security systems in general: every secret creates a potential failure point. Secrecy, in other words, is a prime cause of brittleness—and therefore something likely to make a system prone to catastrophic collapse. Conversely, openness provides ductility."

– Bruce Schneier

In practice, the algorithm that underlies the encryption code has to be known. It's not useful and is also dangerous to rely on the secrecy of the algorithm in order to exchange secret messages. The reason is that, essentially, if an algorithm has to be used by an open community (just like the internet), it is impossible to keep it secret.

The second statement is The security of an algorithm depends largely on its underlying mathematical problem.

As an example, RSA, one of the most famous and most widely used algorithms in the history of cryptography, is supported by the mathematical problem of factorization.

Factorization is essentially the decomposition of a number into its divisors:

21 = 3 x 7

It's very easy to find the divisors of 21, which are 3 and 7, for small integers, but it is also well known that increasing the number of digits will exponentially increase the problem of factorization.

We will deeply analyze asymmetric algorithms such as RSA in this book, and in particular, in Chapter 3, Asymmetric Encryption, when I will explain asymmetric encryption. But here, it is sufficient to explain why RSA is used to protect financial, intelligence, and other kinds of very sensitive secrets.

The reason is that the mathematical problem underlining RSA (factorization) is still a hard problem to solve for computers of this generation. However, in this introductory section, I can't go deeper into analyzing RSA, so I will limit myself to saying that RSA suffers from not only the problem of factorization as its point of attack, but there is another equally competitive, in computational terms, problem, which is the discrete logarithm problem. Later in the book, we will even analyze both these hard computational problems. Now, we assume (incorrectly, as 99% of cryptographic texts do) that the pillar of security underlying RSA is factorization. In Chapter 6, New Algorithms in Public/Private Key Cryptography, I will show an attack on the RSA algorithm depending on a problem different from factorization. It's the similitude of the weak link of the chain explained at the beginning of this section. If something in an algorithm goes wrong, the underlying security of the algorithm fails.

Anyway, let's see what happens when we attempt to break RSA relying only on the factorization problem, using brute force. In this case, just to give you an idea of the computational power required to decompose an RSA number of 250 digits, factorizing a big semi-prime number is not easy at all if we are dealing with hundreds of digits, or thousands. Just to give you a demonstration, RSA-250 is an 829-bit number composed of 250 decimal digits and is very hard to break with a computer from the current generation.

This integer was factorized in February 2020, with a total computation time of about 2,700 core years with Intel Xeon Gold 6130 at 2.1 GHz. Like many factorization records, this one was performed using a grid of 100 machines and an optimization algorithm that elevated their computation.

The third statement is Practical security is always less secure than theoretical security.

For example, if we analyze the Vernam cipher, we can easily understand how the implementation of this algorithm in practice is very difficult. So, we can say that Vernam is invulnerable but only in theoretical security, not in practical security. A corollary of this assumption is this: implementing an algorithm means putting into practice its theoretical scheme and adding much more complexity to it. So, complexity is the enemy of security. The more complex a system is, the more points of attack can be found.

Another consideration is related to the grade of security of an algorithm. We can better understand this concept by considering Shannon's theory and the concept of perfect secrecy. The definition given by Claude Shannon in 1949 of perfect secrecy is based on statistics and probabilities. However, about the maximum grade of security, Shannon theorized that a ciphertext maintains perfect secrecy if an attacker's knowledge of the content of a message is the same both before and after the adversary inspects the ciphertext, attacking it with unlimited resources. That is, the message gives the adversary precisely no information about the message contents.

To better understand this concept, I invite you to think of different levels or grades of security, in which any of these degrees is secure but with a decreasing gradient. In other words, the highest level is the strongest and the lowest is the weakest but in the middle, there is a zone of an indefinite grade that depends on the technological computational level of the adversary.

It's not important how many degrees are supposed to be secure and how many are not. I think that, essentially, we have to consider what is certainly secure and what is not, but also what can be accepted as secure in a determinate time. With that in mind, let's see the difference between perfect secrecy and secure:

  • A cryptosystem could be considered to have perfect secrecy if it satisfies at least two conditions:
    • It cannot be broken even if the adversary has unlimited computing power.
    • It's not possible to get any information about the message, [m], and the key, [k], by analyzing the cryptogram, [c] (that is, Vernam is a theoretically perfect secrecy system but only under determinate conditions).
  • A cryptogram is secure even if, theoretically, an adversary can break the cryptosystem (that is, if they had quantum computational power and an algorithm of factorization that runs well) but the underlying mathematical problem is considered at that time very hard to solve. Under some conditions, ciphers can be used (such as RSA, Diffie-Hellmann, and El Gamal) because, based on empirical evidence, factorization and discrete logarithms are still hard problems to solve.

So, the concept of security is dynamic and very fuzzy. What is secure now might not be tomorrow. What will happen to RSA and all of the classical cryptography if quantum computers become effective, or a powerful algorithm is discovered tomorrow that is able to break the factorization problem? We will come back to these questions in Chapter 8, Quantum Cryptography. For now, I can say that most classical cryptography algorithms will be broken by the disruptive computational power of quantum computers, but we don't know yet when this will happen.

Under some conditions, we will see that the quantum exchange of the key can be considered a perfect secrecy system. But it doesn't always work, so it's not currently used. Some OTP systems could now be considered highly secure (maybe semi-perfect secrecy), but everything depends on the practical implementation. Finally, remember an important rule: a weak link in the chain destroys everything.

So, in conclusion, we can note the following:

  • Cryptography has to be open source (the algorithms have to be known), except for the key.
  • The security of an algorithm depends largely on its underlying mathematical problem.
  • Complexity is the enemy of security.
  • Security is a dynamic concept: perfect security is only a theoretical issue.