Book Image

50 Algorithms Every Programmer Should Know - Second Edition

By : Imran Ahmad
4 (5)
Book Image

50 Algorithms Every Programmer Should Know - Second Edition

4 (5)
By: Imran Ahmad

Overview of this book

The ability to use algorithms to solve real-world problems is a must-have skill for any developer or programmer. This book will help you not only to develop the skills to select and use an algorithm to tackle problems in the real world but also to understand how it works. You'll start with an introduction to algorithms and discover various algorithm design techniques, before exploring how to implement different types of algorithms, with the help of practical examples. As you advance, you'll learn about linear programming, page ranking, and graphs, and will then work with machine learning algorithms to understand the math and logic behind them. Case studies will show you how to apply these algorithms optimally before you focus on deep learning algorithms and learn about different types of deep learning models along with their practical use. You will also learn about modern sequential models and their variants, algorithms, methodologies, and architectures that are used to implement Large Language Models (LLMs) such as ChatGPT. Finally, you'll become well versed in techniques that enable parallel processing, giving you the ability to use these algorithms for compute-intensive tasks. By the end of this programming book, you'll have become adept at solving real-world computational problems by using a wide range of algorithms.
Table of Contents (22 chapters)
Free Chapter
1
Section 1: Fundamentals and Core Algorithms
7
Section 2: Machine Learning Algorithms
14
Section 3: Advanced Topics
20
Other Books You May Enjoy
21
Index

Leaky ReLU

In ReLU, a negative value for x results in a zero value for y. It means that some information is lost in the process, which makes training cycles longer, especially at the start of training. The Leaky ReLU activation function resolves this issue. The following applies for Leaky ReLu:

Shape Description automatically generated with medium confidence
 ; for 
Shape Description automatically generated with medium confidence
 for

This is shown in the following diagram:

Figure 8.13: Leaky ReLu

Here, ß is a parameter with a value less than one.It can be implemented in Python as follows:

def leakyReLU(x,beta=0.01):
    if x<0:
        return (beta*x)    
    else:        
        return x

There are three ways of specifying the value for ß:

  • We can specify a default value of ß.
  • We can make ß a parameter in our neural networkneural network and we can let the neuralneural network decide the value (this is called parametric ReLU).
  • We can make ß a random value (this is called randomized ReLU).