Book Image

Hands-On Deep Learning with R

By : Michael Pawlus, Rodger Devine
Book Image

Hands-On Deep Learning with R

By: Michael Pawlus, Rodger Devine

Overview of this book

Deep learning enables efficient and accurate learning from a massive amount of data. This book will help you overcome a number of challenges using various deep learning algorithms and architectures with R programming. This book starts with a brief overview of machine learning and deep learning and how to build your first neural network. You’ll understand the architecture of various deep learning algorithms and their applicable fields, learn how to build deep learning models, optimize hyperparameters, and evaluate model performance. Various deep learning applications in image processing, natural language processing (NLP), recommendation systems, and predictive analytics will also be covered. Later chapters will show you how to tackle recognition problems such as image recognition and signal detection, programmatically summarize documents, conduct topic modeling, and forecast stock market prices. Toward the end of the book, you will learn the common applications of GANs and how to build a face generation model using them. Finally, you’ll get to grips with using reinforcement learning and deep reinforcement learning to solve various real-world problems. By the end of this deep learning book, you will be able to build and deploy your own deep learning applications using appropriate frameworks and algorithms.
Table of Contents (16 chapters)
1
Section 1: Deep Learning Basics
5
Section 2: Deep Learning Applications
12
Section 3: Reinforcement Learning

Choosing the most appropriate activation function

Using keras, you can use a number of different activation functions. Some of these have been discussed in previous chapters; however, there are some that have not been previously covered. We can begin by listing the ones we have already covered with a quick note on each function:

  • Linear: Also known as the identity function. Uses the value of x.
  • Sigmoid: Uses 1 divided by 1 plus the exponent of negative x.
  • Hyperbolic tangent (tanh): Uses the exponent of x minus the exponent of negative x divided by x plus the exponent of negative x. This has the same shape as the sigmoid function; however, the range along the y-axis goes from 1 to -1 instead of from 1 to 0.
  • Rectified Linear Units (ReLU): Uses the value of x if x is greater than 0; otherwise, it assigns a value of 0 if x is less than or equal to 0.
  • Leaky ReLU: Uses the same formula...