Book Image

R Deep Learning Cookbook

By : PKS Prakash, Achyutuni Sri Krishna Rao
Book Image

R Deep Learning Cookbook

By: PKS Prakash, Achyutuni Sri Krishna Rao

Overview of this book

Deep Learning is the next big thing. It is a part of machine learning. It's favorable results in applications with huge and complex data is remarkable. Simultaneously, R programming language is very popular amongst the data miners and statisticians. This book will help you to get through the problems that you face during the execution of different tasks and Understand hacks in deep learning, neural networks, and advanced machine learning techniques. It will also take you through complex deep learning algorithms and various deep learning packages and libraries in R. It will be starting with different packages in Deep Learning to neural networks and structures. You will also encounter the applications in text mining and processing along with a comparison between CPU and GPU performance. By the end of the book, you will have a logical understanding of Deep learning and different deep learning packages to have the most appropriate solutions for your problems.
Table of Contents (17 chapters)
Title Page
Credits
About the Authors
About the Reviewer
www.PacktPub.com
Customer Feedback
Preface

Defining the cost function used for optimization


The cost function is primarily used to evaluate the current performance of the model by comparing the true class labels (y_true_cls) with the predicted class labels (y_pred_cls). Based on the current performance, the optimizer then fine-tunes the network parameters, such as weights and biases, to further improve its performance.

Getting ready

The cost function definition is critical as it will decide optimization criteria. The cost function definition will require true classes and predicted classes to do comparison. The objective function used in this recipe is cross entropy, used in multi-classification problems.

How to do it...

  1. Evaluate the current performance of each image using the cross entropy function in TensorFlow. As the cross entropy function in TensorFlow internally applies softmax normalization, we provide the output of the fully connected layer post dropout (layer_fc2_drop) as an input along with true labels (y_true):
cross_entropy...