Book Image

Java Deep Learning Cookbook

By : Rahul Raj
Book Image

Java Deep Learning Cookbook

By: Rahul Raj

Overview of this book

Java is one of the most widely used programming languages in the world. With this book, you will see how to perform deep learning using Deeplearning4j (DL4J) – the most popular Java library for training neural networks efficiently. This book starts by showing you how to install and configure Java and DL4J on your system. You will then gain insights into deep learning basics and use your knowledge to create a deep neural network for binary classification from scratch. As you progress, you will discover how to build a convolutional neural network (CNN) in DL4J, and understand how to construct numeric vectors from text. This deep learning book will also guide you through performing anomaly detection on unsupervised data and help you set up neural networks in distributed systems effectively. In addition to this, you will learn how to import models from Keras and change the configuration in a pre-trained DL4J model. Finally, you will explore benchmarking in DL4J and optimize neural networks for optimal results. By the end of this book, you will have a clear understanding of how you can use DL4J to build robust deep learning applications in Java.
Table of Contents (14 chapters)

Implementing frozen layers

We might want to keep the training instance limited to certain layers, which means some layers can be kept frozen for the training instance, so we can focus on optimizing other layers while frozen layers are kept unchanged. We saw two ways of implementing frozen layers earlier: using the regular transfer learning builder and using the transfer learning helper. In this recipe, we will implement frozen layers for transfer layers.

How to do it...

  1. Define frozen layers by calling setFeatureExtractor():
MultiLayerNetwork newModel = new TransferLearning.Builder(oldModel)
.setFeatureExtractor(featurizeExtractionLayer)
.build();
  1. Call fit() to start the training instance:
newModel.fit(numOfEpochs);
...