Book Image

Neural Network Programming with Java - Second Edition

By : Fabio M. Soares, Alan M. F. Souza
Book Image

Neural Network Programming with Java - Second Edition

By: Fabio M. Soares, Alan M. F. Souza

Overview of this book

<p>Want to discover the current state-of-art in the field of neural networks that will let you understand and design new strategies to apply to more complex problems? This book takes you on a complete walkthrough of the process of developing basic to advanced practical examples based on neural networks with Java, giving you everything you need to stand out.</p> <p>You will first learn the basics of neural networks and their process of learning. We then focus on what Perceptrons are and their features. Next, you will implement self-organizing maps using practical examples. Further on, you will learn about some of the applications that are presented in this book such as weather forecasting, disease diagnosis, customer profiling, generalization, extreme machine learning, and characters recognition (OCR). Finally, you will learn methods to optimize and adapt neural networks in real time.</p> <p>All the examples generated in the book are provided in the form of illustrative source code, which merges object-oriented programming (OOP) concepts and neural network features to enhance your learning experience.</p>
Table of Contents (19 chapters)
Neural Network Programming with Java Second Edition
Credits
About the Authors
About the Reviewer
www.PacktPub.com
Customer Feedback
Preface
Index

Let the coding begin! Neural networks in practice


In this book, we will cover the entire process of implementing a neural network by using the Java programming language. Java is an object-oriented programming language that was created in the 1990s by a small group of engineers from Sun Microsystems, later acquired by Oracle in the 2010s. Nowadays, Java is present in many devices that are part of our daily life.

In an object-oriented language, such as Java, we deal with classes and objects. A class is a blueprint of something in the real world, and an object is an instance of this blueprint, something like a car (class referring to all and any car) and my car (object referring to a specific car—mine). Java classes are usually composed of attributes and methods (or functions), that include objects-oriented programming (OOP) concepts. We are going to briefly review all of these concepts without diving deeper into them, since the goal of this book is just to design and create neural networks from a practical point of view. Four concepts are relevant and need to be considered in this process:

  • Abstraction: The transcription of a real-world problem or rule into a computer programming domain, considering only its relevant features and dismissing the details that often hinder development.

  • Encapsulation: Analogous to a product encapsulation by which some relevant features are disclosed openly (public methods), while others are kept hidden within their domain (private or protected), therefore avoiding misuse or excess of information.

  • Inheritance: In the real world, multiple classes of objects share attributes and methods in a hierarchical manner; for example, a vehicle can be a superclass for car and truck. So, in OOP, this concept allows one class to inherit all features from another one, thereby avoiding the rewriting of code.

  • Polymorphism: Almost the same as inheritance, but with the difference that methods with the same signature present different behaviors on different classes.

Using the neural network concepts presented in this chapter and the OOP concepts, we are now going to design the very first class set that implements a neural network. As could be seen, a neural network consists of layers, neurons, weights, activation functions, and biases. About layers, there are three types of them: input, hidden, and output. Each layer may have one or more neurons. Each neuron is connected either to a neural input/output or to another neuron, and these connections are known as weights.

It is important to highlight that a neural network may have many hidden layers or none, because the number of neurons in each layer may vary. However, the input and output layers have the same number of neurons as the number of neural inputs/outputs, respectively.

So, let's start implementing. Initially, we are going to define the following classes:

  • Neuron: Defines the artificial neuron

  • NeuralLayer: Abstract class that defines a layer of neurons

  • InputLayer: Defines the neural input layer

  • HiddenLayer: Defines the layers between input and output

  • OutputLayer: Defines the neural output layer

  • InputNeuron: Defines the neuron that is present at the neural network input

  • NeuralNet: Combines all previous classes into one ANN structure

In addition to these classes, we should also define an IActivationFunction interface for activation functions. This is necessary because Activation functions will behave like methods, but they will need to be assigned as a neuron property. So we are going to define classes for activation functions that implement this interface:

  • Linear

  • Sigmoid

  • Step

  • HyperTan

Our first chapter coding is almost complete. We need to define two more classes. One for handling eventually thrown exceptions (NeuralException) and another to generate random numbers (RandomNumberGenerator). Finally, we are going to separate these classes into two packages:

  • edu.packt.neuralnet: For the neural network related classes (NeuralNet, Neuron, NeuralLayer, and so on)

  • edu.packt.neuralnet.math: For the math related classes (IActivationFunction, Linear, and so on)

To save space, we are not going to write the full description of each class, instead we are going to address the key features of most important classes. However, the reader is welcomed to take a glance at the Javadoc documentation of the code, in order to get more details on the implementation.