Book Image

Hands-On Neural Network Programming with C#

By : Matt Cole
Book Image

Hands-On Neural Network Programming with C#

By: Matt Cole

Overview of this book

Neural networks have made a surprise comeback in the last few years and have brought tremendous innovation in the world of artificial intelligence. The goal of this book is to provide C# programmers with practical guidance in solving complex computational challenges using neural networks and C# libraries such as CNTK, and TensorFlowSharp. This book will take you on a step-by-step practical journey, covering everything from the mathematical and theoretical aspects of neural networks, to building your own deep neural networks into your applications with the C# and .NET frameworks. This book begins by giving you a quick refresher of neural networks. You will learn how to build a neural network from scratch using packages such as Encog, Aforge, and Accord. You will learn about various concepts and techniques, such as deep networks, perceptrons, optimization algorithms, convolutional networks, and autoencoders. You will learn ways to add intelligent features to your .NET apps, such as facial and motion detection, object detection and labeling, language understanding, knowledge, and intelligent search. Throughout this book, you will be working on interesting demonstrations that will make it easier to implement complex neural networks in your enterprise applications.
Table of Contents (16 chapters)
13
Activation Function Timings

Fluent training with the MNIST database

In the following example, we will train our CNN against the MNIST database of images.

To declare a function, use the following code:

private void MnistDemo()
{

Next, download the training and testing datasets with the following command:

var datasets = new DataSets();

Load 100 validation sets with the following command:

if (!datasets.Load(100))
{
return;
}

Now it's time to create the neural network using the Fluent API, as follows:

this._net = FluentNet<double>.Create(24, 24, 1)
.Conv(5, 5, 8).Stride(1).Pad(2)
.Relu()
.Pool(2, 2).Stride(2)
.Conv(5, 5, 16).Stride(1).Pad(2)
.Relu()
.Pool(3, 3).Stride(3)
.FullyConn(10)
.Softmax(10)
.Build();

Create the stochastic gradient descent trainer from the network with the following command:

this._trainer = new SgdTrainer<double>(this._net)
{
LearningRate = 0.01,
BatchSize = 20,
L2Decay = 0.001,
Momentum...