Book Image

Learning Neural Networks with Tensorflow [Video]

By : Roland Meertens
Book Image

Learning Neural Networks with Tensorflow [Video]

By: Roland Meertens

Overview of this book

<p><span id="description" class="sugar_field">Neural Networks are used all around us: they index photos into categories, translate text, suggest replies for emails, and beat the best games. Many people are eager to apply this knowledge to their own data, but many fail to achieve the results they expect.</span></p> <p><span id="description" class="sugar_field">In this course, we’ll start by building a simple flower recognition program, making you feel comfortable with Tensorflow, and it will teach you several important concepts in Neural Networks. Next, you’ll start working with high-dimensional uses to predict one output: 1275 molecular features you can use to predict the atomization energy of an atom. The next program we’ll create is a handwritten number recognition system trained on the famous MNIST dataset. We’ll work our way up from a simple multilayer perceptron to a state of the art Deep Convolutional Neural Network. </span></p> <p><span id="description" class="sugar_field">In the final program, estimate what a celebrity looks like, checking for new pictures to see whether a celebrity is attractive, wears a hat, has lipstick on, and many more properties that are difficult to estimate with "traditional" computer vision techniques.</span></p> <p><span id="description" class="sugar_field">After the course, you’ll not only be able to build a Neural Network for your own dataset, you’ll also be able to reason which techniques will improve your Neural Network.</span></p> <h2><span class="sugar_field">Style and Approach</span></h2> <p><span class="sugar_field"><span id="trade_selling_points_c" class="sugar_field">The video is packed with step-by-step instructions, working examples, and helpful advice about building your Neural Network with Tensorflow. You'll learn to build your own network. This practical course is divided into clear byte-size chunks so you can learn at your own pace and focus on the areas of most interest to you.&nbsp;</span></span></p>
Table of Contents (5 chapters)
Chapter 4
Recognizing Written Digits with the MNIST Dataset
Content Locked
Section 6
Optimization and Loss Functions
We currently used the mean squared error loss function and normal gradient descent. The softmax cross entropy function performs better for classification functions. We will also look at the momentum and the adam optimizer, which often perform better. - Understand what the softmax cross entropy function does - Understand what the momentum and adam optimizer do - Compare our performance with our previous performance