Before we start, there is a bit of good news: using TensorFlow, you don't need to take care about writing backpropagation or gradient descent code and also all common types of layers are already implemented, so things should be easier.
In the TensorFlow example here, we will change things a bit from what you learned in Chapter 1, Setup and Introduction to TensorFlow, and use the tf.layers
API to create whole layers of our network with ease:
import tensorflow as tf from tensorflow.examples.tutorials.mnist import input_data mnist = input_data.read_data_sets("MNIST_data/", one_hot=True) # MNIST data input (img shape: 28*28) num_input = 28*28*1 # MNIST total classes (0-9 digits) num_classes = 10 # Define model I/O (Placeholders are used to send/get information from graph) x_ = tf.placeholder("float", shape=[None, num_input], name='X') y_ = tf.placeholder("float", shape=[None, num_classes], name='Y') # Add dropout to the fully connected layer is_training...