Sometimes (for example, when using pretrained networks), it is desirable to freeze some of the layers. We can do this when we're sure that some of the layers most of the time the first couple of layers, also known as the bottom of the network have proven to be of value as feature extractors. In the following recipe, we will demonstrate how to freeze a part of the network after and only train the remaining subset of the network.
- First, we load all libraries as follows:
import tensorflow as tf from tensorflow.examples.tutorials.mnist import input_data
- In TensorFlow, it's straightforward to load the MNIST dataset:
mnist = input_data.read_data_sets('Data/mnist', one_hot=True)
- Next, we define the placeholders:
n_classes = 10 input_size = 784 x = tf.placeholder(tf.float32, shape=[None, input_size]) y = tf.placeholder(tf.float32, shape=[None, n_classes]) keep_prob = tf.placeholder(tf.float32)
- We define some functions we want to use repeatedly in our network architecture...