In this recipe, we will normalize the outputs of the second fully connected layer using softmax activation such that each class has a (probability) value restricted between 0 and 1, and all the values across 10 classes add up to 1.
The activation function is applied at the end of the pipeline on predictions generated by the deep learning model. Before executing this step, all steps in the pipeline need to be executed. The recipe requires the TensorFlow library.
- Run the
softmax
activation function on the output of the second fully connected layer:
y_pred = tf$nn$softmax(layer_fc2_drop)
- Use the
argmax
function to determine the class number of the label. It is the index of the class with the largest (probability) value:
y_pred_cls = tf$argmax(y_pred, dimension=1L)