There are many element-wise operations in neural network layers. The activation function is one of these operations. The cuDNN library provides six activation functions: sigmoid, ReLU, tanh, clipped ReLU, ELU, and identity. In the cuDNN library, cudnnActivationForward() does forward operation and cudnnActivationBackward() does backward operation.
Let's look at the cuddnnActivationForward() function's interface, as follows:
cudnnStatus_t cudnnActivationForward( cudnnHandle_t handle,
cudnnActivationDescriptor_t activationDesc,
const void *alpha, const cudnnTensorDescriptor_t xDesc,
const void *x, const void *beta,
const cudnnTensorDescriptor_t yDesc, void *y)
Using cudnnActivationDescriptor_t, we can determine the types of the activation function. Alpha and beta are scalar values that determine the rate of input to be...