TensorFlow relies on building a computation graph at its core, and it uses this computation graph to derive relationships between tensors from the input all the way to the output. Let's say, we have rank 0 (scalar) tensors a, b, and c and we want to evaluate . This evaluation can be represented as a computation graph, as shown in the following figure:
As we can see, the computation graph is simply a network of nodes. Each node resembles an operation, which applies a function to its input tensor or tensors and returns zero or more tensors as the output.
TensorFlow builds this computation graph and uses it to compute the gradients accordingly. The individual steps for building and compiling such a computation graph in TensorFlow are as follows:
Instantiate a new, empty computation graph.
Add nodes (tensors and operations) to the computation graph.
Execute the graph:
Start a new session
Initialize the variables in the graph
Run the computation graph in...