Book Image

TensorFlow Machine Learning Cookbook

By : Nick McClure
Book Image

TensorFlow Machine Learning Cookbook

By: Nick McClure

Overview of this book

TensorFlow is an open source software library for Machine Intelligence. The independent recipes in this book will teach you how to use TensorFlow for complex data computations and will let you dig deeper and gain more insights into your data than ever before. You’ll work through recipes on training models, model evaluation, sentiment analysis, regression analysis, clustering analysis, artificial neural networks, and deep learning – each using Google’s machine learning library TensorFlow. This guide starts with the fundamentals of the TensorFlow library which includes variables, matrices, and various data sources. Moving ahead, you will get hands-on experience with Linear Regression techniques with TensorFlow. The next chapters cover important high-level concepts such as neural networks, CNN, RNN, and NLP. Once you are familiar and comfortable with the TensorFlow ecosystem, the last chapter will show you how to take it to production.
Table of Contents (19 chapters)
TensorFlow Machine Learning Cookbook
Credits
About the Author
About the Reviewer
www.PacktPub.com
Customer Feedback
Preface
Index

Declaring Tensors


Tensors are the primary data structure that TensorFlow uses to operate on the computational graph. We can declare these tensors as variables and or feed them in as placeholders. First we must know how to create tensors.

Getting ready

When we create a tensor and declare it to be a variable, TensorFlow creates several graph structures in our computation graph. It is also important to point out that just by creating a tensor, TensorFlow is not adding anything to the computational graph. TensorFlow does this only after creating available out of the tensor. See the next section on variables and placeholders for more information.

How to do it…

Here we will cover the main ways to create tensors in TensorFlow:

  1. Fixed tensors:

    • Create a zero filled tensor. Use the following:

      zero_tsr = tf.zeros([row_dim, col_dim])
    • Create a one filled tensor. Use the following:

      ones_tsr = tf.ones([row_dim, col_dim])
    • Create a constant filled tensor. Use the following:

      filled_tsr = tf.fill([row_dim, col_dim], 42)
    • Create a tensor out of an existing constant. Use the following:

      constant_tsr = tf.constant([1,2,3])

    Note

    Note that the tf.constant() function can be used to broadcast a value into an array, mimicking the behavior of tf.fill() by writing tf.constant(42, [row_dim, col_dim])

  2. Tensors of similar shape:

    • We can also initialize variables based on the shape of other tensors, as follows:

      zeros_similar = tf.zeros_like(constant_tsr)
      ones_similar = tf.ones_like(constant_tsr)

    Note

    Note, that since these tensors depend on prior tensors, we must initialize them in order. Attempting to initialize all the tensors all at once willwould result in an error. See the section There's more… at the end of the next chapter on variables and placeholders.

  3. Sequence tensors:

    • TensorFlow allows us to specify tensors that contain defined intervals. The following functions behave very similarly to the range() outputs and numpy's linspace() outputs. See the following function:

      linear_tsr = tf.linspace(start=0, stop=1, start=3)
    • The resulting tensor is the sequence [0.0, 0.5, 1.0]. Note that this function includes the specified stop value. See the following function:

      integer_seq_tsr = tf.range(start=6, limit=15, delta=3)
    • The result is the sequence [6, 9, 12]. Note that this function does not include the limit value.

  4. Random tensors:

    • The following generated random numbers are from a uniform distribution:

      randunif_tsr = tf.random_uniform([row_dim, col_dim], minval=0, maxval=1)
    • Note that this random uniform distribution draws from the interval that includes the minval but not the maxval (minval <= x < maxval).

    • To get a tensor with random draws from a normal distribution, as follows:

      randnorm_tsr = tf.random_normal([row_dim, col_dim], mean=0.0, stddev=1.0)
    • There are also times when we wish to generate normal random values that are assured within certain bounds. The truncated_normal() function always picks normal values within two standard deviations of the specified mean. See the following:

      runcnorm_tsr = tf.truncated_normal([row_dim, col_dim], mean=0.0, stddev=1.0)
    • We might also be interested in randomizing entries of arrays. To accomplish this, there are two functions that help us: random_shuffle() and random_crop(). See the following:

      shuffled_output = tf.random_shuffle(input_tensor)
      cropped_output = tf.random_crop(input_tensor, crop_size)
    • Later on in this book, we will be interested in randomly cropping an image of size (height, width, 3) where there are three color spectrums. To fix a dimension in the cropped_output, you must give it the maximum size in that dimension:

      cropped_image = tf.random_crop(my_image, [height/2, width/2, 3])

How it works…

Once we have decided on how to create the tensors, then we may also create the corresponding variables by wrapping the tensor in the Variable() function, as follows. More on this in the next section:

my_var = tf.Variable(tf.zeros([row_dim, col_dim]))

There's more…

We are not limited to the built-in functions. We can convert any numpy array to a Python list, or constant to a tensor using the function convert_to_tensor(). Note that this function also accepts tensors as an input in case we wish to generalize a computation inside a function.