Book Image

Machine Learning Using TensorFlow Cookbook

By : Luca Massaron, Alexia Audevart, Konrad Banachewicz
Book Image

Machine Learning Using TensorFlow Cookbook

By: Luca Massaron, Alexia Audevart, Konrad Banachewicz

Overview of this book

The independent recipes in Machine Learning Using TensorFlow Cookbook will teach you how to perform complex data computations and gain valuable insights into your data. Dive into recipes on training models, model evaluation, sentiment analysis, regression analysis, artificial neural networks, and deep learning - each using Google’s machine learning library, TensorFlow. This cookbook covers the fundamentals of the TensorFlow library, including variables, matrices, and various data sources. You’ll discover real-world implementations of Keras and TensorFlow and learn how to use estimators to train linear models and boosted trees, both for classification and regression. Explore the practical applications of a variety of deep learning architectures, such as recurrent neural networks and Transformers, and see how they can be used to solve computer vision and natural language processing (NLP) problems. With the help of this book, you will be proficient in using TensorFlow, understand deep learning from the basics, and be able to implement machine learning algorithms in real-world scenarios.
Table of Contents (15 chapters)
5
Boosted Trees
11
Reinforcement Learning with TensorFlow and TF-Agents
13
Other Books You May Enjoy
14
Index

Creating custom activations for tabular data

With images and text, it is more difficult to backpropagate errors in DNNs working on tabular data because the data is sparse. While the ReLU activation function is used widely, new activation functions have been found to work better in such cases and can improve the network performances. These activations functions are SeLU, GeLU, and Mish. Since SeLU is already present in Keras and TensorFlow (see https://www.tensorflow.org/api_docs/python/tf/keras/activations/selu and https://www.tensorflow.org/api_docs/python/tf/nn/selu), in this recipe we'll use the GeLU and Mish activation functions.

Getting ready

You need the usual imports:

from tensorflow import keras as keras
import numpy as np
import matplotlib.pyplot as plt

We've added matplotlib, so we can plot how these new activation functions work and get an idea of the reason for their efficacy.

How to do it…

GeLU and Mish are defined by their mathematics...