Book Image

Hands-On Machine Learning on Google Cloud Platform

By : Giuseppe Ciaburro, V Kishore Ayyadevara, Alexis Perrier
Book Image

Hands-On Machine Learning on Google Cloud Platform

By: Giuseppe Ciaburro, V Kishore Ayyadevara, Alexis Perrier

Overview of this book

Google Cloud Machine Learning Engine combines the services of Google Cloud Platform with the power and flexibility of TensorFlow. With this book, you will not only learn to build and train different complexities of machine learning models at scale but also host them in the cloud to make predictions. This book is focused on making the most of the Google Machine Learning Platform for large datasets and complex problems. You will learn from scratch how to create powerful machine learning based applications for a wide variety of problems by leveraging different data services from the Google Cloud Platform. Applications include NLP, Speech to text, Reinforcement learning, Time series, recommender systems, image classification, video content inference and many other. We will implement a wide variety of deep learning use cases and also make extensive use of data related services comprising the Google Cloud Platform ecosystem such as Firebase, Storage APIs, Datalab and so forth. This will enable you to integrate Machine Learning and data processing features into your web and mobile applications. By the end of this book, you will know the main difficulties that you may encounter and get appropriate strategies to overcome these difficulties and build efficient systems.
Table of Contents (18 chapters)
8
Creating ML Applications with Firebase

Intuition of over/under fitting

Before we understand about how the preceding techniques are useful, let's build a scenario, so that we understand the phenomenon of overfitting.

Scenario 1: A case of not generalizing on an unseen dataset

In this scenario, we will create a dataset, for which there is a clear linearly separable mapping between input and output. For example, whenever the independent variables are positive, the output is [1,0], and when the input variables are negative, the output is [0,1]:

To that dataset, we will add a small amount of noise (10% of the preceding dataset created) by adding some data points that follow the opposite of the preceding pattern, that is, when the input variables are positive, the output is [0,1], and the output is [1,0] when the input variables are negative:

Appending the datasets obtained by the preceding two steps gives us the...