Book Image

Mobile Deep Learning with TensorFlow Lite, ML Kit and Flutter

By : Anubhav Singh, Rimjhim Bhadani
Book Image

Mobile Deep Learning with TensorFlow Lite, ML Kit and Flutter

By: Anubhav Singh, Rimjhim Bhadani

Overview of this book

Deep learning is rapidly becoming the most popular topic in the mobile app industry. This book introduces trending deep learning concepts and their use cases with an industrial and application-focused approach. You will cover a range of projects covering tasks such as mobile vision, facial recognition, smart artificial intelligence assistant, augmented reality, and more. With the help of eight projects, you will learn how to integrate deep learning processes into mobile platforms, iOS, and Android. This will help you to transform deep learning features into robust mobile apps efficiently. You’ll get hands-on experience of selecting the right deep learning architectures and optimizing mobile deep learning models while following an application oriented-approach to deep learning on native mobile apps. We will later cover various pre-trained and custom-built deep learning model-based APIs such as machine learning (ML) Kit through Firebase. Further on, the book will take you through examples of creating custom deep learning models with TensorFlow Lite. Each project will demonstrate how to integrate deep learning libraries into your mobile apps, right from preparing the model through to deployment. By the end of this book, you’ll have mastered the skills to build and deploy deep learning mobile applications on both iOS and Android.
Table of Contents (13 chapters)

Developing RNN-based models for music generation

In this section, we'll be developing a music generation model. We'll be using RNNs for that, and using the LSTM neuron model for the same. An RNN is different from a simple artificial neural network (ANN) in a very significant way—it allows the reuse of input between layers. 

While, in an ANN, we expect input values that enter the neural network to move forward and then produce error-based feedback to be incorporated into the network weights, RNNs make the input come back to the previous layers in loops several times. 

The following diagram represents an RNN neuron:

From the preceding diagram, we can see that the input after passing through the activation function in the neuron splits into two parts. One part moves forward in the network toward the next layer or output, while the other part is fed back...