Book Image

Apache Spark Deep Learning Cookbook

By : Ahmed Sherif, Amrith Ravindra
Book Image

Apache Spark Deep Learning Cookbook

By: Ahmed Sherif, Amrith Ravindra

Overview of this book

Organizations these days need to integrate popular big data tools such as Apache Spark with highly efficient deep learning libraries if they’re looking to gain faster and more powerful insights from their data. With this book, you’ll discover over 80 recipes to help you train fast, enterprise-grade, deep learning models on Apache Spark. Each recipe addresses a specific problem, and offers a proven, best-practice solution to difficulties encountered while implementing various deep learning algorithms in a distributed environment. The book follows a systematic approach, featuring a balance of theory and tips with best practice solutions to assist you with training different types of neural networks such as convolutional neural networks (CNNs) and recurrent neural networks (RNNs). You’ll also have access to code written in TensorFlow and Keras that you can run on Spark to solve a variety of deep learning problems in computer vision and natural language processing (NLP), or tweak to tackle other problems encountered in deep learning. By the end of this book, you'll have the skills you need to train and deploy state-of-the-art deep learning models on Apache Spark.
Table of Contents (21 chapters)
Title Page
Copyright and Credits
Packt Upsell
Foreword
Contributors
Preface
Index

Introduction


Recurrent neural networks have proven to be incredibly efficient at tasks involving the learning and prediction of sequential data. However, when it comes to natural language, the question of long-term dependencies comes into play, which is basically remembering the context of a particular conversation, paragraph, or sentence in order to make better predictions in the future. For example, consider a sentence that says:

Last year, I happened to visit China. Not only was Chinese food different from the Chinese food available everywhere else in the world, but the people were extremely warm and hospitable too. In my three years of stay in this beautiful country, I managed to pick up and speak very good....

If the preceding sentence were fed into a recurrent neural network to predict the next word in the sentence (such as Chinese), the network would find it difficult since it has no memory of the context of the sentence. This is what we mean by long-term dependencies. In order to predict...