Book Image

Python Deep Learning

By : Valentino Zocca, Gianmario Spacagna, Daniel Slater, Peter Roelants
Book Image

Python Deep Learning

By: Valentino Zocca, Gianmario Spacagna, Daniel Slater, Peter Roelants

Overview of this book

With an increasing interest in AI around the world, deep learning has attracted a great deal of public attention. Every day, deep learning algorithms are used broadly across different industries. The book will give you all the practical information available on the subject, including the best practices, using real-world use cases. You will learn to recognize and extract information to increase predictive accuracy and optimize results. Starting with a quick recap of important machine learning concepts, the book will delve straight into deep learning principles using Sci-kit learn. Moving ahead, you will learn to use the latest open source libraries such as Theano, Keras, Google's TensorFlow, and H20. Use this guide to uncover the difficulties of pattern recognition, scaling data with greater accuracy and discussing deep learning algorithms and techniques. Whether you want to dive deeper into Deep Learning, or want to investigate how to get more out of this powerful technology, you’ll find everything inside.
Table of Contents (18 chapters)
Python Deep Learning
Credits
About the Authors
About the Reviewer
www.PacktPub.com
Customer Feedback
Preface
Index

Summary


In this chapter, we have introduced neural networks in detail and we have mentioned their success over other competing algorithms. Neural networks are comprised of the "units", or neurons, that belong to them or their connections, or weights, that characterize the strength of the communication between different neurons and their activity functions, that is, how the neurons process the information. We have discussed how we can create different architectures, and how a neural network can have many layers, and why inner (hidden) layers are important. We have explained how the information flows from the input to the output by passing from each layer to the next based on the weights and the activity function defined, and finally we have shown how we can define a method called back-propagation to "tune" the weights to improve the desired level of accuracy. We have also mentioned many of the areas where neural networks are and have been employed.

In the next chapter, we will continue discussing...