Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Deep Learning with Theano
  • Table Of Contents Toc
Deep Learning with Theano

Deep Learning with Theano

By : Bourez
3.7 (3)
close
close
Deep Learning with Theano

Deep Learning with Theano

3.7 (3)
By: Bourez

Overview of this book

This book offers a complete overview of Deep Learning with Theano, a Python-based library that makes optimizing numerical expressions and deep learning models easy on CPU or GPU. The book provides some practical code examples that help the beginner understand how easy it is to build complex neural networks, while more experimented data scientists will appreciate the reach of the book, addressing supervised and unsupervised learning, generative models, reinforcement learning in the fields of image recognition, natural language processing, or game strategy. The book also discusses image recognition tasks that range from simple digit recognition, image classification, object localization, image segmentation, to image captioning. Natural language processing examples include text generation, chatbots, machine translation, and question answering. The last example deals with generating random data that looks real and solving games such as in the Open-AI gym. At the end, this book sums up the best -performing nets for each task. While early research results were based on deep stacks of neural layers, in particular, convolutional layers, the book presents the principles that improved the efficiency of these architectures, in order to help the reader build new custom nets.
Table of Contents (15 chapters)
close
close
14
Index

Cost function and errors

The cost function given the predicted probabilities by the model is as follows:

cost = -T.mean(T.log(model)[T.arange(y.shape[0]), y])

The error is the number of predictions that are different from the true class, averaged by the total number of values, which can be written as a mean:

error = T.mean(T.neq(y_pred, y))

On the contrary, accuracy corresponds to the number of correct predictions divided by the total number of predictions. The sum of error and accuracy is one.

For other types of problems, here are a few other loss functions and implementations:

Categorical cross entropy

An equivalent implementation of ours

T.nnet.categorical_crossentropy(model, y_true).mean()

Binary cross entropy

For the case when output can take only two values {0,1}

Typically used after a sigmoid activation predicting the probability, p

T.nnet.binary_crossentropy(model, y_true).mean()

Mean squared error

L2 norm for regression problems

T.sqr(model – y_true).mean()

Mean absolute...

CONTINUE READING
83
Tech Concepts
36
Programming languages
73
Tech Tools
Icon Unlimited access to the largest independent learning library in tech of over 8,000 expert-authored tech books and videos.
Icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Icon 50+ new titles added per month and exclusive early access to books as they are being written.
Deep Learning with Theano
notes
bookmark Notes and Bookmarks search Search in title playlist Add to playlist download Download options font-size Font size

Change the font size

margin-width Margin width

Change margin width

day-mode Day/Sepia/Night Modes

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY

Submit Your Feedback

Modal Close icon
Modal Close icon
Modal Close icon