Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Deep Learning with TensorFlow and Keras – 3rd edition
  • Table Of Contents Toc
Deep Learning with TensorFlow and Keras – 3rd edition

Deep Learning with TensorFlow and Keras – 3rd edition - Third Edition

By : Amita Kapoor, Antonio Gulli , Sujit Pal
4.5 (44)
close
close
Deep Learning with TensorFlow and Keras – 3rd edition

Deep Learning with TensorFlow and Keras – 3rd edition

4.5 (44)
By: Amita Kapoor, Antonio Gulli , Sujit Pal

Overview of this book

Deep Learning with TensorFlow and Keras teaches you neural networks and deep learning techniques using TensorFlow (TF) and Keras. You'll learn how to write deep learning applications in the most powerful, popular, and scalable machine learning stack available. TensorFlow 2.x focuses on simplicity and ease of use, with updates like eager execution, intuitive higher-level APIs based on Keras, and flexible model building on any platform. This book uses the latest TF 2.0 features and libraries to present an overview of supervised and unsupervised machine learning models and provides a comprehensive analysis of deep learning and reinforcement learning models using practical examples for the cloud, mobile, and large production environments. This book also shows you how to create neural networks with TensorFlow, runs through popular algorithms (regression, convolutional neural networks (CNNs), transformers, generative adversarial networks (GANs), recurrent neural networks (RNNs), natural language processing (NLP), and graph neural networks (GNNs)), covers working example apps, and then dives into TF in production, TF mobile, and TensorFlow with AutoML.
Table of Contents (23 chapters)
close
close
21
Other Books You May Enjoy
22
Index

References

  1. Mikolov, T., et al. (2013, Sep 7) Efficient Estimation of Word Representations in Vector Space. arXiv:1301.3781v3 [cs.CL].
  2. Mikolov, T., et al. (2013, Sep 17). Exploiting Similarities among Languages for Machine Translation. arXiv:1309.4168v1 [cs.CL].
  3. Mikolov, T., et al. (2013). Distributed Representations of Words and Phrases and their Compositionality. Advances in Neural Information Processing Systems 26 (NIPS 2013).
  4. Pennington, J., Socher, R., Manning, C. (2014). GloVe: Global Vectors for Word Representation. D14-1162, Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP).
  5. Niu, F., et al (2011, 11 Nov). HOGWILD! A Lock-Free Approach to Parallelizing Stochastic Gradient Descent. arXiv:1106.5730v2 [math.OC].
  6. Levy, O., Goldberg, Y. (2014). Neural Word Embedding as Implicit Matrix Factorization. Advances in Neural Information Processing Systems 27 (NIPS 2014).
  7. Mahoney, M. (2011, 1 Sep). text8 dataset: http://mattmahoney.net/dc/textdata...
CONTINUE READING
83
Tech Concepts
36
Programming languages
73
Tech Tools
Icon Unlimited access to the largest independent learning library in tech of over 8,000 expert-authored tech books and videos.
Icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Icon 50+ new titles added per month and exclusive early access to books as they are being written.
Deep Learning with TensorFlow and Keras – 3rd edition
notes
bookmark Notes and Bookmarks search Search in title playlist Add to playlist download Download options font-size Font size

Change the font size

margin-width Margin width

Change margin width

day-mode Day/Sepia/Night Modes

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY

Submit Your Feedback

Modal Close icon
Modal Close icon
Modal Close icon