Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Hands-On Neural Networks
  • Table Of Contents Toc
Hands-On Neural Networks

Hands-On Neural Networks

By : Leonardo De Marchi, Laura Mitchell
3.5 (2)
close
close
Hands-On Neural Networks

Hands-On Neural Networks

3.5 (2)
By: Leonardo De Marchi, Laura Mitchell

Overview of this book

Neural networks play a very important role in deep learning and artificial intelligence (AI), with applications in a wide variety of domains, right from medical diagnosis, to financial forecasting, and even machine diagnostics. Hands-On Neural Networks is designed to guide you through learning about neural networks in a practical way. The book will get you started by giving you a brief introduction to perceptron networks. You will then gain insights into machine learning and also understand what the future of AI could look like. Next, you will study how embeddings can be used to process textual data and the role of long short-term memory networks (LSTMs) in helping you solve common natural language processing (NLP) problems. The later chapters will demonstrate how you can implement advanced concepts including transfer learning, generative adversarial networks (GANs), autoencoders, and reinforcement learning. Finally, you can look forward to further content on the latest advancements in the field of neural networks. By the end of this book, you will have the skills you need to build, train, and optimize your own neural network model that can be used to provide predictable solutions.
Table of Contents (16 chapters)
close
close
Lock Free Chapter
1
Section 1: Getting Started
4
Section 2: Deep Learning Applications
9
Section 3: Advanced Applications

Implementing MTL

Now, we will see in more detail what we need to do in an MTL task.

There are different ways to implement MTL. Two methods that are commonly used are as follows:

  • Hard parameter sharing: This is the most common way to implement MTL, and it consists of sharing some of the hidden layers across all tasks, while other layers are kept specific for each single task:

The main advantage of this method is that it's difficult to overfit. Overfitting is particularly a problem for NNs, but in this case, the more tasks, the lower the danger of overfitting. This is quite clear, because overfitting is creating a solution that is too specific for the dataset we provide, while in this case, by design, we have a more generic task and a variegated dataset.

  • Soft parameter sharing: With soft parameter sharing, we have one model, but each task will have its own parameters. In...
CONTINUE READING
83
Tech Concepts
36
Programming languages
73
Tech Tools
Icon Unlimited access to the largest independent learning library in tech of over 8,000 expert-authored tech books and videos.
Icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Icon 50+ new titles added per month and exclusive early access to books as they are being written.
Hands-On Neural Networks
notes
bookmark Notes and Bookmarks search Search in title playlist Add to playlist download Download options font-size Font size

Change the font size

margin-width Margin width

Change margin width

day-mode Day/Sepia/Night Modes

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY

Submit Your Feedback

Modal Close icon
Modal Close icon
Modal Close icon