Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Hands-On Unsupervised Learning with Python
  • Table Of Contents Toc
Hands-On Unsupervised Learning with Python

Hands-On Unsupervised Learning with Python

By : Bonaccorso, Giuseppe Bonaccorso
3.7 (3)
close
close
Hands-On Unsupervised Learning with Python

Hands-On Unsupervised Learning with Python

3.7 (3)
By: Bonaccorso, Giuseppe Bonaccorso

Overview of this book

Unsupervised learning is about making use of raw, untagged data and applying learning algorithms to it to help a machine predict its outcome. With this book, you will explore the concept of unsupervised learning to cluster large sets of data and analyze them repeatedly until the desired outcome is found using Python. This book starts with the key differences between supervised, unsupervised, and semi-supervised learning. You will be introduced to the best-used libraries and frameworks from the Python ecosystem and address unsupervised learning in both the machine learning and deep learning domains. You will explore various algorithms, techniques that are used to implement unsupervised learning in real-world use cases. You will learn a variety of unsupervised learning approaches, including randomized optimization, clustering, feature selection and transformation, and information theory. You will get hands-on experience with how neural networks can be employed in unsupervised scenarios. You will also explore the steps involved in building and training a GAN in order to process images. By the end of this book, you will have learned the art of unsupervised learning for different real-world challenges.
Table of Contents (12 chapters)
close
close

Summary

In this chapter, we presented different techniques that can be employed for both dimensionality reduction and dictionary learning. PCA is a very well-known method that involves finding the most import components of the dataset associated with the directions where the variance is larger. This method has the double effect of diagonalizing the covariance matrix and providing an immediate measure of the importance of each feature, so as to simplify the selection and maximize the residual explained variance (the amount of variance that it is possible to explain with a smaller number of components). As PCA is intrinsically a linear method, it cannot often be employed with non-linear datasets. For this reason, a kernel-based variant has been developed. In our example, you saw how an RBF kernel is able to project a non-linearly separable dataset onto a subspace, where PCA can...

CONTINUE READING
83
Tech Concepts
36
Programming languages
73
Tech Tools
Icon Unlimited access to the largest independent learning library in tech of over 8,000 expert-authored tech books and videos.
Icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Icon 50+ new titles added per month and exclusive early access to books as they are being written.
Hands-On Unsupervised Learning with Python
notes
bookmark Notes and Bookmarks search Search in title playlist Add to playlist download Download options font-size Font size

Change the font size

margin-width Margin width

Change margin width

day-mode Day/Sepia/Night Modes

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY

Submit Your Feedback

Modal Close icon
Modal Close icon
Modal Close icon