Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying scikit-learn Cookbook
  • Table Of Contents Toc
scikit-learn Cookbook

scikit-learn Cookbook - Third Edition

By : John Sukup
close
close
scikit-learn Cookbook

scikit-learn Cookbook

By: John Sukup

Overview of this book

Trusted by data scientists, ML engineers, and software developers alike, scikit-learn offers a versatile, user-friendly framework for implementing a wide range of ML algorithms, enabling the efficient development and deployment of predictive models in real-world applications. This third edition of scikit-learn Cookbook will help you master ML with real-world examples and scikit-learn 1.5 features. This updated edition takes you on a journey from understanding the fundamentals of ML and data preprocessing, through implementing advanced algorithms and techniques, to deploying and optimizing ML models in production. Along the way, you’ll explore practical, step-by-step recipes that cover everything from feature engineering and model selection to hyperparameter tuning and model evaluation, all using scikit-learn. By the end of this book, you’ll have gained the knowledge and skills needed to confidently build, evaluate, and deploy sophisticated ML models using scikit-learn, ready to tackle a wide range of data-driven challenges. *Email sign-up and proof of purchase required
Table of Contents (17 chapters)
close
close

Random Forests and Bagging

While building a single decision tree model is intuitive, most real-world applications will only use them as part of an ensemble method due to some of their shortcomings – especially in overfitting. As the saying goes, “two heads (or trees in this case) are better than one!”

Cleverly named random forests are robust ensemble models that combine multiple decision trees to improve accuracy and reduce overfitting. They achieve this by employing a method known as bagging (Bootstrap Aggregating), where multiple trees are trained on different subsets of the data sampled with replacement. Each tree contributes a prediction vote, and the final prediction is based on the majority vote or average of predictions from all trees. Random forests excel in handling large datasets and complex feature interactions better than a single decision tree alone. This recipe will introduce you to ensemble methods.

Getting ready

We will utilize scikit-learn to demonstrate...

CONTINUE READING
83
Tech Concepts
36
Programming languages
73
Tech Tools
Icon Unlimited access to the largest independent learning library in tech of over 8,000 expert-authored tech books and videos.
Icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Icon 50+ new titles added per month and exclusive early access to books as they are being written.
scikit-learn Cookbook
notes
bookmark Notes and Bookmarks search Search in title playlist Add to playlist download Download options font-size Font size

Change the font size

margin-width Margin width

Change margin width

day-mode Day/Sepia/Night Modes

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY

Submit Your Feedback

Modal Close icon
Modal Close icon
Modal Close icon