Book Image

Smarter Decisions - The Intersection of Internet of Things and Decision Science

By : Jojo Moolayil
Book Image

Smarter Decisions - The Intersection of Internet of Things and Decision Science

By: Jojo Moolayil

Overview of this book

With an increasing number of devices getting connected to the Internet, massive amounts of data are being generated that can be used for analysis. This book helps you to understand Internet of Things in depth and decision science, and solve business use cases. With IoT, the frequency and impact of the problem is huge. Addressing a problem with such a huge impact requires a very structured approach. The entire journey of addressing the problem by defining it, designing the solution, and executing it using decision science is articulated in this book through engaging and easy-to-understand business use cases. You will get a detailed understanding of IoT, decision science, and the art of solving a business problem in IoT through decision science. By the end of this book, you’ll have an understanding of the complex aspects of decision making in IoT and will be able to take that knowledge with you onto whatever project calls for it
Table of Contents (15 chapters)
Smarter Decisions – The Intersection of Internet of Things and Decision Science
Credits
About the Author
About the Reviewer
eBooks, discount offers, and more
Preface

Ensemble modeling - random forest


Random forest is an extremely popular machine learning technique that is used mainly for classification and regression. As the algorithm builds multiple decision trees, we have already covered a substantial part of the foundation required for random forest. Let's quickly understand the algorithm and solve our previous problem better.

What is random forest?

Random forest is a machine learning technique built on the principle of ensemble modeling. It builds an ensemble of decision trees with each tree having a randomly chosen subset of features; hence the name Random + Forest. Random forest is basically an advanced version of the bagging algorithm. In bagging, we build multiple decision trees with a bootstrapped training sample selected with replacement from the entire training set. In random forest, the addition of randomness is taken one step further. Here, from the entire list of features only a predefined number of features are chosen randomly for each tree...