Book Image

Machine Learning Quick Reference

By : Rahul Kumar
Book Image

Machine Learning Quick Reference

By: Rahul Kumar

Overview of this book

Machine learning makes it possible to learn about the unknowns and gain hidden insights into your datasets by mastering many tools and techniques. This book guides you to do just that in a very compact manner. After giving a quick overview of what machine learning is all about, Machine Learning Quick Reference jumps right into its core algorithms and demonstrates how they can be applied to real-world scenarios. From model evaluation to optimizing their performance, this book will introduce you to the best practices in machine learning. Furthermore, you will also look at the more advanced aspects such as training neural networks and work with different kinds of data, such as text, time-series, and sequential data. Advanced methods and techniques such as causal inference, deep Gaussian processes, and more are also covered. By the end of this book, you will be able to train fast, accurate machine learning models at your fingertips, which you can easily use as a point of reference.
Table of Contents (18 chapters)
Title Page
Copyright and Credits
About Packt
Contributors
Preface
Index

Graphical causal models


This model was covered in detail in Chapter 8, Probabilistic Graphical Models. We will also look into it briefly here, too.

Bayesian networks are directed acyclic graphs (DAGs) where the nodes represent variables of interest (for example, the temperature of a device, the gender of a patient, a feature of an object, the occurrence of an event, and so on). Causal influences among the variables are represented using links. The strength of an influence can be potrayed by conditional probabilities that are linked to each cluster of the parent-child nodes in the network. In the following diagram we can see the causal models, that have a node and an edge:

The node represents the variables and the edges stand for conditional relationship between the variables. What we are looking for is full joint probability distribution. Here, the conditional dependency is being spoken. Rain causes the ground to be wet. However, winning the lottery has nothing to do with other variables....