Book Image

Mastering Probabilistic Graphical Models with Python

By : Ankur Ankan
Book Image

Mastering Probabilistic Graphical Models with Python

By: Ankur Ankan

Overview of this book

Table of Contents (14 chapters)
Mastering Probabilistic Graphical Models Using Python
Credits
About the Authors
About the Reviewers
www.PacktPub.com
Preface
Index

Using a Markov chain


So far, we have been discussing constructing Markov chains. In this section, we will see how to apply these concepts in the case of our graphical models. In the case of probabilistic models, we usually want to compute the posterior probability P(Y|E = e) , and to sample this posterior distribution, we will have to construct a Markov chain whose stationary distribution is P(Y|E = e). So, the states of this Markov chain should be instantiations x of variables and should converge to .

So, for a state in the Markov chain, we define the kernel as follows:

We can see that this transition probability doesn't depend on the current value of of but only on the remaining state . Now, it's really easy to show that the posterior distribution is a stationary distribution of this process.

In graphical models, Gibbs sampling can be very easily implemented in cases where we can compute the transition probability efficiently. We already know the following:

Let denote the assignment...