So far, we have been discussing constructing Markov chains. In this section, we will see how to apply these concepts in the case of our graphical models. In the case of probabilistic models, we usually want to compute the posterior probability P(Y|E = e) , and to sample this posterior distribution, we will have to construct a Markov chain whose stationary distribution is P(Y|E = e). So, the states of this Markov chain should be instantiations x of variables and should converge to .
So, for a state in the Markov chain, we define the kernel as follows:
We can see that this transition probability doesn't depend on the current value of of but only on the remaining state . Now, it's really easy to show that the posterior distribution is a stationary distribution of this process.
In graphical models, Gibbs sampling can be very easily implemented in cases where we can compute the transition probability efficiently. We already know the following: