In this chapter we used the simple yet powerful Bayesian model, which has a representation as a probabilistic graphical model. We saw a Bayesian treatment of the over-fitting problem with the use of priors, such as the Dirichlet-multinomial and the famous Beta-Binomial model.
The last section introduced another graphical model, which was around before the invention of probabilistic graphical models and is called the Gaussian mixture. It is a very important model to capture data coming from different subsets within the same model. And finally, we saw another application of the EM algorithm: learning such models and finding out the parameters of each Gaussian component.
Of course, the Gaussian mixture is not the only latent variable model; in fact it represents a lot of Bayesian models and the probabilistic graphical model framework.
In the next chapter, we will continue our study of inference algorithms for Bayesian models and probabilistic graphical models with the introduction of a...