In this first chapter we learned the base concepts of probabilities
We saw how and why they are used to represent uncertainty about data and knowledge, while also introducing the Bayes formula. This is the most important formula to compute posterior probabilities—that is, to update our beliefs and knowledge about a fact when new data is available
We saw what a joint probability distribution is and learnt that they can quickly become too complex and intractable to deal with. We learned the basics of probabilistic graphical models as a generic framework to perform tractable, efficient, and easy modeling with probabilistic models. Finally, we introduced the different types of probabilistic graphical model and learned how to use R packages to write our first models
In the next chapter, we will learn the first set of algorithms to do Bayesian inference with probabilistic graphical models—that is, to put questions and queries to our models. We will introduce new features of the R packages and, at the same time, we'll learn how these algorithms work and can be used in an efficient manner.