In this chapter, we learned how conditional independence properties allow a joint distribution to be represented as the Bayes network. We then took a tour of types of reasoning and understood how influence can flow through a Bayes network, and we explored the same concepts using Libpgm. Finally, we used a simple Bayes network (Naive Bayes) to solve a real-world problem of text classification.
In the next chapter, we shall learn about the undirected graphical models or Markov networks.