Book Image

Bayesian Analysis with Python - Second Edition

By : Osvaldo Martin
4.5 (2)
Book Image

Bayesian Analysis with Python - Second Edition

4.5 (2)
By: Osvaldo Martin

Overview of this book

The second edition of Bayesian Analysis with Python is an introduction to the main concepts of applied Bayesian inference and its practical implementation in Python using PyMC3, a state-of-the-art probabilistic programming library, and ArviZ, a new library for exploratory analysis of Bayesian models. The main concepts of Bayesian statistics are covered using a practical and computational approach. Synthetic and real data sets are used to introduce several types of models, such as generalized linear models for regression and classification, mixture models, hierarchical models, and Gaussian processes, among others. By the end of the book, you will have a working knowledge of probabilistic modeling and you will be able to design and implement Bayesian models for your own data science problems. After reading the book you will be better prepared to delve into more advanced material or specialized statistical modeling if you need to.
Table of Contents (11 chapters)
9
Where To Go Next?

Summary

A simple linear regression is a model that can be used to predict and/or explain one variable from another one. Using machine learning language, this is a case of supervised learning. From a probabilitic perspective, a linear regression model is an extension of the Gaussian model where the mean is not directly estimated but rather computed as a linear function of a predictor variable and some additional parameters. While the Gaussian distribution is the most common choice for the dependent variable, we are free to choose other distributions. One alternative, which is especially useful when dealing with potential outliers, is the Student's t-distribution. In the next chapter, we will explore other alternatives.

In this chapter, we also discussed the Pearson correlation coefficient, the most common measure of linear correlation between two variables, and we learned...