Index
A
- ANalysis Of VAriance (ANOVA)
- about / Generalized linear models
- automatic differentiation variational inference (ADVI) / Variational methods
B
- bandwidth
- about / The Gaussian kernel
- Bayes factors
- about / Bayes factors
- analogy, with information criteria / Analogy with information criteria
- computing / Computing Bayes factors
- common problems, when computing / Common problems computing Bayes factors
- and information criteria / Bayes factors and information criteria
- Bayesian analysis
- communicating / Communicating a Bayesian analysis
- Bayesian information criterion (BIC) / Bayesian information criterion
- Bayesian maximum a posteriori (MAP) estimation / Akaike information criterion
- Bayes theorem
- and statistical inference / Bayes' theorem and statistical inference
- beta-binomial
C
- central limit theorem (CLT)
- centroids
- about / Kernelized linear regression
- Cholesky decomposition
- about / Making predictions from a GP
- Cohen's d
- about / The tips dataset, Cohen's d
- reference / Cohen's d
- coin-flipping problem
- about / The coin-flipping problem
- general model / The general model
- likelihood, selecting / Choosing the likelihood
- prior, selecting / Choosing the prior
- posterior, getting / Getting the posterior
- posterior, computing / Computing and plotting the posterior
- posterior, plotting / Computing and plotting the posterior
- prior, influence / Influence of the prior and how to choose one
- confounding variables
- continuous mixtures
- about / Continuous mixtures
- beta-binomial / Beta-binomial and negative binomial
- negative binomial / Beta-binomial and negative binomial
- Student's t-distribution / The Student's t-distribution
- continuous variables / Probability distributions
- correlated variables
- reference link / Correlation, causation, and the messiness of life
- count data
- modeling, with Poisson distribution / The Poisson distribution
- modeling, with Zero-Inflated Poisson (ZIP) model / The Zero-Inflated Poisson model
- Poisson regression / Poisson regression and ZIP regression
- ZIP regression / Poisson regression and ZIP regression
- covariance function
- about / Gaussian processes
- covariance matrix
- about / Pearson coefficient from a multivariate Gaussian
- building / Building the covariance matrix
- sampling, from GP prior / Sampling from a GP prior
- parameterized kernel, using / Using a parameterized kernel
- Cromwel rule
- URL / Exercises
- cross-validation
- about / Cross-validation
D
- degree of freedom / The Student's t-distribution
- detailed balance condition
- about / Markov chain
- deviance / The log-likelihood and the deviance
- deviance information criterion (DIC) / Deviance information criterion
- Dirichlet distribution
- about / How to build mixture models
- Dirichlet process
- reference link / Non-fixed component clustering
- discrete variables / Probability distributions
- discriminative model
- Dutch book
- URL / Exercises
E
- effect size / Comparing groups
- Evidence Lower Bound (ELBO) / Variational methods
- experimental design / Exploratory data analysis
- Exploratory Data Analysis (EDA) / Exploratory data analysis
F
- fixed component clustering
- about / Fixed component clustering
- non-fixed component clustering / Non-fixed component clustering
- Function-space view
- about / Kernelized linear regression
G
- gamma-Poisson mixture continuous model
- Gaussian inferences
- about / Gaussian inferences
- Gaussian kernel
- about / The Gaussian kernel
- Gaussian mixture model
- about / Mixture models
- Gaussian Process (GP)
- about / Gaussian processes
- predictions, determining / Making predictions from a GP
- implementing, with PyMC3 / Implementing a GP using PyMC3
- posterior predictive checks, performing / Posterior predictive checks
- periodic kernel / Periodic kernel
- Gaussian processes
- about / Gaussian processes
- covariance matrix, building / Building the covariance matrix
- Gaussians
- Gedanken experiment / Hamiltonian Monte Carlo/NUTS
- generalized linear models (GLMs)
- about / Generalized linear models
- generalized liner model (GLM)
- about / The GLM module
- generative classifier
- generative model
- GLM module
- about / The GLM module
- grid computing / Grid computing
- groups
- comparing / Comparing groups
- tips dataset / The tips dataset
- Cohen's d / Cohen's d
- probability of superiority / Probability of superiority
H
- Hamiltonian Monte Carlo/NUTS / Hamiltonian Monte Carlo/NUTS
- hard-clustering
- about / Model-based clustering
- hierarchical linear regression
- about / Hierarchical linear regression
- correlation, predicting / Correlation, causation, and the messiness of life
- causation, predicting / Correlation, causation, and the messiness of life
- hierarchical models
- about / Hierarchical models, Regularizing priors and hierarchical models
- shrinkage / Shrinkage
- Highest Posterior Density (HPD)
- Hybrid / Hamiltonian Monte Carlo/NUTS
- hyper-parameter
- about / Using a parameterized kernel
- hyper-parameters
- about / Hierarchical models
- hyper-priors
- about / Hierarchical models
I
- inference button
- pushing / Pushing the inference button
- inference engines
- about / Inference engines
- Non-Markovian methods / Non-Markovian methods
- Markovian methods / Markovian methods
- inferential statistics / Inferential statistics
- information criteria
- about / Information criteria
- log-likelihood / The log-likelihood and the deviance
- deviance / The log-likelihood and the deviance
- Akaike information criterion / Akaike information criterion
- deviance information criterion (DIC) / Deviance information criterion
- widely available information criterion (WAIC) / Widely available information criterion
- Pareto smoothed importance sampling (PSIS) / Pareto smoothed importance sampling leave-one-out cross-validation
- Bayesian information criterion (BIC) / Bayesian information criterion
- computing, PyMC3 used / Computing information criteria with PyMC3
- Information Theory
- about / Information criteria
- iris dataset
- about / The iris dataset
K
- K-fold cross-validation
- about / Cross-validation
- kernel-based models
- about / Kernel-based models
- Gaussian kernel / The Gaussian kernel
- kernelized linear regression / Kernelized linear regression
- overfitting / Overfitting and priors
- priors / Overfitting and priors
- kernel density estimation (KDE)
- about / Mixture models
- Kernel Density Estimation (KDE) / Convergence
- kernelized linear regression
- about / Kernelized linear regression
- knots
- about / Kernelized linear regression
- Kronecker delta
- about / Using a parameterized kernel
L
- Laplace method / Quadratic method
- Large Hadron Collider (LHC) / Exploratory data analysis
- latent variable
- about / How to build mixture models
- leave-one-out cross-validation (LOOCV)
- about / Cross-validation
- linear discriminant analysis (LDA)
- logistic model
- about / The logistic model
- logistic regression
- about / Logistic regression
- logistic model / The logistic model
- iris dataset / The iris dataset
- logistic model, applied to iris dataset / The logistic model applied to the iris dataset
- predictions, making / Making predictions
M
- machine learning (ML)
- and simple linear regression / The machine learning connection
- magnetic resonance imaging (MRI) / Nuisance parameters and marginalized distributions
- marginalized distributions
- marginalized Gaussian mixture model
- Markov Chain
- about / Markov chain
- Markov Chain Monte Carlo (MCMC) methods / Inference engines
- Markovian methods
- about / Markovian methods
- Monte Carlo / Monte Carlo
- Markov Chain / Markov chain
- Metropolis- Hastings / Metropolis-Hastings
- Hamiltonian Monte Carlo/NUTS / Hamiltonian Monte Carlo/NUTS
- other MCMC methods / Other MCMC methods
- mean function
- about / Gaussian processes
- Metropolis-Hasting algorithm / Markov chain, Metropolis-Hastings
- Metropolis Coupled MCMC / Other MCMC methods
- mixture models
- about / Mixture models
- building / How to build mixture models
- marginalized Gaussian mixture model / Marginalized Gaussian mixture model
- count data / Mixture models and count data
- robust logistic regression / Robust logistic regression
- model
- notation and visualization / Model notation and visualization
- model-based clustering
- about / Model-based clustering
- fixed component clustering / Fixed component clustering
- model averaging / Interpreting and using information criteria measures
- model selection / Interpreting and using information criteria measures
- Monte Carlo
- about / Monte Carlo
- multiple linear regression
- about / Multiple linear regression
- confounding variables / Confounding variables and redundant variables, Multicollinearity or when the correlation is too high
- redundant variables / Confounding variables and redundant variables, Multicollinearity or when the correlation is too high
- effect variables, masking / Masking effect variables
- interactions, adding / Adding interactions
- multiple logistic regression
- about / Multiple logistic regression
- boundary decision / The boundary decision
- model, implementing / Implementing the model
- correlated variables, dealing with / Dealing with correlated variables
- unbalanced classes, dealing with / Dealing with unbalanced classes
- problem, solving / How do we solve this problem?
- coefficients, interpreting / Interpreting the coefficients of a logistic regression
- generalized linear models / Generalized linear models
- softmax regression / Softmax regression or multinomial logistic regression
- multinomial logistic regression / Softmax regression or multinomial logistic regression
- multivariate Gaussian
- Pearson correlation coefficient, computing / Pearson coefficient from a multivariate Gaussian
- Multivariate Gaussian
- about / Gaussian processes
N
- negative binomial
- No-U-Turn Sampler (NUTS) / Hamiltonian Monte Carlo/NUTS
- non-fixed component clustering
- about / Non-fixed component clustering
- Non-Markovian methods
- about / Non-Markovian methods
- grid computing / Grid computing
- quadratic method / Quadratic method
- variational methods / Variational methods
- non-parametric statistics
- about / Non-parametric statistics
- Normal distribution / The Student's t-distribution
- normality parameter / The Student's t-distribution
- nuisance parameters
O
- Occam's razor
- about / Occam's razor – simplicity and accuracy
- simplicity / Occam's razor – simplicity and accuracy
- accuracy / Occam's razor – simplicity and accuracy
- too many parameters, leading to overfitting / Too many parameters leads to overfitting
- too few parameters, leading to underfitting / Too few parameters leads to underfitting
- simplicity and accuracy, balancing / The balance between simplicity and accuracy
- odds
- out-of-sample accuracy
- over-dispersion
- overfitting
P
- parallel tempering / Other MCMC methods
- parameterized kernel
- using / Using a parameterized kernel
- Pareto smoothed importance sampling (PSIS) / Pareto smoothed importance sampling leave-one-out cross-validation
- Pearson correlation coefficient
- about / Pearson correlation coefficient
- reference link / Pearson correlation coefficient
- computing, from multivariate Gaussian / Pearson coefficient from a multivariate Gaussian
- periodic kernel
- about / Periodic kernel
- phylogenetics
- about / Model-based clustering
- Poisson distribution
- about / The Poisson distribution
- Poisson regression
- polynomial regression
- about / Polynomial regression
- parameters, interpreting / Interpreting the parameters of a polynomial regression
- comparison / Polynomial regression – the ultimate model?
- posterior
- summarizing / Summarizing the posterior, Summarizing the posterior
- predictive checks / Posterior predictive checks
- -based decisions / Posterior-based decisions
- Region Of Practical Equivalence (ROPE) / ROPE
- Loss functions / Loss functions
- posterior distribution / Bayes' theorem and statistical inference
- posterior predictive checks
- about / Posterior predictive checks
- predictive accuracy measures
- about / Predictive accuracy measures
- cross-validation / Predictive accuracy measures, Cross-validation
- information criteria / Predictive accuracy measures, Information criteria, The log-likelihood and the deviance
- information criteria, computing with PyMC3 / Computing information criteria with PyMC3
- information criteria measures, interpreting / Interpreting and using information criteria measures
- information criteria measures, using / Interpreting and using information criteria measures
- posterior predictive checks / Posterior predictive checks
- priors
- regularizing / Regularizing priors, Regularizing priors and hierarchical models
- about / Overfitting and priors
- probabilistic models / Inferential statistics
- probabilistic programming
- about / Probabilistic programming
- probabilistic programming languages (PPL) / Probabilistic programming
- probabilities
- and uncertainty / Probabilities and uncertainty
- distributions / Probability distributions
- probability of superiority
- PyMC3
- about / PyMC3 introduction
- coin-flipping problem / Coin-flipping, the computational approach
- array([1, 0, 0, 0]) model specification / Model specification
- inference button, pushing / Pushing the inference button
- samples, diagnosing / Diagnosing the sampling process
- convergence / Convergence
- autocorrelation / Autocorrelation
- size / Effective size
- Gaussian Process (GP), implementing / Implementing a GP using PyMC3
- Python packages
- installing / Installing the necessary Python packages
Q
- quadratic linear discriminant (QDA)
- quadratic method / Quadratic method
R
- random variable / Probability distributions
- redundant variables
- Region Of Practical Equivalence (ROPE) / ROPE
- regularizing priors
- ridge regression
- about / Regularizing priors
- robust estimation
- about / Student's t-distribution
- robust inferences
- about / Robust inferences
- Student's t-distribution / Student's t-distribution
- robust linear regression
- about / Robust linear regression
- robust logistic regression
- about / Robust logistic regression
S
- shrinkage
- about / Shrinkage
- sigmoid function
- about / Logistic regression
- simple linear regression
- about / Simple linear regression
- machine learning (ML) / The machine learning connection
- building / The core of linear regression models
- linear models / Linear models and high autocorrelation
- autocorrelation / Linear models and high autocorrelation
- data, modifying before execution / Modifying the data before running
- sampling method, modifying / Changing the sampling method
- posterior, interpreting / Interpreting and visualizing the posterior
- posterior, visualizing / Interpreting and visualizing the posterior
- Pearson correlation coefficient / Pearson correlation coefficient
- single parameter inference
- about / Single parameter inference
- coin-flipping problem / The coin-flipping problem
- smooth functions
- about / Kernelized linear regression
- soft-clustering
- about / Model-based clustering
- softmax function
- softmax regression
- sopa seca
- about / Logistic regression
- squared Euclidean distance (SED)
- about / The Gaussian kernel
- statistical inference / Bayes' theorem and statistical inference
- statistics
- about / Statistics as a form of modeling
- exploratory data analysis / Exploratory data analysis
- inferential statistics / Inferential statistics
- Student's t-distribution
- support vector machine (SVM)
- about / Kernel-based models
T
- Theano tutorial
- URL / PyMC3 introduction
- Tikhonov regularization
- about / Regularizing priors
V
- variational methods / Variational methods
W
- WAIC and LOO computations
- reliability / A note on the reliability of WAIC and LOO computations
- Weight-space view
- about / Kernelized linear regression
- widely available information criterion (WAIC) / Widely available information criterion
- within-sample accuracy
- World Health Organization (WHO)
- about / Hierarchical models
Z
- Zero-Inflated Poisson (ZIP) model
- about / The Zero-Inflated Poisson model
- ZIP regression