Index
A
- Alternating Least Squares (ALS)
- about / Matrix factorization
- limitations / Limitations of ALS
- Anaconda
- download link / Setting up the environment
B
- backpropagation / Neural networks and deep learning
- BaseSimpleEstimator interface / The BaseSimpleEstimator interface
- bias-variance dilemma / Error terms
- bias-variance trade-off / Error terms
- bias trade-off
- about / The bias/variance trade-off
- error terms / Error terms
- error, due to / Error due to bias
C
- CARTClassifier / An example of supervised learning in action
- Classification and Regression Trees (CART) / Decision trees – an introduction
- clustering
- methods / Various clustering methods
- about / What is clustering?
- distance metrics / Distance metrics
- coefficient / Parametric models
- cold-start problem / Limitations of content-based systems
- collaborative filtering
- about / Recommended systems and an introduction to collaborative filtering
- item-to-item / Item-to-item collaborative filtering
- content-based filtering
- about / Content-based filtering
- limitations / Limitations of content-based systems
D
- data
- splitting / Model evaluation and data splitting
- splitting, scikit-learn library used / Splitting made easy
- decision trees
- about / Introduction to non-parametric models and decision trees, Decision trees – an introduction, Decision trees
- example / An intuitive example – decision tree
- implementing / How do decision trees make decisions?
- tree, splitting by hand / Splitting a tree by hand
- splitting, on x1 / If we split on x1
- splitting, on x2 / If we split on x2
- implementing, from scratch / Implementing a decision tree from scratch
- classification tree / Classification tree
- regression tree / Regression tree
- deep learning / Neural networks and deep learning
- descent / Hill climbing and descent
E
- environment
- setting up / Setting up the environment
- Euclidean distance / Distance metrics, Recommended systems and an introduction to collaborative filtering
F
- finite-dimensional models / Finite-dimensional models
H
- high bias
- handling, strategies for / Strategies for handling high bias
- strategies, for handling / Strategies for handling high bias
- high variance
- handling, strategies for / Strategies for handling high variance
- strategies, for handling / Strategies for handling high variance
- hill climbing / Hill climbing and loss functions, Hill climbing and descent
- about / Hill climbing and descent
- homophily / Recommended systems and an introduction to collaborative filtering
I
- in-sample evaluation
- versus out-of-sample evaluation / Out-of-sample versus in-sample evaluation
K
- KDTree / A classic KNN algorithm
- KNN
- about / KNN – introduction
- considerations / KNN – considerations
- classic KNN algorithm / A classic KNN algorithm
- implementing, from scratch / Implementing KNNs from scratch
- KNN clustering / KNN clustering
L
- learning curves / Learning curves
- linear regression
- implementing, from scratch / Implementing linear regression from scratch
- logistic function / Hill climbing and descent
- logistic regression
- about / Logistic regression
- implementing, from scratch / Implementing logistic regression from scratch
- example / Example of logistic regression
- logistic regression algorithm
- about / The algorithm
- example / Example of logistic regression
- logistic regression models
- about / Logistic regression models
- concept / The concept
- math / The math
- logistic (sigmoid) transformation / The logistic (sigmoid) transformation
- algorithm / The algorithm
- predictions, creating / Creating predictions
- loss functions
- about / Hill climbing and loss functions, Loss functions
- slope, measuring of curve / Measuring the slope of a curve
- slope, measuring of Nd-curve / Measuring the slope of an Nd-curve
- slope, measuring of multiple functions / Measuring the slope of multiple functions
M
- machine learning (ML) / Recommended systems and an introduction to collaborative filtering
- Mahalanobis / Distance metrics
- Manhattan / Distance metrics
- matrix factorization
- about / Matrix factorization
- in Python / Matrix factorization in Python
- model
- evaluation / Model evaluation and data splitting
- model parameters / Finite-dimensional models
N
- neural network
- about / Neural networks and deep learning, Neural networks
- training, tips for / Tips and tricks for training a neural network
- non-parametric / Parametric models
- non-parametric learning algorithms
- characteristics / Characteristics of non-parametric learning algorithms
- non-parametric models
- about / Introduction to non-parametric models and decision trees
- learning / Non-parametric learning
- model parametric, considerations / Is a model parametric or not?
- pros / Non-parametric models – pros/cons, Pros of non-parametric models
- cons / Non-parametric models – pros/cons, Cons of non-parametric models
- considerations / Which model to use?
O
- out-of-sample evaluation
- versus in-sample evaluation / Out-of-sample versus in-sample evaluation
P
- parametric learning algorithms
- characteristics / The characteristics of parametric learning algorithms
- parametric models
- about / Parametric models
- finite-dimensional models / Finite-dimensional models
- example / Parametric model example
- pros / The pros and cons of parametric models
- cons / The pros and cons of parametric models
- Pearson correlation / Recommended systems and an introduction to collaborative filtering
- perceptron / Neural networks and deep learning
- Python
- matrix factorization / Matrix factorization in Python
R
- recommended systems / Recommended systems and an introduction to collaborative filtering
S
- scikit-learn library
- used, for splitting data / Splitting made easy
- Spam dataset / An example of supervised learning in action
- Sum of Squared Error (SSE) / Loss functions
- supervised learning in action
- example / An example of supervised learning in action
- supervised machine learning
- example / An example of supervised learning in action
- about / Supervised learning
T
- Training score / An example of supervised learning in action, Learning curves
- transfer learning
- using / Using transfer learning
U
- underfitting / The pros and cons of parametric models
- unsupervised machine learning / Supervised learning
V
- Validation score / An example of supervised learning in action, Learning curves
- variance trade-off
- about / The bias/variance trade-off
- error terms / Error terms
- error, due to / Error due to variance