Index
A
- ADADELTA algorithm / Random search
- auto-encoders
- working / How do auto-encoders work?
- undercomplete / How do auto-encoders work?
- overcomplete / How do auto-encoders work?
- regularized / Regularized auto-encoders
- penalized / Penalized auto-encoders
- denoising / Denoising auto-encoders
- training, in R / Training an auto-encoder in R
- data.table package, adding / Training an auto-encoder in R
- use case / Use case – building and applying an auto-encoder model
- building / Use case – building and applying an auto-encoder model
- applying / Use case – building and applying an auto-encoder model
- models, fine-tuning / Fine-tuning auto-encoder models
- automatic classification
- deep neural network, training for / Use case – training a deep neural network for automatic classification
B
- bagging
- about / Ensembles and model averaging
C
- caret package
- about / Neural networks in R
- Classification and Regression Training
- about / Neural networks in R
- Comprehensive R Archive Network (CRAN)
- Convolutional Neural Network (CNN)
- about / Deep neural networks
D
- darch package / The darch package
- datasets
- linking, to H2O cluster / Linking datasets to an H2O cluster
- Deep Belief Networks (DBNs)
- about / Deep neural networks
- deep feedforward neural networks
- deep learning
- about / What is deep learning?
- neural networks / Conceptual overview of neural networks
- deep neural network (DNN) / Deep neural networks
- deep learning, R packages
- about / R packages for deep learning
- reproducible results, setting up / Setting up reproducible results
- neural networks / Neural networks
- deepnet package / The deepnet package
- darch package / The darch package
- H2O package / The H2O package
- deepnet package / The deepnet package
- deep neural network
- new data, predicting / Training and predicting new data from a deep neural network
- new data, training / Training and predicting new data from a deep neural network
- training, for automatic classification / Use case – training a deep neural network for automatic classification
- URL / Use case – training a deep neural network for automatic classification
- model results, working with / Working with model results
- deep neural network (DNN)
- about / Deep neural networks
- dropout
E
- Emacs Speaks Statistics (ESS)
- about / R packages for deep learning
- ensembles
- and model averaging / Ensembles and model averaging
- epochs / Training an auto-encoder in R
F
- feedforward neural networks / Getting started with deep feedforward neural networks
G
- ggplot2 package / Training an auto-encoder in R
- glmnet
- about / L1 penalty in action
- grandmother cell / Conceptual overview of neural networks
H
- H2O
- URL / Setting up reproducible results
- about / Connecting R and H2O
- initializing / Initializing H2O
- datasets, linking / Linking datasets to an H2O cluster
- H2O package / The H2O package
- hidden neurons / Conceptual overview of neural networks
- Hogwild!
- hyperbolic tangent
- hyperparameters
- picking / Picking hyperparameters
I
- inputs / Conceptual overview of neural networks
- integrated development environment (IDE)
- about / R packages for deep learning
- Iris dataset / Dealing with missing data
K
- K-means clustering
- Kaggle
- about / Building a neural network
L
- L1 penalty
- defining / L1 penalty
- working / L1 penalty in action
- L2 penalty
- defining / L2 penalty
- working / L2 penalty in action
- weight decay / Weight decay (L2 penalty in neural networks)
- learning rate
- about / Building a neural network
- Least Absolute Shrinkage and Selection Operator (lasso)
- about / L1 penalty
M
- maxout
- mean squared error (MSE) / Training an auto-encoder in R, Random search
- Million Song Dataset / Use case – training a deep neural network for automatic classification
- missing data
- dealing with / Dealing with missing data
- model averaging
- and ensembles / Ensembles and model averaging
- models
- with low accuracy, solutions / Solutions for models with low accuracy
- grid search / Grid search
- random search / Random search
- Modified National Institute of Standards and Technology (MNIST) / Training an auto-encoder in R
- Modified National Institute of Standards and Technology (MNIST) data / Picking hyperparameters
N
- neural network
- building / Building a neural network, Use case – build and apply a neural network
- predictions, generating from / Generating predictions from a neural network
- applying / Use case – build and apply a neural network
- neural network (NN)
- neural networks, in R
- about / Neural networks in R
- nnet
- about / Neural networks in R
O
- Ordinary Least Squares (OLS)
- about / L1 penalty
- overfitting data problem
P
- prediction / Conceptual overview of neural networks
- predictions
- generating, from neural network / Generating predictions from a neural network
R
- R
- and H2O, connecting / Connecting R and H2O
- rectifier / Common activation functions – rectifiers, hyperbolic tangent, and maxout
- Recurrent Neural Network (RNN)
- about / Deep neural networks
- recurrent neural networks / Getting started with deep feedforward neural networks
- Restricted Boltzmann Machine (RBM)
- about / Deep neural networks
- R package checkpoint / Setting up reproducible results
- RSNNS / Neural networks
- about / Neural networks in R
- Rstudio
- about / R packages for deep learning
S
- Spearmint library
- URL / Random search
- Stuttgart Neural Network Simulator (SNNS) / Neural networks
- about / Building a neural network
- supervised learning
U
- unsupervised learning
V
- Vincent Goulet