Tuning hyper-parameters using grid searches in H2O
H2O packages also allow you to perform hyper-parameter tuning using grid search (h2o.grid
).
Getting ready
We first load and initialize the H2O package with the following code:
# Load the required packages require(h2o) # Initialize H2O instance (single node) localH2O = h2o.init(ip = "localhost", port = 54321, startH2O = TRUE,min_mem_size = "20G",nthreads = 8)
The occupancy dataset is loaded, converted to hex format, and named occupancy_train.hex.
How to do it...
The section will focus on optimizing hyper parameters in H2O using grid searches.
- In our case, we will optimize for the activation function, the number of hidden layers (along with the number of neurons in each layer),
epochs
, and regularization lambda (l1
andl2
):
# Perform hyper parameter tuning activation_opt <- c("Rectifier","RectifierWithDropout", "Maxout","MaxoutWithDropout") hidden_opt <- list(5, c(5,5)) epoch_opt <- c(10,50,100) l1_opt <- c(0,1e-3,1e-4) l2_opt <- c...