Tuning is a time-consuming and computation-expensive task. Throughout this book, we've paid limited attention to tuning hyperparameters. Most were obtained with pre-chosen values. To choose the right values, we can use heuristics or an extensive grid search. Grid search is a popular method for parameter tuning in machine learning.
In the following recipe, we will demonstrate how you can apply grid search when building a deep learning model. For this, we will be using Hyperopt.
- We start by importing the libraries used in this recipe:
import sys import numpy as np from hyperopt import fmin, tpe, hp, STATUS_OK, Trials from keras.models import Sequential from keras.layers import Dense, Dropout, Activation, Flatten from keras.layers import Conv2D, MaxPooling2D from keras.optimizers import Adam from sklearn.model_selection import train_test_split from keras.utils import to_categorical from keras.callbacks import EarlyStopping, TensorBoard...