The most popular to prevent overfitting in neural networks is adding dropouts. In Chapter 2, Feed-Forward Neural Networks, we introduced dropouts, and we've used dropouts throughout the book. In the following recipe, we demonstrate, just like Chapter 2, Feed-Forward Neural Networks, the difference in performance when adding dropouts. This time, we will be using the cifar10
dataset.
- We start by importing all libraries as follows:
from keras.models import Sequential from keras.layers import Dense, Dropout, Activation, Flatten from keras.layers import Conv2D, MaxPooling2D from keras.optimizers import Adam from sklearn.model_selection import train_test_split from keras.utils import to_categorical from keras.callbacks import EarlyStopping, TensorBoard, ModelCheckpoint from keras.datasets import cifar10
- Next, we load the Cifar10 dataset and pre-process it:
(X_train, y_train), (X_test, y_test) = cifar10.load_data() validation_split = 0.1 X_train...