Another approach to regularization involves combining neural network models and averaging out the results. The resultant model is the most accurate one.
A neural network ensemble is a set of neural network models taking a decision by averaging the results of individual models. Ensemble technique is a simple way to improve generalization, especially when caused by noisy data or a small dataset. We train multiple neural networks and average their outputs.
As an example, we take 20 neural networks for the same learning problem, we adjust the various parameters in the training processing, and then the mean squared errors are compared with the mean squared errors of their average.
The following are the steps followed:
- The dataset is loaded and divided into a train and test set. The percentage split can be varied for different neural net models.
- Multiple models are created with the different training sets and by adjusting the parameters in the
nnet()
function...