The sklearn decision tree classifier has two hyperparameters that we can tune: criterion and max depth. Hyperparameters are often changed to see if accuracy can be increased. The criterion is either gini or entropy. Both of these criteria evaluate impurities in the child nodes. The next one is max depth. The max depth of the decision tree can affect over- and underfitting.
Underfitting versus overfitting
Models that underfit are inaccurate and poorly represent the data they were trained on.
Models that overfit are unable to generalize from the data trained on. It misses similar data to the training set because it only works on exactly the same data it was trained on.
Models that underfit are inaccurate and poorly represent the data they were trained on.
Models that overfit are unable to generalize from the data trained on. It misses similar data to the training set because it only works on exactly the same data it was trained on.