The network was made by the creator of Keras, François Chollet in Xception: Deep Learning with Depthwise Separable Convolutions (https://arxiv.org/abs/1610.02357). Xception is an extension of the Inception architecture, where the Inception modules are replaced with depthwise separable convolutions. In the previous recipe, we focused on training the top layers only while keeping the original Inception weights frozen during training. However, we can also choose to train all weights with a smaller rate. This is called fine-tuning. This technique can give the model a small performance boost by removing some biased weights from the original network.
- First, we with importing all the needed, as follows:
import numpy as np from keras.models import Model from keras.applications import Xception from keras.layers import Dense, GlobalAveragePooling2D from keras.optimizers import Adam from keras.applications import imagenet_utils from keras.utils import np_utils...