Thank you for reply Bhack.
I’m sorry but I can’t upload notebook.
I realized that the original Keras example works same every time, even if I changed the model “xception” to “VGG16”. But my model is not so.
My model is transfer-trained, which based on VGG16.
I found the cause of random is difference of model.
So I would like to change my question to “How to get same result of layers every time?”
Detail:
I found [last_conv_layer_output] makes the difference:
def make_gradcam_heatmap(img_array, model, last_conv_layer_name, pred_index=None):
............
last_conv_layer_output, preds = grad_model(img_array)
grad_model retruns [last_conv_layer] and [preds].
In Keras example, last_conv_layer is same value every time , but mine is random.
Therefore, I found I should modify my model.
My model is made with transfer learning. Base is VGG16, and added some layers.
I saved the model in this way:
model.save("FruitsSorter.tf")
Load it in this way:
from keras.models import load_model
model2 = load_model("FruitsSorter.tf")
Training machine and Inference machine is same.
The summary of model is:
Model: "model"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_2 (InputLayer) [(None, 224, 224, 3)] 0
_________________________________________________________________
sequential (Sequential) (None, 224, 224, 3) 0
_________________________________________________________________
tf.__operators__.getitem_1 ( (None, 224, 224, 3) 0
_________________________________________________________________
tf.math.add_1 (TFOpLambda) (None, 224, 224, 3) 0
_________________________________________________________________
block1_conv1 (Conv2D) (None, 224, 224, 64) 1792
_________________________________________________________________
block1_conv2 (Conv2D) (None, 224, 224, 64) 36928
_________________________________________________________________
block1_pool (MaxPooling2D) (None, 112, 112, 64) 0
_________________________________________________________________
block2_conv1 (Conv2D) (None, 112, 112, 128) 73856
_________________________________________________________________
block2_conv2 (Conv2D) (None, 112, 112, 128) 147584
_________________________________________________________________
block2_pool (MaxPooling2D) (None, 56, 56, 128) 0
_________________________________________________________________
block3_conv1 (Conv2D) (None, 56, 56, 256) 295168
_________________________________________________________________
block3_conv2 (Conv2D) (None, 56, 56, 256) 590080
_________________________________________________________________
block3_conv3 (Conv2D) (None, 56, 56, 256) 590080
_________________________________________________________________
block3_pool (MaxPooling2D) (None, 28, 28, 256) 0
_________________________________________________________________
block4_conv1 (Conv2D) (None, 28, 28, 512) 1180160
_________________________________________________________________
block4_conv2 (Conv2D) (None, 28, 28, 512) 2359808
_________________________________________________________________
block4_conv3 (Conv2D) (None, 28, 28, 512) 2359808
_________________________________________________________________
block4_pool (MaxPooling2D) (None, 14, 14, 512) 0
_________________________________________________________________
block5_conv1 (Conv2D) (None, 14, 14, 512) 2359808
_________________________________________________________________
block5_conv2 (Conv2D) (None, 14, 14, 512) 2359808
_________________________________________________________________
block5_conv3 (Conv2D) (None, 14, 14, 512) 2359808
_________________________________________________________________
block5_pool (MaxPooling2D) (None, 7, 7, 512) 0
_________________________________________________________________
global_average_pooling2d_2 ( (None, 512) 0
_________________________________________________________________
dropout (Dropout) (None, 512) 0
_________________________________________________________________
dense_1 (Dense) (None, 512) 262656
_________________________________________________________________
dense (Dense) (None, 5) 2565
=================================================================
Total params: 14,979,909
Trainable params: 265,221
Non-trainable params: 14,714,688