Hi all,
I was going through Matterport mask-rcnn code. They are skipping ‘gamma’ and ‘beta’ of Batch norm from L2 regularization.
# Add L2 Regularization
# Skip gamma and beta weights of batch normalization layers.
reg_losses = [
keras.regularizers.l2(self.config.WEIGHT_DECAY)(w) / tf.cast(tf.size(w), tf.float32)
for w in self.keras_model.trainable_weights
if 'gamma' not in w.name and 'beta' not in w.name]
Any reason for this?