Each unit in the layer has 4 parameters, beta, gamma, mean and standard deviation. This is documented.
get_weights is not documented for BatchNormalization, but returns a list of 4 arrays, each with a size corresponding to the number of units. Presumably, .get_weights()[i][j] will return the ith parameters for the jth unit. But which parameter is which?
This was asked on StackOverflow:
The answer seems to be that they are in the order I listed them, as given by the stackoverflow answer. Looking at the keras code, add_weight is being called in this order.
Looking at my network, my get_weights()[1] frequently has negative values. A negative gamma (scaling factor) makes no sense to me. Am I wrong about the order of parameters? Is there a good reason for a negative gamma? Is this a bug in Keras? Can the documentation be updated to include more information about this?