Constant weight intialisation giving different weights after training

I have read in the notes on this website, as well as in other resources that if the network is intialzed with constant identical weights, then each neuron learns the same weights.
However, when I did the same in tensorflow, i am getting different weights after training for each neuron.
I am following this tutorial
Here’s the important part code:

multi_linear_model = tf.keras.Sequential([

tf.keras.layers.Lambda(lambda x: x[:,-1:,:]),

tf.keras.layers.Dense(OUT_STEPS*num_features,kernel_initializer=tf.initializers.zeros(),bias_initializer=tf.initializers.zeros()),

= tf.keras.layers.Reshape([OUT_STEPS, num_features])

])

I am basically working with time series data. The first lambda layer just extracts the last time stamp. The second layer is a dense layer that operates on this last times step with number of neurons defined by the some constants.

The important part is that I have set the weights and biases to zero, But after training, I am getting different weights for each neuron in this layer.

Can someone explain why?? Should this happen or not? Is this a tensorflow thing?Please help…