How to make initializers i.i.d

I need to build a MLP and I need to initialized it with weights (W and b) gaussian independent and identically distributed across all layers (all W are gaussian i.i.d. and all b are i.i.d. with a different distribution respect to W). Now, if I specify a keras.initializers.RandomNormal(...) I can do this only for 1 layer, right? If I want to make it across all layers the only way i can follow is to build a custom initializer?

Hi @Fabiano_Veglianti ,

  • Using keras.initializers.RandomNormal(...): You’re correct that if you specify this initializer directly in a layer, it will only apply to that specific layer. Each call to RandomNormal() creates a new instance with its own random seed, so using it multiple times won’t guarantee identical distributions across layers.
  • Custom initializer: Creating a custom initializer is indeed a good way to ensure consistent initialization across all layers.

Attaching the Reference Gist kindly have a look on it ,

Thank You .