Check/ Visualize gardients/loss getting applied to each layer

My network makes use of 3 losses that are applied to different layers. This is what I am doing currently :
trainable_vars = self.trainable_variables
gradients = tape.gradient([loss1,loss2,loss3], trainable_vars)
I am hoping that the gradients get applied according to the layer outputs that were involved in the calculation of each loss. However, how do I verify this?
Is there any means to check which loss/ gradient each layer in the network is getting affedcted by during back propagation without the need for custom layers?

Hi @anxious_learner ,

This comprehensive example will help you Visualize and Analyze how different losses affect various layers in your network.

Key features of this implementation include:

  1. A custom TensorFlow model with four dense layers.
  2. Separate calculation of gradients for each loss function.
  3. Visualization of gradient distributions using histograms.
  4. Computation and display of gradient statistics (mean and standard deviation) for each layer.

I excuted it with dummy model , I am attaching gist for the same kindly refer this .

Hope this helps ,

Thank You !