I am looking at some code from Google (google-research/social_rl/multiagent_tfagents/joint_attention/attention_networks.py at c56b47713b08c95ad427d5f93ee0dbb9ad008964 · google-research/google-research · GitHub). I am new to TensorFlow and I don’t really understand what line 491 is doing. It’s passing some normalized kwargs to the call method of tf.keras.layers.Layer, which gives back those three ouputs. Is this just a sequence of dense layers?
Thanks
Hi @Samuele_Bolotta,
Sorry for the delay in response.
outputs, new_state, attention_weights = tf.keras.layers.Layer.__call__(
self, **normalized_kwargs)
Above __call__
method from tf.keras.layers.Layer
, allowing the AttentionNetwork
to access keras features where it sends arguments like input data and network state. This method returns the outputs, an updated state, and attention weights showing the attention given to each input.This structure allows for complex data processing, not only to just dense layers, it can include various layer types, such as convolutional(images) or recurrent layers(LSTMs/GRUs).
Please share if this clears things up. Thank you.