We want to re-use an existing keras layer, but return an intermediate value of the call function.
To be precise we aim to re-use the ResNet class of SimCLR, which equals to:
class Resnet(tf.keras.layers.Layer):
def call(self, inputs, training):
for layer in self.initial_conv_relu_max_pool:
inputs = layer(inputs, training=training)
for i, layer in enumerate(self.block_groups):
inputs = layer(inputs, training=training)
inputs = tf.reduce_mean(inputs, [1, 2])
inputs = tf.identity(inputs, 'final_avg_pool')
return inputs
We want to obtain the inputs before the reduce_mean
function.
If this was a keras Model we could do something like model.get_layer(index=X).output
.
Keras Layers do have submodules, and we could identify the correct submodule (resnet_model.submodules[8].name
returns block_group4
as expected). However, resnet_model.submodules[8].output
yields an
AttributeError: Layer block_group4 has no inbound nodes.
Is the only way to subclass and redefine call? Or is there another way to get the output of a submodule / intermediate value of the layer?