Output on my Model cannot be used as loss value for the gradient?

Hello,

I am trying to build a custom training loop for a (very) complex model

x = Input(shape=(2,*input_shape))
total_new_loss = Lambda(self.totals_)(x)

… (some other constructs for the model, see from here to here )

net = Model(x, [y_aae, total_new_loss]) 

for train_subdata_batch_cache in batches:
    with tf.GradientTape() as tape:
        preds, loss_value = net(tf.convert_to_tensor(train_batch_cache))
    gradients = tape.gradient(loss_value, net.trainable_weights)

But then I get:

AttributeError: ‘RefVariable’ object has no attribute ‘_id’

(full error log here)

The only solution I have seen so far suggests to enable the eager mode but I don’t want to do this for several reasons.

Any idea on how to solve this ?

Many thanks!

Aymeric

p.s: when I print “loss_value” it returns this : Tensor("model_1/lambda_1/Abs:0", shape=(), dtype=float32)