Multiple optimizer keras

Im trying to use 2 optimizers for different variables. I am using keras. The parts that are commented out is what I am trying to do, but with a modern tensorflow/keras version

# var_D = [v for v in tf.global_variables() if v.name.startswith('d')]
# var_G = [v for v in tf.global_variables() if v.name.startswith('g') or v.name.starts_with('h')]


optimizer_D = optimizers.Adam(learning_rate=0.0004, beta_1=0.5,
                                            beta_2=0.9)  # .minimize(Loss_D, var_list=var_D)
optimizer_G = optimizers.Adam(learning_rate=0.0001, beta_1=0.5,
                                            beta_2=0.9)  # .minimize(Loss_G, var_list=var_G)

model = Model(inputs=[X, Y, MASK], outputs=result)
model.compile()

Hi @DerFrederikHD, Could you please elaborate more on the issue, and let us know if you are facing any error. Thank You.

Tensorflow v1 code

var_D = [v for v in tf.global_variables() if v.name.startswith('d')]
optimizer_D = tf.train.AdamOptimizer(learning_rate=0.0004, beta1=0.5, beta2=0.9).minimize(Loss_D, var_list=var_D)

Tensorflow v2 code

loss_D = Lambda( lambda t: tf.reduce_mean(...))
optimizer_D = optimizers.Adam(learning_rate=0.0001, beta_1=0.5,
                                            beta_2=0.9)

My question is How can I get the right variables in tensorflow2? How can I tell the optimizer that it’s part of the model? how can I set the loss. I found that the optimizer has build(var_D). & scale_loss(loss_D). Are these the correct functions to use to reproduce the tesnorflow v1 behavior? How do I get the 2 optimizers into the model