Model asking me to provide loss in `model.compile` even though it is handled in custom training loop

I am trying to train variational autoencoder on MNIST dataset. I’ve implemented custom training loop for the same but the compile method is asking me to pass the loss parameter while compiling even though I’ve handeled the loss in the custom training loop.

Here’s the link to code: Variational Autoencoder on MNIST Dataset · GitHub

Error: ValueError: No loss to compute. Provide a "loss" argument in "compile()".

However if I provide `loss=“binary_crossentropy” then it throws following error:

/usr/local/lib/python3.10/dist-packages/keras/src/utils/traceback_utils.py in error_handler(*args, **kwargs)
    120             # To get the full stack trace, call:
    121             # `keras.config.disable_traceback_filtering()`
--> 122             raise e.with_traceback(filtered_tb) from None
    123         finally:
    124             del filtered_tb

/usr/local/lib/python3.10/dist-packages/optree/ops.py in tree_map(func, tree, is_leaf, none_is_leaf, namespace, *rests)
    745     leaves, treespec = _C.flatten(tree, is_leaf, none_is_leaf, namespace)
    746     flat_args = [leaves] + [treespec.flatten_up_to(r) for r in rests]
--> 747     return treespec.unflatten(map(func, *flat_args))
    748 
    749 

ValueError: None values not supported.

Hi @Preet_Sojitra, As you have defined a custom training loop and you are also using the fit method for training the model which I think is not the suggested way. Also if you compile a model the model expects the loss function to be passed if not will cause the ValueError: No loss to compute. Provide a "loss" argument in "compile()". If you handle the loss & optimizers in a custom training loop you don’t need to compile the model with those parameters again. Please refer to this document for training a model using custom training loop. Thank You.