In the mean time, a (relatively inefficient) solution is to remove trees from the model (using the model inspector and model builder) and then to evaluate each intermediate model with metrics=[tf.keras.metrics.AUC()].
Thanks for the info. I’m trying to find a way to make the model log aucs metric when calling model.make_inspector().training_logs(). Currently, the output is like this for the classification task:
TL;DR: It is not possible to output AUC in model.make_inspector().training_logs() with the current code (without adding a new parameter and implementing the related logic).