Issue with GradientTape for a custom loss function, TF2.6

I am trying to use a custom loss function in my Keras sequential model (TensorFlow 2.6.0). This custom loss (ideally) will calculate the data loss plus the residual of a physical equation (say, diffusion equation, Navier Stokes, etc.). This residual error is based on the model output derivative wrt its inputs and I want to use GradientTape.

In this MWE, I removed the data loss term and other equation losses, and just used the derivative of the output wrt its input. The dataset can be found here.

from numpy import loadtxt
from keras.models import Sequential
from keras.layers import Dense
import tensorflow as tf #tf.__version__ = '2.6.0'
# load the dataset
dataset = loadtxt('pima-indians-diabetes.csv', delimiter=',')
# split into input (X) and output (y) variables
X = dataset[:,0:8] #X.shape = (768, 8)
y = dataset[:,8]

def customLoss(y_true,y_pred):
    x_tensor = tf.convert_to_tensor(model.input, dtype=tf.float32)
#     x_tensor = tf.cast(x_tensor, tf.float32)
    with tf.GradientTape() as t:
        t.watch(x_tensor)
        output = model(x_tensor)
    DyDX = t.gradient(output, x_tensor)
    dy_t = DyDX[:, 5:6]
    R_pred=dy_t
    # loss_data = tf.reduce_mean(tf.square(yTrue - yPred), axis=-1)
    loss_PDE = tf.reduce_mean(tf.square(R_pred))
    return loss_PDE

model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(12, activation='relu'))
model.add(Dense(12, activation='relu'))
model.add(Dense(1, activation='sigmoid'))

model.compile(loss=customLoss, optimizer='adam', metrics=['accuracy'])

model.fit(X, y, epochs=15)

After execution, I get this ValueError:
ValueError: Passed in object of type <class 'keras.engine.keras_tensor.KerasTensor'>, not tf.Tensor

When I change loss=customLoss to loss='mse', the model starts training, but using that customLoss is the whole point. Any ideas?

Have you tried to organize your code customizing the train_step?

Check this example:
https://keras.io/guides/customizing_what_happens_in_fit/

You are actually getting to my point. I was trying to get the job done without going low-level. Is that possible? Or should I modify the train_step method? Thanks!

Do you want something like this:

Thank you! Yeah that was it! Much appreciated!