How to get logits from trained tensorflow model

I have a trained TensorFlow classification model (52 classes).
the 3 last layers are:

model.add(LSTM(70, return_sequences=False, unroll=True)
model.add(Dense(50))
model.add(Dense(52, activation='softmax'))

I want to get the logits (values before the softmax) in order to calibrate the model (with softmax temperature scaling ) .

How can I get those logits ?
If I will try this way:

probabilities = model.predict(input_data)
logits = tf.math.log(probabilities)

I will get wrong logits (because softmax divide each logit with the sum of exp logits)

If I will try this way:

logits_layer_model = tf.keras.Model(inputs=model.input, outputs=model.layers[-2].output)
logits = logits_layer_model.predict(input_data)

In this way, there is no consideration of the weights of the last layer.

How should I get the logits ?

Hi @SAL, As logits are the unnormalized final scores of your model. If you apply softmax to those you will get probability distribution over your classes. As you have used softmax activation in your last dense layer the output of the model will be probabilities. If you want to get the logits as the model output you can use linear activation in the last layer.

If you want to get the logits from the trained model you can take the model layers except the last layer, and can pass the test data to get the logits. Please refer to this gist for code example. Thank You.