Feature importance and classes importance

I have the following trained time series classification tensorflow model :

model = Sequential()
model.add(Masking(mask_value=0.0, input_shape=(90, 8)))
model.add(LSTM(100, return_sequences=True))
model.add(LSTM(70, return_sequences=True))
model.add(LSTM(70, return_sequences=False)) 
model.add(Dense(20, activation='relu'))
model.add(Dense(22, activation='softmax'))
  • The input to the model has the shape: (Batch_size, 90, 8)
  • 90 - time series length
  • 8 - number of features
  • The output of the model is 22 (classes)

I want to get the importance of the features so Iā€™m using the following code:

first_layer_weights = model.layers[1].get_weights()[0]
feature_importance  = np.abs(first_layer_weights).sum(axis=1)
  1. Am I right that the values of feature_importance ā†’ higher values means more importance than other features ?

  2. I want to check if the model give more attention (higher probability) to some classes.
    Iā€™m using this code to get the weights of the last layer:

last_layer_weights = model.layers[-1].get_weights()[1]
feature_importance = np.abs(last_layer_weights )
  • Is my suggestion for the features is true for the last layer - the values of feature_importance ā†’ higher values means more importance than other features ā†’ and thus some classes may get higher probability by the model to be selected ?

1 Like

Hi @Svl ,

The interpretation about higher values of feature_importance indicating more importance generally holds true for both layers, though the specifics can vary based on how the model is structured and trained.

Higher feature_importance values can imply that the model assigns more importance (or weight) to certain classes when making predictions. This can affect the probability distribution over classes.

Thank You !