Error embedding layer, error message

Hallo, maybe there is someone who can help me. I’m trying to make Bidirectional model and got this error :
indices[22,5] = -1 is not in [0, 184)
[[{{node sequential_28/embedding_20/embedding_lookup}}]] [Op:__inference_train_function_233341]

Here my code :
model = keras.Sequential([
keras.layers.Embedding(input_dim=len(X_train), output_dim=len(X_train), input_length=1),
keras.layers.Bidirectional(keras.layers.LSTM(64, return_sequences=True)),
keras.layers.Bidirectional(keras.layers.LSTM(64)),
keras.layers.Dense(128, activation=‘relu’),
keras.layers.Dense(1, activation=‘sigmoid’)
])

I’m already check the X_train dan y_train and both of them have the same length, but both of them do not have a value of -1.

Here my X_train and X_test


Hi @winter .
Unfortunately the images you added at the end of your message don’t render on the forum.
Anyways, if you look at the documentation, you’ll see in the argument section:

  • input_dim: Integer. Size of the vocabulary, i.e. maximum integer index + 1.
  • output_dim: Integer. Dimension of the dense embedding.

Meanwhile in your code you set these two arguments of keras.layers.Embedding equal to len(X_train) .

Please make appropriate changes.
Thank you.