LSTM Keras API: Handling two cases; y<t> = x<t+1> and y<t> = x<t>

I’m learning how to implement LSTM using Keras API, and got a question regard how I can handle two different cases; y = x and y = x

For example, the following code builds a simple LSTM in Keras:

inputs = tf.random.normal([32, 10, 8])
lstm = tf.keras.layers.LSTM(4, return_sequences=True, return_state=True)
whole_seq_output, final_memory_state, final_carry_state = lstm(inputs)

When doing training for name entity task y equals to x, but when doing inferencing then y = x. But it seems like there is no argument/parameter specifying these cases when initializing LSTM in Keras.

Is it that these things are being handled internally all by Keras API by itself or is there something I’m missing?

Thanks.

1 Like

Hi,

In the training of LSTM, long sequences(x(1)…x(n)) are process through internal state and maintain history of the sequences. At each time step(t), it will predict output based on current input and previous state.

In inference of LSTM, we are giving input sequence at first time step and get the predictions. When next input sequence arrives then it will append it to previous input and remove the first element to get the next sequence.

Here you can fine more learning regarding sequence-to-sequence modelling in keras.