How is the forward pass for SimpleRNN computed? There are 3 kernels, an input? kernel of shape (input_dim, units), recurrent kernel of shape (units, units), and bias of shape (units, 1). I was not able to manually reproduce the final_state
from calling the layer:
simple_rnn = tf.keras.layers.SimpleRNN(4, return_sequences=True, return_state=True)
whole_sequence_output, final_state = simple_rnn(inputs)
Hi @Lu_Bin_Liu
Welcome to the TensorFlow Forum!
SimpleRNN computes the output of the layer for the given input_sequences. In the forward pass, It will take the Input sequences as an input which will be computed with weights and bias in the Hidden state for each time steps and will generate the final Output. Please refer to this link for more details.
inputs = np.random.random([32, 10, 8]).astype(np.float32)
simple_rnn = tf.keras.layers.SimpleRNN(4, return_sequences=True, return_state=True)
whole_sequence_output, final_state = simple_rnn(inputs)
print(whole_sequence_output.shape)
print(final_state.shape)
Output:
(32, 10, 4)
(32, 4)
1 Like