On-Device training for LSTM or GRU Model

Hi I’m new to tensorflow and I’m trying to make LSTM or GRU model to be enable to re-train on-device(Android) with tabular data (mostly customer interaction).

I’m referencing these examples
On-Device Training

This is an example of a CNN, but not able to understand how can I enable on-device training for lstm or gru model.
Are there any examples for reference? Thanks

Hi, @yycheng

I apologize for the delayed response, LiteRT supports converting TensorFlow RNN models to LiteRT’s fused LSTM operations. Fused operations exist to maximize the performance of their underlying kernel implementations, as well as provide a higher level interface to define complex transformations like quantizatization.

Since there are many variants of RNN APIs in TensorFlow, our approach has been two fold:

  1. Provide native support for standard TensorFlow RNN APIs like Keras LSTM. This is the recommended option.
  2. Provide an interface into the conversion infrastructure for user-defined RNN implementations to plug in and get converted to LiteRT. We provide a couple of out of box examples of such conversion using lingvo’s LSTMCellSimple and LayerNormalizedLSTMCellSimple RNN interfaces. Please refer this official documentation

NOTE : TFLite renamed as LiteRT please refer this blog and please refer this Google colab notebook which may help you to solve your issue

Thank you for your cooperation and patience.