TFLM model with TFLite

Hello,

I am working on a project that has been using tensorflow lite micro for the past 2 years. The target is an ARMv7 with a NEON processor. I have been doing some digging on optimizing TFLM and it appears to me that we should actually be using tensorflow lite NOT micro so that we can utilize xnnpack and its NEON optimizations. There is some hesitancy to move to tensorflow lite because of the nature of our teams workflow.
Although we will at some point move to tensorflow lite could I in the meantime use the model generated for TFLM with the tensorflow lite library?
Thank you

Hi @nvinyard,

You might have a solution by this time. Here are some pointers.
Usually, TFLM models are intended for microcontrollers with very limited resources. They may not be fully compatible with the TFLite library due to differences in operators and data formats.
Check whether your TFLM model uses TFLite-compatible operators and data types. Then check the compatibility issues by using TFLIte Converter. If compatibility is high, then directly load the TFLM model into TFLite and check its performance.
If your TFLM model is incompatible with TFLite, retrain it in a TFLite-compatible configuration throughout the training phase. This ensures that the model employs TFLite-supported operations and data types.
Convert the retrained model with the TFLite Converter, with both XNNPACK and NEON optimizations for performance improvement. Please let us know if there are any related issues while converting the model.

Thank You