I have a TFLite model that is designed for on-device training, with signatures including restore, train, and save. I have successfully run it using the Java task in my own app.
However, when I try to run it using the TensorFlow Lite shared library that I downloaded from this link: Maven Central Repository Search, I encounter an error.
For the restore and train signatures, I followed these steps: set input tensors → TfLiteSignatureRunnerInvoke → read output tensors. But the error log indicates that the Select TensorFlow op(s) are not supported by this interpreter when running TfLiteSignatureRunnerInvoke.
Here is the error message:
“Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For Android, this can be resolved by adding the ‘org.tensorflow:tensorflow-lite-select-tf-ops’ dependency. See instructions: عملگرهای TensorFlow را انتخاب کنید | TensorFlow Lite Node number 151 (FlexRestore) failed to prepare.”
I would like to know if there is any other method to address this issue. Does the TensorFlow Lite C API support on-device training for restoring weights and training a new model?
Thank you very much in advance.