I am facing issue while trying to invoke the interpreter
File “/home/ubuntu/Documents/pythonProject1/tensorflow_lite.py”, line 48, in
interpreter.invoke()*
File “/home/ubuntu/Documents/pythonProject1/venv/lib/python3.8/site-packages/tflite_runtime/interpreter.py”, line 917, in invoke*
self._interpreter.Invoke()* RuntimeError: Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding “org.tensorflow:tensorflow-lite-select-tf-ops” dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_selectNode number 5 (FlexTensorListReserve) failed to prepare.
Do you have the code that you have converted your .tflite file?
It seems that you have not included select_tf_ops while converting.
Take a look at this.
If the i am not using
converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
then getting below error
/home/hasher/.local/lib/python3.8/site-packages/tensorflow/python/saved_model/save.py:1276:0: error: failed to legalize operation 'tf.TensorListReserve' that was explicitly marked illegal
<unknown>:0: note: loc(fused["StatefulPartitionedCall:", "StatefulPartitionedCall"]): called from
<unknown>:0: error: Lowering tensor list ops is failed. Please consider using Select TF ops and disabling `_experimental_lower_tensor_list_ops` flag in the TFLite converter object. For example, converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]\n converter._experimental_lower_tensor_list_ops = False
Could you please help me with the conversion of .h5 model without using ops.
There is a documentation here, but I am not sure if you can build wheel for tflite-runtime that includes Select TF ops
Let’s tag @khanhlvg to shed some light if the wheel building for tflite-runtime can include also these ops.
The issue arises because your TensorFlow Lite model uses TensorFlow operations that are not natively supported by the standard TensorFlow Lite interpreter. To resolve this, you must enable the TensorFlow Lite Flex delegate, which allows the interpreter to execute the unsupported TensorFlow operations.