I have trained my own model, saved it in the .tflite format on my macbook and tested it for inference on the same machine using tensorflow and keras. It worked fine.
However when I port the model on raspberry pi and try to do the inference using tflite_runtime with the following code:
interpreter = Interpreter(model_path=TF_MODEL_FILE_PATH)
signatures = interpreter.get_signature_list()
classify_lite = interpreter.get_signature_runner('serving_default')
interpreter.allocate_tensors()
I am getting the following error:
RuntimeError: There is at least 1 reference to internal data
in the interpreter in the form of a numpy array or slice. Be sure to
only hold the function returned from tensor() if you are using raw
data access.
Could you please advice how to get around this problem?