I use ONNX to convert Pytorch to tflite and use flutter with package ‘tflite_flutter: ^0.9.0’ to deploy model. But the tflite model run on Colab output differently from flutter (Colab is the correct one). On Colab, image must be transpose to [1,3,256,256] to be feed on the model. But I don’t know how to transpose Image shape on Flutter and feed the image with model shape [1,3,256,256].
After I run ‘interpreter.run(tensorImage.buffer, probabilityBuffer.buffer)’, The output is different from the output from Colab. Output shape from flutter is 1D array which length is [196608].
How to feed image to model shape [1,3,256,256] on flutter.
Have you tried with
I don’t have a good answer for the question but out of curiosity, what does this model do?