Tensorflow model.predict gives instant results in Android Emulator but stucks on Iphone/Ipad after deployed to Expo App

I’m totally new to Tensorflow, I’ve just trained a custom model in Teachable Machine and successfully made right predictions after loading the model in my React Native app.

let model = await tf.loadLayersModel(bundleResourceIO());
model.predict();

This predictions are really fast and accurate on Android Emulator. But after deployment of app the prediction part stucks without error and gives no result (on Iphone/Ipad).

I’ve tried to use tflite instead but had no success to load the .tflite file localy.

Tensorflow backend uses rn-webgl

What could be this huge difference of performance between environments ?

Hi @can777 ,

I apologies for the delay in my response. There could be several reason why the Expo app is getting stuck on your iOS devices. One highest possibility is that memory management in iOS is handled differently compared to android. I found this blog that might help explain the differences in memory management between iOS vs Android.

If the model size or the prediction input size is too large for your iOS device’s available memory, or if the device’s processing thread is blocked by another process, it could cause the app to stuck.

Could you please share the error you’re getting when loading the tfLite model locally? You can try loading the model directly using the following function:

tflite.loadTFLiteModel('url/to/your/model.tflite’).

Thank You!!