Hi,
I’ve been trying to use EfficientDet-Lite0 for object detection with TF lite. I tried on Android(Samsung A32) with NNAPI and CPU. Inference time for CPU is <100ms whereas with NNAPI it’s >1000ms. I’m hoping someone can confirm that this is expected and if so why?
Thanks