I’m working with tflite on Android, and I have a model architecture that’s a MobilenetV2 backbone, and a GRU RNN, trained with CTC loss. I want to use ctc beam search decoding at inference time.
I’ve successfully made a .tflite model and tested that I can load it up and get the expected behaviour on my local machine.
BUT, it doesn’t work on my Android project as I get this error:
Could not build model from the provided pre-loaded flatbuffer: Unsupported custom op: FlexCTCBeamSearchDecoder, version: 1
The error is raised when I try to instantiate this class:
package com.stampfree.validation.tflite;
import android.app.Activity;
import java.io.IOException;
/** This TensorFlowLite classifier works with the float MobileNet model. */
public class DigicodeMobileNet extends Classifier {
/**
* Initializes a {@code ClassifierFloatMobileNet}.
*
* @param device a {@link Device} object to configure the hardware accelerator
* @param numThreads the number of threads during the inference
* @throws IOException if the model is not loaded correctly
*/
public DigicodeMobileNet(Activity activity, Device device, int numThreads)
throws IOException {
super(activity, device, numThreads);
}
@Override
protected String getModelPath() {
return "tf_mobilenetv2_128x768_ctc_post_epoch08.tflite";
}
}
I have tried this using the default dependencies:
dependencies {
implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly-SNAPSHOT'
implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:0.0.0-nightly-SNAPSHOT'
}
and by building the AAR in a docker container.
Both approaches gave the same result.
Any tips? Happy to share more context on request.