Run a custom model on Coral micro device

We follow the guidelines described on the page linked below to carry out transfer learning using the Yamnet model.

We completed the task successfully and our model achieved excellent accuracy.

After that, we performed the conversion of the TF model to Tflite and successfully tested the model on the computer. We also converted it to run on EdgeTPU without any problems.

The problem occurs when we try to embed the model on the CORAL MICRO device. We didn’t make it. The Arduino IDE compiles without problems, but when running it starts complaining about several OPCODES. We add to the “resolver” definition these OPCODES until it reaches an OPCODE that do not exist in the CORAL C++ libraries.

How do we resolve this?

The direct question would be: How can I run the new trained model (using transfer learning from this link above) on the micro coral?

Hi @Frederico_Coelho,

Sorry for the delayed response. First check the unsupported ops in your model by running tensorflow/lite/tools/visualize.py --input_model model.tflite. If any unsupported op, please go with custom op implementation. Make sure you used strict int8(full integer) quantization only, for your coral.

Thank You