Quick question. Why is it that certain softmax shapes will not convert to tensorflow lite? For example (10,26,26) will work but (10,26,26,1) will not. I am getting the following error
Some ops are not supported by the native TFLite runtime, you can enable TF kernels fallback using TF Select. See instructions: https://www.tensorflow.org/lite/guide/ops_select
TF Select ops: Softmax
Details:
tf.Softmax(tensor<?x10x26x26x1xf32>) -> (tensor<?x10x26x26x1xf32>) : {device = ""}
I am trying to figure out how to solve this while keeping the tf runtime.