I have been consistently to run the Bert Neuspell Tokenizer graph as SavedModelBundle using Tensorflow core platform 0.4.1 in Scala App, for some bizarre reason in last day or so without making any change to code that generates the tensor output, i keep getting this error below:
org.tensorflow.exceptions.TFInvalidArgumentException: ConcatOp : Ranks of all input tensors should match: shape[0] = [1,1] vs. shape[1] = [3]
** [[{{function_node __inference_serve_549812}}{{node RaggedConcat/concat}}]]**
Interestingly this looks like one of internal nodes, i am just passing single Input as TString to one of input functions. ‘serving_default_text’ and then fetching the output from the function ‘StatePartitionedCall_2’ here is signature def dump of this graph.
{serving_default=inputs {
key: “text”
value {
name: “serving_default_text:0”
dtype: DT_STRING
tensor_shape {
unknown_rank: true
}
}
}
outputs {
key: “output_0”
value {
name: “StatefulPartitionedCall_2:0”
dtype: DT_INT64
tensor_shape {
dim {
size: 1
}
dim {
size: -1
}
}
}
}
outputs {
key: “output_1”
value {
name: “StatefulPartitionedCall_2:1”
dtype: DT_INT64
tensor_shape {
dim {
size: 1
}
dim {
size: -1
}
}
}
}
outputs {
key: “output_2”
value {
name: “StatefulPartitionedCall_2:2”
dtype: DT_INT64
tensor_shape {
dim {
size: -1
}
dim {
size: -1
}
}
}
}
method_name: “tensorflow/serving/predict”
, __saved_model_init_op=outputs {
key: “__saved_model_init_op”
value {
name: “NoOp”
tensor_shape {
unknown_rank: true
}
}
}
I load the graph using resource file pattern where you have saved_model.pb, and variables folder.
i can go through debugger and see graph is loaded and i can also print out all signature defs but when i use this syntax i keep getting the error above from some intermediate node operation.
syntax to invoke is using savedmodelbundle instance
a) create session runner val tokenizerSession = savedModelBundle.session().runner()
b) then invoke .fetch and .feed(multiple times) to pull all outputs.
val testQuery = “psycologist” //misspelled.
val input = TString.tensorOfBytes(NdArrays.vectorOfObjects(testQuery.getBytes(StandardCharsets.UTF_8)))
val tensors = tokenizerSession.feed(“serving_default_text”, input).fetch(“StatefulPartitionedCall_2”, 0).fetch(“StatefulPartitionedCall_2”, 1).fetch(“StatefulPartitionedCall_2”, 2).run()
this command generates this stack trace
Exception in thread “zio-fiber-65” org.tensorflow.exceptions.TFInvalidArgumentException: ConcatOp : Ranks of all input tensors should match: shape[0] = [1,1] vs. shape[1] = [3]
[[{{function_node __inference_serve_549812}}{{node RaggedConcat/concat}}]]
at org.tensorflow.internal.c_api.AbstractTF_Status.throwExceptionIfNotOK(AbstractTF_Status.java:87)
at org.tensorflow.Session.run(Session.java:850)
at org.tensorflow.Session.access$300(Session.java:82)
at org.tensorflow.Session$Runner.runHelper(Session.java:552)
at org.tensorflow.Session$Runner.runNoInit(Session.java:499)
at org.tensorflow.Session$Runner.run(Session.java:495)