I am following the blog post below to run TFX’s RunInference
API on images:
While I was able to export the model so that it can directly consume TFRecord strings I am unable to parse the Beam outputs which look like so:
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:tensorflow:Restoring parameters from tfx-inference-exported/model/image-demo/v1/variables/variables
INFO:tensorflow:Restoring parameters from tfx-inference-exported/model/image-demo/v1/variables/variables
predict_log {
request {
model_spec {
signature_name: "serving_default"
}
inputs {
key: "bytes_inputs"
value {
dtype: DT_STRING
tensor_shape {
dim {
size: 1
}
}
string_val: "..."
}
}
}
response {
outputs {
key: "probabilities"
value {
dtype: DT_FLOAT
tensor_shape {
dim {
size: 1
}
dim {
size: 10
}
}
tensor_content: "\000\000\000\000\000\000\200?\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"
}
}
model_spec {
signature_name: "serving_default"
}
}
}
predict_log {
request {
model_spec {
signature_name: "serving_default"
}
inputs {
key: "bytes_inputs"
value {
dtype: DT_STRING
tensor_shape {
dim {
size: 1
}
}
string_val: "..."
}
}
}
response {
outputs {
key: "probabilities"
value {
dtype: DT_FLOAT
tensor_shape {
dim {
size: 1
}
dim {
size: 10
}
}
tensor_content: "\000\000\000\000\000\000\200?\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000\000"
}
}
model_spec {
signature_name: "serving_default"
}
}
}
I am able to go till the tensor_content
key (with the help of the blog post) :
class PredictionProcessor(beam.DoFn):
def process(
self,
element: prediction_log_pb2.PredictionLog):
predict_log = element.predict_log
output_value = predict_log.response.outputs
# yield tf.train.Example.FromString(output_value['output_0'].tensor_content)
# yield (f"output is {output_value['output_0'].tensor_content[0]}")
yield (f"output is {str(output_value['probabilities'].tensor_content, 'utf-8')}")
But this does not decode the content in the expected manner. This is the underlying model:
def serialize_model():
image_inputs = tf.keras.Input((224, 224, 3), name="image")
x = tf.keras.layers.Conv2D(32, 9)(image_inputs)
x = tf.keras.layers.GlobalAveragePooling2D()(x)
outputs = tf.keras.layers.Dense(10, activation="softmax", dtype="float32")(x)
image_model = tf.keras.Model(image_inputs, outputs)
tf.keras.models.save_model(image_model, save_model_dir_image)
return image_model
And this is how I have exported it (following this book):
_CONCRETE_INPUT = "numpy_inputs"
def _preprocess(bytes_input):
decoded = tf.io.decode_jpeg(bytes_input, channels=3)
resized = tf.image.resize(decoded, size=(224, 224))
return resized
@tf.function(input_signature=[tf.TensorSpec([None], tf.string)])
def preprocess_fn(bytes_inputs):
decoded_images = tf.map_fn(
_preprocess, bytes_inputs, dtype=tf.float32, back_prop=False
)
return {_CONCRETE_INPUT: decoded_images}
def _model_exporter(model: tf.keras.Model):
m_call = tf.function(model.call).get_concrete_function(
[
tf.TensorSpec(
shape=[None, 224, 224, 3], dtype=tf.float32, name=_CONCRETE_INPUT
)
]
)
@tf.function(input_signature=[tf.TensorSpec([None], tf.string)])
def serving_fn(bytes_inputs):
features = tf.io.parse_example(bytes_inputs, image_feature_description)
images = preprocess_fn(features["image_raw"])
probs = m_call(**images)
return {"probabilities": probs}
return serving_fn
save_model_dir_image_new = 'tfx-inference-exported/model/image-demo/v1/'
tf.keras.models.save_model(image_model,
save_model_dir_image_new,
signatures={"serving_default": _model_exporter(image_model)})
Here’s the Colab notebook for full details: Google Colab.
The above-mentioned blog post exports the model in a simpler manner. However, I couldn’t follow that process since images are needed to be decoded properly from the TFRecord strings. The following serving function does not work:
@tf.function(input_signature=[tf.TensorSpec([None], tf.string)])
def serving_fn(bytes_inputs):
features = tf.io.parse_example(bytes_inputs, image_feature_description)
image = _preprocess(features["image_raw"])
features.update({"image": image})
return model(features, training=False)
Is it possible to simplify the model export method I am following and align it with that example has shown?
Ccing @Robert_Crowe @zhitaoli @martin_gorner_tf