Thanks in advance for any guidance on this issue and I apologize if I am missing something in the docs. However, despite attempting related solutions (e.g., #1108, #1885, #1906 etc.) I have failed to successfully create two export signature defs one that would allow of raw text predictions via ai platform’s prediction service and one that would allow for the the examplegen component example output (example_gen.outputs[‘examples’]) to be used for model evaluation in the evaluator component.
For reference, I am following the taxi example closely, with the main difference being that I am pulling my own data via BigQuery with BiqQueryExampleGen.
dependencies
- tfx[kfp]==0.30.0
- python 3.7
My latest attempt follows this solution which integrates a separate export signature for raw data via a MyModule(tf.Module) class. Similar to the author @jason-brian-anderson, we avoided the use of tf.reshape because we use the _fill_in_missing operation in our preprocessing_fn which expects and parses sparsetensors. Below is the code embedded within the scope of the run_fn.
class MyModule(tf.Module):
def __init__(self, model, tf_transform_output):
self.model = model
self.tf_transform_output = tf_transform_output
self.model.tft_layer = self.tf_transform_output.transform_features_layer()
@tf.function(input_signature=[tf.TensorSpec(shape=[None], dtype=tf.string, name='examples')])
def serve_tf_examples_fn(self, serialized_tf_examples):
feature_spec = self.tf_transform_output.raw_feature_spec()
feature_spec.pop(features.LABEL_KEY)
parsed_features = tf.io.parse_example(serialized_tf_examples, feature_spec)
transformed_features = self.model.tft_layer(parsed_features)
return self.model(transformed_features)
@tf.function(input_signature=[tf.TensorSpec(shape=(None), dtype=tf.string, name='raw_data')])
def tf_serving_raw_input_fn(self, raw_data):
raw_data_sp_tensor = tf.sparse.SparseTensor(
indices=[[0, 0]],
values=raw_data,
dense_shape=(1, 1)
)
parsed_features = {'raw_data': raw_data_sp_tensor, }
transformed_features = self.model.tft_layer(parsed_features)
return self.model(transformed_features)
module = MyModule(model, tf_transform_output)
signatures = {"serving_default": module.serve_tf_examples_fn,
"serving_raw_input": module.tf_serving_raw_input_fn,
}
tf.saved_model.save(module,
export_dir=fn_args.serving_model_dir,
signatures=signatures,
options=None,
)
The keras model expects 3 inputs: 1 DENSE_FLOAT_FEATURE_KEY and 2 VOCAB_FEATURE_KEYS.
The error I am currently experiencing is “can only concatenate str (not “SparseTensor”) to str” which is occurring at parsed_features = {‘raw_data’: raw_data_sp_tensor, }
I also attempted to manually create the feature spec re:(test by copybara-service[bot] · Pull Request #1906 · tensorflow/tfx · GitHub) but there were naming convention issues and I was unable to expose schema2tensorspec to ensure similar expected input names/data types.
Any and all help is welcome.