Whenever I try to convert a trained checkpoint to a SavedModel using ‘exporter_main_v2.py’, I get a warning that says WARNING:tensorflow:Skipping full serialization of Keras layer
This is causing trouble for later operations that I want to perform with the trained model.
Any thoughts on why this might be happening? Is there any certain arguments that I need to pass to this function for it to convert properly?
It’s impossible to answer this without more details.
Can you figure out which layer it’s complaining about? It might be a custom layer missing the .get_config/.from_config methods.
If your model can’t be loaded as a keras model (tf.keras.models.load_model) it can probably still be loaded as a saved_model with tf.saved_model.load. That doesn’t give you access to the layers, but it will let you use the reloaded model, if that’s what you’re trying to do.
The command I am using is the following: python ~/models/research/object_detection/exporter_main_v2.py --input_type image_tensor --pipeline_config_path pipeline.config --trained_checkpoint_dir ./checkpoint/ --output_directory exported_model/
The following attached screenshot shows the warning message that I get:
Thank you for the reference. I am trying to find solutions through the link that you have sent but there seems to be no responses yet. Am I missing anything?
I have previously looked at this but it is a different error than the warning that I am working with. I am able to successfully export a SavedModel from a trained checkpoint using ‘exporter_main_v2.py’ but the trouble I’m facing is working with the exported saved model after that. I need to convert the exported saved model to another format using a command that calls the function 'meta_optimizer.cc, which then returns an empty graph with 0 nodes and 0 edges after ‘optimizing’ it. I have posted a question about that specifically here https://tensorflow-prod.ospodiscourse.com/t/function-optimizer-py-returns-empty-graph/2724