Exporter_main_v2.py 'skipping full serialization of Keras layer' error

Hello,

Whenever I try to convert a trained checkpoint to a SavedModel using ‘exporter_main_v2.py’, I get a warning that says WARNING:tensorflow:Skipping full serialization of Keras layer

This is causing trouble for later operations that I want to perform with the trained model.

Any thoughts on why this might be happening? Is there any certain arguments that I need to pass to this function for it to convert properly?

Thanks.

It’s impossible to answer this without more details.

Can you figure out which layer it’s complaining about? It might be a custom layer missing the .get_config/.from_config methods.

If your model can’t be loaded as a keras model (tf.keras.models.load_model) it can probably still be loaded as a saved_model with tf.saved_model.load. That doesn’t give you access to the layers, but it will let you use the reloaded model, if that’s what you’re trying to do.

Hello,

I am trying to export the trained checkpoint of the SSD MobilenetV2 320x320 model found on TF2 Zoo http://download.tensorflow.org/models/object_detection/tf2/20200711/ssd_mobilenet_v2_320x320_coco17_tpu-8.tar.gz.

The command I am using is the following:
python ~/models/research/object_detection/exporter_main_v2.py --input_type image_tensor --pipeline_config_path pipeline.config --trained_checkpoint_dir ./checkpoint/ --output_directory exported_model/

The following attached screenshot shows the warning message that I get:

I am using tf-nightly.

https://github.com/tensorflow/models/issues/9940

Thank you for the reference. I am trying to find solutions through the link that you have sent but there seems to be no responses yet. Am I missing anything?

It is just a reference to the same topic on the github repository.

Have you tried with something like:

https://github.com/tensorflow/models/issues/8841#issuecomment-657647648

1 Like

I have previously looked at this but it is a different error than the warning that I am working with. I am able to successfully export a SavedModel from a trained checkpoint using ‘exporter_main_v2.py’ but the trouble I’m facing is working with the exported saved model after that. I need to convert the exported saved model to another format using a command that calls the function 'meta_optimizer.cc, which then returns an empty graph with 0 nodes and 0 edges after ‘optimizing’ it. I have posted a question about that specifically here https://tensorflow-prod.ospodiscourse.com/t/function-optimizer-py-returns-empty-graph/2724

Infact I am also getting this error. Due to this I am unable to export my model into tflite.