Failed to execute an inference in C++

,

I have created and trained a model to recognize some cartoon’s character.
I’m using TensorFlow 2.18, and the model trained with TF Python API is accurate enough.
The accuracy is pretty excellent. After the training, the model is saved with

tensorflow.saved_model.save().

I test the model, with the python API, everything is working fine.

Now I’m trying to test the model with the C++ API.

  1. Loading the model: OK
  2. Converting the image to Tensor: OK
  3. Executing the inference: Error as follows:
2025-02-10 21:07:16.084677: I tensorflow/core/framework/local_rendezvous.cc:405] Local 
rendezvous is aborting with status: FAILED_PRECONDITION: 
Could not find variable sequential/conv2d_3/kernel. This could mean that the variable has been deleted. In TF1,
it can also mean the variable is uninitialized. Debug info: container=localhost, status error message=Resource localhost/sequential/conv2d_3/kernel/N10tensorflow3VarE does not exist.

Please note that the variable not found may vary from an execution to another.

I tried everything I could, knowing that the C++ API doc is not very well documented.

For loading the model, I’m using the following code snippet:

/// Attempts to load Model 

m_status = LoadSavedModel(m_sessionOptions, 
                          m_runOptions,filePath, 
                           tensorflow::kSavedModelTagServe,  
                          &m_bundle);

For executing the inference:

// Execute inference
tensorflow::Status status = 
m_bundle.GetSession()->Run(
{{"serving_default_inputs:0", inputTensor}},
{"StatefulPartitionedCall:0"},
{},
&outputs);

How to resolve this issue?

I’m desperatly looking for a rational explaination.

Cheers.

I was a lost soul like you until I found this guy

This legend found that you have to export the model using keras

from tensorflow.keras.export import ExportArchive

export_archive = ExportArchive()
export_archive.track(model)
export_archive.add_endpoint(
    name="serving_default",
    fn=model.call,
    input_signature=[tf.TensorSpec(shape=(None, 120, 65), dtype=tf.float32)],
)
export_archive.write_out(f'model/{paramSave}')

Well it worked - using tensorflow.keras.export.ExportArchive appears to be the magic incantation:

  • model.save() with a directory name throws a deprecation warning
  • tf.saved_model.save() writes a directory, but it’s not functional
  • keras.export.ExportArchive writes a directory that actually works