Hi folks. When using mixed precision to perform transfer learning with any hub model I run into the following error:
ValueError: Could not find matching function to call loaded from the SavedModel. Got:
Positional arguments (2 total):
* Tensor("x:0", shape=(None, 224, 224, 3), dtype=float16)
* False
Keyword arguments: {}
Expected these arguments to match one of the following 4 option(s):
Option 1:
Positional arguments (2 total):
* TensorSpec(shape=(None, None, None, 3), dtype=tf.float32, name='input_1')
* True
Keyword arguments: {}
Option 2:
Positional arguments (2 total):
* TensorSpec(shape=(None, None, None, 3), dtype=tf.float32, name='x')
* False
Keyword arguments: {}
Option 3:
Positional arguments (2 total):
* TensorSpec(shape=(None, None, None, 3), dtype=tf.float32, name='input_1')
* False
Keyword arguments: {}
Option 4:
Positional arguments (2 total):
* TensorSpec(shape=(None, None, None, 3), dtype=tf.float32, name='x')
* True
Keyword arguments: {}
Is it a known issue? To reproduce this, just take this official example: Transfer learning with TensorFlow Hub and add the following lines of code in the library imports:
from tensorflow.keras import mixed_precision
policy = mixed_precision.Policy('mixed_float16')
mixed_precision.set_global_policy(policy)
The data type of the last layer of the classifier should be float32
. This is to prevent numeric instabilities, though. Also, to perform mixed-precision the compute capability of your GPU should be at least 7 or higher. V100 is a good example of this.