I’m looking for a working Python sample that given an image dataset with a class for each folder it trains a custom model using transfer learning on EfficientNetV2XL downloaded form TfHub and loaded with

```
model = tf.saved_model.load(path)
```

Thanks.

I’m looking for a working Python sample that given an image dataset with a class for each folder it trains a custom model using transfer learning on EfficientNetV2XL downloaded form TfHub and loaded with

```
model = tf.saved_model.load(path)
```

Thanks.

here you go: Retraining an Image Classifier | TensorFlow Hub

I think this can give you a very good start

Thanks for the answer but the problem is that * model* obtained from tfhub downloaded archive with

`model = tf.saved_model.load(extraction_path)`

is a different object than * model* obtained with

```
model = tf.keras.Sequential([
tf.keras.layers.InputLayer(input_shape=IMAGE_SIZE + (3,)),
hub.KerasLayer(model_handle, trainable=do_fine_tuning),
tf.keras.layers.Dropout(rate=0.2),
tf.keras.layers.Dense(len(class_names), kernel_regularizer=tf.keras.regularizers.l2(0.0001))
])
model.build((None,)+IMAGE_SIZE+(3,))
```

as in the example of the link (and other examples I have seen)

and I cannot call e.g. methods as model.fit or model.predict on `model = tf.saved_model.load(extraction_path)`

In what way I should proceed if I want to use the saved models?

Sorry, it’s not clear what you want to do.

A model from tfhub, ideally shoud be used in one of two way:

- hub.load(model_path)
- hub.KerasLayer(model_path)

for 1, you will get a model that can do inference directly (hub.load | TensorFlow Hub). This is equivalent to tf.saved_model.load and the result is a TensorFlow 2 low-level module. This is not a Keras object so it doesn’t have the fit and predict methods

For 2, you will get a layer that can be used to compose a model. You can still use it for inference, it will work but it’s not optmal. (hub.KerasLayer | TensorFlow Hub). The returned object is wrapped such that it can be used as a Keras layer.

so 1 and 2 are the same model but presented in a different way

For what you want to do, you can follow the colab I shared, fine tune your model, save it and then load it anyway you want and just run inference on it

does it makes sense?

in both cases, what you have is a