Hello, I am relatively new to deep learning and wanted to know if the number of layers of the pre-trained models in the TensorFlow model zoo can be reduced to improve inference time. If so how can I do this?
Hi @courtney_ann ,
Yes, you can reduce the number of layers in pre-trained models from the TensorFlow Model Zoo to improve inference time. I have loaded the MobileNetV3Small
pre-trained model.
Total params: 2,554,968 (9.75 MB) Trainable params: 2,542,856 (9.70 MB) Non-trainable params: 12,112 (47.31 KB)
and after taking the intermediate layer from the existing architecture to create new_model. After creating new model with reduced layers, below are the total trainable parameters.
Total params: 7,908 (30.89 KB) Trainable params: 7,524 (29.39 KB) Non-trainable params: 384 (1.50 KB)
Attached the reference gist for the same.
Thanks.