I have developed a lstm model with tensorflow and keras.
The model is trained through “ray” framework with distributed data set.
I can save the model from the chief node.
But what is about the keras Tokenizer? It is created in every worker node.
If I save it from chief node, I lose some vocabulary. And it effects in serving.
Please Help.