Recently, I try to use bert to do some contrastive learning in tfhub. You know that dropout rate is a important param in contrastive learning (for nlp task), but I don’t find any method to set dropout rate in tfhub api. Is there some way to set dropout in tfhub?
I don’t think you can change dropout as a parameter from the models on Hub.
What you could do instead, is create a model yourself (eg: Classify text with BERT | Text | TensorFlow) based on the BERT encoders on TFHub
and define the dropout rate you want. With this you have full control of the parameter.
does that work for you?