Port numbers to use in distributed training?

I’m reading the documentation for distributed training in tensorflow:

Suppose I have three machines, and start two workers on each machine. What port number can I give each worker in TF_CONFIG? Can I choose any arbitrary port number that isn’t already taken by some standard program and the worker will use that port?

Hi @Cheerful_Squirrel, As per my knowledge, yes you can choose the arbitrary port numbers that were not in use, but make sure that the port numbers given to the each workers should be unique and does not have conflict with other services running on the hosts. Thank You.