Keras-Tuner parallel processing using multi-core CPU

Hello all,
I am currently running a hyperparameter search using GridSearch. I wanted to get some clarification on whether or not Keras-Tuner can utilize multiple CPU cores/threads to speed up the process of Hyperparameter tuning. I know there is documentation online about utilizing multiple GPUs via tf.distribute, but I am specifically interested in utilizing my Threadripper PRO 3975WX and NVIDIA RTX A6000.

Thanks

Hi,

Distributed hyperparameter search can be done by using keras-Tuner. Distributed hyperparameter is used to tune the hyperparameter of models with multiple GPUs and multiple machines.Using that keras can be run using single threaded or multiple workers parrellel. Here you can fine more details regarding this.