I’m using Google Colab with a GPU. Are there any reasons why a specific model wouldn’t be able to use a GPU for training?
In my case, I’ve worked through the Tensorflow Recommender example, and it trained perfectly on GPU. However, with my own model, it’s only training on CPU which seems significantly slower.
Here’s my training notebook - Google Colab
Maybe loading the data from SQLite is the problem. Besides that, I am using a slightly different setup from the example notebook. My inputs are based on embeddings from GPT3.