I apologize in advance as this question must have been asked before but I couldn’t find a resource. The input of my model is of dimension 130. I create the dataset from a list of lists. The dataset is large, but it’s very fast. However, afterwards I apply the function tf.data.Dataset.from_tensor_slices, which takes a long time, too long. What can I do?
Hi @over_dose ,
can you try to convert the list
of lists to an np.array()
before creating the dataset?
Somehow it’s faster, than passing lists … (see screenshot)
Please feel free to provide some exact timings and your input data size.
Looking forward,
Dennis
1 Like