Hey! I’ve detailed my question in this stackoverflow post:
https://stackoverflow.com/questions/73050659/python-generator-gives-error-in-keras-fit-but-trains-fine-without-generator
My use case is having multiple files representing a large datasets.
The files represent batches that are predetermined - each batch cannot change.
In these files are encodings, where observations have a variable number of encondings.
I want to load the data through a generator - or perhaps using tf.data.Dataset if possible.
The final fitting is done through the triplet-semi-hard loss. More details in post above.
Would really appreciate the help with solving this! Thank you!