I’m working on a custom training loop in TensorFlow 2.x where I need to apply data augmentation (rotation, zoom, and horizontal flip) to my image batches during training. My current setup uses tf.data.Dataset for the input pipeline.
Here’s my current code:
batch_size = 32
input_shape = (224, 224, 3)
dataset = tf.data.Dataset.from_tensor_slices((x_train, y_train))
dataset = dataset.shuffle(buffer_size=1024).batch(batch_size)
model = create_model(input_shape)
optimizer = tf.keras.optimizers.Adam(learning_rate=0.001)
I want to:
- Apply random augmentations to each batch during training
- Keep the augmentations consistent within the custom training loop
- Ensure the augmentations are performed on GPU for better performance
What’s the most efficient way to implement this in TensorFlow 2.x while maintaining the custom training loop structure?