I’m adapting some code to neaten it up and call it from a module. The original code runs through fine where an uncompiled model is explicitly defined near the learning rate test. Please have a look at the error and code and let me know where I’m going wrong.
When I call the code as a function passing a similar compiled model I receive:
ValueError: `y` argument is not supported when using dataset as input.
This is raised at:
lr_history = compiled_model.fit(training_dataset, num_epochs,
callbacks=[lr_schedule])
The original code:
def create_uncompiled_model():
model = tf.keras.models.Sequential([
tf.keras.layers.Conv1D(filters=64, kernel_size=3, ...),
tf.keras.layers.LSTM(64, return_sequences=True),
tf.keras.layers.Dense(10, activation="relu"),
tf.keras.layers.Dense(1),
])
return model
def adjust_learning_rate(dataset):
model = create_uncompiled_model()
lr_schedule = tf.keras.callbacks.LearningRateScheduler(lambda epoch: 1e-4 * 10**(epoch / 20))
optimizer = tf.keras.optimizers.SGD(momentum=0.9)
model.compile(loss=tf.keras.losses.Huber(),
optimizer=optimizer,
metrics=["mae"])
history = model.fit(dataset, epochs=100, callbacks=[lr_schedule])
return history
# Run the training with dynamic LR
lr_history = adjust_learning_rate(train_set)
My function code:
def learning_rate_history(compiled_model, train_set):
lr_schedule = tf.keras.callbacks.LearningRateScheduler(lambda epoch: \
1e-4 * 10**(epoch / 20))
lr_history = compiled_model.fit(train_set, num_epochs,
callbacks=[lr_schedule])
return lr_history