this is the code:
learning_rate = get_learning_rate(batch)
tf.summary.scalar(‘learning_rate’, learning_rate)
if OPTIMIZER == 'momentum':
optimizer = tf.keras.optimizers.SGD(learning_rate,
momentum=MOMENTUM)
elif OPTIMIZER == 'adam':
optimizer = tf.keras.optimizers.Adam(learning_rate)
train_op = optimizer.minimize(loss, global_step=batch)
# Add ops to save and restore all the variables.
saver = tf.train.Saver()
this is the issue:
train_op = optimizer.minimize(loss, global_step=batch)
TypeError: minimize() got an unexpected keyword argument ‘global_step’