Model input incompatible with the Layer

I have created a custom tuner that explores multiple hyperparameters to optimize my model. The custom tuner calls on my a preprocess function that creates time windows (the length of which is a hyperparameter) that are then passed to a create_model function which has several other hyperparameters (learning rate, number of layers, dropout, etc.).

The model is created according to the choices of the hyperparameters including the window length. However, when the tuner.search function is called it says the input of the model doesn’t match the expected input.

Here is my CustomTuner class:

class CustomTuner(kt.tuners.BayesianOptimization):
def run_trial(self, trial, *args, **kwargs):
return super(CustomTuner, self).run_trial(trial, *args, **kwargs)

Here is the tuner that is created:

tuner = CustomTuner(
hypermodel=lambda hp: create_model_and_preprocess_data(hp, train_features, train_output),
objective=SumOfMSEsObjective(),
max_trials=200,
directory=‘logs’,
overwrite=True
)

Here is the preprocess and model creating function:

def create_model_and_preprocess_data(hp, train_features, train_output):
# Use hyperparameters to determine window size
window_size = hp.Int(‘window_size’, min_value=1, max_value=100)

# Window the data based on the determined window size
X_windowed, Y_windowed = create_window_dataset(train_features, train_output, window_size)

# Build the model with the windowed data
model = windowed_model_build(hp, X_windowed.shape[1:], Y_windowed.shape[1:], X_windowed)

return model

I can confirm that the model is created with the window size selected in this function. However, for some reason, it doesn’t get the correct input into the model when it comes time to train it.