Finetunevgg16+timedistributed

HI,
I would like to fine tune VGG16 by adding a sigmoid activation function to last fully connected layer; then fed into an LSTM through TimeDistributed

The code bellow does not work for me:

vgg_model = VGG16(weights=“imagenet”, include_top=True, input_shape=(224,224,3), pooling=None)

model = Sequential()

add all layers except two laster layers [ FC+Predection]

for layer in vgg_model.layers[:-2]:
model.add(layer)

for layer in model.layers:
layer.trainable = False

Add FC with sigmoid

model.add(Dense(4096, activation=“sigmoid”))

add a time distirbuted layer

model.add(TimeDistributed(model, input_shape=(10,224,224,3)))

model.add(LSTM(256, activation=‘relu’, return_sequences=False))

model.add(Dense(64, activation=‘relu’))

model.add(Dropout(0.5))

model.add(Dense(5, activation=‘linear’))

model.summary()

Thank you

Hi @youb ,

Can you please check below modified from the above code and let us know.

#Load the VGG16 model:

vgg_model = VGG16(weights='imagenet', include_top=True, input_shape=(224, 224, 3))

#Freeze the weights of the VGG16 model
for layer in vgg_model.layers:
    layer.trainable = False

# Add a sigmoid activation function to the last fully connected layer
vgg_model.layers[-1].activation = 'sigmoid'

# Add a TimeDistributed layer to the VGG16 model
time_distributed = TimeDistributed(vgg_model)

# Add an LSTM layer to the TimeDistributed layer
lstm = LSTM(256, activation='relu')

# Add a Dense layer to the LSTM layer
dense1=dense = Dense(64, activation='relu')

## Add a droupout layer to the LSTM layer
dropout=model.add(Dropout(0.5))

# Add a Dense layer to the LSTM layer
dense2 = Dense(5, activation='linear')

# Compile the model
model = Sequential()
model.add(time_distributed)
model.add(lstm)
model.add(dense1)
model.add(dropout)
model.add(dense2)
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])

# Train the model
model.fit(x_train, y_train, epochs=10)

# Evaluate the model
loss, accuracy = model.evaluate(x_test, y_test)

Please let me know if it helps you.

Thanks.