import tensorflow as tf
from tensorflow.keras.applications import ResNet50
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Flatten
from tensorflow.keras.preprocessing.image import ImageDataGenerator
You might not need to set the data_dir as it will depend on how you upload your data to Google Colab.
Set the parameters for data preprocessing and augmentation
batch_size = 32
image_size = (224, 224)
Create data generators for training, validation, and testing data
train_datagen = ImageDataGenerator(
rescale=1.0/255.0,
shear_range=0.2,
zoom_range=0.2,
horizontal_flip=True,
validation_split=0.2
)
In Google Colab, you’ll need to upload your data to the Colab environment or mount Google Drive to access your data.
You can use the following code to upload a zip file to Google Colab and extract it:
from google.colab import files
uploaded = files.upload()
!unzip data.zip
Replace ‘data_dir’ with the path where you uploaded or extracted your data in Google Colab.
data_dir = ‘/content/drive/MyDrive/CP III/DATASET’
train_generator = train_datagen.flow_from_directory(
data_dir,
target_size=image_size,
batch_size=batch_size,
class_mode=‘categorical’,
subset=‘training’
)
valid_generator = train_datagen.flow_from_directory(
data_dir,
target_size=image_size,
batch_size=batch_size,
class_mode=‘categorical’,
subset=‘validation’
)
Build the ResNet-50 model
base_model = ResNet50(include_top=False, weights=‘imagenet’, input_shape=(224, 224, 3))
model = Sequential()
model.add(base_model)
model.add(Flatten())
model.add(Dense(50, activation=‘softmax’))
Compile the model
model.compile(optimizer=‘adam’, loss=‘categorical_crossentropy’, metrics=[‘accuracy’])
Train the model
epochs = 100
steps_per_epoch = train_generator.n // train_generator.batch_size
validation_steps = valid_generator.n // valid_generator.batch_size
model.fit(
train_generator,
epochs=epochs,
validation_data=valid_generator,
)
Evaluate the model on the test data
test_datagen = ImageDataGenerator(rescale=1.0/255.0)
test_generator = test_datagen.flow_from_directory(
data_dir,
target_size=image_size,
batch_size=batch_size,
class_mode=‘categorical’,
shuffle=False
)
test_loss, test_accuracy = model.evaluate(test_generator)
print(f’Test Loss: {test_loss:.4f}‘)
print(f’Test Accuracy: {test_accuracy:.4f}’)