Link logs file from google drive folder to google colab and run Tensorboard for visualization the model

Hi, I am running a dozen of model with different architecture in google colab and save the logs file to my google drive folder. Now I would like to use TensorBoard to visualize the accuracy and loss of the models, but I can’t call it from my google drive folder

According to sample provided by Tensorflow (Google Colab). The code is shown below:
%tensorboard --logdir logs

But I saved the logs in my google drive in case of losing them when colab collapsed or interrupted.

I tried the following code but it is still not working
%tensorboard --logdir content/drive/MyDrive/CNN/logs
%tensorboard --logdir ‘content/drive/MyDrive/CNN/logs’

Hope someone can help me on this issue

1 Like

Just to make sure, can you access the logs from a cell, like !lson the dir?

Even with the log’s on drive, if you were disconnected from colab at some point and comeback, you might need to re-mount the drive folder for each to be accessible.

1 Like

hello, I’m having trouble running tensorboard as well:

Is there a way to pause/unpause execution of code cells in google colab? I know we can interrupt and stop code cells but can we pause/unpause?

Maybe I need to give some background on what I’m trying to do:
I have tensorboard watching a folder on google drive for my log dir for changes to my model’s metrics. It’s on a separate instance of colab since I can’t run two instances of a code cell at the same time on the same colab notebook (run tenorboard in one cell in parallel to another cell running my training and eval script…). I realized having tensorboard on a separate instance sort of works but it seems when tensorflow updates log files… the file isn’t updated in real time across google drive end points… so for the latest log dir files to be written and synced the code has to stop… that’s when it syncs the files across google drive… and that’s when tensorboard detects a change and sees the files… so this is why I was wondering if I could “stop”, “resume” code cells… just to have those log directory files written and synced then I can unpause the code cells and continue training… or is there another way to do this?

I know that on a local machine this isn’t a problem… just run all the python scripts in parallel… just wondering how to do it in google colab. Thanks for any advice!

As far as I know you can’t pause the cell execution on colab.

Maybe there might be some possible customization here: TensorBoard

but if you are using this callback already, it will write the logs every epoch by default. If it’s not being flushed at the pace you’d like, I don’t know if there’s much that can be done.

I created code example: AIU_CS512_DL/000_Lecture_Project/week06/02_Code at main · peterhchen/AIU_CS512_DL · GitHub

from google.colab import drive
drive.mount(‘/content/drive/’)
path_name = “/content/drive/”
print(‘!ls {path_name}:’)
!ls {path_name}
print()
mydrive_name = path_name + ‘MyDrive’
print(‘!ls {mydrive_name}:’)
!ls {mydrive_name}
print()
log_folder = path_name + ‘MyDrive/’ + ‘log/’
print(‘log_folder:’, log_folder)
print(‘!ls {log_folder}:’)
!ls {log_folder}

model = keras.Sequential([
keras.layers.Flatten(input_shape=(28, 28)),
keras.layers.Dense(100, activation=‘relu’),
keras.layers.Dense(10, activation=‘sigmoid’)
])

model.compile(optimizer=‘SGD’,
loss=‘sparse_categorical_crossentropy’,
metrics=[‘accuracy’])

#tb_callback = tf.keras.callbacks.TensorBoard(log_dir=“logs/”, histogram_freq=1)
tb_callback = tf.keras.callbacks.TensorBoard(log_dir=log_folder, histogram_freq=1)
model.fit(X_train, y_train, epochs=5, callbacks=[tb_callback])