TensorBoard helps you visualize a lot of data that comes out of TensorFlow like loss, accuracy, model graph, projecting embeddings, and much more.
This quickstart tutorial will show how to quickly get started with TensorBoard in Google Colab. Google Colab is a service from Google research that makes it really easy to get started with TensorFlow keras.
We can enable showing TensorBoard directly within Colab. You’ll notice that the way that we start TensorBoard here is exactly the same as in the command line. It’s the same command, just has the magic function in front of it. The same thing will also work in Jupyter Notebooks.
We’re going to use the fashion MNIST dataset, and we’re going to train a really simple keras sequential model, one that just has a few layers, including dense and dropout layers.
fashion_mnist = tf.keras.datasets.fashion_mnist
(x_train, y_train),(x_test, y_test) = fashion_mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0
def create_model():
return tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape=(28, 28)),
tf.keras.layers.Dense(512, activation='relu'),
tf.keras.layers.Dropout(0.2),
tf.keras.layers.Dense(10, activation='softmax')
])
Create Log Directory
TensorBoard requires a logdir
to read logs from. Create the logs in a timestamped subdirectory to allow easy selection of different training runs.
logdir = os.path.join("logs", datetime.datetime.now().strftime("%Y%m%d-%H%M%S"))
print(logdir)
Run TensorBord in Google Colab
We’re going to start TensorBoard before doing our training because, in most cases, your training will take longer than one minute and you want to view the TensorBord while your model is training to understand its progress.
%tensorboard --port=5036 --logdir $logdir
UsageError: Line magic function %tensorboard not found
Before you run TensorBoard you need to load the TensorBoard extension. To load it, use:
%load_ext tensorboard
Now, to start TensorBoard, specify the root log directory you used above.
%tensorboard --port=5036 --logdir $logdir
No dashboards are active for the current data set..
We started TensorBoard but it has no data. Once a couple of epochs of training have finished, we can refresh it and start to see something. Wait a few seconds for TensorBoard’s UI to spin up.
Train Model
You’re now ready to train and evaluate your model. We’re going to train it with a keras fit API and give it the TensorBord callback so that we make sure that we log the right data to visualize in TensorBoard.
Create the Keras TensorBoard callback and specify a log directory. To enable histogram computation every epoch specify histogram_freq=1
this is off by default
model = create_model()
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
tensorboard_callback = tf.keras.callbacks.TensorBoard(logdir, histogram_freq=1)
Pass the TensorBoard callback to Keras’ Model.fit() to ensure that logs are created and stored.
model.fit(x=x_train,
y=y_train,
epochs=20,
validation_data=(x_test, y_test),
callbacks=[tensorboard_callback])
So let’s see what’s different. First of all, when using the keras callback, we have train and validation showing up on the same charts to make it much easier to compare accuracy, loss, and other metrics. This makes it easier to detect things like over-fitting.
