TensorFlow 1.0 vs 2.0, Part 4: TensorBoard

Why TensorBoard?

Yusup
AI³ | Theory, Practice, Business

--

In my previous three posts, I covered tf.keras and other features of TensorFlow 1.0 and 2.0 such as computational graphs, Eager execution, and AutoGraph. With this post, I’ll introduce you to the super useful tool, TensorBoard!

After reading, you should understand:

  • Why TensorBoard is useful;
  • How TensorBoard works, and what it can do
  • The basics of TensorBoard Graphs, Scalars, and Histograms.

Why is TensorBoard useful?

"In God we trust, all others must bring data." —W. Edwards Deming.

Photo By Mika Baumeister

Although the human brain is an amazing machine, it’s only capable of processing one value at a time. This is why we need data visualization such as charts and graphs to help us to receive and interpret massive amounts of data.

Building and training computational graphs is not an easy task. Your model might underfit, overfit, or get stuck at a local optimum. You might even have a vanishing gradient problem.

How can you debug when you have all these issues?

TensorBoard, to the rescue!

Often, to improve something, you need to be able to measure them. This can be done with TensorBoard, an auxiliary tool for providing the measurements and visualizations needed during the machine-learning workflow.

How Does TensorBoard Work?

Under the hood, TensorBoard relies on tf.summary Operations to serialize data in protobuf format; then it writes them into the log files.

Here are the full list of tf.summary operations:

  • tf.summary.scalar
  • tf.summary.image
  • tf.summary.audio
  • tf.summary.text
  • tf.summary.histogram

When we run the TensorBoard, it reads the serialized data from the log directory, does all the deserialization, and renders them.

On another note, compared to TensorFlow 1.0, there are some major changes with regard to summary API in TensorFlow 2. But I won’t go into detail on that in this post.

Instead, from here on out, we will go full TensorFlow 2.0 and tf.keras. In our example code, you won’t notice those itsy bitsy, tf.summary-related details due to the perfect encapsulation and abstraction done by tf.keras.

Note: All code samples in this post are for TensorFlow 2.0 only.

What Can TensorBoard Do?

Whether you are using Jupyter, or Colab or any of their variants, you can load the TensorBoard module by using this magic command:

# Load the TensorBoard notebook extension%load_ext tensorboard

As you can see in the TensorBoard screenshot above,Scalars, Graphs, Distributions, Histograms, and Profile are menus at the top.

The Distribution and Profile menus are self-explanatory: the others I will explain one by one. Generally, not all menus show up when you run TensorBoard. For example, if your code is pure eager mode (no graphs), you won’t see a Graphs menu.

Here is the sample code we’ll use to demonstrate TensorBoard:

# Load the TensorBoard notebook extension
%load_ext tensorboard
import tensorflow as tf
import datetime
# Clear any logs from previous runs
!rm -rf ./logs/
mnist = tf.keras.datasets.mnist
(x_train, y_train),(x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0
def create_model():
return tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape=(28, 28)),
tf.keras.layers.Dense(512, activation='relu'),
tf.keras.layers.Dropout(0.2),
tf.keras.layers.Dense(10, activation='softmax')
])
model = create_model()
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
log_dir="logs/fit/" + datetime.datetime.now().strftime("%Y%m%d-%H%M%S")
tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir=log_dir, histogram_freq=1)
model.fit(x=x_train,
y=y_train,
epochs=5,
validation_data=(x_test, y_test),
callbacks=[tensorboard_callback])

Of course, TensorBoard can also be run in standalone mode—you don’t have to embed it into the Notebooks.

Graphs

If you aren’t running everything with pure Eager mode (i.e no AutoGraph), you’ll have to deal with Graphs one way or another. And TensorBoard can help you visualize what your actual underlying Computational Graph looks like in detail.

Depending on the complexity of your problem, your model might have thousands of nodes. This makes it super hard to get a bird’s eye view when the model is too big.

Luckily, tf.keras takes care of all names and scopes. This makes it super easy to navigate through the whole graph. (If you want to go down in the model hierarchy, all you have to do is to click the plus sign to expand the current node).

Scalars

The Scalars tab shows scalar-valued tensor value over epochs. As I mentioned before, tf.keras hides away unnecessary complexity.

In the code sample used above to demonstrate TensorBoard, tf.keras does it with tf.keras.callbacks.TensorBoard callback.

Under the hood, tf.keras logs accuracy and loss data on epoch ends by default, at the end of each epoch. It is also open for customization if you want to change the default behaviors or add custom metrics.

Histograms

A histogram is an accurate representation of the distribution of numerical data.

The TensorBoard Histogram Dashboard above displays how the distribution of some Tensors in your TensorFlow graph has changed over time. It does this by showing many histograms visualizations of your tensor at different points in time.

Summary

“To get rich, you need measurement and leverage.” — Paul Graham

I’ve quoted Paul Graham in my previous posts and can’t resist doing so once more, to reiterate the importance of measurement.

While developing machine learning algorithms with TensorFlow or tf.keras, TensorBoard is the tool you can rely on to measure and debug your algorithms.

Now that we covered the what, why, and how of TensorBoard, let’s take a step back and look at all the boxes we’ve ticked so far:

Thanks for reading! If you enjoyed this article, please hit the clap button as many times as you can. It would mean a lot and encourage me to keep sharing my knowledge.

Feel free to share your questions and comments here, and follow me so you don’t miss the latest content!

--

--