How do I use the Tensorboard callback of Keras?

2019-01-12 16:22发布

I have built a neural network with Keras. I would visualize its data by Tensorboard, therefore I have utilized:

keras.callbacks.TensorBoard(log_dir='/Graph', histogram_freq=0,
                            write_graph=True, write_images=True)

as explained in keras.io. When I run the callback I get <keras.callbacks.TensorBoard at 0x7f9abb3898>, but I don't get any file in my folder "Graph". Is there something wrong in how I have used this callback?

8条回答
家丑人穷心不美
2楼-- · 2019-01-12 16:43

If you are working with Keras library and want to use tensorboard to print your graphs of accuracy and other variables, Then below are the steps to follow.

step 1: Initialize the keras callback library to import tensorboard by using below command

from keras.callbacks import TensorBoard

step 2: Include the below command in your program just before "model.fit()" command.

tensor_board = TensorBoard(log_dir='./Graph', histogram_freq=0, write_graph=True, write_images=True)

Note: Use "./graph". It will generate the graph folder in your current working directory, avoid using "/graph".

step 3: Include Tensorboard callback in "model.fit()".The sample is given below.

model.fit(X_train,y_train, batch_size=batch_size, epochs=nb_epoch, verbose=1, validation_split=0.2,callbacks=[tensor_board])

step 4 : Run your code and check whether your graph folder is there in your working directory. if the above codes work correctly you will have "Graph" folder in your working directory.

step 5 : Open Terminal in your working directory and type the command below.

tensorboard --logdir ./Graph

step 6: Now open your web browser and enter the address below.

htttp://localhost:6006

After entering, the Tensorbaord page will open where you can see your graphs of different variables.

查看更多
神经病院院长
3楼-- · 2019-01-12 16:52

Here is some code:

K.set_learning_phase(1)
K.set_image_data_format('channels_last')

tb_callback = keras.callbacks.TensorBoard(
    log_dir=log_path,
    histogram_freq=2,
    write_graph=True
)
tb_callback.set_model(model)
callbacks = []
callbacks.append(tb_callback)

# Train net:
history = model.fit(
    [x_train],
    [y_train, y_train_c],
    batch_size=int(hype_space['batch_size']),
    epochs=EPOCHS,
    shuffle=True,
    verbose=1,
    callbacks=callbacks,
    validation_data=([x_test], [y_test, y_test_coarse])
).history

# Test net:
K.set_learning_phase(0)
score = model.evaluate([x_test], [y_test, y_test_coarse], verbose=0)

Basically, histogram_freq=2 is the most important parameter to tune when calling this callback: it sets an interval of epochs to call the callback, with the goal of generating fewer files on disks.

So here is an example visualization of the evolution of values for the last convolution throughout training once seen in TensorBoard, under the "histograms" tab (and I found the "distributions" tab to contain very similar charts, but flipped on the side):

tensorboard weights monitoring

In case you would like to see a full example in context, you can refer to this open-source project: https://github.com/Vooban/Hyperopt-Keras-CNN-CIFAR-100

查看更多
甜甜的少女心
4楼-- · 2019-01-12 16:54

Change

keras.callbacks.TensorBoard(log_dir='/Graph', histogram_freq=0,  
          write_graph=True, write_images=True)

to

tbCallBack = keras.callbacks.TensorBoard(log_dir='Graph', histogram_freq=0,  
          write_graph=True, write_images=True)

and set your model

tbCallback.set_model(model)

Run in your terminal

tensorboard  --logdir Graph/
查看更多
5楼-- · 2019-01-12 16:56

You should check out Losswise (https://losswise.com), it has a plugin for Keras that's easier to use than Tensorboard and has some nice extra features. With Losswise you'd just use from losswise.libs import LosswiseKerasCallback and then callback = LosswiseKerasCallback(tag='my fancy convnet 1') and you're good to go (see https://docs.losswise.com/#keras-plugin).

查看更多
Evening l夕情丶
6楼-- · 2019-01-12 17:04

There are few things.

First, not /Graph but ./Graph

Second, when you use the TensorBoard callback, always pass validation data, because without it, it wouldn't start.

Third, if you want to use anything except scalar summaries, then you should only use the fit method because fit_generator will not work. Or you can rewrite the callback to work with fit_generator.

To add callbacks, just add it to model.fit(..., callbacks=your_list_of_callbacks)

查看更多
Animai°情兽
7楼-- · 2019-01-12 17:07
keras.callbacks.TensorBoard(log_dir='./Graph', histogram_freq=0,  
          write_graph=True, write_images=True)

This line creates a Callback Tensorboard object, you should capture that object and give it to the fit function of your model.

tbCallBack = keras.callbacks.TensorBoard(log_dir='./Graph', histogram_freq=0, write_graph=True, write_images=True)
...
model.fit(...inputs and parameters..., callbacks=[tbCallBack])

This way you gave your callback object to the function. It will be ran during the training and will output files that can be used with tensorboard.

If you want to visualize the files created during training, run in your terminal

tensorboard --logdir path_to_current_dir/Graph 

Hope this helps !

查看更多
登录 后发表回答