Tensorboard Explained
This class in stored in the module. Has many parameters, the main ones are listed below:
1, log_dir: used to save the Tensorboard's log files and other content location
2. histogram_freq: Frequency of calculating activation values and histogram of model weights for each layer in the model.
3. write_graph: whether to visualize the image in TensorBoard.
4. write_grads: whether to visualize the histogram of gradient values in TensorBoard.
5. batch_size: The size of the input batch of the incoming neuron network computed as a histogram.
6. write_images: whether to visualize the model weights as images in TensorBoard.
7, update_freq: three common values are 'batch', 'epoch' or integer. When using 'batch', the loss and evaluation values are written to the TensorBoard after each batch. 'epoch' is similar. If an integer is used, the loss and evaluation values are written to the TensorBoard after each certain number of samples.
The default values are as follows:
log_dir='./logs', # Saved by default under the logs folder in the current folder histogram_freq=0, batch_size=32, write_graph=True, # Defaults to True, which displays the graph by default. write_grads=False, write_images=False, update_freq='epoch'
Example of use
Using handwriting as an example, we open histogram_freq and write_grads, that is, we save the histogram of weights and the histogram of gradients in Tensorboard.
Open CMD and generate a tensorboard observation page using tensorboard --logdir=logs.
1. loss and acc
2. Histogram of weights
3. Gradient histogram
Implementation Code
import numpy as np from import Input, Dense, Dropout, Activation,Conv2D,MaxPool2D,Flatten from import mnist from import Model from import to_categorical from import TensorBoard if __name__=="__main__": (x_train,y_train),(x_test,y_test) = mnist.load_data() x_train=np.expand_dims(x_train,axis=-1) x_test=np.expand_dims(x_test,axis=-1) y_train=to_categorical(y_train,num_classes=10) y_test=to_categorical(y_test,num_classes=10) batch_size=128 epochs=10 inputs = Input([28,28,1]) x = Conv2D(32, (5,5), activation='relu')(inputs) x = Conv2D(64, (5,5), activation='relu')(x) x = MaxPool2D(pool_size=(2,2))(x) x = Flatten()(x) x = Dense(128, activation='relu')(x) x = Dropout(0.5)(x) x = Dense(10, activation='softmax')(x) model = Model(inputs,x) (loss='categorical_crossentropy', optimizer="adam",metrics=['acc']) Tensorboard= TensorBoard(log_dir="./model", histogram_freq=1,write_grads=True) history=(x_train, y_train, batch_size=batch_size, epochs=epochs, shuffle=True, validation_split=0.2,callbacks=[Tensorboard])
Above is the tensorflow visualization Keras framework in the Tensorboard use example of the details, more information about Keras Tensorboard visualization please pay attention to my other related articles!