Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

peiying410632
from peiying410632 More from this publisher
22.02.2024 Views

model’s graph (not quite the same as the dynamic computation graph we drew usingmake_dot(), though), and, of course, both scalars: training and validation losses.add_graphLet’s start with add_graph(): unfortunately, its documentation seems to be absent(as at the time of writing), and its default values for arguments lead you to believeyou don’t need to provide any inputs (input_to_model=None). What happens if wetry it?writer.add_graph(model)We’ll get an enormous error message that ends with:Output...TypeError: 'NoneType' object is not iterableSo, we do need to send it some inputs together with our model. Let’s fetch a minibatchof data points from our train_loader and then pass it as input toadd_graph():Adding the Model’s Graph# Fetching a tuple of feature (dummy_x) and label (dummy_y)dummy_x, dummy_y = next(iter(train_loader))# Since our model was sent to device, we need to do the same# with the data.# Even here, both model and data need to be on the same device!writer.add_graph(model, dummy_x.to(device))If you open (or refresh) your browser (or re-run the cell containing the magic%tensorboard --logdir runs inside a notebook) to look at TensorBoard, it shouldlook like this:TensorBoard | 157

Figure 2.4 - Dynamic computation graph on TensorBoardadd_scalarsWhat about sending the loss values to TensorBoard? I’m on it! We can use theadd_scalars() method to send multiple scalar values at once; it needs threearguments:• main_tag: the parent name of the tags, or the "group tag," if you will• tag_scalar_dict: the dictionary containing the key: value pairs for the scalarsyou want to keep track of (in our case, training and validation losses)• global_step: step value; that is, the index you’re associating with the valuesyou’re sending in the dictionary; the epoch comes to mind in our case, as lossesare computed for each epochHow does it translate into code? Let’s check it out:Adding Losseswriter.add_scalars(main_tag='loss',tag_scalar_dict={'training': loss,'validation': val_loss},global_step=epoch)If you run the code above after performing the model training, it will just send bothloss values computed for the last epoch (199). Your TensorBoard will look like this(don’t forget to refresh it—it may take a while if you’re running it on Google Colab):158 | Chapter 2: Rethinking the Training Loop

Figure 2.4 - Dynamic computation graph on TensorBoard

add_scalars

What about sending the loss values to TensorBoard? I’m on it! We can use the

add_scalars() method to send multiple scalar values at once; it needs three

arguments:

• main_tag: the parent name of the tags, or the "group tag," if you will

• tag_scalar_dict: the dictionary containing the key: value pairs for the scalars

you want to keep track of (in our case, training and validation losses)

• global_step: step value; that is, the index you’re associating with the values

you’re sending in the dictionary; the epoch comes to mind in our case, as losses

are computed for each epoch

How does it translate into code? Let’s check it out:

Adding Losses

writer.add_scalars(

main_tag='loss',

tag_scalar_dict={'training': loss,

'validation': val_loss},

global_step=epoch

)

If you run the code above after performing the model training, it will just send both

loss values computed for the last epoch (199). Your TensorBoard will look like this

(don’t forget to refresh it—it may take a while if you’re running it on Google Colab):

158 | Chapter 2: Rethinking the Training Loop

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!