Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

peiying410632
from peiying410632 More from this publisher
22.02.2024 Views

browser, you’ll likely see something like this:Figure 2.3 - Empty TensorBoardIt doesn’t show you anything yet because it cannot find any data inside the runsfolder, as we haven’t sent anything there yet. It will be automatically updated whenwe send some data to it so, let’s send some data to TensorBoard!Running It Separately (Binder)If you chose to follow this book using Binder, you’ll need to run TensorBoardseparately.But you won’t have to actually do much. Configuring TensorBoard for runninginside Binder’s environment is a bit tricky (it involves Jupyter’s server extensions),so I took care of that for you :-)Moreover, I’ve provided an automatically generated link that will open a new tabpointing to the TensorBoard instance running in your Binder environment.The link looks like this (the actual URL is generated on the spot, this one is just adummy):Click here to open TensorBoardThe only downside is that the folder where TensorBoard will look for logs is fixed:runs.SummaryWriterIt all starts with the creation of a SummaryWriter:TensorBoard | 155

SummaryWriterwriter = SummaryWriter('runs/test')Since we told TensorBoard to look for logs inside the runs folder, it makes sense toactually log to that folder. Moreover, to be able to distinguish between differentexperiments or models, we should also specify a sub-folder: test.If we do not specify any folder, TensorBoard will default toruns/CURRENT_DATETIME_HOSTNAME, which is not such a greatname if you’ll be looking for your experiment results in the future.So, it is recommended to name it in a more meaningful way, likeruns/test or runs/simple_linear_regression. It will then createa sub-folder inside runs (the folder we specified when we startedTensorBoard).Even better, you should name it in a meaningful way and adddatetime or a sequential number as a suffix, like runs/test_001or runs/test_20200502172130, to avoid writing data of multipleruns into the same folder (we’ll see why this is bad in the"add_scalars" section below).The summary writer implements several methods to allow us to send informationto TensorBoard:add_graph() add_scalars() add_scalar()add_histogram() add_images() add_image()add_figure() add_video() add_audio()add_text() add_embedding() add_pr_curve()add_custom_scalars() add_mesh() add_hparams()It also implements two other methods for effectively writing data to disk:• flush()• close()We’ll be using the first two methods (add_graph() and add_scalars()) to send our156 | Chapter 2: Rethinking the Training Loop

browser, you’ll likely see something like this:

Figure 2.3 - Empty TensorBoard

It doesn’t show you anything yet because it cannot find any data inside the runs

folder, as we haven’t sent anything there yet. It will be automatically updated when

we send some data to it so, let’s send some data to TensorBoard!

Running It Separately (Binder)

If you chose to follow this book using Binder, you’ll need to run TensorBoard

separately.

But you won’t have to actually do much. Configuring TensorBoard for running

inside Binder’s environment is a bit tricky (it involves Jupyter’s server extensions),

so I took care of that for you :-)

Moreover, I’ve provided an automatically generated link that will open a new tab

pointing to the TensorBoard instance running in your Binder environment.

The link looks like this (the actual URL is generated on the spot, this one is just a

dummy):

Click here to open TensorBoard

The only downside is that the folder where TensorBoard will look for logs is fixed:

runs.

SummaryWriter

It all starts with the creation of a SummaryWriter:

TensorBoard | 155

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!