Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub
browser, you’ll likely see something like this:Figure 2.3 - Empty TensorBoardIt doesn’t show you anything yet because it cannot find any data inside the runsfolder, as we haven’t sent anything there yet. It will be automatically updated whenwe send some data to it so, let’s send some data to TensorBoard!Running It Separately (Binder)If you chose to follow this book using Binder, you’ll need to run TensorBoardseparately.But you won’t have to actually do much. Configuring TensorBoard for runninginside Binder’s environment is a bit tricky (it involves Jupyter’s server extensions),so I took care of that for you :-)Moreover, I’ve provided an automatically generated link that will open a new tabpointing to the TensorBoard instance running in your Binder environment.The link looks like this (the actual URL is generated on the spot, this one is just adummy):Click here to open TensorBoardThe only downside is that the folder where TensorBoard will look for logs is fixed:runs.SummaryWriterIt all starts with the creation of a SummaryWriter:TensorBoard | 155
SummaryWriterwriter = SummaryWriter('runs/test')Since we told TensorBoard to look for logs inside the runs folder, it makes sense toactually log to that folder. Moreover, to be able to distinguish between differentexperiments or models, we should also specify a sub-folder: test.If we do not specify any folder, TensorBoard will default toruns/CURRENT_DATETIME_HOSTNAME, which is not such a greatname if you’ll be looking for your experiment results in the future.So, it is recommended to name it in a more meaningful way, likeruns/test or runs/simple_linear_regression. It will then createa sub-folder inside runs (the folder we specified when we startedTensorBoard).Even better, you should name it in a meaningful way and adddatetime or a sequential number as a suffix, like runs/test_001or runs/test_20200502172130, to avoid writing data of multipleruns into the same folder (we’ll see why this is bad in the"add_scalars" section below).The summary writer implements several methods to allow us to send informationto TensorBoard:add_graph() add_scalars() add_scalar()add_histogram() add_images() add_image()add_figure() add_video() add_audio()add_text() add_embedding() add_pr_curve()add_custom_scalars() add_mesh() add_hparams()It also implements two other methods for effectively writing data to disk:• flush()• close()We’ll be using the first two methods (add_graph() and add_scalars()) to send our156 | Chapter 2: Rethinking the Training Loop
- Page 130 and 131: Let’s build a proper (yet simple)
- Page 132 and 133: "What do we need this for?"It turns
- Page 134 and 135: 1 Instantiating a model2 What IS th
- Page 136 and 137: In the __init__() method, we create
- Page 138 and 139: LayersA Linear model can be seen as
- Page 140 and 141: There are MANY different layers tha
- Page 142 and 143: We use magic, just like that:%run -
- Page 144 and 145: • Step 1: compute model’s predi
- Page 146 and 147: RecapFirst of all, congratulations
- Page 148 and 149: Chapter 2Rethinking the Training Lo
- Page 150 and 151: Let’s take a look at the code onc
- Page 152 and 153: Higher-Order FunctionsAlthough this
- Page 154 and 155: def exponentiation_builder(exponent
- Page 156 and 157: Apart from returning the loss value
- Page 158 and 159: Our code should look like this; see
- Page 160 and 161: There is no need to load the whole
- Page 162 and 163: but if we want to get serious about
- Page 164 and 165: How does this change our code so fa
- Page 166 and 167: Run - Model Training V2%run -i mode
- Page 168 and 169: piece of code that’s going to be
- Page 170 and 171: for it. We could do the same for th
- Page 172 and 173: EvaluationHow can we evaluate the m
- Page 174 and 175: And then, we update our model confi
- Page 176 and 177: Run - Model Training V4%run -i mode
- Page 178 and 179: Loading Extension# Load the TensorB
- Page 182 and 183: model’s graph (not quite the same
- Page 184 and 185: Figure 2.5 - Scalars on TensorBoard
- Page 186 and 187: Define - Model Training V51 %%write
- Page 188 and 189: If, by any chance, you ended up wit
- Page 190 and 191: The procedure is exactly the same,
- Page 192 and 193: soon, so please bear with me for no
- Page 194 and 195: After recovering our model’s stat
- Page 196 and 197: Run - Model Configuration V31 # %lo
- Page 198 and 199: This is the general structure you
- Page 200 and 201: Chapter 2.1Going ClassySpoilersIn t
- Page 202 and 203: # A completely empty (and useless)
- Page 204 and 205: # These attributes are defined here
- Page 206 and 207: # Creates the train_step function f
- Page 208 and 209: # Builds function that performs a s
- Page 210 and 211: setattrThe setattr function sets th
- Page 212 and 213: See? We effectively modified the un
- Page 214 and 215: the random seed as arguments.This s
- Page 216 and 217: The current state of development of
- Page 218 and 219: Lossesdef plot_losses(self):fig = p
- Page 220 and 221: Run - Data Preparation V21 # %load
- Page 222 and 223: Model TrainingWe start by instantia
- Page 224 and 225: Making PredictionsLet’s make up s
- Page 226 and 227: OutputOrderedDict([('0.weight', ten
- Page 228 and 229: Run - Data Preparation V21 # %load
browser, you’ll likely see something like this:
Figure 2.3 - Empty TensorBoard
It doesn’t show you anything yet because it cannot find any data inside the runs
folder, as we haven’t sent anything there yet. It will be automatically updated when
we send some data to it so, let’s send some data to TensorBoard!
Running It Separately (Binder)
If you chose to follow this book using Binder, you’ll need to run TensorBoard
separately.
But you won’t have to actually do much. Configuring TensorBoard for running
inside Binder’s environment is a bit tricky (it involves Jupyter’s server extensions),
so I took care of that for you :-)
Moreover, I’ve provided an automatically generated link that will open a new tab
pointing to the TensorBoard instance running in your Binder environment.
The link looks like this (the actual URL is generated on the spot, this one is just a
dummy):
Click here to open TensorBoard
The only downside is that the folder where TensorBoard will look for logs is fixed:
runs.
SummaryWriter
It all starts with the creation of a SummaryWriter:
TensorBoard | 155