Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub
# These attributes are defined here, but since they are# not available at the moment of creation, we keep them Noneself.train_loader = Noneself.val_loader = Noneself.writer = NoneThe train data loader is obviously required. How could we possibly train a modelwithout it?"Why don’t we make the train data loader an argument then?"Conceptually speaking, the data loader (and the dataset it contains) is not part ofthe model. It is the input we use to train the model. Since we can specify a modelwithout it, it shouldn’t be made an argument of our class.In other words, our StepByStep class is defined by a particular combination ofarguments (model, loss function, and optimizer), which can then be used toperform model training on any (compatible) dataset.The validation data loader is not required (although it is recommended), and thesummary writer is definitely optional.The class should implement methods to allow the user to supply those at a latertime (both methods should be placed inside the StepByStep class, after theconstructor method):def set_loaders(self, train_loader, val_loader=None):# This method allows the user to define which train_loader# (and val_loader, optionally) to use# Both loaders are then assigned to attributes of the class# So they can be referred to laterself.train_loader = train_loaderself.val_loader = val_loaderdef set_tensorboard(self, name, folder='runs'):# This method allows the user to create a SummaryWriter to# interface with TensorBoardsuffix = datetime.datetime.now().strftime('%Y%m%d%H%M%S')self.writer = SummaryWriter(f'{folder}/{name}_{suffix}')Going Classy | 179
"Why do we need to specify a default value to the val_loader? Itsplaceholder value is already None."Since the validation loader is optional, setting a default value for a particularargument in the method’s definition frees the user from having to provide thatargument when calling the method. The best default value, in our case, is the samevalue we chose when specifying the placeholder for the validation loader: None.VariablesThen, there are variables we may want to keep track of. Typical examples are thenumber of epochs, and the training and validation losses. These variables are likelyto be computed and updated internally by the class.We need to append the following code to the constructor method (like we didwith the placeholders):# These attributes are going to be computed internallyself.losses = []self.val_losses = []self.total_epochs = 0"Can’t we just set these variables whenever we use them for the firsttime?"Yes, we could, and we would probably get away with it just fine since our class isquite simple. As classes grow more complex, though, it may lead to problems. So, itis best practice to define all attributes of a class in the constructor method.FunctionsFor convenience, sometimes it is useful to create attributes that are functions,which will be called somewhere else inside the class. In our case, we can createboth train_step_fn() and val_step_fn() using the higher-order functions wedefined in Chapter 2 (Helper Functions #1 and #3, respectively). Both of them takea model, a loss function, and an optimizer as arguments, and all of those are alreadyknown attributes of our StepByStep class at construction time.The code below will be the last addition to our constructor method (once again, aswe did with the placeholders):180 | Chapter 2.1: Going Classy
- Page 154 and 155: def exponentiation_builder(exponent
- Page 156 and 157: Apart from returning the loss value
- Page 158 and 159: Our code should look like this; see
- Page 160 and 161: There is no need to load the whole
- Page 162 and 163: but if we want to get serious about
- Page 164 and 165: How does this change our code so fa
- Page 166 and 167: Run - Model Training V2%run -i mode
- Page 168 and 169: piece of code that’s going to be
- Page 170 and 171: for it. We could do the same for th
- Page 172 and 173: EvaluationHow can we evaluate the m
- Page 174 and 175: And then, we update our model confi
- Page 176 and 177: Run - Model Training V4%run -i mode
- Page 178 and 179: Loading Extension# Load the TensorB
- Page 180 and 181: browser, you’ll likely see someth
- Page 182 and 183: model’s graph (not quite the same
- Page 184 and 185: Figure 2.5 - Scalars on TensorBoard
- Page 186 and 187: Define - Model Training V51 %%write
- Page 188 and 189: If, by any chance, you ended up wit
- Page 190 and 191: The procedure is exactly the same,
- Page 192 and 193: soon, so please bear with me for no
- Page 194 and 195: After recovering our model’s stat
- Page 196 and 197: Run - Model Configuration V31 # %lo
- Page 198 and 199: This is the general structure you
- Page 200 and 201: Chapter 2.1Going ClassySpoilersIn t
- Page 202 and 203: # A completely empty (and useless)
- Page 206 and 207: # Creates the train_step function f
- Page 208 and 209: # Builds function that performs a s
- Page 210 and 211: setattrThe setattr function sets th
- Page 212 and 213: See? We effectively modified the un
- Page 214 and 215: the random seed as arguments.This s
- Page 216 and 217: The current state of development of
- Page 218 and 219: Lossesdef plot_losses(self):fig = p
- Page 220 and 221: Run - Data Preparation V21 # %load
- Page 222 and 223: Model TrainingWe start by instantia
- Page 224 and 225: Making PredictionsLet’s make up s
- Page 226 and 227: OutputOrderedDict([('0.weight', ten
- Page 228 and 229: Run - Data Preparation V21 # %load
- Page 230 and 231: • defining our StepByStep class
- Page 232 and 233: import numpy as npimport torchimpor
- Page 234 and 235: Next, we’ll standardize the featu
- Page 236 and 237: Equation 3.1 - A linear regression
- Page 238 and 239: The odds ratio is given by the rati
- Page 240 and 241: As expected, probabilities that add
- Page 242 and 243: Sigmoid Functiondef sigmoid(z):retu
- Page 244 and 245: A picture is worth a thousand words
- Page 246 and 247: OutputOrderedDict([('linear.weight'
- Page 248 and 249: The first summation adds up the err
- Page 250 and 251: IMPORTANT: Make sure to pass the pr
- Page 252 and 253: To make it clear: In this chapter,
# These attributes are defined here, but since they are
# not available at the moment of creation, we keep them None
self.train_loader = None
self.val_loader = None
self.writer = None
The train data loader is obviously required. How could we possibly train a model
without it?
"Why don’t we make the train data loader an argument then?"
Conceptually speaking, the data loader (and the dataset it contains) is not part of
the model. It is the input we use to train the model. Since we can specify a model
without it, it shouldn’t be made an argument of our class.
In other words, our StepByStep class is defined by a particular combination of
arguments (model, loss function, and optimizer), which can then be used to
perform model training on any (compatible) dataset.
The validation data loader is not required (although it is recommended), and the
summary writer is definitely optional.
The class should implement methods to allow the user to supply those at a later
time (both methods should be placed inside the StepByStep class, after the
constructor method):
def set_loaders(self, train_loader, val_loader=None):
# This method allows the user to define which train_loader
# (and val_loader, optionally) to use
# Both loaders are then assigned to attributes of the class
# So they can be referred to later
self.train_loader = train_loader
self.val_loader = val_loader
def set_tensorboard(self, name, folder='runs'):
# This method allows the user to create a SummaryWriter to
# interface with TensorBoard
suffix = datetime.datetime.now().strftime('%Y%m%d%H%M%S')
self.writer = SummaryWriter(f'{folder}/{name}_{suffix}')
Going Classy | 179