Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub
setattrThe setattr function sets the value of the specified attribute of a givenobject. But methods are also attributes, so we can use this function to"attach" a method to an existing class and to all its existing instances in onego!Yes, this is a hack! No, you should not use it in your regular code! Usingsetattr to build a class by appending methods to it incrementally serveseducational purposes only.To illustrate how it works and why it may be dangerous, I will show you alittle example. Let’s create a simple Dog class, which takes only the dog’sname as argument:class Dog(object):def __init__(self, name):self.name = nameNext, let’s instantiate our class; that is, we are creating a dog. Let’s call it Rex.Its name is going to be stored in the name attribute:rex = Dog('Rex')print(rex.name)OutputRexThen, let’s create a bark() function that takes an instance of Dog asargument:def bark(dog):print('{} barks: "Woof!"'.format(dog.name))Sure enough, we can call this function to make Rex bark:Going Classy | 185
bark(rex)OutputRex barks: "Woof!"But that’s not what we want. We want our dogs to be able to bark out of thebox! So we will use setattr to give dogs the ability to bark. There is onething we need to change, though, and that’s the function’s argument. Sincewe want the bark function to be a method of the Dog class itself, theargument needs to be the method’s own instance: self.def bark(self):print('{} barks: "Woof!"'.format(self.name))setattr(Dog, 'bark', bark)Does it work? Let’s create a new dog:fido = Dog('Fido')fido.bark()OutputFido barks: "Woof!"Of course it works! Not only new dogs can bark now, but all dogs can bark:rex.bark()OutputRex barks: "Woof!"186 | Chapter 2.1: Going Classy
- Page 160 and 161: There is no need to load the whole
- Page 162 and 163: but if we want to get serious about
- Page 164 and 165: How does this change our code so fa
- Page 166 and 167: Run - Model Training V2%run -i mode
- Page 168 and 169: piece of code that’s going to be
- Page 170 and 171: for it. We could do the same for th
- Page 172 and 173: EvaluationHow can we evaluate the m
- Page 174 and 175: And then, we update our model confi
- Page 176 and 177: Run - Model Training V4%run -i mode
- Page 178 and 179: Loading Extension# Load the TensorB
- Page 180 and 181: browser, you’ll likely see someth
- Page 182 and 183: model’s graph (not quite the same
- Page 184 and 185: Figure 2.5 - Scalars on TensorBoard
- Page 186 and 187: Define - Model Training V51 %%write
- Page 188 and 189: If, by any chance, you ended up wit
- Page 190 and 191: The procedure is exactly the same,
- Page 192 and 193: soon, so please bear with me for no
- Page 194 and 195: After recovering our model’s stat
- Page 196 and 197: Run - Model Configuration V31 # %lo
- Page 198 and 199: This is the general structure you
- Page 200 and 201: Chapter 2.1Going ClassySpoilersIn t
- Page 202 and 203: # A completely empty (and useless)
- Page 204 and 205: # These attributes are defined here
- Page 206 and 207: # Creates the train_step function f
- Page 208 and 209: # Builds function that performs a s
- Page 212 and 213: See? We effectively modified the un
- Page 214 and 215: the random seed as arguments.This s
- Page 216 and 217: The current state of development of
- Page 218 and 219: Lossesdef plot_losses(self):fig = p
- Page 220 and 221: Run - Data Preparation V21 # %load
- Page 222 and 223: Model TrainingWe start by instantia
- Page 224 and 225: Making PredictionsLet’s make up s
- Page 226 and 227: OutputOrderedDict([('0.weight', ten
- Page 228 and 229: Run - Data Preparation V21 # %load
- Page 230 and 231: • defining our StepByStep class
- Page 232 and 233: import numpy as npimport torchimpor
- Page 234 and 235: Next, we’ll standardize the featu
- Page 236 and 237: Equation 3.1 - A linear regression
- Page 238 and 239: The odds ratio is given by the rati
- Page 240 and 241: As expected, probabilities that add
- Page 242 and 243: Sigmoid Functiondef sigmoid(z):retu
- Page 244 and 245: A picture is worth a thousand words
- Page 246 and 247: OutputOrderedDict([('linear.weight'
- Page 248 and 249: The first summation adds up the err
- Page 250 and 251: IMPORTANT: Make sure to pass the pr
- Page 252 and 253: To make it clear: In this chapter,
- Page 254 and 255: argument of nn.BCEWithLogitsLoss().
- Page 256 and 257: It is not that hard, to be honest.
- Page 258 and 259: Figure 3.6 - Training and validatio
bark(rex)
Output
Rex barks: "Woof!"
But that’s not what we want. We want our dogs to be able to bark out of the
box! So we will use setattr to give dogs the ability to bark. There is one
thing we need to change, though, and that’s the function’s argument. Since
we want the bark function to be a method of the Dog class itself, the
argument needs to be the method’s own instance: self.
def bark(self):
print('{} barks: "Woof!"'.format(self.name))
setattr(Dog, 'bark', bark)
Does it work? Let’s create a new dog:
fido = Dog('Fido')
fido.bark()
Output
Fido barks: "Woof!"
Of course it works! Not only new dogs can bark now, but all dogs can bark:
rex.bark()
Output
Rex barks: "Woof!"
186 | Chapter 2.1: Going Classy