Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub
dummy_model = nn.Linear(1, 1)dummy_list = []def dummy_hook(layer, inputs, outputs):dummy_list.append((layer, inputs, outputs))The (forward) hook function takes three arguments:• a model (or layer)• a tensor representing the inputs taken by that model (or layer)• a tensor representing the outputs generated by that model (or layer)So, any function that takes three arguments, regardless of their names, can work asa hook. In our case (and in many other cases too), we would like to keep theinformation that goes through the hook function.You should use a variable (or variables) defined outside thehook function to store values.That’s the role of the dummy_list variable in the snippet above. Our dummy_hook()function is as basic as it gets: It simply appends a tuple of its three arguments to thedummy_list variable defined outside the hook function."How do you hook the hook function to the model?"There is a method for it, register_forward_hook(), which takes the hook functionand returns a handle, so we can keep track of the hooks attached to our model.dummy_handle = dummy_model.register_forward_hook(dummy_hook)dummy_handleOutput<torch.utils.hooks.RemovableHandle at 0x7fc9a003e190>Simple enough, right? Let’s see it in action:Visualizing Filters and More! | 395
dummy_x = torch.tensor([0.3])dummy_model.forward(dummy_x)Outputtensor([-0.7514], grad_fn=<AddBackward0>)It should add a new tuple to the dummy list, one containing a linear layer, an inputtensor (0.3), and an output tensor (-0.7514). By the way, your values are going to bedifferent than mine, since we didn’t bother to use a seed here.dummy_listOutput[]"Empty?! So it is not working?"GOTCHA! I deliberately used the model’s forward() method here to illustratesomething we’ve discussed much earlier, in Chapter 1:You should NOT call the forward(x) method! You should call thewhole model instead, as in model(x), to perform a forward pass.Otherwise, your hooks won’t work.Let’s do it right this time:dummy_model(dummy_x)Outputtensor([-0.7514], grad_fn=<AddBackward0>)396 | Chapter 5: Convolutions
- Page 370 and 371: import randomimport numpy as npfrom
- Page 372 and 373: identity = np.array([[[[0, 0, 0],[0
- Page 374 and 375: Figure 5.4 - Striding the image, on
- Page 376 and 377: Output-----------------------------
- Page 378 and 379: Outputtensor([[[[9., 5., 0., 7.],[0
- Page 380 and 381: OutputParameter containing:tensor([
- Page 382 and 383: Moreover, notice that if we were to
- Page 384 and 385: In code, as usual, PyTorch gives us
- Page 386 and 387: Outputtensor([[[[5., 5., 0., 8., 7.
- Page 388 and 389: edge = np.array([[[[0, 1, 0],[1, -4
- Page 390 and 391: A pooling kernel of two-by-two resu
- Page 392 and 393: Outputtensor([[22., 23., 11., 24.,
- Page 394 and 395: Figure 5.15 - LeNet-5 architectureS
- Page 396 and 397: • second block: produces 16-chann
- Page 398 and 399: Transformed Dataset1 class Transfor
- Page 400 and 401: LossNew problem, new loss. Since we
- Page 402 and 403: Outputtensor([4.0000, 1.0000, 0.500
- Page 404 and 405: The loss only considers the predict
- Page 406 and 407: Outputtensor([[-1.5229, -0.3146, -2
- Page 408 and 409: IMPORTANT: I can’t stress this en
- Page 410 and 411: figures at the beginning of this ch
- Page 412 and 413: The three units in the output layer
- Page 414 and 415: StepByStep Method@staticmethoddef _
- Page 416 and 417: The meow() method is totally indepe
- Page 418 and 419: StepByStep Methoddef visualize_filt
- Page 422 and 423: dummy_listOutput[(Linear(in_feature
- Page 424 and 425: Output{Conv2d(1, 1, kernel_size=(3,
- Page 426 and 427: will be the externally defined vari
- Page 428 and 429: Removing Hookssbs_cnn1.remove_hooks
- Page 430 and 431: return figsetattr(StepByStep, 'visu
- Page 432 and 433: Figure 5.22 - Feature maps (classif
- Page 434 and 435: classification: The predicted class
- Page 436 and 437: convolutional layers to our model a
- Page 438 and 439: Capturing Outputsfeaturizer_layers
- Page 440 and 441: the filters learned by the model pr
- Page 442 and 443: given chapter are imported at its v
- Page 444 and 445: Data PreparationThe data preparatio
- Page 446 and 447: model anyway. We’ll use it to com
- Page 448 and 449: StepByStep Method@staticmethoddef m
- Page 450 and 451: "What’s wrong with the colors?"Th
- Page 452 and 453: three_channel_filter = np.array([[[
- Page 454 and 455: Fancier Model (Constructor)class CN
- Page 456 and 457: Fancier Model (Classifier)def class
- Page 458 and 459: torch.manual_seed(44)dropping_model
- Page 460 and 461: Outputtensor([0.1000, 0.2000, 0.300
- Page 462 and 463: Figure 6.8 - Output distribution fo
- Page 464 and 465: Adaptive moment estimation (Adam) u
- Page 466 and 467: torch.manual_seed(13)# Model Config
- Page 468 and 469: Outputtorch.Size([5, 3, 3, 3])Its s
dummy_model = nn.Linear(1, 1)
dummy_list = []
def dummy_hook(layer, inputs, outputs):
dummy_list.append((layer, inputs, outputs))
The (forward) hook function takes three arguments:
• a model (or layer)
• a tensor representing the inputs taken by that model (or layer)
• a tensor representing the outputs generated by that model (or layer)
So, any function that takes three arguments, regardless of their names, can work as
a hook. In our case (and in many other cases too), we would like to keep the
information that goes through the hook function.
You should use a variable (or variables) defined outside the
hook function to store values.
That’s the role of the dummy_list variable in the snippet above. Our dummy_hook()
function is as basic as it gets: It simply appends a tuple of its three arguments to the
dummy_list variable defined outside the hook function.
"How do you hook the hook function to the model?"
There is a method for it, register_forward_hook(), which takes the hook function
and returns a handle, so we can keep track of the hooks attached to our model.
dummy_handle = dummy_model.register_forward_hook(dummy_hook)
dummy_handle
Output
<torch.utils.hooks.RemovableHandle at 0x7fc9a003e190>
Simple enough, right? Let’s see it in action:
Visualizing Filters and More! | 395