Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

peiying410632
from peiying410632 More from this publisher
22.02.2024 Views

Run - Model Training V4%run -i model_training/v4.pyAfter updating all parts, in sequence, our current state ofdevelopment is:• Data Preparation V2• Model Configuration V2• Model Training V4Let’s inspect the model’s state:# Checks model's parametersprint(model.state_dict())OutputOrderedDict([('0.weight', tensor([[1.9419]], device='cuda:0')),('0.bias', tensor([1.0244], device='cuda:0'))])Plotting LossesLet’s take a look at both losses—training and validation.Figure 2.1 - Training and validation losses during trainingEvaluation | 151

Does your plot look different? Try running the whole pipelineagain:Full Pipeline%run -i data_preparation/v2.py%run -i model_configuration/v2.py%run -i model_training/v4.pyAnd then plot the resulting losses one more time.Cool, right? But, remember in the training step function, when I mentioned thatadding losses to a list was not very cutting-edge? Time to fix that! To bettervisualize the training process, we can make use of…TensorBoardYes, TensorBoard is that good! So good that we’ll be using a tool from thecompeting framework, TensorFlow :-) Jokes aside, TensorBoard is a very usefultool, and PyTorch provides classes and methods so that we can integrate it withour model.Running It Inside a NotebookThis section applies to both Google Colab and local installation.If you are using a local installation, you can either runTensorBoard inside a notebook or separately (check the nextsection for instructions).If you chose to follow this book using Google Colab, you’ll need to run TensorBoardinside a notebook. Luckily, this is easily accomplished using some Jupyter magics.If you are using Binder, this Jupyter magic will not work, forreasons that are beyond the scope of this section. More details onhow to use TensorBoard with Binder can be found in thecorresponding section below.First, we need to load TensorBoard’s extension for Jupyter:152 | Chapter 2: Rethinking the Training Loop

Run - Model Training V4

%run -i model_training/v4.py

After updating all parts, in sequence, our current state of

development is:

• Data Preparation V2

• Model Configuration V2

• Model Training V4

Let’s inspect the model’s state:

# Checks model's parameters

print(model.state_dict())

Output

OrderedDict([('0.weight', tensor([[1.9419]], device='cuda:0')),

('0.bias', tensor([1.0244], device='cuda:0'))])

Plotting Losses

Let’s take a look at both losses—training and validation.

Figure 2.1 - Training and validation losses during training

Evaluation | 151

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!