22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Model Configuration

1 optimizer_model = optim.Adam(model.parameters(), lr=3e-4)

2 sbs_incep = StepByStep(model, inception_loss, optimizer_model)

"Wait, aren’t we pre-processing the dataset this time?"

Unfortunately, no. The preprocessed_dataset() function cannot handle multiple

outputs. Instead of making the process convoluted in order to handle the

peculiarities of the Inception model, I am sticking with the simpler (yet slower) way

of training the last layer while it is still attached to the rest of the model.

The Inception model is also different from the others in its expected input size: 299

instead of 224. So, we need to recreate the data loaders accordingly:

Data Preparation

1 normalizer = Normalize(mean=[0.485, 0.456, 0.406],

2 std=[0.229, 0.224, 0.225])

3

4 composer = Compose([Resize(299),

5 ToTensor(),

6 normalizer])

7

8 train_data = ImageFolder(root='rps', transform=composer)

9 val_data = ImageFolder(root='rps-test-set', transform=composer)

10 # Builds a loader of each set

11 train_loader = DataLoader(

12 train_data, batch_size=16, shuffle=True)

13 val_loader = DataLoader(val_data, batch_size=16)

We’re ready, so let’s train our model for a single epoch and evaluate the result:

Model Training

1 sbs_incep.set_loaders(train_loader, val_loader)

2 sbs_incep.train(1)

StepByStep.loader_apply(val_loader, sbs_incep.correct)

Auxiliary Classifiers (Side-Heads) | 523

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!