22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

ax.set_xlabel('Learning Rate')

ax.set_ylabel('Loss')

fig.tight_layout()

return tracking, fig

setattr(StepByStep, 'lr_range_test', lr_range_test)

Since the technique is supposed to be applied on an untrained model, we create a

new model (and optimizer) here:

Model Configuration

1 torch.manual_seed(13)

2 new_model = CNN2(n_feature=5, p=0.3)

3 multi_loss_fn = nn.CrossEntropyLoss(reduction='mean')

4 new_optimizer = optim.Adam(new_model.parameters(), lr=3e-4)

Next, we create an instance of StepByStep and call the new method using the

training data loader, the upper range for the learning rate (end_lr), and how many

iterations we’d like it to try:

Learning Rate Range Test

1 sbs_new = StepByStep(new_model, multi_loss_fn, new_optimizer)

2 tracking, fig = sbs_new.lr_range_test(

3 train_loader, end_lr=1e-1, num_iter=100)

Figure 6.14 - Learning rate finder

There we go: a "U"-shaped curve. Apparently, the Karpathy Constant (3e-4) is too

low for our model. The descending part of the curve is the region we should aim

for: something around 0.01.

Learning Rates | 451

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!