22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Let’s recap what we’ve already seen:

A Note on Encoded Distances

• The encoded distance is defined by the Euclidean distance between two

vectors, or, in other words, it is the norm (size) of the difference

between two encoding vectors.

• The encoded distance between positions zero and two (T=2) should be

exactly the same as the encoded distance between positions one and

three, two and four, and so on.

In other words, the encoded distance between any two positions T steps

apart remains constant. Let’s illustrate this by computing the encoded

distances among the first five positions (by the way, we’re using the

encoding with eight dimensions now):

distances = np.zeros((5, 5))

for i, v1 in enumerate(encoding[:5]):

for j, v2 in enumerate(encoding[:5]):

distances[i, j] = np.linalg.norm(v1 - v2)

The resulting matrix is full of pretty diagonals, each diagonal containing a

constant value for the encoded distance corresponding to positions T steps

apart.

For example, for positions next to each other (T=1), our encoded distance is

always 0.96. That’s an amazing property of this encoding scheme.

Positional Encoding (PE) | 771

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!