09.05.2023 Views

pdfcoffee

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Chapter 5

for act in layer_activations:

# In gradient ascent, we'll want to maximize this value

# so our image increasingly "excites" the layer

loss = tf.math.reduce_mean(act)

# Normalize by the number of units in the layer

loss /= np.prod(act.shape)

total_loss += loss

return total_loss

Now let's run the gradient ascent:

img = tf.Variable(img)

steps = 400

for step in range(steps):

with tf.GradientTape() as tape:

activations = forward(img)

loss = calc_loss(activations)

gradients = tape.gradient(loss, img)

# Normalize the gradients

gradients /= gradients.numpy().std() + 1e-8

# Update our image by directly adding the gradients

img.assign_add(gradients)

if step % 50 == 0:

clear_output()

print ("Step %d, loss %f" % (step, loss))

show(deprocess(img.numpy()))

plt.show()

# Let's see the result

clear_output()

show(deprocess(img.numpy()))

[ 171 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!