09.05.2023 Views

pdfcoffee

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Chapter 10

inpX = rbm.rbm_output(inpX)

Our DBN is ready. The three stacked RBMs are now trained using unsupervised

learning. DBNs can also be trained using supervised training. To do so we will

need to fine-tune the weights of the trained RBMs and add a fully connected layer

at the end.

Variational Autoencoders

Like DBNs and GANs, variational autoencoders are also generative models.

Variational Autoencoders (VAEs) are a mix of the best of neural networks and

Bayesian inference. They are one of the most interesting neural networks and have

emerged as one of the most popular approaches to unsupervised learning. They

are Autoencoders with a twist. Along with the conventional encoder and decoder

network of Autoencoders (see Chapter 8, Autoencoders), they have additional

stochastic layers. The stochastic layer, after the encoder network, samples the data

using a Gaussian distribution, and the one after the decoder network samples the

data using Bernoulli's distribution. Like GANs, VAEs can be used to generate images

and figures based on the distribution they have been trained on. VAEs allow one to

set complex priors in the latent and thus learn powerful latent representations. The

following diagram describes a VAE:

The Encoder network q Φ

(z|x) approximates the true but intractable posterior

distribution p(z|x), where x is the input to the VAE and z is the latent representation.

The decoder network ppΘ(xx|zz) takes the d-dimensional latent variables (also called

latent space) as its input and generates new images following the same distribution

as P(x). As you can see from the preceding diagram, the latent representation z is

sampled from z|x ~ NN(μμ zz|xx , ∑ zz|xx ) , and the output of the decoder network samples

x|z from x|z ~ NN(μμ xx|zz , ∑ xx|zz ) .

[ 399 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!