09.05.2023 Views

pdfcoffee

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Unsupervised Learning

This chapter delves into unsupervised learning models. In the previous chapter we

explored Autoencoders, novel neural networks that learn via unsupervised learning.

In this chapter we will delve deeper into some other unsupervised learning models.

In contrast to supervised learning, where the training dataset consists of both the

input and the desired labels, unsupervised learning deals with the case where the

model is provided only the input. The model learns the inherent input distribution

by itself without any desired label guiding it. Clustering and dimensionality

reduction are the two most commonly used unsupervised learning techniques.

In this chapter we will learn about different machine learning and NN techniques

for both. We will cover techniques required for clustering and dimensionality

reduction, and will go into the details of Boltzmann machines, and finally we will

cover the implementation of the aforementioned techniques using TensorFlow. The

concepts covered will be extended to build Restricted Boltzmann Machines (RBMs).

The chapter will include:

• Principal component analysis

• K-Means clustering

• Self-organizing maps

• Boltzmann machines

• RBMs

Principal component analysis

Principal component analysis (PCA) is the most popular multivariate statistical

technique for dimensionality reduction. It analyzes the training data consisting of

several dependent variables, which are, in general, inter-correlated, and extracts

important information from the training data in the form of a set of new orthogonal

variables called principal components. We can perform PCA using two methods

either using eigen decomposition or using singular value decomposition (SVD).

[ 375 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!