pdfcoffee

soumyasankar99
from soumyasankar99 More from this publisher
09.05.2023 Views

[ 387 ]Chapter 10Colour mapping using SOMSome of the interesting properties of the feature map of the input space generatedby SOM are:• The feature map provides a good representation of the input space. Thisproperty can be used to perform vector quantization so that we may havea continuous input space, and using SOM we can represent it in a discreteoutput space.• The feature map is topologically ordered, that is, the spatial location of aneuron in the output lattice corresponds to a particular feature of the input.• The feature map also reflects the statistical distribution of the input space;the domain that has the largest number of input samples gets a wider areain the feature map.These features of SOM make them the natural choice for many interestingapplications. Here we use SOM for clustering a range of given R, G, and B pixelvalues to a corresponding color map. We start with the importing of modules:import tensorflow as tfimport numpy as npimport matplotlib.pyplot as pltThe main component of the code is our class WTU. The class __init__ functioninitializes various hyperparameters of our SOM, the dimensions of our 2D lattice(m, n), the number of features in the input (dim), the neighborhood radius (sigma),the initial weights, and the topographic information:# Define the Winner Take All unitsclass WTU(object):#_learned = Falsedef __init__(self, m, n, dim, num_iterations, eta = 0.5, sigma =None):"""m x n : The dimension of 2D lattice in which neuronsare arrangeddim : Dimension of input training datanum_iterations: Total number of training iterationseta : Learning ratesigma: The radius of neighbourhood function."""self._m = mself._n = n

Unsupervised Learningself._neighbourhood = []self._topography = []self._num_iterations = int(num_iterations)self._learned = Falseself.dim = dimself.eta = float(eta)if sigma is None:sigma = max(m,n)/2.0 # Constant radiuselse:sigma = float(sigma)self.sigma = sigmaprint('Network created with dimensions',m,n)# Weight Matrix and the topography of neuronsself._W = tf.random.normal([m*n, dim], seed = 0)self._topography = np.array(list(self._neuron_location(m, n)))The most important function of the class is the train() function, where we use theKohonen algorithm as discussed before to find the winner units and then update theweights based on the neighborhood function:def training(self,x, i):m = self._mn= self._n# Finding the Winner and its locationd = tf.sqrt(tf.reduce_sum(tf.pow(self._W - tf.stack([x for i inrange(m*n)]),2),1))self.WTU_idx = tf.argmin(d,0)slice_start = tf.pad(tf.reshape(self.WTU_idx, [1]),np.array([[0,1]]))self.WTU_loc = tf.reshape(tf.slice(self._topography, slice_start,[1,2]), [2])# Change learning rate and radius as a function of iterationslearning_rate = 1 - i/self._num_iterations_eta_new = self.eta * learning_rate_sigma_new = self.sigma * learning_rate[ 388 ]

[ 387 ]

Chapter 10

Colour mapping using SOM

Some of the interesting properties of the feature map of the input space generated

by SOM are:

• The feature map provides a good representation of the input space. This

property can be used to perform vector quantization so that we may have

a continuous input space, and using SOM we can represent it in a discrete

output space.

• The feature map is topologically ordered, that is, the spatial location of a

neuron in the output lattice corresponds to a particular feature of the input.

• The feature map also reflects the statistical distribution of the input space;

the domain that has the largest number of input samples gets a wider area

in the feature map.

These features of SOM make them the natural choice for many interesting

applications. Here we use SOM for clustering a range of given R, G, and B pixel

values to a corresponding color map. We start with the importing of modules:

import tensorflow as tf

import numpy as np

import matplotlib.pyplot as plt

The main component of the code is our class WTU. The class __init__ function

initializes various hyperparameters of our SOM, the dimensions of our 2D lattice

(m, n), the number of features in the input (dim), the neighborhood radius (sigma),

the initial weights, and the topographic information:

# Define the Winner Take All units

class WTU(object):

#_learned = False

def __init__(self, m, n, dim, num_iterations, eta = 0.5, sigma =

None):

"""

m x n : The dimension of 2D lattice in which neurons

are arranged

dim : Dimension of input training data

num_iterations: Total number of training iterations

eta : Learning rate

sigma: The radius of neighbourhood function.

"""

self._m = m

self._n = n

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!