Self-organizing incremental neural network and its application
Self-organizing incremental neural network and its application
Self-organizing incremental neural network and its application
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
<strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong><br />
<strong>application</strong><br />
F. Shen 1 O. Hasegawa 2<br />
1 National Key Laboratory for Novel Software Technology, Nanjing University<br />
2 Imaging Science <strong>and</strong> Engineering Lab, Tokyo Institute of Technology<br />
June 12, 2009<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Contents of this tutorial<br />
1 What is SOINN<br />
2 Why SOINN<br />
3 Detail algorithm of SOINN<br />
4 SOINN for machine learning<br />
5 SOINN for associative memory<br />
6 References<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
1 What is SOINN<br />
2 Why SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
3 Detail algorithm of SOINN<br />
4 SOINN for machine learning<br />
5 SOINN for associative memory<br />
6 References<br />
What is SOINN<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
What is SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
What is SOINN<br />
SOINN: <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong><br />
Represent the topological structure of the input data<br />
Realize online <strong>incremental</strong> learning<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
What is SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
What is SOINN<br />
SOINN: <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong><br />
Represent the topological structure of the input data<br />
Realize online <strong>incremental</strong> learning<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
What is SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
What is SOINN<br />
SOINN: <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong><br />
Represent the topological structure of the input data<br />
Realize online <strong>incremental</strong> learning<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
What is SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
What is SOINN<br />
SOINN: <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong><br />
Represent the topological structure of the input data<br />
Realize online <strong>incremental</strong> learning<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
1 What is SOINN<br />
2 Why SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
3 Detail algorithm of SOINN<br />
4 SOINN for machine learning<br />
5 SOINN for associative memory<br />
6 References<br />
Background<br />
Characteristics of SOINN<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
Characteristics of SOINN<br />
Background: Networks for topology representation<br />
SOM(<strong>Self</strong>-Organizing Map): predefine structure <strong>and</strong> size of<br />
the <strong>network</strong><br />
NG(Neural Gas): predefine the <strong>network</strong> size<br />
GNG(Growing Neural Gas): predefine the <strong>network</strong> size;<br />
constant learning rate leads to non-stationary result.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
Characteristics of SOINN<br />
Background: Networks for topology representation<br />
SOM(<strong>Self</strong>-Organizing Map): predefine structure <strong>and</strong> size of<br />
the <strong>network</strong><br />
NG(Neural Gas): predefine the <strong>network</strong> size<br />
GNG(Growing Neural Gas): predefine the <strong>network</strong> size;<br />
constant learning rate leads to non-stationary result.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
Characteristics of SOINN<br />
Background: Networks for topology representation<br />
SOM(<strong>Self</strong>-Organizing Map): predefine structure <strong>and</strong> size of<br />
the <strong>network</strong><br />
NG(Neural Gas): predefine the <strong>network</strong> size<br />
GNG(Growing Neural Gas): predefine the <strong>network</strong> size;<br />
constant learning rate leads to non-stationary result.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
Characteristics of SOINN<br />
Background: Networks for topology representation<br />
SOM(<strong>Self</strong>-Organizing Map): predefine structure <strong>and</strong> size of<br />
the <strong>network</strong><br />
NG(Neural Gas): predefine the <strong>network</strong> size<br />
GNG(Growing Neural Gas): predefine the <strong>network</strong> size;<br />
constant learning rate leads to non-stationary result.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
Characteristics of SOINN<br />
Background: Networks for <strong>incremental</strong> learning<br />
Incremental learning: Learning new knowledge without destroy<br />
of old learned knowledge (Stability-Plasticity Dilemma)<br />
ART(Adaptive Resonance Theory): Need a user defined<br />
threshold.<br />
Multilayer Perceptrons: To learn new knowledge will destroy<br />
old knowledge<br />
Sub-<strong>network</strong> methods: Need plenty of storage<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
Characteristics of SOINN<br />
Background: Networks for <strong>incremental</strong> learning<br />
Incremental learning: Learning new knowledge without destroy<br />
of old learned knowledge (Stability-Plasticity Dilemma)<br />
ART(Adaptive Resonance Theory): Need a user defined<br />
threshold.<br />
Multilayer Perceptrons: To learn new knowledge will destroy<br />
old knowledge<br />
Sub-<strong>network</strong> methods: Need plenty of storage<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
Characteristics of SOINN<br />
Background: Networks for <strong>incremental</strong> learning<br />
Incremental learning: Learning new knowledge without destroy<br />
of old learned knowledge (Stability-Plasticity Dilemma)<br />
ART(Adaptive Resonance Theory): Need a user defined<br />
threshold.<br />
Multilayer Perceptrons: To learn new knowledge will destroy<br />
old knowledge<br />
Sub-<strong>network</strong> methods: Need plenty of storage<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
Characteristics of SOINN<br />
Background: Networks for <strong>incremental</strong> learning<br />
Incremental learning: Learning new knowledge without destroy<br />
of old learned knowledge (Stability-Plasticity Dilemma)<br />
ART(Adaptive Resonance Theory): Need a user defined<br />
threshold.<br />
Multilayer Perceptrons: To learn new knowledge will destroy<br />
old knowledge<br />
Sub-<strong>network</strong> methods: Need plenty of storage<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
Characteristics of SOINN<br />
Background: Networks for <strong>incremental</strong> learning<br />
Incremental learning: Learning new knowledge without destroy<br />
of old learned knowledge (Stability-Plasticity Dilemma)<br />
ART(Adaptive Resonance Theory): Need a user defined<br />
threshold.<br />
Multilayer Perceptrons: To learn new knowledge will destroy<br />
old knowledge<br />
Sub-<strong>network</strong> methods: Need plenty of storage<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Characteristics of SOINN<br />
Background<br />
Characteristics of SOINN<br />
Neurons are self-organized with no predefined <strong>network</strong><br />
structure <strong>and</strong> size<br />
Adaptively find suitable number of neurons for the <strong>network</strong><br />
Realize online <strong>incremental</strong> learning without any priori<br />
condition<br />
Find typical prototypes for large-scale data set.<br />
Robust to noise<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Characteristics of SOINN<br />
Background<br />
Characteristics of SOINN<br />
Neurons are self-organized with no predefined <strong>network</strong><br />
structure <strong>and</strong> size<br />
Adaptively find suitable number of neurons for the <strong>network</strong><br />
Realize online <strong>incremental</strong> learning without any priori<br />
condition<br />
Find typical prototypes for large-scale data set.<br />
Robust to noise<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Characteristics of SOINN<br />
Background<br />
Characteristics of SOINN<br />
Neurons are self-organized with no predefined <strong>network</strong><br />
structure <strong>and</strong> size<br />
Adaptively find suitable number of neurons for the <strong>network</strong><br />
Realize online <strong>incremental</strong> learning without any priori<br />
condition<br />
Find typical prototypes for large-scale data set.<br />
Robust to noise<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Characteristics of SOINN<br />
Background<br />
Characteristics of SOINN<br />
Neurons are self-organized with no predefined <strong>network</strong><br />
structure <strong>and</strong> size<br />
Adaptively find suitable number of neurons for the <strong>network</strong><br />
Realize online <strong>incremental</strong> learning without any priori<br />
condition<br />
Find typical prototypes for large-scale data set.<br />
Robust to noise<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Characteristics of SOINN<br />
Background<br />
Characteristics of SOINN<br />
Neurons are self-organized with no predefined <strong>network</strong><br />
structure <strong>and</strong> size<br />
Adaptively find suitable number of neurons for the <strong>network</strong><br />
Realize online <strong>incremental</strong> learning without any priori<br />
condition<br />
Find typical prototypes for large-scale data set.<br />
Robust to noise<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Characteristics of SOINN<br />
Background<br />
Characteristics of SOINN<br />
Neurons are self-organized with no predefined <strong>network</strong><br />
structure <strong>and</strong> size<br />
Adaptively find suitable number of neurons for the <strong>network</strong><br />
Realize online <strong>incremental</strong> learning without any priori<br />
condition<br />
Find typical prototypes for large-scale data set.<br />
Robust to noise<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
1 What is SOINN<br />
2 Why SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
3 Detail algorithm of SOINN<br />
4 SOINN for machine learning<br />
5 SOINN for associative memory<br />
6 References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Structure: Two-layer competitive <strong>network</strong><br />
Two-layer competitive<br />
<strong>network</strong><br />
First layer: Competitive<br />
for input data<br />
Second layer: Competitive<br />
for output of first-layer<br />
Output topology structure<br />
<strong>and</strong> weight vector of<br />
second layer<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Structure: Two-layer competitive <strong>network</strong><br />
Two-layer competitive<br />
<strong>network</strong><br />
First layer: Competitive<br />
for input data<br />
Second layer: Competitive<br />
for output of first-layer<br />
Output topology structure<br />
<strong>and</strong> weight vector of<br />
second layer<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Structure: Two-layer competitive <strong>network</strong><br />
Two-layer competitive<br />
<strong>network</strong><br />
First layer: Competitive<br />
for input data<br />
Second layer: Competitive<br />
for output of first-layer<br />
Output topology structure<br />
<strong>and</strong> weight vector of<br />
second layer<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Structure: Two-layer competitive <strong>network</strong><br />
Two-layer competitive<br />
<strong>network</strong><br />
First layer: Competitive<br />
for input data<br />
Second layer: Competitive<br />
for output of first-layer<br />
Output topology structure<br />
<strong>and</strong> weight vector of<br />
second layer<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Structure: Two-layer competitive <strong>network</strong><br />
Two-layer competitive<br />
<strong>network</strong><br />
First layer: Competitive<br />
for input data<br />
Second layer: Competitive<br />
for output of first-layer<br />
Output topology structure<br />
<strong>and</strong> weight vector of<br />
second layer<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Training flowchart of SOINN<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Adaptively updated<br />
threshold<br />
Between-class<br />
insertion<br />
Update weight of<br />
nodes<br />
Within-class<br />
insertion<br />
Remove noise nodes<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Training flowchart of SOINN<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Adaptively updated<br />
threshold<br />
Between-class<br />
insertion<br />
Update weight of<br />
nodes<br />
Within-class<br />
insertion<br />
Remove noise nodes<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Training flowchart of SOINN<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Adaptively updated<br />
threshold<br />
Between-class<br />
insertion<br />
Update weight of<br />
nodes<br />
Within-class<br />
insertion<br />
Remove noise nodes<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Training flowchart of SOINN<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Adaptively updated<br />
threshold<br />
Between-class<br />
insertion<br />
Update weight of<br />
nodes<br />
Within-class<br />
insertion<br />
Remove noise nodes<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Training flowchart of SOINN<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Adaptively updated<br />
threshold<br />
Between-class<br />
insertion<br />
Update weight of<br />
nodes<br />
Within-class<br />
insertion<br />
Remove noise nodes<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Training flowchart of SOINN<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Adaptively updated<br />
threshold<br />
Between-class<br />
insertion<br />
Update weight of<br />
nodes<br />
Within-class<br />
insertion<br />
Remove noise nodes<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Training flowchart of SOINN<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Adaptively updated<br />
threshold<br />
Between-class<br />
insertion<br />
Update weight of<br />
nodes<br />
Within-class<br />
insertion<br />
Remove noise nodes<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
First layer: adaptively updating threshold Ti<br />
Basic idea: within-class distance ≤ T ≤ between-class distance<br />
1 Initialize: Ti = +∞ when node i is a new node.<br />
2 When i is winner or second winner, update Ti by<br />
If i has neighbors, Ti is updated as the maximum distance<br />
between i <strong>and</strong> all of <strong>its</strong> neighbors.<br />
Ti = max ||Wi − Wc|| (1)<br />
c∈Ni<br />
If i has no neighbors, Ti is updated as the minimum distance<br />
of i <strong>and</strong> all other nodes in <strong>network</strong> A.<br />
Ti = min<br />
c∈A\{i} ||Wi − Wc|| (2)<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
First layer: adaptively updating threshold Ti<br />
Basic idea: within-class distance ≤ T ≤ between-class distance<br />
1 Initialize: Ti = +∞ when node i is a new node.<br />
2 When i is winner or second winner, update Ti by<br />
If i has neighbors, Ti is updated as the maximum distance<br />
between i <strong>and</strong> all of <strong>its</strong> neighbors.<br />
Ti = max ||Wi − Wc|| (1)<br />
c∈Ni<br />
If i has no neighbors, Ti is updated as the minimum distance<br />
of i <strong>and</strong> all other nodes in <strong>network</strong> A.<br />
Ti = min<br />
c∈A\{i} ||Wi − Wc|| (2)<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
First layer: adaptively updating threshold Ti<br />
Basic idea: within-class distance ≤ T ≤ between-class distance<br />
1 Initialize: Ti = +∞ when node i is a new node.<br />
2 When i is winner or second winner, update Ti by<br />
If i has neighbors, Ti is updated as the maximum distance<br />
between i <strong>and</strong> all of <strong>its</strong> neighbors.<br />
Ti = max ||Wi − Wc|| (1)<br />
c∈Ni<br />
If i has no neighbors, Ti is updated as the minimum distance<br />
of i <strong>and</strong> all other nodes in <strong>network</strong> A.<br />
Ti = min<br />
c∈A\{i} ||Wi − Wc|| (2)<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
First layer: adaptively updating threshold Ti<br />
Basic idea: within-class distance ≤ T ≤ between-class distance<br />
1 Initialize: Ti = +∞ when node i is a new node.<br />
2 When i is winner or second winner, update Ti by<br />
If i has neighbors, Ti is updated as the maximum distance<br />
between i <strong>and</strong> all of <strong>its</strong> neighbors.<br />
Ti = max ||Wi − Wc|| (1)<br />
c∈Ni<br />
If i has no neighbors, Ti is updated as the minimum distance<br />
of i <strong>and</strong> all other nodes in <strong>network</strong> A.<br />
Ti = min<br />
c∈A\{i} ||Wi − Wc|| (2)<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
First layer: adaptively updating threshold Ti<br />
Basic idea: within-class distance ≤ T ≤ between-class distance<br />
1 Initialize: Ti = +∞ when node i is a new node.<br />
2 When i is winner or second winner, update Ti by<br />
If i has neighbors, Ti is updated as the maximum distance<br />
between i <strong>and</strong> all of <strong>its</strong> neighbors.<br />
Ti = max ||Wi − Wc|| (1)<br />
c∈Ni<br />
If i has no neighbors, Ti is updated as the minimum distance<br />
of i <strong>and</strong> all other nodes in <strong>network</strong> A.<br />
Ti = min<br />
c∈A\{i} ||Wi − Wc|| (2)<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
First layer: adaptively updating threshold Ti<br />
Basic idea: within-class distance ≤ T ≤ between-class distance<br />
1 Initialize: Ti = +∞ when node i is a new node.<br />
2 When i is winner or second winner, update Ti by<br />
If i has neighbors, Ti is updated as the maximum distance<br />
between i <strong>and</strong> all of <strong>its</strong> neighbors.<br />
Ti = max ||Wi − Wc|| (1)<br />
c∈Ni<br />
If i has no neighbors, Ti is updated as the minimum distance<br />
of i <strong>and</strong> all other nodes in <strong>network</strong> A.<br />
Ti = min<br />
c∈A\{i} ||Wi − Wc|| (2)<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Second layer: constant threshold Tc<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Basic idea 1: within-class distance ≤ T ≤ between-class<br />
distance<br />
Basic idea 2: we already have some knowledge of input data<br />
from results of first-layer.<br />
Within-class distance:<br />
dw = 1 <br />
||Wi − Wj|| (3)<br />
NC<br />
(i,j)∈C<br />
Between-class distance of two class Ci <strong>and</strong> Cj:<br />
db(Ci,Cj) = min ||Wi − Wj|| (4)<br />
i∈Ci,j∈Cj<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Second layer: constant threshold Tc<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Basic idea 1: within-class distance ≤ T ≤ between-class<br />
distance<br />
Basic idea 2: we already have some knowledge of input data<br />
from results of first-layer.<br />
Within-class distance:<br />
dw = 1 <br />
||Wi − Wj|| (3)<br />
NC<br />
(i,j)∈C<br />
Between-class distance of two class Ci <strong>and</strong> Cj:<br />
db(Ci,Cj) = min ||Wi − Wj|| (4)<br />
i∈Ci,j∈Cj<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Second layer: constant threshold Tc<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Basic idea 1: within-class distance ≤ T ≤ between-class<br />
distance<br />
Basic idea 2: we already have some knowledge of input data<br />
from results of first-layer.<br />
Within-class distance:<br />
dw = 1 <br />
||Wi − Wj|| (3)<br />
NC<br />
(i,j)∈C<br />
Between-class distance of two class Ci <strong>and</strong> Cj:<br />
db(Ci,Cj) = min ||Wi − Wj|| (4)<br />
i∈Ci,j∈Cj<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Second layer: constant threshold Tc<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Basic idea 1: within-class distance ≤ T ≤ between-class<br />
distance<br />
Basic idea 2: we already have some knowledge of input data<br />
from results of first-layer.<br />
Within-class distance:<br />
dw = 1 <br />
||Wi − Wj|| (3)<br />
NC<br />
(i,j)∈C<br />
Between-class distance of two class Ci <strong>and</strong> Cj:<br />
db(Ci,Cj) = min ||Wi − Wj|| (4)<br />
i∈Ci,j∈Cj<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Second layer: constant threshold Tc<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Basic idea 1: within-class distance ≤ T ≤ between-class<br />
distance<br />
Basic idea 2: we already have some knowledge of input data<br />
from results of first-layer.<br />
Within-class distance:<br />
dw = 1 <br />
||Wi − Wj|| (3)<br />
NC<br />
(i,j)∈C<br />
Between-class distance of two class Ci <strong>and</strong> Cj:<br />
db(Ci,Cj) = min ||Wi − Wj|| (4)<br />
i∈Ci,j∈Cj<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Second layer: constant threshold Tc (continue)<br />
1 Set Tc as the minimum between-cluster distance.<br />
Tc = db(Ci1 ,Cj1 ) = min<br />
k,l=1,...,Q,k=l db(Ck,Cl) (5)<br />
2 Set Tc as the minimum between-class distance.<br />
Tc = db(Ci1 ,Cj1 ) = min<br />
k,l=1,...,Q,k=l db(Ck,Cl) (6)<br />
3 If Tc is less than within-class distance dw, set Tc as the next<br />
minimum between-cluster distance.<br />
Tc = db(Ci2 ,Cj2 ) = min db(Ck,Cl) (7)<br />
k,l=1,...,Q,k=l,k=i1,l=j1<br />
4 Go to step 2 to update Tc until Tc is greater than dw.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Second layer: constant threshold Tc (continue)<br />
1 Set Tc as the minimum between-cluster distance.<br />
Tc = db(Ci1 ,Cj1 ) = min<br />
k,l=1,...,Q,k=l db(Ck,Cl) (5)<br />
2 Set Tc as the minimum between-class distance.<br />
Tc = db(Ci1 ,Cj1 ) = min<br />
k,l=1,...,Q,k=l db(Ck,Cl) (6)<br />
3 If Tc is less than within-class distance dw, set Tc as the next<br />
minimum between-cluster distance.<br />
Tc = db(Ci2 ,Cj2 ) = min db(Ck,Cl) (7)<br />
k,l=1,...,Q,k=l,k=i1,l=j1<br />
4 Go to step 2 to update Tc until Tc is greater than dw.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Second layer: constant threshold Tc (continue)<br />
1 Set Tc as the minimum between-cluster distance.<br />
Tc = db(Ci1 ,Cj1 ) = min<br />
k,l=1,...,Q,k=l db(Ck,Cl) (5)<br />
2 Set Tc as the minimum between-class distance.<br />
Tc = db(Ci1 ,Cj1 ) = min<br />
k,l=1,...,Q,k=l db(Ck,Cl) (6)<br />
3 If Tc is less than within-class distance dw, set Tc as the next<br />
minimum between-cluster distance.<br />
Tc = db(Ci2 ,Cj2 ) = min db(Ck,Cl) (7)<br />
k,l=1,...,Q,k=l,k=i1,l=j1<br />
4 Go to step 2 to update Tc until Tc is greater than dw.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Second layer: constant threshold Tc (continue)<br />
1 Set Tc as the minimum between-cluster distance.<br />
Tc = db(Ci1 ,Cj1 ) = min<br />
k,l=1,...,Q,k=l db(Ck,Cl) (5)<br />
2 Set Tc as the minimum between-class distance.<br />
Tc = db(Ci1 ,Cj1 ) = min<br />
k,l=1,...,Q,k=l db(Ck,Cl) (6)<br />
3 If Tc is less than within-class distance dw, set Tc as the next<br />
minimum between-cluster distance.<br />
Tc = db(Ci2 ,Cj2 ) = min db(Ck,Cl) (7)<br />
k,l=1,...,Q,k=l,k=i1,l=j1<br />
4 Go to step 2 to update Tc until Tc is greater than dw.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Second layer: constant threshold Tc (continue)<br />
1 Set Tc as the minimum between-cluster distance.<br />
Tc = db(Ci1 ,Cj1 ) = min<br />
k,l=1,...,Q,k=l db(Ck,Cl) (5)<br />
2 Set Tc as the minimum between-class distance.<br />
Tc = db(Ci1 ,Cj1 ) = min<br />
k,l=1,...,Q,k=l db(Ck,Cl) (6)<br />
3 If Tc is less than within-class distance dw, set Tc as the next<br />
minimum between-cluster distance.<br />
Tc = db(Ci2 ,Cj2 ) = min db(Ck,Cl) (7)<br />
k,l=1,...,Q,k=l,k=i1,l=j1<br />
4 Go to step 2 to update Tc until Tc is greater than dw.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Updating learning rate ǫ1(t) <strong>and</strong> ǫ2(t)<br />
Update of weight vector<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
∆Ws1 = ǫ1(t)(ξ − Ws1 ) (8)<br />
∆Wi = ǫ2(t)(ξ − Wi) (∀i ∈ Ns1 ) (9)<br />
After the size of <strong>network</strong> becomes stable, fine tune the <strong>network</strong><br />
stochastic approximation: a number of adaptation steps with<br />
a<br />
<br />
strength ǫ(t) decaying slowly but not too slowly, i.e.,<br />
∞<br />
t=1 ǫ(t) = ∞, <strong>and</strong> ∞ t=1 ǫ2 (t) < ∞.<br />
The harmonic series satisfies the conditions.<br />
ǫ1(t) = 1<br />
t , ǫ2(t) = 1<br />
100t<br />
(10)<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Updating learning rate ǫ1(t) <strong>and</strong> ǫ2(t)<br />
Update of weight vector<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
∆Ws1 = ǫ1(t)(ξ − Ws1 ) (8)<br />
∆Wi = ǫ2(t)(ξ − Wi) (∀i ∈ Ns1 ) (9)<br />
After the size of <strong>network</strong> becomes stable, fine tune the <strong>network</strong><br />
stochastic approximation: a number of adaptation steps with<br />
a<br />
<br />
strength ǫ(t) decaying slowly but not too slowly, i.e.,<br />
∞<br />
t=1 ǫ(t) = ∞, <strong>and</strong> ∞ t=1 ǫ2 (t) < ∞.<br />
The harmonic series satisfies the conditions.<br />
ǫ1(t) = 1<br />
t , ǫ2(t) = 1<br />
100t<br />
(10)<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Updating learning rate ǫ1(t) <strong>and</strong> ǫ2(t)<br />
Update of weight vector<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
∆Ws1 = ǫ1(t)(ξ − Ws1 ) (8)<br />
∆Wi = ǫ2(t)(ξ − Wi) (∀i ∈ Ns1 ) (9)<br />
After the size of <strong>network</strong> becomes stable, fine tune the <strong>network</strong><br />
stochastic approximation: a number of adaptation steps with<br />
a<br />
<br />
strength ǫ(t) decaying slowly but not too slowly, i.e.,<br />
∞<br />
t=1 ǫ(t) = ∞, <strong>and</strong> ∞ t=1 ǫ2 (t) < ∞.<br />
The harmonic series satisfies the conditions.<br />
ǫ1(t) = 1<br />
t , ǫ2(t) = 1<br />
100t<br />
(10)<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Updating learning rate ǫ1(t) <strong>and</strong> ǫ2(t)<br />
Update of weight vector<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
∆Ws1 = ǫ1(t)(ξ − Ws1 ) (8)<br />
∆Wi = ǫ2(t)(ξ − Wi) (∀i ∈ Ns1 ) (9)<br />
After the size of <strong>network</strong> becomes stable, fine tune the <strong>network</strong><br />
stochastic approximation: a number of adaptation steps with<br />
a<br />
<br />
strength ǫ(t) decaying slowly but not too slowly, i.e.,<br />
∞<br />
t=1 ǫ(t) = ∞, <strong>and</strong> ∞ t=1 ǫ2 (t) < ∞.<br />
The harmonic series satisfies the conditions.<br />
ǫ1(t) = 1<br />
t , ǫ2(t) = 1<br />
100t<br />
(10)<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Single-layer SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
For topology<br />
representation,<br />
first-layer is enough<br />
Within-class<br />
insertion slightly<br />
happened in<br />
first-layer<br />
Using subclass <strong>and</strong><br />
density to judge if<br />
connection is<br />
needed.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Single-layer SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
For topology<br />
representation,<br />
first-layer is enough<br />
Within-class<br />
insertion slightly<br />
happened in<br />
first-layer<br />
Using subclass <strong>and</strong><br />
density to judge if<br />
connection is<br />
needed.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Single-layer SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
For topology<br />
representation,<br />
first-layer is enough<br />
Within-class<br />
insertion slightly<br />
happened in<br />
first-layer<br />
Using subclass <strong>and</strong><br />
density to judge if<br />
connection is<br />
needed.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Single-layer SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
For topology<br />
representation,<br />
first-layer is enough<br />
Within-class<br />
insertion slightly<br />
happened in<br />
first-layer<br />
Using subclass <strong>and</strong><br />
density to judge if<br />
connection is<br />
needed.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Single-layer SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
For topology<br />
representation,<br />
first-layer is enough<br />
Within-class<br />
insertion slightly<br />
happened in<br />
first-layer<br />
Using subclass <strong>and</strong><br />
density to judge if<br />
connection is<br />
needed.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Single-layer SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
For topology<br />
representation,<br />
first-layer is enough<br />
Within-class<br />
insertion slightly<br />
happened in<br />
first-layer<br />
Using subclass <strong>and</strong><br />
density to judge if<br />
connection is<br />
needed.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation<br />
Stationary <strong>and</strong> non-stationary<br />
Stationary: all training data obey same distribution<br />
Non-stationary: next training sample maybe obey different<br />
distribution from previous one.<br />
Original data Stationary Non-stationary<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation<br />
Stationary <strong>and</strong> non-stationary<br />
Stationary: all training data obey same distribution<br />
Non-stationary: next training sample maybe obey different<br />
distribution from previous one.<br />
Original data Stationary Non-stationary<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation<br />
Stationary <strong>and</strong> non-stationary<br />
Stationary: all training data obey same distribution<br />
Non-stationary: next training sample maybe obey different<br />
distribution from previous one.<br />
Original data Stationary Non-stationary<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation<br />
Stationary <strong>and</strong> non-stationary<br />
Stationary: all training data obey same distribution<br />
Non-stationary: next training sample maybe obey different<br />
distribution from previous one.<br />
Original data Stationary Non-stationary<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation<br />
Stationary <strong>and</strong> non-stationary<br />
Stationary: all training data obey same distribution<br />
Non-stationary: next training sample maybe obey different<br />
distribution from previous one.<br />
Original data Stationary Non-stationary<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation<br />
Stationary <strong>and</strong> non-stationary<br />
Stationary: all training data obey same distribution<br />
Non-stationary: next training sample maybe obey different<br />
distribution from previous one.<br />
Original data Stationary Non-stationary<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation<br />
Stationary <strong>and</strong> non-stationary<br />
Stationary: all training data obey same distribution<br />
Non-stationary: next training sample maybe obey different<br />
distribution from previous one.<br />
Original data Stationary Non-stationary<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation (continue)<br />
Original data Two-layer SOINN Single-layer SOINN<br />
Conclusion of experiments: SOINN is able to<br />
Represent topology structure of input data.<br />
Realize <strong>incremental</strong> learning.<br />
Automatically learn number of nodes, de-noise, etc.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation (continue)<br />
Original data Two-layer SOINN Single-layer SOINN<br />
Conclusion of experiments: SOINN is able to<br />
Represent topology structure of input data.<br />
Realize <strong>incremental</strong> learning.<br />
Automatically learn number of nodes, de-noise, etc.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation (continue)<br />
Original data Two-layer SOINN Single-layer SOINN<br />
Conclusion of experiments: SOINN is able to<br />
Represent topology structure of input data.<br />
Realize <strong>incremental</strong> learning.<br />
Automatically learn number of nodes, de-noise, etc.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation (continue)<br />
Original data Two-layer SOINN Single-layer SOINN<br />
Conclusion of experiments: SOINN is able to<br />
Represent topology structure of input data.<br />
Realize <strong>incremental</strong> learning.<br />
Automatically learn number of nodes, de-noise, etc.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation (continue)<br />
Original data Two-layer SOINN Single-layer SOINN<br />
Conclusion of experiments: SOINN is able to<br />
Represent topology structure of input data.<br />
Realize <strong>incremental</strong> learning.<br />
Automatically learn number of nodes, de-noise, etc.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation (continue)<br />
Original data Two-layer SOINN Single-layer SOINN<br />
Conclusion of experiments: SOINN is able to<br />
Represent topology structure of input data.<br />
Realize <strong>incremental</strong> learning.<br />
Automatically learn number of nodes, de-noise, etc.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation (continue)<br />
Original data Two-layer SOINN Single-layer SOINN<br />
Conclusion of experiments: SOINN is able to<br />
Represent topology structure of input data.<br />
Realize <strong>incremental</strong> learning.<br />
Automatically learn number of nodes, de-noise, etc.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN<br />
Training process of SOINN<br />
Similarity threshold for judging input data<br />
Learning rate<br />
Simple version of SOINN<br />
Simulation results<br />
Artificial data set: topology representation (continue)<br />
Original data Two-layer SOINN Single-layer SOINN<br />
Conclusion of experiments: SOINN is able to<br />
Represent topology structure of input data.<br />
Realize <strong>incremental</strong> learning.<br />
Automatically learn number of nodes, de-noise, etc.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
1 What is SOINN<br />
2 Why SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
3 Detail algorithm of SOINN<br />
4 SOINN for machine learning<br />
5 SOINN for associative memory<br />
6 References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Some objectives of unsupervised learning<br />
Automatically learn number of classes of input data<br />
Clustering with no priori knowledge<br />
Topology representation<br />
Realize real-time <strong>incremental</strong> learning<br />
Separate classes with low density overlapped area<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Some objectives of unsupervised learning<br />
Automatically learn number of classes of input data<br />
Clustering with no priori knowledge<br />
Topology representation<br />
Realize real-time <strong>incremental</strong> learning<br />
Separate classes with low density overlapped area<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Some objectives of unsupervised learning<br />
Automatically learn number of classes of input data<br />
Clustering with no priori knowledge<br />
Topology representation<br />
Realize real-time <strong>incremental</strong> learning<br />
Separate classes with low density overlapped area<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Some objectives of unsupervised learning<br />
Automatically learn number of classes of input data<br />
Clustering with no priori knowledge<br />
Topology representation<br />
Realize real-time <strong>incremental</strong> learning<br />
Separate classes with low density overlapped area<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Some objectives of unsupervised learning<br />
Automatically learn number of classes of input data<br />
Clustering with no priori knowledge<br />
Topology representation<br />
Realize real-time <strong>incremental</strong> learning<br />
Separate classes with low density overlapped area<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Some objectives of unsupervised learning<br />
Automatically learn number of classes of input data<br />
Clustering with no priori knowledge<br />
Topology representation<br />
Realize real-time <strong>incremental</strong> learning<br />
Separate classes with low density overlapped area<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN for unsupervised learning: If two nodes connected<br />
with one path, the nodes belong to one class<br />
1 Do SOINN for input data, output topology representation of<br />
nodes<br />
2 Initialize all nodes as unclassified.<br />
3 R<strong>and</strong>omly choose one unclassified node i from node set A.<br />
Mark node i as classified <strong>and</strong> label it as class Ci.<br />
4 Search A to find all unclassified nodes that are connected to<br />
node i with a “path.” Mark these nodes as classified <strong>and</strong> label<br />
them as the same class as node i.<br />
5 Go to Step3 to continue the classification process until all<br />
nodes are classified.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN for unsupervised learning: If two nodes connected<br />
with one path, the nodes belong to one class<br />
1 Do SOINN for input data, output topology representation of<br />
nodes<br />
2 Initialize all nodes as unclassified.<br />
3 R<strong>and</strong>omly choose one unclassified node i from node set A.<br />
Mark node i as classified <strong>and</strong> label it as class Ci.<br />
4 Search A to find all unclassified nodes that are connected to<br />
node i with a “path.” Mark these nodes as classified <strong>and</strong> label<br />
them as the same class as node i.<br />
5 Go to Step3 to continue the classification process until all<br />
nodes are classified.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN for unsupervised learning: If two nodes connected<br />
with one path, the nodes belong to one class<br />
1 Do SOINN for input data, output topology representation of<br />
nodes<br />
2 Initialize all nodes as unclassified.<br />
3 R<strong>and</strong>omly choose one unclassified node i from node set A.<br />
Mark node i as classified <strong>and</strong> label it as class Ci.<br />
4 Search A to find all unclassified nodes that are connected to<br />
node i with a “path.” Mark these nodes as classified <strong>and</strong> label<br />
them as the same class as node i.<br />
5 Go to Step3 to continue the classification process until all<br />
nodes are classified.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN for unsupervised learning: If two nodes connected<br />
with one path, the nodes belong to one class<br />
1 Do SOINN for input data, output topology representation of<br />
nodes<br />
2 Initialize all nodes as unclassified.<br />
3 R<strong>and</strong>omly choose one unclassified node i from node set A.<br />
Mark node i as classified <strong>and</strong> label it as class Ci.<br />
4 Search A to find all unclassified nodes that are connected to<br />
node i with a “path.” Mark these nodes as classified <strong>and</strong> label<br />
them as the same class as node i.<br />
5 Go to Step3 to continue the classification process until all<br />
nodes are classified.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN for unsupervised learning: If two nodes connected<br />
with one path, the nodes belong to one class<br />
1 Do SOINN for input data, output topology representation of<br />
nodes<br />
2 Initialize all nodes as unclassified.<br />
3 R<strong>and</strong>omly choose one unclassified node i from node set A.<br />
Mark node i as classified <strong>and</strong> label it as class Ci.<br />
4 Search A to find all unclassified nodes that are connected to<br />
node i with a “path.” Mark these nodes as classified <strong>and</strong> label<br />
them as the same class as node i.<br />
5 Go to Step3 to continue the classification process until all<br />
nodes are classified.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN for unsupervised learning: If two nodes connected<br />
with one path, the nodes belong to one class<br />
1 Do SOINN for input data, output topology representation of<br />
nodes<br />
2 Initialize all nodes as unclassified.<br />
3 R<strong>and</strong>omly choose one unclassified node i from node set A.<br />
Mark node i as classified <strong>and</strong> label it as class Ci.<br />
4 Search A to find all unclassified nodes that are connected to<br />
node i with a “path.” Mark these nodes as classified <strong>and</strong> label<br />
them as the same class as node i.<br />
5 Go to Step3 to continue the classification process until all<br />
nodes are classified.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Artificial data set: 5 classes with 10% noise<br />
Original data Clustering result<br />
Conclusion of experiments<br />
Automatically reports number of classes.<br />
Perfectly clustering data with different shape <strong>and</strong> distribution.<br />
Find typical prototypes; <strong>incremental</strong> learning; de-noise; etc.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Artificial data set: 5 classes with 10% noise<br />
Original data Clustering result<br />
Conclusion of experiments<br />
Automatically reports number of classes.<br />
Perfectly clustering data with different shape <strong>and</strong> distribution.<br />
Find typical prototypes; <strong>incremental</strong> learning; de-noise; etc.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Artificial data set: 5 classes with 10% noise<br />
Original data Clustering result<br />
Conclusion of experiments<br />
Automatically reports number of classes.<br />
Perfectly clustering data with different shape <strong>and</strong> distribution.<br />
Find typical prototypes; <strong>incremental</strong> learning; de-noise; etc.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Artificial data set: 5 classes with 10% noise<br />
Original data Clustering result<br />
Conclusion of experiments<br />
Automatically reports number of classes.<br />
Perfectly clustering data with different shape <strong>and</strong> distribution.<br />
Find typical prototypes; <strong>incremental</strong> learning; de-noise; etc.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Artificial data set: 5 classes with 10% noise<br />
Original data Clustering result<br />
Conclusion of experiments<br />
Automatically reports number of classes.<br />
Perfectly clustering data with different shape <strong>and</strong> distribution.<br />
Find typical prototypes; <strong>incremental</strong> learning; de-noise; etc.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Artificial data set: 5 classes with 10% noise<br />
Original data Clustering result<br />
Conclusion of experiments<br />
Automatically reports number of classes.<br />
Perfectly clustering data with different shape <strong>and</strong> distribution.<br />
Find typical prototypes; <strong>incremental</strong> learning; de-noise; etc.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Face recognition: AT&T face data set<br />
Experiment results<br />
Automatically reports there are 10 classes.<br />
Prototypes of every classes are reported.<br />
With such prototypes, recognition ratio (1-NN rule) is 90%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Face recognition: AT&T face data set<br />
Experiment results<br />
Automatically reports there are 10 classes.<br />
Prototypes of every classes are reported.<br />
With such prototypes, recognition ratio (1-NN rule) is 90%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Face recognition: AT&T face data set<br />
Experiment results<br />
Automatically reports there are 10 classes.<br />
Prototypes of every classes are reported.<br />
With such prototypes, recognition ratio (1-NN rule) is 90%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Face recognition: AT&T face data set<br />
Experiment results<br />
Automatically reports there are 10 classes.<br />
Prototypes of every classes are reported.<br />
With such prototypes, recognition ratio (1-NN rule) is 90%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Face recognition: AT&T face data set<br />
Experiment results<br />
Automatically reports there are 10 classes.<br />
Prototypes of every classes are reported.<br />
With such prototypes, recognition ratio (1-NN rule) is 90%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Prototype-based classifier: based on 1-NN or k-NN rule<br />
Nearest Neighbor Classifier (NNC): all training data as<br />
prototypes<br />
Nearest Mean Classifier (NMC): mean of each class as<br />
prototypes<br />
k-means classifier (KMC), Learning Vector Quantization<br />
(LVQ), <strong>and</strong> others: predefine number of prototypes for every<br />
class.<br />
Main difficulty<br />
1 How to find enough prototypes without overfitting<br />
2 How to realize Incremental learning<br />
Incremental of new data inside one class (non-stationary or<br />
concept drift);<br />
Incremental of new classes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Prototype-based classifier: based on 1-NN or k-NN rule<br />
Nearest Neighbor Classifier (NNC): all training data as<br />
prototypes<br />
Nearest Mean Classifier (NMC): mean of each class as<br />
prototypes<br />
k-means classifier (KMC), Learning Vector Quantization<br />
(LVQ), <strong>and</strong> others: predefine number of prototypes for every<br />
class.<br />
Main difficulty<br />
1 How to find enough prototypes without overfitting<br />
2 How to realize Incremental learning<br />
Incremental of new data inside one class (non-stationary or<br />
concept drift);<br />
Incremental of new classes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Prototype-based classifier: based on 1-NN or k-NN rule<br />
Nearest Neighbor Classifier (NNC): all training data as<br />
prototypes<br />
Nearest Mean Classifier (NMC): mean of each class as<br />
prototypes<br />
k-means classifier (KMC), Learning Vector Quantization<br />
(LVQ), <strong>and</strong> others: predefine number of prototypes for every<br />
class.<br />
Main difficulty<br />
1 How to find enough prototypes without overfitting<br />
2 How to realize Incremental learning<br />
Incremental of new data inside one class (non-stationary or<br />
concept drift);<br />
Incremental of new classes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Prototype-based classifier: based on 1-NN or k-NN rule<br />
Nearest Neighbor Classifier (NNC): all training data as<br />
prototypes<br />
Nearest Mean Classifier (NMC): mean of each class as<br />
prototypes<br />
k-means classifier (KMC), Learning Vector Quantization<br />
(LVQ), <strong>and</strong> others: predefine number of prototypes for every<br />
class.<br />
Main difficulty<br />
1 How to find enough prototypes without overfitting<br />
2 How to realize Incremental learning<br />
Incremental of new data inside one class (non-stationary or<br />
concept drift);<br />
Incremental of new classes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Prototype-based classifier: based on 1-NN or k-NN rule<br />
Nearest Neighbor Classifier (NNC): all training data as<br />
prototypes<br />
Nearest Mean Classifier (NMC): mean of each class as<br />
prototypes<br />
k-means classifier (KMC), Learning Vector Quantization<br />
(LVQ), <strong>and</strong> others: predefine number of prototypes for every<br />
class.<br />
Main difficulty<br />
1 How to find enough prototypes without overfitting<br />
2 How to realize Incremental learning<br />
Incremental of new data inside one class (non-stationary or<br />
concept drift);<br />
Incremental of new classes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Prototype-based classifier: based on 1-NN or k-NN rule<br />
Nearest Neighbor Classifier (NNC): all training data as<br />
prototypes<br />
Nearest Mean Classifier (NMC): mean of each class as<br />
prototypes<br />
k-means classifier (KMC), Learning Vector Quantization<br />
(LVQ), <strong>and</strong> others: predefine number of prototypes for every<br />
class.<br />
Main difficulty<br />
1 How to find enough prototypes without overfitting<br />
2 How to realize Incremental learning<br />
Incremental of new data inside one class (non-stationary or<br />
concept drift);<br />
Incremental of new classes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Prototype-based classifier: based on 1-NN or k-NN rule<br />
Nearest Neighbor Classifier (NNC): all training data as<br />
prototypes<br />
Nearest Mean Classifier (NMC): mean of each class as<br />
prototypes<br />
k-means classifier (KMC), Learning Vector Quantization<br />
(LVQ), <strong>and</strong> others: predefine number of prototypes for every<br />
class.<br />
Main difficulty<br />
1 How to find enough prototypes without overfitting<br />
2 How to realize Incremental learning<br />
Incremental of new data inside one class (non-stationary or<br />
concept drift);<br />
Incremental of new classes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Prototype-based classifier: based on 1-NN or k-NN rule<br />
Nearest Neighbor Classifier (NNC): all training data as<br />
prototypes<br />
Nearest Mean Classifier (NMC): mean of each class as<br />
prototypes<br />
k-means classifier (KMC), Learning Vector Quantization<br />
(LVQ), <strong>and</strong> others: predefine number of prototypes for every<br />
class.<br />
Main difficulty<br />
1 How to find enough prototypes without overfitting<br />
2 How to realize Incremental learning<br />
Incremental of new data inside one class (non-stationary or<br />
concept drift);<br />
Incremental of new classes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Prototype-based classifier: based on 1-NN or k-NN rule<br />
Nearest Neighbor Classifier (NNC): all training data as<br />
prototypes<br />
Nearest Mean Classifier (NMC): mean of each class as<br />
prototypes<br />
k-means classifier (KMC), Learning Vector Quantization<br />
(LVQ), <strong>and</strong> others: predefine number of prototypes for every<br />
class.<br />
Main difficulty<br />
1 How to find enough prototypes without overfitting<br />
2 How to realize Incremental learning<br />
Incremental of new data inside one class (non-stationary or<br />
concept drift);<br />
Incremental of new classes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Prototype-based classifier: based on 1-NN or k-NN rule<br />
Nearest Neighbor Classifier (NNC): all training data as<br />
prototypes<br />
Nearest Mean Classifier (NMC): mean of each class as<br />
prototypes<br />
k-means classifier (KMC), Learning Vector Quantization<br />
(LVQ), <strong>and</strong> others: predefine number of prototypes for every<br />
class.<br />
Main difficulty<br />
1 How to find enough prototypes without overfitting<br />
2 How to realize Incremental learning<br />
Incremental of new data inside one class (non-stationary or<br />
concept drift);<br />
Incremental of new classes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN for supervised learning: Targets<br />
Automatically learn the number of prototypes needed to<br />
represent every class<br />
Only the prototypes used to determine the decision boundary<br />
will be remained<br />
Realize both types of <strong>incremental</strong> learning<br />
Robust to noise<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN for supervised learning: Targets<br />
Automatically learn the number of prototypes needed to<br />
represent every class<br />
Only the prototypes used to determine the decision boundary<br />
will be remained<br />
Realize both types of <strong>incremental</strong> learning<br />
Robust to noise<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN for supervised learning: Targets<br />
Automatically learn the number of prototypes needed to<br />
represent every class<br />
Only the prototypes used to determine the decision boundary<br />
will be remained<br />
Realize both types of <strong>incremental</strong> learning<br />
Robust to noise<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN for supervised learning: Targets<br />
Automatically learn the number of prototypes needed to<br />
represent every class<br />
Only the prototypes used to determine the decision boundary<br />
will be remained<br />
Realize both types of <strong>incremental</strong> learning<br />
Robust to noise<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN for supervised learning: Targets<br />
Automatically learn the number of prototypes needed to<br />
represent every class<br />
Only the prototypes used to determine the decision boundary<br />
will be remained<br />
Realize both types of <strong>incremental</strong> learning<br />
Robust to noise<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Adjusted SOINN Classifier (ASC)<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN learns k for<br />
k-means.<br />
Noise-reduction removes<br />
noisy prototypes<br />
Center-cleaning removes<br />
prototypes unuseful for<br />
decision<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Adjusted SOINN Classifier (ASC)<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN learns k for<br />
k-means.<br />
Noise-reduction removes<br />
noisy prototypes<br />
Center-cleaning removes<br />
prototypes unuseful for<br />
decision<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Adjusted SOINN Classifier (ASC)<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN learns k for<br />
k-means.<br />
Noise-reduction removes<br />
noisy prototypes<br />
Center-cleaning removes<br />
prototypes unuseful for<br />
decision<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Adjusted SOINN Classifier (ASC)<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN learns k for<br />
k-means.<br />
Noise-reduction removes<br />
noisy prototypes<br />
Center-cleaning removes<br />
prototypes unuseful for<br />
decision<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
ASC: noise-reduction & center-cleaning<br />
Noise-reduction<br />
If the label of a node differs from the label of majority voting of <strong>its</strong><br />
k-neighbors, it is considered an outlier.<br />
Center-cleaning<br />
If a prototype of class i has never been the nearest prototype of<br />
other classes, remove the prototype.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
ASC: noise-reduction & center-cleaning<br />
Noise-reduction<br />
If the label of a node differs from the label of majority voting of <strong>its</strong><br />
k-neighbors, it is considered an outlier.<br />
Center-cleaning<br />
If a prototype of class i has never been the nearest prototype of<br />
other classes, remove the prototype.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
ASC: noise-reduction & center-cleaning<br />
Noise-reduction<br />
If the label of a node differs from the label of majority voting of <strong>its</strong><br />
k-neighbors, it is considered an outlier.<br />
Center-cleaning<br />
If a prototype of class i has never been the nearest prototype of<br />
other classes, remove the prototype.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
ASC: noise-reduction & center-cleaning<br />
Noise-reduction<br />
If the label of a node differs from the label of majority voting of <strong>its</strong><br />
k-neighbors, it is considered an outlier.<br />
Center-cleaning<br />
If a prototype of class i has never been the nearest prototype of<br />
other classes, remove the prototype.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
ASC: noise-reduction & center-cleaning<br />
Noise-reduction<br />
If the label of a node differs from the label of majority voting of <strong>its</strong><br />
k-neighbors, it is considered an outlier.<br />
Center-cleaning<br />
If a prototype of class i has never been the nearest prototype of<br />
other classes, remove the prototype.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (I)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 6; Recognition ratio = 100%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (I)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 6; Recognition ratio = 100%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (I)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 6; Recognition ratio = 100%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (I)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 6; Recognition ratio = 100%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (I)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 6; Recognition ratio = 100%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (II)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 86; Recognition ratio = 98%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (II)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 86; Recognition ratio = 98%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (II)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 86; Recognition ratio = 98%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (II)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 86; Recognition ratio = 98%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (II)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 86; Recognition ratio = 98%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (III)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 87; Recognition ratio = 97.8%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (III)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 87; Recognition ratio = 97.8%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (III)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 87; Recognition ratio = 97.8%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (III)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 87; Recognition ratio = 97.8%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: artificial data (III)<br />
Original data SOINN results ASC results<br />
Test results of ASC<br />
No. of prototypes = 87; Recognition ratio = 97.8%.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Experiment results: optdig<strong>its</strong><br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
ASC with different parameter sets (ad,λ), displayed with average<br />
of 10 times training <strong>and</strong> st<strong>and</strong>ard deviation<br />
Parameter set of {ad, λ}<br />
(50, 50) (25, 25) (10, 10)<br />
recognition ratio (%) 97.7 ± 0.2 97.4 ± 0.2 97.0 ± 0.2<br />
No. of prototypes 377 ± 12 258 ± 7 112 ± 7<br />
Compression ratio (%) 9.9 ± 0.3 6.8 ± 0.2 2.9 ± 0.2<br />
Compare with SVM <strong>and</strong> 1-NN<br />
LibSVM: 1197 support vectors; Recognition ratio = 96.6%.<br />
1-NN: best classifier (98%). All 3823 samples as prototypes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Experiment results: optdig<strong>its</strong><br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
ASC with different parameter sets (ad,λ), displayed with average<br />
of 10 times training <strong>and</strong> st<strong>and</strong>ard deviation<br />
Parameter set of {ad, λ}<br />
(50, 50) (25, 25) (10, 10)<br />
recognition ratio (%) 97.7 ± 0.2 97.4 ± 0.2 97.0 ± 0.2<br />
No. of prototypes 377 ± 12 258 ± 7 112 ± 7<br />
Compression ratio (%) 9.9 ± 0.3 6.8 ± 0.2 2.9 ± 0.2<br />
Compare with SVM <strong>and</strong> 1-NN<br />
LibSVM: 1197 support vectors; Recognition ratio = 96.6%.<br />
1-NN: best classifier (98%). All 3823 samples as prototypes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Experiment results: optdig<strong>its</strong><br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
ASC with different parameter sets (ad,λ), displayed with average<br />
of 10 times training <strong>and</strong> st<strong>and</strong>ard deviation<br />
Parameter set of {ad, λ}<br />
(50, 50) (25, 25) (10, 10)<br />
recognition ratio (%) 97.7 ± 0.2 97.4 ± 0.2 97.0 ± 0.2<br />
No. of prototypes 377 ± 12 258 ± 7 112 ± 7<br />
Compression ratio (%) 9.9 ± 0.3 6.8 ± 0.2 2.9 ± 0.2<br />
Compare with SVM <strong>and</strong> 1-NN<br />
LibSVM: 1197 support vectors; Recognition ratio = 96.6%.<br />
1-NN: best classifier (98%). All 3823 samples as prototypes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Experiment results: optdig<strong>its</strong><br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
ASC with different parameter sets (ad,λ), displayed with average<br />
of 10 times training <strong>and</strong> st<strong>and</strong>ard deviation<br />
Parameter set of {ad, λ}<br />
(50, 50) (25, 25) (10, 10)<br />
recognition ratio (%) 97.7 ± 0.2 97.4 ± 0.2 97.0 ± 0.2<br />
No. of prototypes 377 ± 12 258 ± 7 112 ± 7<br />
Compression ratio (%) 9.9 ± 0.3 6.8 ± 0.2 2.9 ± 0.2<br />
Compare with SVM <strong>and</strong> 1-NN<br />
LibSVM: 1197 support vectors; Recognition ratio = 96.6%.<br />
1-NN: best classifier (98%). All 3823 samples as prototypes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: UCI repository data sets<br />
Comparison results of ASC <strong>and</strong> other classifiers: recognition ratio<br />
Data set ASC (ad, λ) NSC (σ 2 max) KMC (M) NNC (k) LVQ (M)<br />
Iris 97.4 ± 0.86 96.3 ± 0.4 96.2 ± 0.8 96.7 ± 0.6 96.1 ± 0.6<br />
Breast cancer 97.4 ± 0.38 97.2 ± 0.2 95.9 ± 0.3 97.0 ± 0.2 96.3 ± 0.4<br />
Ionosphere 90.4 ± 0.64 91.9 ± 0.8 87.4 ± 0.6 86.1 ± 0.7 86.4 ± 0.8<br />
Glass 73.5 ± 1.6 70.2 ± 1.5 68.8 ± 1.1 72.3 ± 1.2 68.3 ± 2.0<br />
Liver disorders 62.6 ± 0.83 62.9 ± 2.3 59.3 ± 2.3 67.3 ± 1.6 66.3 ± 1.9<br />
Pima Indians 72.0 ± 0.63 68.6 ± 1.6 68.7 ± 0.9 74.7 ± 0.7 73.5 ± 0.9<br />
Wine 82.6 ± 1.55 75.3 ± 1.7 71.9 ± 1.9 73.9 ± 1.9 72.3 ± 1.5<br />
Average 82.3 ± 0.93 80.4 ± 1.2 78.3 ± 1.1 81.1 ± 0.99 79.9 ± 1.2<br />
In average, ASC has best recognition performance.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: UCI repository data sets<br />
Comparison results of ASC <strong>and</strong> other classifiers: recognition ratio<br />
Data set ASC (ad, λ) NSC (σ 2 max) KMC (M) NNC (k) LVQ (M)<br />
Iris 97.4 ± 0.86 96.3 ± 0.4 96.2 ± 0.8 96.7 ± 0.6 96.1 ± 0.6<br />
Breast cancer 97.4 ± 0.38 97.2 ± 0.2 95.9 ± 0.3 97.0 ± 0.2 96.3 ± 0.4<br />
Ionosphere 90.4 ± 0.64 91.9 ± 0.8 87.4 ± 0.6 86.1 ± 0.7 86.4 ± 0.8<br />
Glass 73.5 ± 1.6 70.2 ± 1.5 68.8 ± 1.1 72.3 ± 1.2 68.3 ± 2.0<br />
Liver disorders 62.6 ± 0.83 62.9 ± 2.3 59.3 ± 2.3 67.3 ± 1.6 66.3 ± 1.9<br />
Pima Indians 72.0 ± 0.63 68.6 ± 1.6 68.7 ± 0.9 74.7 ± 0.7 73.5 ± 0.9<br />
Wine 82.6 ± 1.55 75.3 ± 1.7 71.9 ± 1.9 73.9 ± 1.9 72.3 ± 1.5<br />
Average 82.3 ± 0.93 80.4 ± 1.2 78.3 ± 1.1 81.1 ± 0.99 79.9 ± 1.2<br />
In average, ASC has best recognition performance.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: UCI repository data sets (continue)<br />
Comparison results of ASC <strong>and</strong> other classifiers: compression ratio<br />
Data set ASC (a ∗ d, λ ∗ ) NSC (σ 2 ∗ ∗ ∗ ∗<br />
max ) KMC (M ) NNC (k ) LVQ (M )<br />
Iris 5.2 (6, 6) 7.3 (0.25) 8.0 (4) 100 (14) 15 (22)<br />
Breast cancer 1.4 (8,8) 1.8 (35.0) 0.29 (1) 100 (5) 5.9 (40)<br />
Ionosphere 3.4 (15, 15) 31 (1.25) 4.0 (7) 100 (2) 6.8 (24)<br />
Glass 13.7 (15, 15) 97 (0.005) 17 (6) 100 (1) 45 (97)<br />
Liver disorders 4.6 (6, 6) 4.9 (600) 11 (19) 100 (14) 8.4 (29)<br />
Pima Indians 0.6 (6, 6) 1.7 (2600) 1.0 (4) 100 (17) 3.4 (26)<br />
Wine 3.2 (6, 6) 96 (4.0) 29 (17) 100 (1) 32 (57)<br />
Average 4.6 34.2 10.0 100 16.6<br />
In average, ASC has best compression ratio.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment results: UCI repository data sets (continue)<br />
Comparison results of ASC <strong>and</strong> other classifiers: compression ratio<br />
Data set ASC (a ∗ d, λ ∗ ) NSC (σ 2 ∗ ∗ ∗ ∗<br />
max ) KMC (M ) NNC (k ) LVQ (M )<br />
Iris 5.2 (6, 6) 7.3 (0.25) 8.0 (4) 100 (14) 15 (22)<br />
Breast cancer 1.4 (8,8) 1.8 (35.0) 0.29 (1) 100 (5) 5.9 (40)<br />
Ionosphere 3.4 (15, 15) 31 (1.25) 4.0 (7) 100 (2) 6.8 (24)<br />
Glass 13.7 (15, 15) 97 (0.005) 17 (6) 100 (1) 45 (97)<br />
Liver disorders 4.6 (6, 6) 4.9 (600) 11 (19) 100 (14) 8.4 (29)<br />
Pima Indians 0.6 (6, 6) 1.7 (2600) 1.0 (4) 100 (17) 3.4 (26)<br />
Wine 3.2 (6, 6) 96 (4.0) 29 (17) 100 (1) 32 (57)<br />
Average 4.6 34.2 10.0 100 16.6<br />
In average, ASC has best compression ratio.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Requirement of Semi-supervised learning<br />
Labeled instances are difficult, expensive, or time consuming<br />
to obtain.<br />
How can a system use large amount of unlabeled data with<br />
limited labeled data to built good classifiers?<br />
New data are continually added to an already huge database<br />
How can a system learn new knowledge without forgetting<br />
previous learned knowledge?<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Requirement of Semi-supervised learning<br />
Labeled instances are difficult, expensive, or time consuming<br />
to obtain.<br />
How can a system use large amount of unlabeled data with<br />
limited labeled data to built good classifiers?<br />
New data are continually added to an already huge database<br />
How can a system learn new knowledge without forgetting<br />
previous learned knowledge?<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Requirement of Semi-supervised learning<br />
Labeled instances are difficult, expensive, or time consuming<br />
to obtain.<br />
How can a system use large amount of unlabeled data with<br />
limited labeled data to built good classifiers?<br />
New data are continually added to an already huge database<br />
How can a system learn new knowledge without forgetting<br />
previous learned knowledge?<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Requirement of Semi-supervised learning<br />
Labeled instances are difficult, expensive, or time consuming<br />
to obtain.<br />
How can a system use large amount of unlabeled data with<br />
limited labeled data to built good classifiers?<br />
New data are continually added to an already huge database<br />
How can a system learn new knowledge without forgetting<br />
previous learned knowledge?<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Requirement of Semi-supervised learning<br />
Labeled instances are difficult, expensive, or time consuming<br />
to obtain.<br />
How can a system use large amount of unlabeled data with<br />
limited labeled data to built good classifiers?<br />
New data are continually added to an already huge database<br />
How can a system learn new knowledge without forgetting<br />
previous learned knowledge?<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN used for Semi-supervised learning<br />
1 SOINN:represent topology,<br />
<strong>incremental</strong> learning;<br />
2 Labeled data: label nodes<br />
(winner);<br />
3 Division of a cluster<br />
Condition of division<br />
Rc−1 ≤ Rc&Rc > Rc+1 (11)<br />
Rc = <br />
dis(wa, wc) (12)<br />
a∈Nc<br />
c-1: former node<br />
c+1: unlabeled neighbors.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN used for Semi-supervised learning<br />
1 SOINN:represent topology,<br />
<strong>incremental</strong> learning;<br />
2 Labeled data: label nodes<br />
(winner);<br />
3 Division of a cluster<br />
Condition of division<br />
Rc−1 ≤ Rc&Rc > Rc+1 (11)<br />
Rc = <br />
dis(wa, wc) (12)<br />
a∈Nc<br />
c-1: former node<br />
c+1: unlabeled neighbors.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN used for Semi-supervised learning<br />
1 SOINN:represent topology,<br />
<strong>incremental</strong> learning;<br />
2 Labeled data: label nodes<br />
(winner);<br />
3 Division of a cluster<br />
Condition of division<br />
Rc−1 ≤ Rc&Rc > Rc+1 (11)<br />
Rc = <br />
dis(wa, wc) (12)<br />
a∈Nc<br />
c-1: former node<br />
c+1: unlabeled neighbors.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN used for Semi-supervised learning<br />
1 SOINN:represent topology,<br />
<strong>incremental</strong> learning;<br />
2 Labeled data: label nodes<br />
(winner);<br />
3 Division of a cluster<br />
Condition of division<br />
Rc−1 ≤ Rc&Rc > Rc+1 (11)<br />
Rc = <br />
dis(wa, wc) (12)<br />
a∈Nc<br />
c-1: former node<br />
c+1: unlabeled neighbors.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
SOINN used for Semi-supervised learning<br />
1 SOINN:represent topology,<br />
<strong>incremental</strong> learning;<br />
2 Labeled data: label nodes<br />
(winner);<br />
3 Division of a cluster<br />
Condition of division<br />
Rc−1 ≤ Rc&Rc > Rc+1 (11)<br />
Rc = <br />
dis(wa, wc) (12)<br />
a∈Nc<br />
c-1: former node<br />
c+1: unlabeled neighbors.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Experiment: original data<br />
5%, 15%, or 40% overlap<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
training samples 500, validation samples 5,000, <strong>and</strong> test<br />
samples 5,000<br />
labeled samples: 10% <strong>and</strong> 20%<br />
light blue: unlabeled data; others: labeled data<br />
- - - - - ideal decision boundary<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Experiment: original data<br />
5%, 15%, or 40% overlap<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
training samples 500, validation samples 5,000, <strong>and</strong> test<br />
samples 5,000<br />
labeled samples: 10% <strong>and</strong> 20%<br />
light blue: unlabeled data; others: labeled data<br />
- - - - - ideal decision boundary<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Experiment: original data<br />
5%, 15%, or 40% overlap<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
training samples 500, validation samples 5,000, <strong>and</strong> test<br />
samples 5,000<br />
labeled samples: 10% <strong>and</strong> 20%<br />
light blue: unlabeled data; others: labeled data<br />
- - - - - ideal decision boundary<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Experiment: original data<br />
5%, 15%, or 40% overlap<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
training samples 500, validation samples 5,000, <strong>and</strong> test<br />
samples 5,000<br />
labeled samples: 10% <strong>and</strong> 20%<br />
light blue: unlabeled data; others: labeled data<br />
- - - - - ideal decision boundary<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Experiment: original data<br />
5%, 15%, or 40% overlap<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
training samples 500, validation samples 5,000, <strong>and</strong> test<br />
samples 5,000<br />
labeled samples: 10% <strong>and</strong> 20%<br />
light blue: unlabeled data; others: labeled data<br />
- - - - - ideal decision boundary<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Experiment: original data<br />
5%, 15%, or 40% overlap<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
training samples 500, validation samples 5,000, <strong>and</strong> test<br />
samples 5,000<br />
labeled samples: 10% <strong>and</strong> 20%<br />
light blue: unlabeled data; others: labeled data<br />
- - - - - ideal decision boundary<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Experiment results<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Separate classes<br />
with few labeled<br />
samples.<br />
For UCI data sets,<br />
work better than<br />
other typical<br />
methods.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Experiment results<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Separate classes<br />
with few labeled<br />
samples.<br />
For UCI data sets,<br />
work better than<br />
other typical<br />
methods.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Experiment results<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Separate classes<br />
with few labeled<br />
samples.<br />
For UCI data sets,<br />
work better than<br />
other typical<br />
methods.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
SOINN used for active learning<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Targets: Actively ask for label of some samples to label all<br />
classes<br />
Idea:<br />
1 Use SOINN to learn the topology structure of input data.<br />
2 Actively label the vertex nodes of every class<br />
3 Use vertex nodes to label all nodes.<br />
4 Actively label the nodes lie in the overlapped area.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
SOINN used for active learning<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Targets: Actively ask for label of some samples to label all<br />
classes<br />
Idea:<br />
1 Use SOINN to learn the topology structure of input data.<br />
2 Actively label the vertex nodes of every class<br />
3 Use vertex nodes to label all nodes.<br />
4 Actively label the nodes lie in the overlapped area.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
SOINN used for active learning<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Targets: Actively ask for label of some samples to label all<br />
classes<br />
Idea:<br />
1 Use SOINN to learn the topology structure of input data.<br />
2 Actively label the vertex nodes of every class<br />
3 Use vertex nodes to label all nodes.<br />
4 Actively label the nodes lie in the overlapped area.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
SOINN used for active learning<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Targets: Actively ask for label of some samples to label all<br />
classes<br />
Idea:<br />
1 Use SOINN to learn the topology structure of input data.<br />
2 Actively label the vertex nodes of every class<br />
3 Use vertex nodes to label all nodes.<br />
4 Actively label the nodes lie in the overlapped area.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
SOINN used for active learning<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Targets: Actively ask for label of some samples to label all<br />
classes<br />
Idea:<br />
1 Use SOINN to learn the topology structure of input data.<br />
2 Actively label the vertex nodes of every class<br />
3 Use vertex nodes to label all nodes.<br />
4 Actively label the nodes lie in the overlapped area.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
SOINN used for active learning<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Targets: Actively ask for label of some samples to label all<br />
classes<br />
Idea:<br />
1 Use SOINN to learn the topology structure of input data.<br />
2 Actively label the vertex nodes of every class<br />
3 Use vertex nodes to label all nodes.<br />
4 Actively label the nodes lie in the overlapped area.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
SOINN used for active learning<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Targets: Actively ask for label of some samples to label all<br />
classes<br />
Idea:<br />
1 Use SOINN to learn the topology structure of input data.<br />
2 Actively label the vertex nodes of every class<br />
3 Use vertex nodes to label all nodes.<br />
4 Actively label the nodes lie in the overlapped area.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment: artificial data set under stationary<br />
environment<br />
Original data: Four classes in all, with 10% noise.<br />
Results: under stationary environment; 10 teacher vectors.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment: artificial data set under stationary<br />
environment<br />
Original data: Four classes in all, with 10% noise.<br />
Results: under stationary environment; 10 teacher vectors.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment: artificial data set under stationary<br />
environment<br />
Original data: Four classes in all, with 10% noise.<br />
Results: under stationary environment; 10 teacher vectors.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment: artificial data set under non-stationary<br />
environment<br />
16 teacher vectors are asked.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Unsupervised learning<br />
Supervised learning<br />
Semi-supervised learning<br />
Active learning<br />
Experiment: artificial data set under non-stationary<br />
environment<br />
16 teacher vectors are asked.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
1 What is SOINN<br />
2 Why SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
3 Detail algorithm of SOINN<br />
4 SOINN for machine learning<br />
5 SOINN for associative memory<br />
6 References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Background: typical associative memory systems<br />
Distributed Learning Associative Memory:<br />
Hopfield Network: most famous <strong>network</strong>, for auto-associative<br />
memory<br />
Bidirectional Associative Memory (BAM), for<br />
hetero-associative memory<br />
Competitive Learning Associative Memory<br />
KFMAM: Kohonon feature map associative memory.<br />
Difficulties<br />
Forget previously learned knowledge when learning new<br />
knowledge <strong>incremental</strong>ly.<br />
Storage limitation.<br />
Memory real-valued data.<br />
Many-to-Many associate.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Background: typical associative memory systems<br />
Distributed Learning Associative Memory:<br />
Hopfield Network: most famous <strong>network</strong>, for auto-associative<br />
memory<br />
Bidirectional Associative Memory (BAM), for<br />
hetero-associative memory<br />
Competitive Learning Associative Memory<br />
KFMAM: Kohonon feature map associative memory.<br />
Difficulties<br />
Forget previously learned knowledge when learning new<br />
knowledge <strong>incremental</strong>ly.<br />
Storage limitation.<br />
Memory real-valued data.<br />
Many-to-Many associate.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Background: typical associative memory systems<br />
Distributed Learning Associative Memory:<br />
Hopfield Network: most famous <strong>network</strong>, for auto-associative<br />
memory<br />
Bidirectional Associative Memory (BAM), for<br />
hetero-associative memory<br />
Competitive Learning Associative Memory<br />
KFMAM: Kohonon feature map associative memory.<br />
Difficulties<br />
Forget previously learned knowledge when learning new<br />
knowledge <strong>incremental</strong>ly.<br />
Storage limitation.<br />
Memory real-valued data.<br />
Many-to-Many associate.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Background: typical associative memory systems<br />
Distributed Learning Associative Memory:<br />
Hopfield Network: most famous <strong>network</strong>, for auto-associative<br />
memory<br />
Bidirectional Associative Memory (BAM), for<br />
hetero-associative memory<br />
Competitive Learning Associative Memory<br />
KFMAM: Kohonon feature map associative memory.<br />
Difficulties<br />
Forget previously learned knowledge when learning new<br />
knowledge <strong>incremental</strong>ly.<br />
Storage limitation.<br />
Memory real-valued data.<br />
Many-to-Many associate.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Background: typical associative memory systems<br />
Distributed Learning Associative Memory:<br />
Hopfield Network: most famous <strong>network</strong>, for auto-associative<br />
memory<br />
Bidirectional Associative Memory (BAM), for<br />
hetero-associative memory<br />
Competitive Learning Associative Memory<br />
KFMAM: Kohonon feature map associative memory.<br />
Difficulties<br />
Forget previously learned knowledge when learning new<br />
knowledge <strong>incremental</strong>ly.<br />
Storage limitation.<br />
Memory real-valued data.<br />
Many-to-Many associate.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Background: typical associative memory systems<br />
Distributed Learning Associative Memory:<br />
Hopfield Network: most famous <strong>network</strong>, for auto-associative<br />
memory<br />
Bidirectional Associative Memory (BAM), for<br />
hetero-associative memory<br />
Competitive Learning Associative Memory<br />
KFMAM: Kohonon feature map associative memory.<br />
Difficulties<br />
Forget previously learned knowledge when learning new<br />
knowledge <strong>incremental</strong>ly.<br />
Storage limitation.<br />
Memory real-valued data.<br />
Many-to-Many associate.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Background: typical associative memory systems<br />
Distributed Learning Associative Memory:<br />
Hopfield Network: most famous <strong>network</strong>, for auto-associative<br />
memory<br />
Bidirectional Associative Memory (BAM), for<br />
hetero-associative memory<br />
Competitive Learning Associative Memory<br />
KFMAM: Kohonon feature map associative memory.<br />
Difficulties<br />
Forget previously learned knowledge when learning new<br />
knowledge <strong>incremental</strong>ly.<br />
Storage limitation.<br />
Memory real-valued data.<br />
Many-to-Many associate.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Background: typical associative memory systems<br />
Distributed Learning Associative Memory:<br />
Hopfield Network: most famous <strong>network</strong>, for auto-associative<br />
memory<br />
Bidirectional Associative Memory (BAM), for<br />
hetero-associative memory<br />
Competitive Learning Associative Memory<br />
KFMAM: Kohonon feature map associative memory.<br />
Difficulties<br />
Forget previously learned knowledge when learning new<br />
knowledge <strong>incremental</strong>ly.<br />
Storage limitation.<br />
Memory real-valued data.<br />
Many-to-Many associate.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Background: typical associative memory systems<br />
Distributed Learning Associative Memory:<br />
Hopfield Network: most famous <strong>network</strong>, for auto-associative<br />
memory<br />
Bidirectional Associative Memory (BAM), for<br />
hetero-associative memory<br />
Competitive Learning Associative Memory<br />
KFMAM: Kohonon feature map associative memory.<br />
Difficulties<br />
Forget previously learned knowledge when learning new<br />
knowledge <strong>incremental</strong>ly.<br />
Storage limitation.<br />
Memory real-valued data.<br />
Many-to-Many associate.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Background: typical associative memory systems<br />
Distributed Learning Associative Memory:<br />
Hopfield Network: most famous <strong>network</strong>, for auto-associative<br />
memory<br />
Bidirectional Associative Memory (BAM), for<br />
hetero-associative memory<br />
Competitive Learning Associative Memory<br />
KFMAM: Kohonon feature map associative memory.<br />
Difficulties<br />
Forget previously learned knowledge when learning new<br />
knowledge <strong>incremental</strong>ly.<br />
Storage limitation.<br />
Memory real-valued data.<br />
Many-to-Many associate.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Background: typical associative memory systems<br />
Distributed Learning Associative Memory:<br />
Hopfield Network: most famous <strong>network</strong>, for auto-associative<br />
memory<br />
Bidirectional Associative Memory (BAM), for<br />
hetero-associative memory<br />
Competitive Learning Associative Memory<br />
KFMAM: Kohonon feature map associative memory.<br />
Difficulties<br />
Forget previously learned knowledge when learning new<br />
knowledge <strong>incremental</strong>ly.<br />
Storage limitation.<br />
Memory real-valued data.<br />
Many-to-Many associate.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Objectives of SOINN-AM<br />
Incremental learning of memory pairs.<br />
Robust for noise data.<br />
Dealing with real-valued data.<br />
Many-to-many association.<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Objectives of SOINN-AM<br />
Incremental learning of memory pairs.<br />
Robust for noise data.<br />
Dealing with real-valued data.<br />
Many-to-many association.<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Objectives of SOINN-AM<br />
Incremental learning of memory pairs.<br />
Robust for noise data.<br />
Dealing with real-valued data.<br />
Many-to-many association.<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Objectives of SOINN-AM<br />
Incremental learning of memory pairs.<br />
Robust for noise data.<br />
Dealing with real-valued data.<br />
Many-to-many association.<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Objectives of SOINN-AM<br />
Incremental learning of memory pairs.<br />
Robust for noise data.<br />
Dealing with real-valued data.<br />
Many-to-many association.<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Objectives of SOINN-AM<br />
Incremental learning of memory pairs.<br />
Robust for noise data.<br />
Dealing with real-valued data.<br />
Many-to-many association.<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture of SOINN-AM<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Algorithms of SOINN-AM<br />
Basic idea of memory phase<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
1 Combine key vector <strong>and</strong> associate vector as input data.<br />
2 Use SOINN to learn such input data.<br />
Basic idea of recall phase<br />
1 Using key part of nodes to find winner node for key vector,<br />
the distance is d.<br />
2 If d ≤ ǫ, output the associative part of winner as the recall<br />
results.<br />
3 If d > ǫ, report unknown for key vector.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Algorithms of SOINN-AM<br />
Basic idea of memory phase<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
1 Combine key vector <strong>and</strong> associate vector as input data.<br />
2 Use SOINN to learn such input data.<br />
Basic idea of recall phase<br />
1 Using key part of nodes to find winner node for key vector,<br />
the distance is d.<br />
2 If d ≤ ǫ, output the associative part of winner as the recall<br />
results.<br />
3 If d > ǫ, report unknown for key vector.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Algorithms of SOINN-AM<br />
Basic idea of memory phase<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
1 Combine key vector <strong>and</strong> associate vector as input data.<br />
2 Use SOINN to learn such input data.<br />
Basic idea of recall phase<br />
1 Using key part of nodes to find winner node for key vector,<br />
the distance is d.<br />
2 If d ≤ ǫ, output the associative part of winner as the recall<br />
results.<br />
3 If d > ǫ, report unknown for key vector.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Algorithms of SOINN-AM<br />
Basic idea of memory phase<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
1 Combine key vector <strong>and</strong> associate vector as input data.<br />
2 Use SOINN to learn such input data.<br />
Basic idea of recall phase<br />
1 Using key part of nodes to find winner node for key vector,<br />
the distance is d.<br />
2 If d ≤ ǫ, output the associative part of winner as the recall<br />
results.<br />
3 If d > ǫ, report unknown for key vector.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Algorithms of SOINN-AM<br />
Basic idea of memory phase<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
1 Combine key vector <strong>and</strong> associate vector as input data.<br />
2 Use SOINN to learn such input data.<br />
Basic idea of recall phase<br />
1 Using key part of nodes to find winner node for key vector,<br />
the distance is d.<br />
2 If d ≤ ǫ, output the associative part of winner as the recall<br />
results.<br />
3 If d > ǫ, report unknown for key vector.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Algorithms of SOINN-AM<br />
Basic idea of memory phase<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
1 Combine key vector <strong>and</strong> associate vector as input data.<br />
2 Use SOINN to learn such input data.<br />
Basic idea of recall phase<br />
1 Using key part of nodes to find winner node for key vector,<br />
the distance is d.<br />
2 If d ≤ ǫ, output the associative part of winner as the recall<br />
results.<br />
3 If d > ǫ, report unknown for key vector.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Algorithms of SOINN-AM<br />
Basic idea of memory phase<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
1 Combine key vector <strong>and</strong> associate vector as input data.<br />
2 Use SOINN to learn such input data.<br />
Basic idea of recall phase<br />
1 Using key part of nodes to find winner node for key vector,<br />
the distance is d.<br />
2 If d ≤ ǫ, output the associative part of winner as the recall<br />
results.<br />
3 If d > ǫ, report unknown for key vector.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Algorithms of SOINN-AM<br />
Basic idea of memory phase<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
1 Combine key vector <strong>and</strong> associate vector as input data.<br />
2 Use SOINN to learn such input data.<br />
Basic idea of recall phase<br />
1 Using key part of nodes to find winner node for key vector,<br />
the distance is d.<br />
2 If d ≤ ǫ, output the associative part of winner as the recall<br />
results.<br />
3 If d > ǫ, report unknown for key vector.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Original data<br />
Binary data<br />
Real-valued data<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Comparison with typical AM systems<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Robustness of noise<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Many-to-Many associate testing<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
SOINN-AM recalls all patterns perfectly.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture <strong>and</strong> basic idea of GAM<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Input layer: key vector<br />
<strong>and</strong> associate vector.<br />
Memory layer: Memory<br />
patterns with classes.<br />
Associate layer: Build<br />
association between<br />
classes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture <strong>and</strong> basic idea of GAM<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Input layer: key vector<br />
<strong>and</strong> associate vector.<br />
Memory layer: Memory<br />
patterns with classes.<br />
Associate layer: Build<br />
association between<br />
classes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture <strong>and</strong> basic idea of GAM<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Input layer: key vector<br />
<strong>and</strong> associate vector.<br />
Memory layer: Memory<br />
patterns with classes.<br />
Associate layer: Build<br />
association between<br />
classes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
Architecture <strong>and</strong> basic idea of GAM<br />
Background<br />
SOINN-AM<br />
Experiments<br />
General Associative Memory<br />
Input layer: key vector<br />
<strong>and</strong> associate vector.<br />
Memory layer: Memory<br />
patterns with classes.<br />
Associate layer: Build<br />
association between<br />
classes.<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
1 What is SOINN<br />
2 Why SOINN<br />
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
3 Detail algorithm of SOINN<br />
4 SOINN for machine learning<br />
5 SOINN for associative memory<br />
6 References<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
References about SOINN<br />
SOINN for unsupervised learning:<br />
Furao Shen <strong>and</strong> Osamu Hasegawa, ”An Incremental Network for On-line<br />
Unsupervised Classification <strong>and</strong> Topology Learning”, Neural Networks,<br />
Vol.19, No.1, pp.90-106, (2005)<br />
Furao Shen, Tomotaka Ogura <strong>and</strong> Osamu Hasegawa, ”An enhanced<br />
self-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> for online unsupervised<br />
learning”, Neural Networks, Vol.20, No.8, pp.893-903, (2007)<br />
SOINN for Supervised learning:<br />
Furao Shen <strong>and</strong> Osamu Hasegawa, ”A Fast Nearest Neighbor Classifier<br />
Based on <strong>Self</strong>-<strong>organizing</strong> Incremental Neural Network”, Neural Networks,<br />
Vol.21, No.10, pp1537-1547, (2008)<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
References about SOINN<br />
SOINN for unsupervised learning:<br />
Furao Shen <strong>and</strong> Osamu Hasegawa, ”An Incremental Network for On-line<br />
Unsupervised Classification <strong>and</strong> Topology Learning”, Neural Networks,<br />
Vol.19, No.1, pp.90-106, (2005)<br />
Furao Shen, Tomotaka Ogura <strong>and</strong> Osamu Hasegawa, ”An enhanced<br />
self-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> for online unsupervised<br />
learning”, Neural Networks, Vol.20, No.8, pp.893-903, (2007)<br />
SOINN for Supervised learning:<br />
Furao Shen <strong>and</strong> Osamu Hasegawa, ”A Fast Nearest Neighbor Classifier<br />
Based on <strong>Self</strong>-<strong>organizing</strong> Incremental Neural Network”, Neural Networks,<br />
Vol.21, No.10, pp1537-1547, (2008)<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
References about SOINN<br />
SOINN for unsupervised learning:<br />
Furao Shen <strong>and</strong> Osamu Hasegawa, ”An Incremental Network for On-line<br />
Unsupervised Classification <strong>and</strong> Topology Learning”, Neural Networks,<br />
Vol.19, No.1, pp.90-106, (2005)<br />
Furao Shen, Tomotaka Ogura <strong>and</strong> Osamu Hasegawa, ”An enhanced<br />
self-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> for online unsupervised<br />
learning”, Neural Networks, Vol.20, No.8, pp.893-903, (2007)<br />
SOINN for Supervised learning:<br />
Furao Shen <strong>and</strong> Osamu Hasegawa, ”A Fast Nearest Neighbor Classifier<br />
Based on <strong>Self</strong>-<strong>organizing</strong> Incremental Neural Network”, Neural Networks,<br />
Vol.21, No.10, pp1537-1547, (2008)<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
References about SOINN<br />
SOINN for Semi-supervised <strong>and</strong> active learning<br />
Youki Kamiya, Toshiaki Ishii, Furao Shen <strong>and</strong> Osamu Hasegawa: ”An<br />
Online Semi-Supervised Clustering Algorithm Based on a <strong>Self</strong>-<strong>organizing</strong><br />
Incremental Neural Network,” IJCNN 2007, Orl<strong>and</strong>o, FL, USA, August<br />
2007<br />
Furao Shen, Keisuke Sakurai, Youki Kamiya <strong>and</strong> Osamu Hasegawa: ”An<br />
Online Semi-supervised Active Learning Algorithm with <strong>Self</strong>-<strong>organizing</strong><br />
Incremental Neural Network,” IJCNN 2007, Orl<strong>and</strong>o, FL, USA, August<br />
2007<br />
SOINN for Associative Memory:<br />
Sudo Akihito; Sato Akihiro; Hasegawa Osamu, ”Associative Memory for<br />
Online Learning in Noisy Environments Using <strong>Self</strong>-<strong>organizing</strong> Incremental<br />
Neural Network”, IEEE Transactions on Neural Networks, (2009) in press<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
References about SOINN<br />
SOINN for Semi-supervised <strong>and</strong> active learning<br />
Youki Kamiya, Toshiaki Ishii, Furao Shen <strong>and</strong> Osamu Hasegawa: ”An<br />
Online Semi-Supervised Clustering Algorithm Based on a <strong>Self</strong>-<strong>organizing</strong><br />
Incremental Neural Network,” IJCNN 2007, Orl<strong>and</strong>o, FL, USA, August<br />
2007<br />
Furao Shen, Keisuke Sakurai, Youki Kamiya <strong>and</strong> Osamu Hasegawa: ”An<br />
Online Semi-supervised Active Learning Algorithm with <strong>Self</strong>-<strong>organizing</strong><br />
Incremental Neural Network,” IJCNN 2007, Orl<strong>and</strong>o, FL, USA, August<br />
2007<br />
SOINN for Associative Memory:<br />
Sudo Akihito; Sato Akihiro; Hasegawa Osamu, ”Associative Memory for<br />
Online Learning in Noisy Environments Using <strong>Self</strong>-<strong>organizing</strong> Incremental<br />
Neural Network”, IEEE Transactions on Neural Networks, (2009) in press<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
References about SOINN<br />
SOINN for Semi-supervised <strong>and</strong> active learning<br />
Youki Kamiya, Toshiaki Ishii, Furao Shen <strong>and</strong> Osamu Hasegawa: ”An<br />
Online Semi-Supervised Clustering Algorithm Based on a <strong>Self</strong>-<strong>organizing</strong><br />
Incremental Neural Network,” IJCNN 2007, Orl<strong>and</strong>o, FL, USA, August<br />
2007<br />
Furao Shen, Keisuke Sakurai, Youki Kamiya <strong>and</strong> Osamu Hasegawa: ”An<br />
Online Semi-supervised Active Learning Algorithm with <strong>Self</strong>-<strong>organizing</strong><br />
Incremental Neural Network,” IJCNN 2007, Orl<strong>and</strong>o, FL, USA, August<br />
2007<br />
SOINN for Associative Memory:<br />
Sudo Akihito; Sato Akihiro; Hasegawa Osamu, ”Associative Memory for<br />
Online Learning in Noisy Environments Using <strong>Self</strong>-<strong>organizing</strong> Incremental<br />
Neural Network”, IEEE Transactions on Neural Networks, (2009) in press<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>
Contents<br />
What is SOINN<br />
Why SOINN<br />
Detail algorithm of SOINN<br />
SOINN for machine learning<br />
SOINN for associative memory<br />
References<br />
References about SOINN<br />
Download papers <strong>and</strong> program of SOINN<br />
http://www.isl.titech.ac.jp/˜ hasegawalab/soinn.html<br />
F. Shen, O. Hasegawa <strong>Self</strong>-<strong>organizing</strong> <strong>incremental</strong> <strong>neural</strong> <strong>network</strong> <strong>and</strong> <strong>its</strong> <strong>application</strong>