Sunteți pe pagina 1din 222

Contents

What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Self-organizing incremental neural network and its


application
F. Shen1
1 National

O. Hasegawa2

Key Laboratory for Novel Software Technology, Nanjing University

2 Imaging

Science and Engineering Lab, Tokyo Institute of Technology

June 12, 2009

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Contents of this tutorial


1

What is SOINN

Why SOINN

Detail algorithm of SOINN

SOINN for machine learning

SOINN for associative memory

References

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

What is SOINN

Why SOINN

Detail algorithm of SOINN

SOINN for machine learning

SOINN for associative memory

References

F. Shen, O. Hasegawa

What is SOINN

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

What is SOINN

What is SOINN

SOINN: Self-organizing incremental neural network


Represent the topological structure of the input data
Realize online incremental learning

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

What is SOINN

What is SOINN

SOINN: Self-organizing incremental neural network


Represent the topological structure of the input data
Realize online incremental learning

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

What is SOINN

What is SOINN

SOINN: Self-organizing incremental neural network


Represent the topological structure of the input data
Realize online incremental learning

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

What is SOINN

What is SOINN

SOINN: Self-organizing incremental neural network


Represent the topological structure of the input data
Realize online incremental learning

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

What is SOINN

Why SOINN

Detail algorithm of SOINN

SOINN for machine learning

SOINN for associative memory

References

F. Shen, O. Hasegawa

Background
Characteristics of SOINN

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Background: Networks for topology representation

SOM(Self-Organizing Map): predefine structure and size of


the network
NG(Neural Gas): predefine the network size
GNG(Growing Neural Gas): predefine the network size;
constant learning rate leads to non-stationary result.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Background: Networks for topology representation

SOM(Self-Organizing Map): predefine structure and size of


the network
NG(Neural Gas): predefine the network size
GNG(Growing Neural Gas): predefine the network size;
constant learning rate leads to non-stationary result.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Background: Networks for topology representation

SOM(Self-Organizing Map): predefine structure and size of


the network
NG(Neural Gas): predefine the network size
GNG(Growing Neural Gas): predefine the network size;
constant learning rate leads to non-stationary result.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Background: Networks for topology representation

SOM(Self-Organizing Map): predefine structure and size of


the network
NG(Neural Gas): predefine the network size
GNG(Growing Neural Gas): predefine the network size;
constant learning rate leads to non-stationary result.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Background: Networks for incremental learning

Incremental learning: Learning new knowledge without destroy


of old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user defined
threshold.
Multilayer Perceptrons: To learn new knowledge will destroy
old knowledge
Sub-network methods: Need plenty of storage

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Background: Networks for incremental learning

Incremental learning: Learning new knowledge without destroy


of old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user defined
threshold.
Multilayer Perceptrons: To learn new knowledge will destroy
old knowledge
Sub-network methods: Need plenty of storage

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Background: Networks for incremental learning

Incremental learning: Learning new knowledge without destroy


of old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user defined
threshold.
Multilayer Perceptrons: To learn new knowledge will destroy
old knowledge
Sub-network methods: Need plenty of storage

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Background: Networks for incremental learning

Incremental learning: Learning new knowledge without destroy


of old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user defined
threshold.
Multilayer Perceptrons: To learn new knowledge will destroy
old knowledge
Sub-network methods: Need plenty of storage

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Background: Networks for incremental learning

Incremental learning: Learning new knowledge without destroy


of old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user defined
threshold.
Multilayer Perceptrons: To learn new knowledge will destroy
old knowledge
Sub-network methods: Need plenty of storage

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Characteristics of SOINN

Neurons are self-organized with no predefined network


structure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any priori
condition
Find typical prototypes for large-scale data set.
Robust to noise

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Characteristics of SOINN

Neurons are self-organized with no predefined network


structure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any priori
condition
Find typical prototypes for large-scale data set.
Robust to noise

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Characteristics of SOINN

Neurons are self-organized with no predefined network


structure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any priori
condition
Find typical prototypes for large-scale data set.
Robust to noise

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Characteristics of SOINN

Neurons are self-organized with no predefined network


structure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any priori
condition
Find typical prototypes for large-scale data set.
Robust to noise

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Characteristics of SOINN

Neurons are self-organized with no predefined network


structure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any priori
condition
Find typical prototypes for large-scale data set.
Robust to noise

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
Characteristics of SOINN

Characteristics of SOINN

Neurons are self-organized with no predefined network


structure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any priori
condition
Find typical prototypes for large-scale data set.
Robust to noise

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

What is SOINN

Why SOINN

Detail algorithm of SOINN

SOINN for machine learning

SOINN for associative memory

References

F. Shen, O. Hasegawa

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Structure: Two-layer competitive network


Two-layer competitive
network
First layer: Competitive
for input data
Second layer: Competitive
for output of first-layer
Output topology structure
and weight vector of
second layer

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Structure: Two-layer competitive network


Two-layer competitive
network
First layer: Competitive
for input data
Second layer: Competitive
for output of first-layer
Output topology structure
and weight vector of
second layer

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Structure: Two-layer competitive network


Two-layer competitive
network
First layer: Competitive
for input data
Second layer: Competitive
for output of first-layer
Output topology structure
and weight vector of
second layer

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Structure: Two-layer competitive network


Two-layer competitive
network
First layer: Competitive
for input data
Second layer: Competitive
for output of first-layer
Output topology structure
and weight vector of
second layer

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Structure: Two-layer competitive network


Two-layer competitive
network
First layer: Competitive
for input data
Second layer: Competitive
for output of first-layer
Output topology structure
and weight vector of
second layer

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Training flowchart of SOINN

Adaptively updated
threshold
Between-class
insertion
Update weight of
nodes
Within-class
insertion
Remove noise nodes

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Training flowchart of SOINN

Adaptively updated
threshold
Between-class
insertion
Update weight of
nodes
Within-class
insertion
Remove noise nodes

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Training flowchart of SOINN

Adaptively updated
threshold
Between-class
insertion
Update weight of
nodes
Within-class
insertion
Remove noise nodes

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Training flowchart of SOINN

Adaptively updated
threshold
Between-class
insertion
Update weight of
nodes
Within-class
insertion
Remove noise nodes

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Training flowchart of SOINN

Adaptively updated
threshold
Between-class
insertion
Update weight of
nodes
Within-class
insertion
Remove noise nodes

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Training flowchart of SOINN

Adaptively updated
threshold
Between-class
insertion
Update weight of
nodes
Within-class
insertion
Remove noise nodes

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Training flowchart of SOINN

Adaptively updated
threshold
Between-class
insertion
Update weight of
nodes
Within-class
insertion
Remove noise nodes

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

First layer: adaptively updating threshold Ti


Basic idea: within-class distance T between-class distance
1
2

Initialize: Ti = + when node i is a new node.


When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distance
between i and all of its neighbors.
Ti = max ||Wi Wc ||
cNi

(1)

If i has no neighbors, Ti is updated as the minimum distance


of i and all other nodes in network A.
Ti = min ||Wi Wc ||
cA\{i }

F. Shen, O. Hasegawa

(2)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

First layer: adaptively updating threshold Ti


Basic idea: within-class distance T between-class distance
1
2

Initialize: Ti = + when node i is a new node.


When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distance
between i and all of its neighbors.
Ti = max ||Wi Wc ||
cNi

(1)

If i has no neighbors, Ti is updated as the minimum distance


of i and all other nodes in network A.
Ti = min ||Wi Wc ||
cA\{i }

F. Shen, O. Hasegawa

(2)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

First layer: adaptively updating threshold Ti


Basic idea: within-class distance T between-class distance
1
2

Initialize: Ti = + when node i is a new node.


When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distance
between i and all of its neighbors.
Ti = max ||Wi Wc ||
cNi

(1)

If i has no neighbors, Ti is updated as the minimum distance


of i and all other nodes in network A.
Ti = min ||Wi Wc ||
cA\{i }

F. Shen, O. Hasegawa

(2)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

First layer: adaptively updating threshold Ti


Basic idea: within-class distance T between-class distance
1
2

Initialize: Ti = + when node i is a new node.


When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distance
between i and all of its neighbors.
Ti = max ||Wi Wc ||
cNi

(1)

If i has no neighbors, Ti is updated as the minimum distance


of i and all other nodes in network A.
Ti = min ||Wi Wc ||
cA\{i }

F. Shen, O. Hasegawa

(2)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

First layer: adaptively updating threshold Ti


Basic idea: within-class distance T between-class distance
1
2

Initialize: Ti = + when node i is a new node.


When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distance
between i and all of its neighbors.
Ti = max ||Wi Wc ||
cNi

(1)

If i has no neighbors, Ti is updated as the minimum distance


of i and all other nodes in network A.
Ti = min ||Wi Wc ||
cA\{i }

F. Shen, O. Hasegawa

(2)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

First layer: adaptively updating threshold Ti


Basic idea: within-class distance T between-class distance
1
2

Initialize: Ti = + when node i is a new node.


When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distance
between i and all of its neighbors.
Ti = max ||Wi Wc ||
cNi

(1)

If i has no neighbors, Ti is updated as the minimum distance


of i and all other nodes in network A.
Ti = min ||Wi Wc ||
cA\{i }

F. Shen, O. Hasegawa

(2)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Second layer: constant threshold Tc


Basic idea 1: within-class distance T between-class
distance
Basic idea 2: we already have some knowledge of input data
from results of first-layer.
Within-class distance:
1 X
dw =
||Wi Wj ||
(3)
NC
(i ,j)C

Between-class distance of two class Ci and Cj :


db (Ci , Cj ) =

min

i Ci ,jCj

F. Shen, O. Hasegawa

||Wi Wj ||

(4)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Second layer: constant threshold Tc


Basic idea 1: within-class distance T between-class
distance
Basic idea 2: we already have some knowledge of input data
from results of first-layer.
Within-class distance:
1 X
dw =
||Wi Wj ||
(3)
NC
(i ,j)C

Between-class distance of two class Ci and Cj :


db (Ci , Cj ) =

min

i Ci ,jCj

F. Shen, O. Hasegawa

||Wi Wj ||

(4)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Second layer: constant threshold Tc


Basic idea 1: within-class distance T between-class
distance
Basic idea 2: we already have some knowledge of input data
from results of first-layer.
Within-class distance:
1 X
dw =
||Wi Wj ||
(3)
NC
(i ,j)C

Between-class distance of two class Ci and Cj :


db (Ci , Cj ) =

min

i Ci ,jCj

F. Shen, O. Hasegawa

||Wi Wj ||

(4)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Second layer: constant threshold Tc


Basic idea 1: within-class distance T between-class
distance
Basic idea 2: we already have some knowledge of input data
from results of first-layer.
Within-class distance:
1 X
dw =
||Wi Wj ||
(3)
NC
(i ,j)C

Between-class distance of two class Ci and Cj :


db (Ci , Cj ) =

min

i Ci ,jCj

F. Shen, O. Hasegawa

||Wi Wj ||

(4)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Second layer: constant threshold Tc


Basic idea 1: within-class distance T between-class
distance
Basic idea 2: we already have some knowledge of input data
from results of first-layer.
Within-class distance:
1 X
dw =
||Wi Wj ||
(3)
NC
(i ,j)C

Between-class distance of two class Ci and Cj :


db (Ci , Cj ) =

min

i Ci ,jCj

F. Shen, O. Hasegawa

||Wi Wj ||

(4)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Second layer: constant threshold Tc (continue)


1

Set Tc as the minimum between-cluster distance.


Tc = db (Ci1 , Cj1 ) =

(5)

min

k,l=1,...,Q,k6=l

db (Ck , Cl )

(6)

If Tc is less than within-class distance dw , set Tc as the next


minimum between-cluster distance.
Tc = db (Ci2 , Cj2 ) =

db (Ck , Cl )

Set Tc as the minimum between-class distance.


Tc = db (Ci1 , Cj1 ) =

min

k,l=1,...,Q,k6=l

min

k,l=1,...,Q,k6=l,k6=i1 ,l6=j1

db (Ck , Cl )

(7)

Go to step 2 to update Tc until Tc is greater than dw .


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Second layer: constant threshold Tc (continue)


1

Set Tc as the minimum between-cluster distance.


Tc = db (Ci1 , Cj1 ) =

(5)

min

k,l=1,...,Q,k6=l

db (Ck , Cl )

(6)

If Tc is less than within-class distance dw , set Tc as the next


minimum between-cluster distance.
Tc = db (Ci2 , Cj2 ) =

db (Ck , Cl )

Set Tc as the minimum between-class distance.


Tc = db (Ci1 , Cj1 ) =

min

k,l=1,...,Q,k6=l

min

k,l=1,...,Q,k6=l,k6=i1 ,l6=j1

db (Ck , Cl )

(7)

Go to step 2 to update Tc until Tc is greater than dw .


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Second layer: constant threshold Tc (continue)


1

Set Tc as the minimum between-cluster distance.


Tc = db (Ci1 , Cj1 ) =

(5)

min

k,l=1,...,Q,k6=l

db (Ck , Cl )

(6)

If Tc is less than within-class distance dw , set Tc as the next


minimum between-cluster distance.
Tc = db (Ci2 , Cj2 ) =

db (Ck , Cl )

Set Tc as the minimum between-class distance.


Tc = db (Ci1 , Cj1 ) =

min

k,l=1,...,Q,k6=l

min

k,l=1,...,Q,k6=l,k6=i1 ,l6=j1

db (Ck , Cl )

(7)

Go to step 2 to update Tc until Tc is greater than dw .


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Second layer: constant threshold Tc (continue)


1

Set Tc as the minimum between-cluster distance.


Tc = db (Ci1 , Cj1 ) =

(5)

min

k,l=1,...,Q,k6=l

db (Ck , Cl )

(6)

If Tc is less than within-class distance dw , set Tc as the next


minimum between-cluster distance.
Tc = db (Ci2 , Cj2 ) =

db (Ck , Cl )

Set Tc as the minimum between-class distance.


Tc = db (Ci1 , Cj1 ) =

min

k,l=1,...,Q,k6=l

min

k,l=1,...,Q,k6=l,k6=i1 ,l6=j1

db (Ck , Cl )

(7)

Go to step 2 to update Tc until Tc is greater than dw .


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Second layer: constant threshold Tc (continue)


1

Set Tc as the minimum between-cluster distance.


Tc = db (Ci1 , Cj1 ) =

(5)

min

k,l=1,...,Q,k6=l

db (Ck , Cl )

(6)

If Tc is less than within-class distance dw , set Tc as the next


minimum between-cluster distance.
Tc = db (Ci2 , Cj2 ) =

db (Ck , Cl )

Set Tc as the minimum between-class distance.


Tc = db (Ci1 , Cj1 ) =

min

k,l=1,...,Q,k6=l

min

k,l=1,...,Q,k6=l,k6=i1 ,l6=j1

db (Ck , Cl )

(7)

Go to step 2 to update Tc until Tc is greater than dw .


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Updating learning rate 1 (t) and 2 (t)


Update of weight vector
Ws1
Wi

= 1 (t)( Ws1 )

(8)

= 2 (t)( Wi ) (i Ns1 )

(9)

After the size of network becomes stable, fine tune the network
stochastic approximation: a number of adaptation steps with
aPstrength (t) decaying
but not too slowly, i.e.,
Pslowly

2
t=1 (t) = , and
t=1 (t) < .
The harmonic series satisfies the conditions.
1
1 (t) = ,
t
F. Shen, O. Hasegawa

2 (t) =

1
100t

(10)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Updating learning rate 1 (t) and 2 (t)


Update of weight vector
Ws1
Wi

= 1 (t)( Ws1 )

(8)

= 2 (t)( Wi ) (i Ns1 )

(9)

After the size of network becomes stable, fine tune the network
stochastic approximation: a number of adaptation steps with
aPstrength (t) decaying
but not too slowly, i.e.,
Pslowly

2
t=1 (t) = , and
t=1 (t) < .
The harmonic series satisfies the conditions.
1
1 (t) = ,
t
F. Shen, O. Hasegawa

2 (t) =

1
100t

(10)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Updating learning rate 1 (t) and 2 (t)


Update of weight vector
Ws1
Wi

= 1 (t)( Ws1 )

(8)

= 2 (t)( Wi ) (i Ns1 )

(9)

After the size of network becomes stable, fine tune the network
stochastic approximation: a number of adaptation steps with
aPstrength (t) decaying
but not too slowly, i.e.,
Pslowly

2
t=1 (t) = , and
t=1 (t) < .
The harmonic series satisfies the conditions.
1
1 (t) = ,
t
F. Shen, O. Hasegawa

2 (t) =

1
100t

(10)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Updating learning rate 1 (t) and 2 (t)


Update of weight vector
Ws1
Wi

= 1 (t)( Ws1 )

(8)

= 2 (t)( Wi ) (i Ns1 )

(9)

After the size of network becomes stable, fine tune the network
stochastic approximation: a number of adaptation steps with
aPstrength (t) decaying
but not too slowly, i.e.,
Pslowly

2
t=1 (t) = , and
t=1 (t) < .
The harmonic series satisfies the conditions.
1
1 (t) = ,
t
F. Shen, O. Hasegawa

2 (t) =

1
100t

(10)

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Single-layer SOINN
For topology
representation,
first-layer is enough
Within-class
insertion slightly
happened in
first-layer
Using subclass and
density to judge if
connection is
needed.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Single-layer SOINN
For topology
representation,
first-layer is enough
Within-class
insertion slightly
happened in
first-layer
Using subclass and
density to judge if
connection is
needed.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Single-layer SOINN
For topology
representation,
first-layer is enough
Within-class
insertion slightly
happened in
first-layer
Using subclass and
density to judge if
connection is
needed.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Single-layer SOINN
For topology
representation,
first-layer is enough
Within-class
insertion slightly
happened in
first-layer
Using subclass and
density to judge if
connection is
needed.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Single-layer SOINN
For topology
representation,
first-layer is enough
Within-class
insertion slightly
happened in
first-layer
Using subclass and
density to judge if
connection is
needed.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Single-layer SOINN
For topology
representation,
first-layer is enough
Within-class
insertion slightly
happened in
first-layer
Using subclass and
density to judge if
connection is
needed.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation


Stationary and non-stationary
Stationary: all training data obey same distribution
Non-stationary: next training sample maybe obey different
distribution from previous one.
Original data

Stationary

F. Shen, O. Hasegawa

Non-stationary

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation


Stationary and non-stationary
Stationary: all training data obey same distribution
Non-stationary: next training sample maybe obey different
distribution from previous one.
Original data

Stationary

F. Shen, O. Hasegawa

Non-stationary

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation


Stationary and non-stationary
Stationary: all training data obey same distribution
Non-stationary: next training sample maybe obey different
distribution from previous one.
Original data

Stationary

F. Shen, O. Hasegawa

Non-stationary

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation


Stationary and non-stationary
Stationary: all training data obey same distribution
Non-stationary: next training sample maybe obey different
distribution from previous one.
Original data

Stationary

F. Shen, O. Hasegawa

Non-stationary

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation


Stationary and non-stationary
Stationary: all training data obey same distribution
Non-stationary: next training sample maybe obey different
distribution from previous one.
Original data

Stationary

F. Shen, O. Hasegawa

Non-stationary

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation


Stationary and non-stationary
Stationary: all training data obey same distribution
Non-stationary: next training sample maybe obey different
distribution from previous one.
Original data

Stationary

F. Shen, O. Hasegawa

Non-stationary

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation


Stationary and non-stationary
Stationary: all training data obey same distribution
Non-stationary: next training sample maybe obey different
distribution from previous one.
Original data

Stationary

F. Shen, O. Hasegawa

Non-stationary

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation (continue)


Original data

Two-layer SOINN

Single-layer SOINN

Conclusion of experiments: SOINN is able to


Represent topology structure of input data.
Realize incremental learning.
AutomaticallyF.learn
number of Self-organizing
nodes, de-noise,
etc.
Shen, O. Hasegawa
incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation (continue)


Original data

Two-layer SOINN

Single-layer SOINN

Conclusion of experiments: SOINN is able to


Represent topology structure of input data.
Realize incremental learning.
AutomaticallyF.learn
number of Self-organizing
nodes, de-noise,
etc.
Shen, O. Hasegawa
incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation (continue)


Original data

Two-layer SOINN

Single-layer SOINN

Conclusion of experiments: SOINN is able to


Represent topology structure of input data.
Realize incremental learning.
AutomaticallyF.learn
number of Self-organizing
nodes, de-noise,
etc.
Shen, O. Hasegawa
incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation (continue)


Original data

Two-layer SOINN

Single-layer SOINN

Conclusion of experiments: SOINN is able to


Represent topology structure of input data.
Realize incremental learning.
AutomaticallyF.learn
number of Self-organizing
nodes, de-noise,
etc.
Shen, O. Hasegawa
incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation (continue)


Original data

Two-layer SOINN

Single-layer SOINN

Conclusion of experiments: SOINN is able to


Represent topology structure of input data.
Realize incremental learning.
AutomaticallyF.learn
number of Self-organizing
nodes, de-noise,
etc.
Shen, O. Hasegawa
incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation (continue)


Original data

Two-layer SOINN

Single-layer SOINN

Conclusion of experiments: SOINN is able to


Represent topology structure of input data.
Realize incremental learning.
AutomaticallyF.learn
number of Self-organizing
nodes, de-noise,
etc.
Shen, O. Hasegawa
incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation (continue)


Original data

Two-layer SOINN

Single-layer SOINN

Conclusion of experiments: SOINN is able to


Represent topology structure of input data.
Realize incremental learning.
AutomaticallyF.learn
number of Self-organizing
nodes, de-noise,
etc.
Shen, O. Hasegawa
incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Architecture of SOINN
Training process of SOINN
Similarity threshold for judging input data
Learning rate
Simple version of SOINN
Simulation results

Artificial data set: topology representation (continue)


Original data

Two-layer SOINN

Single-layer SOINN

Conclusion of experiments: SOINN is able to


Represent topology structure of input data.
Realize incremental learning.
AutomaticallyF.learn
number of Self-organizing
nodes, de-noise,
etc.
Shen, O. Hasegawa
incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

What is SOINN

Why SOINN

Detail algorithm of SOINN

SOINN for machine learning

SOINN for associative memory

References

F. Shen, O. Hasegawa

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Some objectives of unsupervised learning

Automatically learn number of classes of input data


Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Some objectives of unsupervised learning

Automatically learn number of classes of input data


Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Some objectives of unsupervised learning

Automatically learn number of classes of input data


Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Some objectives of unsupervised learning

Automatically learn number of classes of input data


Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Some objectives of unsupervised learning

Automatically learn number of classes of input data


Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Some objectives of unsupervised learning

Automatically learn number of classes of input data


Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN for unsupervised learning: If two nodes connected


with one path, the nodes belong to one class
1

Do SOINN for input data, output topology representation of


nodes

Initialize all nodes as unclassified.

Randomly choose one unclassified node i from node set A.


Mark node i as classified and label it as class Ci .

Search A to find all unclassified nodes that are connected to


node i with a path. Mark these nodes as classified and label
them as the same class as node i .

Go to Step3 to continue the classification process until all


nodes are classified.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN for unsupervised learning: If two nodes connected


with one path, the nodes belong to one class
1

Do SOINN for input data, output topology representation of


nodes

Initialize all nodes as unclassified.

Randomly choose one unclassified node i from node set A.


Mark node i as classified and label it as class Ci .

Search A to find all unclassified nodes that are connected to


node i with a path. Mark these nodes as classified and label
them as the same class as node i .

Go to Step3 to continue the classification process until all


nodes are classified.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN for unsupervised learning: If two nodes connected


with one path, the nodes belong to one class
1

Do SOINN for input data, output topology representation of


nodes

Initialize all nodes as unclassified.

Randomly choose one unclassified node i from node set A.


Mark node i as classified and label it as class Ci .

Search A to find all unclassified nodes that are connected to


node i with a path. Mark these nodes as classified and label
them as the same class as node i .

Go to Step3 to continue the classification process until all


nodes are classified.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN for unsupervised learning: If two nodes connected


with one path, the nodes belong to one class
1

Do SOINN for input data, output topology representation of


nodes

Initialize all nodes as unclassified.

Randomly choose one unclassified node i from node set A.


Mark node i as classified and label it as class Ci .

Search A to find all unclassified nodes that are connected to


node i with a path. Mark these nodes as classified and label
them as the same class as node i .

Go to Step3 to continue the classification process until all


nodes are classified.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN for unsupervised learning: If two nodes connected


with one path, the nodes belong to one class
1

Do SOINN for input data, output topology representation of


nodes

Initialize all nodes as unclassified.

Randomly choose one unclassified node i from node set A.


Mark node i as classified and label it as class Ci .

Search A to find all unclassified nodes that are connected to


node i with a path. Mark these nodes as classified and label
them as the same class as node i .

Go to Step3 to continue the classification process until all


nodes are classified.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN for unsupervised learning: If two nodes connected


with one path, the nodes belong to one class
1

Do SOINN for input data, output topology representation of


nodes

Initialize all nodes as unclassified.

Randomly choose one unclassified node i from node set A.


Mark node i as classified and label it as class Ci .

Search A to find all unclassified nodes that are connected to


node i with a path. Mark these nodes as classified and label
them as the same class as node i .

Go to Step3 to continue the classification process until all


nodes are classified.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Artificial data set: 5 classes with 10% noise


Original data

Clustering result

Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes;
incremental
learning; de-noise; etc.
F. Shen, O. Hasegawa
Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Artificial data set: 5 classes with 10% noise


Original data

Clustering result

Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes;
incremental
learning; de-noise; etc.
F. Shen, O. Hasegawa
Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Artificial data set: 5 classes with 10% noise


Original data

Clustering result

Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes;
incremental
learning; de-noise; etc.
F. Shen, O. Hasegawa
Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Artificial data set: 5 classes with 10% noise


Original data

Clustering result

Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes;
incremental
learning; de-noise; etc.
F. Shen, O. Hasegawa
Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Artificial data set: 5 classes with 10% noise


Original data

Clustering result

Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes;
incremental
learning; de-noise; etc.
F. Shen, O. Hasegawa
Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Artificial data set: 5 classes with 10% noise


Original data

Clustering result

Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes;
incremental
learning; de-noise; etc.
F. Shen, O. Hasegawa
Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Face recognition: AT&T face data set

Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Face recognition: AT&T face data set

Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Face recognition: AT&T face data set

Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Face recognition: AT&T face data set

Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Face recognition: AT&T face data set

Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Prototype-based classifier: based on 1-NN or k-NN rule


Nearest Neighbor Classifier (NNC): all training data as
prototypes
Nearest Mean Classifier (NMC): mean of each class as
prototypes
k-means classifier (KMC), Learning Vector Quantization
(LVQ), and others: predefine number of prototypes for every
class.
Main difficulty
1
2

How to find enough prototypes without overfitting


How to realize Incremental learning
Incremental of new data inside one class (non-stationary or
concept drift);
Incremental of new classes.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Prototype-based classifier: based on 1-NN or k-NN rule


Nearest Neighbor Classifier (NNC): all training data as
prototypes
Nearest Mean Classifier (NMC): mean of each class as
prototypes
k-means classifier (KMC), Learning Vector Quantization
(LVQ), and others: predefine number of prototypes for every
class.
Main difficulty
1
2

How to find enough prototypes without overfitting


How to realize Incremental learning
Incremental of new data inside one class (non-stationary or
concept drift);
Incremental of new classes.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Prototype-based classifier: based on 1-NN or k-NN rule


Nearest Neighbor Classifier (NNC): all training data as
prototypes
Nearest Mean Classifier (NMC): mean of each class as
prototypes
k-means classifier (KMC), Learning Vector Quantization
(LVQ), and others: predefine number of prototypes for every
class.
Main difficulty
1
2

How to find enough prototypes without overfitting


How to realize Incremental learning
Incremental of new data inside one class (non-stationary or
concept drift);
Incremental of new classes.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Prototype-based classifier: based on 1-NN or k-NN rule


Nearest Neighbor Classifier (NNC): all training data as
prototypes
Nearest Mean Classifier (NMC): mean of each class as
prototypes
k-means classifier (KMC), Learning Vector Quantization
(LVQ), and others: predefine number of prototypes for every
class.
Main difficulty
1
2

How to find enough prototypes without overfitting


How to realize Incremental learning
Incremental of new data inside one class (non-stationary or
concept drift);
Incremental of new classes.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Prototype-based classifier: based on 1-NN or k-NN rule


Nearest Neighbor Classifier (NNC): all training data as
prototypes
Nearest Mean Classifier (NMC): mean of each class as
prototypes
k-means classifier (KMC), Learning Vector Quantization
(LVQ), and others: predefine number of prototypes for every
class.
Main difficulty
1
2

How to find enough prototypes without overfitting


How to realize Incremental learning
Incremental of new data inside one class (non-stationary or
concept drift);
Incremental of new classes.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Prototype-based classifier: based on 1-NN or k-NN rule


Nearest Neighbor Classifier (NNC): all training data as
prototypes
Nearest Mean Classifier (NMC): mean of each class as
prototypes
k-means classifier (KMC), Learning Vector Quantization
(LVQ), and others: predefine number of prototypes for every
class.
Main difficulty
1
2

How to find enough prototypes without overfitting


How to realize Incremental learning
Incremental of new data inside one class (non-stationary or
concept drift);
Incremental of new classes.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Prototype-based classifier: based on 1-NN or k-NN rule


Nearest Neighbor Classifier (NNC): all training data as
prototypes
Nearest Mean Classifier (NMC): mean of each class as
prototypes
k-means classifier (KMC), Learning Vector Quantization
(LVQ), and others: predefine number of prototypes for every
class.
Main difficulty
1
2

How to find enough prototypes without overfitting


How to realize Incremental learning
Incremental of new data inside one class (non-stationary or
concept drift);
Incremental of new classes.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Prototype-based classifier: based on 1-NN or k-NN rule


Nearest Neighbor Classifier (NNC): all training data as
prototypes
Nearest Mean Classifier (NMC): mean of each class as
prototypes
k-means classifier (KMC), Learning Vector Quantization
(LVQ), and others: predefine number of prototypes for every
class.
Main difficulty
1
2

How to find enough prototypes without overfitting


How to realize Incremental learning
Incremental of new data inside one class (non-stationary or
concept drift);
Incremental of new classes.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Prototype-based classifier: based on 1-NN or k-NN rule


Nearest Neighbor Classifier (NNC): all training data as
prototypes
Nearest Mean Classifier (NMC): mean of each class as
prototypes
k-means classifier (KMC), Learning Vector Quantization
(LVQ), and others: predefine number of prototypes for every
class.
Main difficulty
1
2

How to find enough prototypes without overfitting


How to realize Incremental learning
Incremental of new data inside one class (non-stationary or
concept drift);
Incremental of new classes.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Prototype-based classifier: based on 1-NN or k-NN rule


Nearest Neighbor Classifier (NNC): all training data as
prototypes
Nearest Mean Classifier (NMC): mean of each class as
prototypes
k-means classifier (KMC), Learning Vector Quantization
(LVQ), and others: predefine number of prototypes for every
class.
Main difficulty
1
2

How to find enough prototypes without overfitting


How to realize Incremental learning
Incremental of new data inside one class (non-stationary or
concept drift);
Incremental of new classes.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN for supervised learning: Targets

Automatically learn the number of prototypes needed to


represent every class
Only the prototypes used to determine the decision boundary
will be remained
Realize both types of incremental learning
Robust to noise

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN for supervised learning: Targets

Automatically learn the number of prototypes needed to


represent every class
Only the prototypes used to determine the decision boundary
will be remained
Realize both types of incremental learning
Robust to noise

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN for supervised learning: Targets

Automatically learn the number of prototypes needed to


represent every class
Only the prototypes used to determine the decision boundary
will be remained
Realize both types of incremental learning
Robust to noise

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN for supervised learning: Targets

Automatically learn the number of prototypes needed to


represent every class
Only the prototypes used to determine the decision boundary
will be remained
Realize both types of incremental learning
Robust to noise

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN for supervised learning: Targets

Automatically learn the number of prototypes needed to


represent every class
Only the prototypes used to determine the decision boundary
will be remained
Realize both types of incremental learning
Robust to noise

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Adjusted SOINN Classifier (ASC)

SOINN learns k for


k-means.
Noise-reduction removes
noisy prototypes
Center-cleaning removes
prototypes unuseful for
decision

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Adjusted SOINN Classifier (ASC)

SOINN learns k for


k-means.
Noise-reduction removes
noisy prototypes
Center-cleaning removes
prototypes unuseful for
decision

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Adjusted SOINN Classifier (ASC)

SOINN learns k for


k-means.
Noise-reduction removes
noisy prototypes
Center-cleaning removes
prototypes unuseful for
decision

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Adjusted SOINN Classifier (ASC)

SOINN learns k for


k-means.
Noise-reduction removes
noisy prototypes
Center-cleaning removes
prototypes unuseful for
decision

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

ASC: noise-reduction & center-cleaning

Noise-reduction
If the label of a node differs from the label of majority voting of its
k-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype of
other classes, remove the prototype.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

ASC: noise-reduction & center-cleaning

Noise-reduction
If the label of a node differs from the label of majority voting of its
k-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype of
other classes, remove the prototype.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

ASC: noise-reduction & center-cleaning

Noise-reduction
If the label of a node differs from the label of majority voting of its
k-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype of
other classes, remove the prototype.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

ASC: noise-reduction & center-cleaning

Noise-reduction
If the label of a node differs from the label of majority voting of its
k-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype of
other classes, remove the prototype.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

ASC: noise-reduction & center-cleaning

Noise-reduction
If the label of a node differs from the label of majority voting of its
k-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype of
other classes, remove the prototype.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (I)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (I)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (I)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (I)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (I)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (II)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (II)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (II)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (II)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (II)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (III)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (III)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (III)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (III)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: artificial data (III)


Original data

SOINN results

ASC results

Test results of ASC


No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: optdigits


ASC with different parameter sets (ad , ), displayed with average
of 10 times training and standard deviation

recognition ratio (%)


No. of prototypes
Compression ratio (%)

Parameter set of {ad , }


(50, 50)
(25, 25)
(10, 10)
97.7 0.2 97.4 0.2 97.0 0.2
377 12
258 7
112 7
9.9 0.3
6.8 0.2
2.9 0.2

Compare with SVM and 1-NN


LibSVM: 1197 support vectors; Recognition ratio = 96.6%.
1-NN: best classifier (98%). All 3823 samples as prototypes.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: optdigits


ASC with different parameter sets (ad , ), displayed with average
of 10 times training and standard deviation

recognition ratio (%)


No. of prototypes
Compression ratio (%)

Parameter set of {ad , }


(50, 50)
(25, 25)
(10, 10)
97.7 0.2 97.4 0.2 97.0 0.2
377 12
258 7
112 7
9.9 0.3
6.8 0.2
2.9 0.2

Compare with SVM and 1-NN


LibSVM: 1197 support vectors; Recognition ratio = 96.6%.
1-NN: best classifier (98%). All 3823 samples as prototypes.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: optdigits


ASC with different parameter sets (ad , ), displayed with average
of 10 times training and standard deviation

recognition ratio (%)


No. of prototypes
Compression ratio (%)

Parameter set of {ad , }


(50, 50)
(25, 25)
(10, 10)
97.7 0.2 97.4 0.2 97.0 0.2
377 12
258 7
112 7
9.9 0.3
6.8 0.2
2.9 0.2

Compare with SVM and 1-NN


LibSVM: 1197 support vectors; Recognition ratio = 96.6%.
1-NN: best classifier (98%). All 3823 samples as prototypes.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: optdigits


ASC with different parameter sets (ad , ), displayed with average
of 10 times training and standard deviation

recognition ratio (%)


No. of prototypes
Compression ratio (%)

Parameter set of {ad , }


(50, 50)
(25, 25)
(10, 10)
97.7 0.2 97.4 0.2 97.0 0.2
377 12
258 7
112 7
9.9 0.3
6.8 0.2
2.9 0.2

Compare with SVM and 1-NN


LibSVM: 1197 support vectors; Recognition ratio = 96.6%.
1-NN: best classifier (98%). All 3823 samples as prototypes.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: UCI repository data sets


Comparison results of ASC and other classifiers: recognition ratio
Data set
Iris
Breast cancer
Ionosphere
Glass
Liver disorders
Pima Indians
Wine
Average

ASC (ad , )
97.4 0.86
97.4 0.38
90.4 0.64
73.5 1.6
62.6 0.83
72.0 0.63
82.6 1.55
82.3 0.93

2
NSC (max
)
96.3 0.4
97.2 0.2
91.9 0.8
70.2 1.5
62.9 2.3
68.6 1.6
75.3 1.7
80.4 1.2

KMC (M)
96.2 0.8
95.9 0.3
87.4 0.6
68.8 1.1
59.3 2.3
68.7 0.9
71.9 1.9
78.3 1.1

NNC (k)
96.7 0.6
97.0 0.2
86.1 0.7
72.3 1.2
67.3 1.6
74.7 0.7
73.9 1.9
81.1 0.99

LVQ (M)
96.1 0.6
96.3 0.4
86.4 0.8
68.3 2.0
66.3 1.9
73.5 0.9
72.3 1.5
79.9 1.2

In average, ASC has best recognition performance.


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: UCI repository data sets


Comparison results of ASC and other classifiers: recognition ratio
Data set
Iris
Breast cancer
Ionosphere
Glass
Liver disorders
Pima Indians
Wine
Average

ASC (ad , )
97.4 0.86
97.4 0.38
90.4 0.64
73.5 1.6
62.6 0.83
72.0 0.63
82.6 1.55
82.3 0.93

2
NSC (max
)
96.3 0.4
97.2 0.2
91.9 0.8
70.2 1.5
62.9 2.3
68.6 1.6
75.3 1.7
80.4 1.2

KMC (M)
96.2 0.8
95.9 0.3
87.4 0.6
68.8 1.1
59.3 2.3
68.7 0.9
71.9 1.9
78.3 1.1

NNC (k)
96.7 0.6
97.0 0.2
86.1 0.7
72.3 1.2
67.3 1.6
74.7 0.7
73.9 1.9
81.1 0.99

LVQ (M)
96.1 0.6
96.3 0.4
86.4 0.8
68.3 2.0
66.3 1.9
73.5 0.9
72.3 1.5
79.9 1.2

In average, ASC has best recognition performance.


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: UCI repository data sets (continue)


Comparison results of ASC and other classifiers: compression ratio
Data set
ASC (ad , )
Iris
5.2 (6, 6)
Breast cancer
1.4 (8, 8)
Ionosphere
3.4 (15, 15)
Glass
13.7 (15, 15)
Liver disorders 4.6 (6, 6)
Pima Indians
0.6 (6, 6)
Wine
3.2 (6, 6)
Average
4.6

2
NSC (max
) KMC (M )
7.3 (0.25)
8.0 (4)
1.8 (35.0)
0.29 (1)
31 (1.25)
4.0 (7)
97 (0.005)
17 (6)
4.9 (600)
11 (19)
1.7 (2600)
1.0 (4)
96 (4.0)
29 (17)
34.2
10.0

NNC (k )
100 (14)
100 (5)
100 (2)
100 (1)
100 (14)
100 (17)
100 (1)
100

LVQ (M )
15 (22)
5.9 (40)
6.8 (24)
45 (97)
8.4 (29)
3.4 (26)
32 (57)
16.6

In average, ASC has best compression ratio.


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results: UCI repository data sets (continue)


Comparison results of ASC and other classifiers: compression ratio
Data set
ASC (ad , )
Iris
5.2 (6, 6)
Breast cancer
1.4 (8, 8)
Ionosphere
3.4 (15, 15)
Glass
13.7 (15, 15)
Liver disorders 4.6 (6, 6)
Pima Indians
0.6 (6, 6)
Wine
3.2 (6, 6)
Average
4.6

2
NSC (max
) KMC (M )
7.3 (0.25)
8.0 (4)
1.8 (35.0)
0.29 (1)
31 (1.25)
4.0 (7)
97 (0.005)
17 (6)
4.9 (600)
11 (19)
1.7 (2600)
1.0 (4)
96 (4.0)
29 (17)
34.2
10.0

NNC (k )
100 (14)
100 (5)
100 (2)
100 (1)
100 (14)
100 (17)
100 (1)
100

LVQ (M )
15 (22)
5.9 (40)
6.8 (24)
45 (97)
8.4 (29)
3.4 (26)
32 (57)
16.6

In average, ASC has best compression ratio.


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Requirement of Semi-supervised learning

Labeled instances are difficult, expensive, or time consuming


to obtain.
How can a system use large amount of unlabeled data with
limited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgetting
previous learned knowledge?

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Requirement of Semi-supervised learning

Labeled instances are difficult, expensive, or time consuming


to obtain.
How can a system use large amount of unlabeled data with
limited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgetting
previous learned knowledge?

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Requirement of Semi-supervised learning

Labeled instances are difficult, expensive, or time consuming


to obtain.
How can a system use large amount of unlabeled data with
limited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgetting
previous learned knowledge?

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Requirement of Semi-supervised learning

Labeled instances are difficult, expensive, or time consuming


to obtain.
How can a system use large amount of unlabeled data with
limited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgetting
previous learned knowledge?

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Requirement of Semi-supervised learning

Labeled instances are difficult, expensive, or time consuming


to obtain.
How can a system use large amount of unlabeled data with
limited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgetting
previous learned knowledge?

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN used for Semi-supervised learning


1

SOINN:represent topology,
incremental learning;

Labeled data: label nodes


(winner);

Division of a cluster

Condition of division
Rc1 Rc &Rc > Rc+1
X
Rc =
dis(wa , wc )

(11)
(12)

aNc

c-1: former node


c+1: unlabeled neighbors.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN used for Semi-supervised learning


1

SOINN:represent topology,
incremental learning;

Labeled data: label nodes


(winner);

Division of a cluster

Condition of division
Rc1 Rc &Rc > Rc+1
X
Rc =
dis(wa , wc )

(11)
(12)

aNc

c-1: former node


c+1: unlabeled neighbors.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN used for Semi-supervised learning


1

SOINN:represent topology,
incremental learning;

Labeled data: label nodes


(winner);

Division of a cluster

Condition of division
Rc1 Rc &Rc > Rc+1
X
Rc =
dis(wa , wc )

(11)
(12)

aNc

c-1: former node


c+1: unlabeled neighbors.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN used for Semi-supervised learning


1

SOINN:represent topology,
incremental learning;

Labeled data: label nodes


(winner);

Division of a cluster

Condition of division
Rc1 Rc &Rc > Rc+1
X
Rc =
dis(wa , wc )

(11)
(12)

aNc

c-1: former node


c+1: unlabeled neighbors.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN used for Semi-supervised learning


1

SOINN:represent topology,
incremental learning;

Labeled data: label nodes


(winner);

Division of a cluster

Condition of division
Rc1 Rc &Rc > Rc+1
X
Rc =
dis(wa , wc )

(11)
(12)

aNc

c-1: former node


c+1: unlabeled neighbors.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment: original data

5%, 15%, or 40% overlap


training samples 500, validation samples 5,000, and test
samples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment: original data

5%, 15%, or 40% overlap


training samples 500, validation samples 5,000, and test
samples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment: original data

5%, 15%, or 40% overlap


training samples 500, validation samples 5,000, and test
samples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment: original data

5%, 15%, or 40% overlap


training samples 500, validation samples 5,000, and test
samples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment: original data

5%, 15%, or 40% overlap


training samples 500, validation samples 5,000, and test
samples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment: original data

5%, 15%, or 40% overlap


training samples 500, validation samples 5,000, and test
samples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results

Separate classes
with few labeled
samples.
For UCI data sets,
work better than
other typical
methods.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results

Separate classes
with few labeled
samples.
For UCI data sets,
work better than
other typical
methods.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment results

Separate classes
with few labeled
samples.
For UCI data sets,
work better than
other typical
methods.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN used for active learning

Targets: Actively ask for label of some samples to label all


classes
Idea:
1
2
3
4

Use SOINN to learn the topology structure of input data.


Actively label the vertex nodes of every class
Use vertex nodes to label all nodes.
Actively label the nodes lie in the overlapped area.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN used for active learning

Targets: Actively ask for label of some samples to label all


classes
Idea:
1
2
3
4

Use SOINN to learn the topology structure of input data.


Actively label the vertex nodes of every class
Use vertex nodes to label all nodes.
Actively label the nodes lie in the overlapped area.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN used for active learning

Targets: Actively ask for label of some samples to label all


classes
Idea:
1
2
3
4

Use SOINN to learn the topology structure of input data.


Actively label the vertex nodes of every class
Use vertex nodes to label all nodes.
Actively label the nodes lie in the overlapped area.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN used for active learning

Targets: Actively ask for label of some samples to label all


classes
Idea:
1
2
3
4

Use SOINN to learn the topology structure of input data.


Actively label the vertex nodes of every class
Use vertex nodes to label all nodes.
Actively label the nodes lie in the overlapped area.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN used for active learning

Targets: Actively ask for label of some samples to label all


classes
Idea:
1
2
3
4

Use SOINN to learn the topology structure of input data.


Actively label the vertex nodes of every class
Use vertex nodes to label all nodes.
Actively label the nodes lie in the overlapped area.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN used for active learning

Targets: Actively ask for label of some samples to label all


classes
Idea:
1
2
3
4

Use SOINN to learn the topology structure of input data.


Actively label the vertex nodes of every class
Use vertex nodes to label all nodes.
Actively label the nodes lie in the overlapped area.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

SOINN used for active learning

Targets: Actively ask for label of some samples to label all


classes
Idea:
1
2
3
4

Use SOINN to learn the topology structure of input data.


Actively label the vertex nodes of every class
Use vertex nodes to label all nodes.
Actively label the nodes lie in the overlapped area.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment: artificial data set under stationary


environment

Original data: Four classes in all, with 10% noise.


Results: under stationary environment; 10 teacher vectors.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment: artificial data set under stationary


environment

Original data: Four classes in all, with 10% noise.


Results: under stationary environment; 10 teacher vectors.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment: artificial data set under stationary


environment

Original data: Four classes in all, with 10% noise.


Results: under stationary environment; 10 teacher vectors.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment: artificial data set under non-stationary


environment

16 teacher vectors are asked.


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Unsupervised learning
Supervised learning
Semi-supervised learning
Active learning

Experiment: artificial data set under non-stationary


environment

16 teacher vectors are asked.


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

What is SOINN

Why SOINN

Detail algorithm of SOINN

SOINN for machine learning

SOINN for associative memory

References

F. Shen, O. Hasegawa

Background
SOINN-AM
Experiments
General Associative Memory

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Background: typical associative memory systems


Distributed Learning Associative Memory:
Hopfield Network: most famous network, for auto-associative
memory
Bidirectional Associative Memory (BAM), for
hetero-associative memory

Competitive Learning Associative Memory


KFMAM: Kohonon feature map associative memory.

Difficulties
Forget previously learned knowledge when learning new
knowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Background: typical associative memory systems


Distributed Learning Associative Memory:
Hopfield Network: most famous network, for auto-associative
memory
Bidirectional Associative Memory (BAM), for
hetero-associative memory

Competitive Learning Associative Memory


KFMAM: Kohonon feature map associative memory.

Difficulties
Forget previously learned knowledge when learning new
knowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Background: typical associative memory systems


Distributed Learning Associative Memory:
Hopfield Network: most famous network, for auto-associative
memory
Bidirectional Associative Memory (BAM), for
hetero-associative memory

Competitive Learning Associative Memory


KFMAM: Kohonon feature map associative memory.

Difficulties
Forget previously learned knowledge when learning new
knowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Background: typical associative memory systems


Distributed Learning Associative Memory:
Hopfield Network: most famous network, for auto-associative
memory
Bidirectional Associative Memory (BAM), for
hetero-associative memory

Competitive Learning Associative Memory


KFMAM: Kohonon feature map associative memory.

Difficulties
Forget previously learned knowledge when learning new
knowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Background: typical associative memory systems


Distributed Learning Associative Memory:
Hopfield Network: most famous network, for auto-associative
memory
Bidirectional Associative Memory (BAM), for
hetero-associative memory

Competitive Learning Associative Memory


KFMAM: Kohonon feature map associative memory.

Difficulties
Forget previously learned knowledge when learning new
knowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Background: typical associative memory systems


Distributed Learning Associative Memory:
Hopfield Network: most famous network, for auto-associative
memory
Bidirectional Associative Memory (BAM), for
hetero-associative memory

Competitive Learning Associative Memory


KFMAM: Kohonon feature map associative memory.

Difficulties
Forget previously learned knowledge when learning new
knowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Background: typical associative memory systems


Distributed Learning Associative Memory:
Hopfield Network: most famous network, for auto-associative
memory
Bidirectional Associative Memory (BAM), for
hetero-associative memory

Competitive Learning Associative Memory


KFMAM: Kohonon feature map associative memory.

Difficulties
Forget previously learned knowledge when learning new
knowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Background: typical associative memory systems


Distributed Learning Associative Memory:
Hopfield Network: most famous network, for auto-associative
memory
Bidirectional Associative Memory (BAM), for
hetero-associative memory

Competitive Learning Associative Memory


KFMAM: Kohonon feature map associative memory.

Difficulties
Forget previously learned knowledge when learning new
knowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Background: typical associative memory systems


Distributed Learning Associative Memory:
Hopfield Network: most famous network, for auto-associative
memory
Bidirectional Associative Memory (BAM), for
hetero-associative memory

Competitive Learning Associative Memory


KFMAM: Kohonon feature map associative memory.

Difficulties
Forget previously learned knowledge when learning new
knowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Background: typical associative memory systems


Distributed Learning Associative Memory:
Hopfield Network: most famous network, for auto-associative
memory
Bidirectional Associative Memory (BAM), for
hetero-associative memory

Competitive Learning Associative Memory


KFMAM: Kohonon feature map associative memory.

Difficulties
Forget previously learned knowledge when learning new
knowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Background: typical associative memory systems


Distributed Learning Associative Memory:
Hopfield Network: most famous network, for auto-associative
memory
Bidirectional Associative Memory (BAM), for
hetero-associative memory

Competitive Learning Associative Memory


KFMAM: Kohonon feature map associative memory.

Difficulties
Forget previously learned knowledge when learning new
knowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.
F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Objectives of SOINN-AM

Incremental learning of memory pairs.


Robust for noise data.
Dealing with real-valued data.
Many-to-many association.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Objectives of SOINN-AM

Incremental learning of memory pairs.


Robust for noise data.
Dealing with real-valued data.
Many-to-many association.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Objectives of SOINN-AM

Incremental learning of memory pairs.


Robust for noise data.
Dealing with real-valued data.
Many-to-many association.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Objectives of SOINN-AM

Incremental learning of memory pairs.


Robust for noise data.
Dealing with real-valued data.
Many-to-many association.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Objectives of SOINN-AM

Incremental learning of memory pairs.


Robust for noise data.
Dealing with real-valued data.
Many-to-many association.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Objectives of SOINN-AM

Incremental learning of memory pairs.


Robust for noise data.
Dealing with real-valued data.
Many-to-many association.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Architecture of SOINN-AM

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Algorithms of SOINN-AM
Basic idea of memory phase
1

Combine key vector and associate vector as input data.

Use SOINN to learn such input data.

Basic idea of recall phase


1

Using key part of nodes to find winner node for key vector,
the distance is d.

If d , output the associative part of winner as the recall


results.

If d > , report unknown for key vector.


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Algorithms of SOINN-AM
Basic idea of memory phase
1

Combine key vector and associate vector as input data.

Use SOINN to learn such input data.

Basic idea of recall phase


1

Using key part of nodes to find winner node for key vector,
the distance is d.

If d , output the associative part of winner as the recall


results.

If d > , report unknown for key vector.


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Algorithms of SOINN-AM
Basic idea of memory phase
1

Combine key vector and associate vector as input data.

Use SOINN to learn such input data.

Basic idea of recall phase


1

Using key part of nodes to find winner node for key vector,
the distance is d.

If d , output the associative part of winner as the recall


results.

If d > , report unknown for key vector.


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Algorithms of SOINN-AM
Basic idea of memory phase
1

Combine key vector and associate vector as input data.

Use SOINN to learn such input data.

Basic idea of recall phase


1

Using key part of nodes to find winner node for key vector,
the distance is d.

If d , output the associative part of winner as the recall


results.

If d > , report unknown for key vector.


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Algorithms of SOINN-AM
Basic idea of memory phase
1

Combine key vector and associate vector as input data.

Use SOINN to learn such input data.

Basic idea of recall phase


1

Using key part of nodes to find winner node for key vector,
the distance is d.

If d , output the associative part of winner as the recall


results.

If d > , report unknown for key vector.


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Algorithms of SOINN-AM
Basic idea of memory phase
1

Combine key vector and associate vector as input data.

Use SOINN to learn such input data.

Basic idea of recall phase


1

Using key part of nodes to find winner node for key vector,
the distance is d.

If d , output the associative part of winner as the recall


results.

If d > , report unknown for key vector.


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Algorithms of SOINN-AM
Basic idea of memory phase
1

Combine key vector and associate vector as input data.

Use SOINN to learn such input data.

Basic idea of recall phase


1

Using key part of nodes to find winner node for key vector,
the distance is d.

If d , output the associative part of winner as the recall


results.

If d > , report unknown for key vector.


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Algorithms of SOINN-AM
Basic idea of memory phase
1

Combine key vector and associate vector as input data.

Use SOINN to learn such input data.

Basic idea of recall phase


1

Using key part of nodes to find winner node for key vector,
the distance is d.

If d , output the associative part of winner as the recall


results.

If d > , report unknown for key vector.


F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Original data
Binary data

Real-valued data

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Comparison with typical AM systems

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Robustness of noise

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Many-to-Many associate testing

SOINN-AM recalls all patterns perfectly.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Architecture and basic idea of GAM

Input layer: key vector


and associate vector.
Memory layer: Memory
patterns with classes.
Associate layer: Build
association between
classes.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Architecture and basic idea of GAM

Input layer: key vector


and associate vector.
Memory layer: Memory
patterns with classes.
Associate layer: Build
association between
classes.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Architecture and basic idea of GAM

Input layer: key vector


and associate vector.
Memory layer: Memory
patterns with classes.
Associate layer: Build
association between
classes.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

Background
SOINN-AM
Experiments
General Associative Memory

Architecture and basic idea of GAM

Input layer: key vector


and associate vector.
Memory layer: Memory
patterns with classes.
Associate layer: Build
association between
classes.

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

What is SOINN

Why SOINN

Detail algorithm of SOINN

SOINN for machine learning

SOINN for associative memory

References

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

References about SOINN


SOINN for unsupervised learning:
Furao Shen and Osamu Hasegawa, An Incremental Network for On-line
Unsupervised Classification and Topology Learning, Neural Networks,
Vol.19, No.1, pp.90-106, (2005)
Furao Shen, Tomotaka Ogura and Osamu Hasegawa, An enhanced
self-organizing incremental neural network for online unsupervised
learning, Neural Networks, Vol.20, No.8, pp.893-903, (2007)
SOINN for Supervised learning:
Furao Shen and Osamu Hasegawa, A Fast Nearest Neighbor Classifier
Based on Self-organizing Incremental Neural Network, Neural Networks,
Vol.21, No.10, pp1537-1547, (2008)

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

References about SOINN


SOINN for unsupervised learning:
Furao Shen and Osamu Hasegawa, An Incremental Network for On-line
Unsupervised Classification and Topology Learning, Neural Networks,
Vol.19, No.1, pp.90-106, (2005)
Furao Shen, Tomotaka Ogura and Osamu Hasegawa, An enhanced
self-organizing incremental neural network for online unsupervised
learning, Neural Networks, Vol.20, No.8, pp.893-903, (2007)
SOINN for Supervised learning:
Furao Shen and Osamu Hasegawa, A Fast Nearest Neighbor Classifier
Based on Self-organizing Incremental Neural Network, Neural Networks,
Vol.21, No.10, pp1537-1547, (2008)

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

References about SOINN


SOINN for unsupervised learning:
Furao Shen and Osamu Hasegawa, An Incremental Network for On-line
Unsupervised Classification and Topology Learning, Neural Networks,
Vol.19, No.1, pp.90-106, (2005)
Furao Shen, Tomotaka Ogura and Osamu Hasegawa, An enhanced
self-organizing incremental neural network for online unsupervised
learning, Neural Networks, Vol.20, No.8, pp.893-903, (2007)
SOINN for Supervised learning:
Furao Shen and Osamu Hasegawa, A Fast Nearest Neighbor Classifier
Based on Self-organizing Incremental Neural Network, Neural Networks,
Vol.21, No.10, pp1537-1547, (2008)

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

References about SOINN


SOINN for Semi-supervised and active learning
Youki Kamiya, Toshiaki Ishii, Furao Shen and Osamu Hasegawa: An
Online Semi-Supervised Clustering Algorithm Based on a Self-organizing
Incremental Neural Network, IJCNN 2007, Orlando, FL, USA, August
2007
Furao Shen, Keisuke Sakurai, Youki Kamiya and Osamu Hasegawa: An
Online Semi-supervised Active Learning Algorithm with Self-organizing
Incremental Neural Network, IJCNN 2007, Orlando, FL, USA, August
2007
SOINN for Associative Memory:
Sudo Akihito; Sato Akihiro; Hasegawa Osamu, Associative Memory for
Online Learning in Noisy Environments Using Self-organizing Incremental
Neural Network, IEEE Transactions on Neural Networks, (2009) in press

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

References about SOINN


SOINN for Semi-supervised and active learning
Youki Kamiya, Toshiaki Ishii, Furao Shen and Osamu Hasegawa: An
Online Semi-Supervised Clustering Algorithm Based on a Self-organizing
Incremental Neural Network, IJCNN 2007, Orlando, FL, USA, August
2007
Furao Shen, Keisuke Sakurai, Youki Kamiya and Osamu Hasegawa: An
Online Semi-supervised Active Learning Algorithm with Self-organizing
Incremental Neural Network, IJCNN 2007, Orlando, FL, USA, August
2007
SOINN for Associative Memory:
Sudo Akihito; Sato Akihiro; Hasegawa Osamu, Associative Memory for
Online Learning in Noisy Environments Using Self-organizing Incremental
Neural Network, IEEE Transactions on Neural Networks, (2009) in press

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

References about SOINN


SOINN for Semi-supervised and active learning
Youki Kamiya, Toshiaki Ishii, Furao Shen and Osamu Hasegawa: An
Online Semi-Supervised Clustering Algorithm Based on a Self-organizing
Incremental Neural Network, IJCNN 2007, Orlando, FL, USA, August
2007
Furao Shen, Keisuke Sakurai, Youki Kamiya and Osamu Hasegawa: An
Online Semi-supervised Active Learning Algorithm with Self-organizing
Incremental Neural Network, IJCNN 2007, Orlando, FL, USA, August
2007
SOINN for Associative Memory:
Sudo Akihito; Sato Akihiro; Hasegawa Osamu, Associative Memory for
Online Learning in Noisy Environments Using Self-organizing Incremental
Neural Network, IEEE Transactions on Neural Networks, (2009) in press

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

Contents
What is SOINN
Why SOINN
Detail algorithm of SOINN
SOINN for machine learning
SOINN for associative memory
References

References about SOINN

Download papers and program of SOINN


http://www.isl.titech.ac.jp/ hasegawalab/soinn.html

F. Shen, O. Hasegawa

Self-organizing incremental neural network and its application

S-ar putea să vă placă și