Documente Academic
Documente Profesional
Documente Cultură
a comprehensive foundation
Lecture Yiping Duan
QQ 499151150
Email yipingduan@126.com
Neural Networks
a comprehensive foundation
2
Textbook and references
Textbook:
Simon Haykin, Neural Networks, a comprehensive foundation,
Prentice-Hall, Inc., 1999, Second Edition
References
Robert J. Schalkoff, Pattern Recognition: Statistical, Structural and
Neural Approaches, Clemson Univ.
C.M.Bishop, Neural Networks for Pattern Recognition, Oxford univ.
press (1995)
T. Kohonen, Self Organizing Maps, Springer (2001)
Bart Kosko, Neural Networks and Fussy Systems, A dynamical systems
approach to machine intelligence, Pretice-Hall, 1992
1993
1994
2001
3
2000
outlines
Chapter I Chapter VII
4
5
6
7
8
9
Neural Networks
a comprehensive foundation
Chatper 1 Introduction
Neural Networks
a comprehensive foundation
Applications
Comparison between brain and
computer
Brain Computer
Biological Integrated circuit
Composed of a huge number Composed of CPU
of neurons Sequential processing
Parallel processing Not fault tolerant
Fault tolerant Fast and precise computation
Fast but not precise Not learnable or trainable
computation Not adaptable to environment
Learnable or trainable
Adaptable to environment
Introduction
What is neural computing/neural networks?
Most impressive of all, the brain learns (without any explicit instructions) to
create the internal representations that make these skills possible
Introduction
(Artificial) Neural networks are computational models which
mimic the brain's learning processes.
They have the essential features of neurons and their interconnections as
found in the brain.
Typically, a computer is programmed to simulate these features.
Other definitions [S. Haykin, Neural Networks a comprehensive
foundation, 2nd edition, 1999, p.2]
A neural network is a massively parallel distributed processor made up of simple
processing units, which has a natural propensity for storing experimental
knowledge and making it available for use. It resemble the brain in two respects:
Knowledge is acquired by the network from its environment through a learning
process
Interneuron connection strengths, known as synaptic weights, are used to store the
acquired knowledge.
Artificial Neural Networks
Artificial neural network models (or simply neural networks) are typically composed of
interconnected units or artificial neurons. How the neurons are connected depends on some
specific task that the neural network performs.
Two key features of neural networks distinguish them from any other sort of computing
developed to date:
Unsupervised learning
Self organizing ANN
Kohonen topgraphic maps
SOMself organizing map
Recurrent neural network
Hopfield neural networks
Supervised Learning
Typically:
Backprop
of errors error = yi - ti
Training set: xi yi
{(xi, ti); -
i=1,2,,N}
desired output
(supervisor)
ti
Unsupervised Learning
Typically:
minimizing
system
energy energy f ( yi )
Training set: xi yi
{xi;
i}
Which for What ?
ANN
EC
Artificial neural networks
Capability Capability
Every combi is
possible and used:
Goal is to realize
processing systems
FL Representing
with greater
Fuzzy Logic Capability
intelligence
ANN a good choice if:
x2(i) 7
x1(i) 9 10
x1 11
12
X(i) =[x1(i), x2(i)] 13
x1(i) 14 8
15 16
Two key features of neural networks distinguish them from any other
sort of computing developed to date:
m m
y k ( wkj x j bk ) ( wkj x j )
j 1 j 0
(v ) {10 v0
v 0
Activation function of a perceptron
(v)
-0.5 0.5 v
1 vk 0 1 vk 0.5 1
(vk ) (vk )
0 otherwise (vk ) v otherwise
0 v 0.5 1 exp(ax)
k
1 vk 0 (v)
(vk ) 0 vk 0
1 vk 0
v
bk
x1
wk 1
x2 wk 2
vk yk
()
wkn
xn
bk
x1
wk 1
x2 wk 2 - yk
()
vk
+
wkn
xn
Binary classifier
Regression and Classification
Problem statement
Given: A set of examples {xn},
n = 1N, x d and
corresponding targets {tn},
n = 1N
Aim: To learn the functional
relationship of the targets with
the inputs with the aim to make
accurate predictions over unseen
data x
t = f (x, )
Original Noisy
image image
Conventional NN
method method
Applications of NNs: image enhancement
Original
image
NN
method
Original
image
NN
method
Q&A
Thank you for your
attention
04/01/17 47