Documente Academic
Documente Profesional
Documente Cultură
Content
1. History of Neural Networks
3. Biological Neuron
4. Artificial Neuron
7. Learning Rules
8. Activation Functions
First step toward harnessing the thinking power of the human brain was done in 1943
when Warren McCulloch, a neurophysiologist, and a young mathematician, Walter Pitts,
wrote a paper on how neurons might work and modeled a simple neuron using electrical
circuits.
In 1949 in book The Organization of Behavior, it was pointed out that neural
pathways strengthen each time as they are used.
In 1950s IBM laboratories first attempted to simulate a neural network but failed.
Frank Rosenblatt, a neurobiologist began work on perceptron and it is the oldest neural
network still in use.
In 1959, Bernard Widrow and Marcian Hoff developed the first neural network to be
applied to real world problems.
After that neural network research faced stunted growth till 1982 due to the fear
of effects of thinking machines on human race.
Has 3 Parts
Soma or Cell body Cell nucleus is located
Dendrites:- nerve connected to cell body
Axon: carries impulses of the neuron
End of axon splits into fine strands.
Each strand terminates into a bulb-like organ called synapse.
Electric impulses are passed between the synapse and dendrites.
Synapses are of two types
Inhibitory:- impulses hinder the firing of the receiving cell
Excitatory:- impulses cause the firing of the receiving cell
The information transmission happens at the synapses.
Artificial Neuron
Artificial Neural Network are the electronic models based on neural structure
of brain.
Each link is associated with weights which contain information about the
input signal.
Each neuron or node has an internal state of its own which is a function of
inputs that a neuron receives.
Artificial Neural Network Basic Model
Basic Model of Artificial Neural Network
Basic Models of
ANN
Activation
Interconnections Learning rules
function
Classification Based on Interconnections
Interconnections
Single layer
Multilayer
Learning Rules
Neural networks repeats both forward and back propagation until the weights are
calibrated to accurately predict an output.
EX: Exclusive or (XOR) Inputs
Outputs
0 0 0
0 1 1
1 0 1
1 1 0
FORWARD PROPAGATION:
Using the last row data (1,1) -- 0 to construct the first forward propagation
1
Choosing first forward propagation weights randomly
0.8 0.
3
1 0.4
0.5 0
0.9
2
0.
1 9
0.
0.5 0.
3
Hidden layersums
Application of activation function to the hidden layer sums
S(1) = = 0.73105857863
S(1.3) = = 0.78583498304
S(0.8) = = 0.68997448112
Adding these values to our neural network as hidden layer results
0.73
0.79 0.5 0
1 0.
3
0.69
S(1.235) = = 0.7746924929
Full diagram:
0.73
1
0.
3
0.69
Target = 0
Calculated = 0.77
BACK PROPAGATION
0.1 Finding the margin of error of the output result to back out the necessary
change in the output sum (delta output sum)
This means that the change in the output sum is the same as the sigmoid prime of the output result
Delta output sum =
Delta output sum =
Delta output sum =
Calculation of new change in weights
Delta hidden sum = (delta output sum / hidden to outer weights) (Hidden sum)
= (-0.1344 / [0.3, 0.5, 0.9])
= [-0.088, -0.0452, -0.0319]
0.69
1
0. 0.77 0.329 0.69
1
26
Target = 0
8
After one iteration calculated value changed from 0.77 to 0.69 which is
more near to desired value
This training of network (forward and back propagation) is repeated thousands of times
to get the desired results
MATLAB Code to train for XOR
net = feedforwardnet([3]);
%Repeat 4 times
P = [0 0; 0 1; 1 0; 1 1; 0 0; 0 1; 1 0; 1 1; 0 0; 0 1; 1 0; 1 1; 0 0; 0 1; 1 0; 1 1]';
T = [0 1 1 0 0 1 1 0 0 1 1 0 0 1 1 0]; % desired output
net = configure(net, P, T);
net.trainParam.goal = 1e-8;
net.trainParam.epochs = 1000;
net = train(net, P, T);
sim(net, P)
ans =
Columns 1 through 8
0.0000 1.0000 1.0000 0.0000 0.0000 1.0000 1.0000 0.0000
Columns 9 through 16
0.0000 1.0000 1.0000 0.0000 0.0000 1.0000 1.0000 0.000