Sunteți pe pagina 1din 3

machineLearning/unsupervisedLearning/neuralNetworks/digitRecognitio... https://github.com/quinnliu/machineLearning/tree/master/unsupervisedL...

quinnliu / machineLearning Watch 31 Star 163 Fork 425

Code Issues 2 Pull requests 4 Projects 0 Pulse Graphs

machineLearning / unsupervisedLearning / neuralNetworks / digitRecognition / Find file History

quinnliu added comments and MIT license Latest commit c327232 on Feb 14, 2014

..

README.md added comments and MIT license 3 years ago

Theta1AndTheta2.mat added new folder for showing the backpropagation neural network learn 3 years ago

costFunctionRegularized.m added new folder for showing the backpropagation neural network learn 3 years ago

displayData.m added new folder for showing the backpropagation neural network learn 3 years ago

fmincg.m added new folder for showing the backpropagation neural network learn 3 years ago

inputTrainingSetOf5000Handwritten added new folder for showing the backpropagation neural network learn 3 years ago

oneVsAll.m added new folder for showing the backpropagation neural network learn 3 years ago

predict.m added new folder for showing the backpropagation neural network learn 3 years ago

predictOneVsAll.m added new folder for showing the backpropagation neural network learn 3 years ago

runMultiClassLogisticRegressionNe added new folder for showing the backpropagation neural network learn 3 years ago

runMultiClassNeuralNetworkWith3L added new folder for showing the backpropagation neural network learn 3 years ago

sigmoid.m added new folder for showing the backpropagation neural network learn 3 years ago

README.md

Recognizing Digits with Neural Networks

How to use this code:


1. install Octave or Matlab

2. fork this repo and clone it locally!

3. navigate into the folder with the above files

4. type runMultiClassLogisticRegressionNeuralNetwork in Octave or Matlab command line to see an example of a trained 2


layer neural network to recognize MNIST hand written digits with 95% success rate.

5. type runMultiClassNeuralNetworkWith3Layers in Octave or Matlab command line to see an example of a trained 3 layer
neural network to recognize MNIST hand written digits with 97% sucess rate.

Neural Network Review


Why a new non-linear hypotheses?

The problem with linear & logistic regression is that when we have a large # of features say 100 the # of feature
combinations that must be computed using linear or logisitic regression gets RIDICKULOUSLY large. For example
when we have 100 features and we decide to create a decision boundary that is cubic then the # of feature

1 of 3 2/12/2017 2:48 PM
machineLearning/unsupervisedLearning/neuralNetworks/digitRecognitio... https://github.com/quinnliu/machineLearning/tree/master/unsupervisedL...

combinations is about 170,000 starting with x_1 * x_2 * x_3, (x_1)^2 * x_2, ... you can imagine there being about
170,000 combinations of the terms x_1, x_2, ..., x_100.

For problems like computer vision, the # of features is very large. For example a 50 x 50 pixel greyscale image will
have 2500 pixels. And if we trained linear or logistic regression with all pixel to pixel relationships then we would
have 2500 features x 2500 other features giving us 3 million plus features which is WAY TOO MUCH.

Representing a Neuron

Here is a neuron in your brain:

To represent this as an artificial neuron we have:

where x = [x_0; x_1; x_2; x_3] where x_0 always = 1 and theta = [theta_0; theta_1; theta_2; theta_3]

Now if we have a whole bunch of artificial neurons in a few layers we get a neural network like:

where (a_i)^(j) = activation of unit i in layer j and BigTheta^(j) = matrix of weights controlling function mapping from
layer j to layer j + 1

The dimension of BigTheta^(j) is s_j+1 x (s_j + 1) if a neural network has s_j units in layer j & s_j+1 units in layer j
+ 1.

This is the set of equations that describes the above neural network configuration during forward propagation:

2 of 3 2/12/2017 2:48 PM
machineLearning/unsupervisedLearning/neuralNetworks/digitRecognitio... https://github.com/quinnliu/machineLearning/tree/master/unsupervisedL...

Now one or more of the neurons in a neural network can simulate a logical operations including AND, OR, NOT,
XNOR. Here is an artificial neuron simulating the logical AND operation:

2017 GitHub, Inc. Terms Privacy Security Status Help Contact GitHub API Training Shop Blog About

3 of 3 2/12/2017 2:48 PM

S-ar putea să vă placă și