Documente Academic
Documente Profesional
Documente Cultură
back-propagation m
developed based
them have been found c o m e
I I
60
m
U
321
. *.
0 weight
- 2245 -
input units. As a rule
hidden layer nodes shoul
Batch
The system was initi error
characters ( [A..Z, a..z, ) of Times Roman font
14. Each character was captured once and its
are stored in an array. These
back-propagation neural netw
performed. After the training, 0.08526
with training set and testing set characters. The table
below shows the results of the system.
Experiment 1:
Training font : Times New Rom
Testing font : Times New Rom 0.03282
Network configuration: 6
Total number of training
and (a..z)
Total number of testing c
Note:P=>F means P is miss-classify as F
Batch training
mor time
(hours)
3.88961 13
0.04008 14
h=>O
u=>n
0.01281 14
U=>H
[3]. Hussain, B and Kabuka, M. R., “A novel feature [5]. Rumelhart, D. E, Hinton, G.E., Williams, R. J,
recognition neural network and its application to “Learning Representation by Error
character recognition”, IEEE Transactions of Pattem Backpropagation”, In Parallel Distributed
Recognition and Machine Intelligence, Vol. 16, Processing, Vol. 1, MIT Press, Cambridge,
-
No.1, 1994, pp.98 106. Chapter.8, 1986, pp.3 18-362.
[4]. Avi-Itzhak, H. I, Diep, T. A. and Garland, H, [6]. Jones, W. P., and Hoskins, J., “Back
“High accuracy optical character recognition using Propagation: A generalised learning rule”, Byte, 12,
neural networks with centroid dithering”, IEEE 1987, pp. 155-158.
Transactions of Pattem Recognition and Machine
Intelligence, Vol. 17, No.2, 1995, pp.218-224.
- 2247 -