Sunteți pe pagina 1din 76

FACULT# OF ELECTRICAL AN$ ELECTRONIC ENGINEERING

NEURAL ASSIGNMENT (MEE 10403)

NEURAL NETWORK FOR FACE RECOGNITION

Code of Course Na e of Course NAME 1! NAME "! LECTURER NAME

MEE 10403 COMPUTATIONAL INTELLIGENCE DINESHWARAN GUNALAN (HE120104) TUAN MOHD MUSTAQIM (HE120092) PROF. MADYA DR. JIWA !" A#DULLAH

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Table Of Contents Page


Table Of Contents List Of Figures List Of Tables Abstract Aim and Objective C a!ters "#$%T&O'(CT$O% "#* Project Overvie+ 2#L$T,&AT(&, &,-$,. 2#* %eural %et+or/ 0asic 2#" 1istorical Pers!ective 2#2 1istorical Of Face &ecognition 2#%,(&AL %,T.O&3 A&C1$T,CT(&, 2#* Classification of %eural %et+or/s 2#" 0ac/!ro!agation %et+or/s 2#2 %et+or/ 'esign 2#2 %et+or/ Arc itecture in 5imulin/6 4#%,(&AL %,T.O&3 T&A$%$%7 4#* T8!e Of Training 4#" 0ac/!ro!agation Algorit m 4#2 &esilient 0ac/!ro!agation Algorit m 9&P&OP: 4#2 Conjugate 7radient Algorit ms 4#4 5caled Conjugate 7radient Algorit m 95C7: 4#4 Transfer Function 4#6 7ive Training $mages To %et+or/ ) ) "* "* "* "2 "4 "4 "4 2* 2" 24 24 24 24 27 28 2) 2* 2 4 6 7 8

JEP, FKEE (Semester 1 2012/2013)

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

4#7 5etting Training Parameters 4#8 Training .it ;ore Faces 4#%,(&AL %,T.O&3 T,5T$%7 4#* %et+or/ 5imulation 4#" Post<Training Anal8sis 6#5OFT.A&, $;PL,;,%TAT$O% 6#* ;ATLA06 %eural %et+or/ Toolbo= 6#" -isuali>ation Tec ni?ue 6#2 5elect Training .indo+ 6#2 Training .indo+ 6#4 Flo+ C art 7#&,5(LT5 8#CO%CL(5$O% A%' &,CO;;,%'AT$O% &eferences

22 22 24 24 24 27 27 28 28 2) 4* 42 74 77

List Of Figures Page


JEP, FKEE (Semester 1 2012/2013)

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 2#"@ Arc itecture of 0ac/!ro!agation net+or/ Figure 2#2@ Log sigmoid transfer function Figure2#2 &etro<!ro!agation of error Figure 2#4@ ,rror algorit m Figure 2#4@ Arc itecture of t e net+or/ Figure 2#6@ %eural net+or/ diagrams Figure 2#7@ %eural net+or/ la8ers Figure 2#8@ First la8er of t e net+or/ Figure 2#) 5econd la8er of t e net+or/ Figure 2#"*@ First and 5econd la8er +eig ts of t e net+or/ Figure 4#"@ %eural net+or/ diagrams# Figure 4#2 Log<sigmoid transfer function Figure 4#2@ Training images Figure 4#4@ Pi=el -alues in an $ntensit8 $mage 'efine 7ra8 Levels Figure 4#" Post<training anal8sis Figure 6#"@ Face &ecognition select training +indo+ Figure 6#2@ Trainr! +indo+ Figure 6#2@ Flo+c art Figure 7#"@ Arc itecture of t e net+or/ Figure 7#2@ Person A original image and gra! Figure 7#2@ Person A train " time gra! s Figure 7#4@ Person A train 4 times gra! s Figure 7#4@ Person A train "* times gra! s Figure 7#6@ Person A train 2* times gra! s Figure 7#7@ Original image Figure 7#8@ &econstructed image Figure 7#)@ &econstructed image Figure 7#"*@ &econstructed image Figure 7#""@ &econstructed image Figure 7#"2@ Person 0 original image and gra! Figure 7#"2@ Person 0 train " time gra! s

"6 "7 "8 ") 2" 2" 22 22 22 22 24 2) 2* 2" 26 28 2) 4* 42 42 44 44 46 47 48 48 48 48 48 4" 42

JEP, FKEE (Semester 1 2012/2013)

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#"4@ Person 0 train 4 times gra! s Figure 7#"4@ Person 0 train "* times gra! s Figure 7#"6@ Person 0 train 2* times gra! s Figure 7#"7@ Original image Figure 7#"8@ &econstructed image Figure 7#")@ &econstructed image Figure 7#2*@ &econstructed image Figure 7#2"@ &econstructed image Figure 7#22@ Person A train " time gra! s Figure 7#22@ Person A train 4 times gra! s Figure 7#24@ Person A train "* times gra! s Figure 7#24@ Person A train 2* times gra! s Figure 7#26@ Original image Figure 7#27@ &econstructed image Figure 7#28@ &econstructed image Figure 7#2)@ &econstructed image Figure 7#2*@ &econstructed image Figure 7#2"@ Person 0 train " time gra! s Figure 7#22@ Person 0 train 4 times gra! s Figure 7#22@ Person 0 train "* times gra! s Figure 7#24@ Person 0 train 2* times gra! s Figure 7#24@ Original image Figure 7#26@ &econstructed image Figure 7#27@ &econstructed image Figure 7#28@ &econstructed image Figure 7#2)@ &econstructed image

42 44 44 46 46 46 46 46 4) 6* 6" 62 62 62 62 62 62 66 67 68 6) 7* 7* 7* 7* 7*

List Of Tables Page


Table 2#"@ 1istorical notes of neural net+or/ Table 7#"@ Target value com!are +it training value "* 4)

JEP, FKEE (Semester 1 2012/2013)

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Table 7#2@ ;ean of target value com!are +it mean training value Table 7#2@ Target value com!are +it training value Table 7#4@ ;ean of target value com!are +it mean training value Table 7#4@ Target value com!are +it training value Table 7#6@ ;ean of target value com!are +it mean training value Table 7#7@ Target value com!are +it training value Table 7#8@ ;ean of target value com!are +it mean training value Table 7#)@ Training times ta/en b8 training function# Table 7#"*@ Training times ta/en b8 training function

4* 47 48 64 64 7" 72 72 72

Abstract
%eural net+or/ +as a s8stem com!osed of man8 sim!le !rocessing elements o!erating in !arallel + ose function +as determined b8 net+or/ structureA connection strengt sA and t e
JEP, FKEE (Semester 1 2012/2013)

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

!rocessing !erformed at com!uting elements or nodes B2C# Peo!le in com!uter vision and !attern recognition ave been +or/ing on automatic recognition of uman faces for t e last 2* 8ears B6C# Face recognition as establis ed itself as an im!ortant sub<branc of !attern recognition +it in t e field of com!uter science# T e basic goal of t is !roject +as to train a neural net+or/ to be im!lementing a sim!le face recognition s8stemA based on neural net+or/# $t demonstrates t e conce!t of data recognition of a com!lete s8stem t at can be em!lo8ed for face recognition# 1ere t e faces +ere verticall8 oriented frontal vie+ +it +ide e=!ression c ange and images +ere in intensit8 value or gra8 scale# A standard 0ac/!ro!agation feedfor+ard net+or/ +as designed for t is !rojectD it +as a t+o<la8er log<sigmoidElog< sigmoid net+or/# T ree t8!es of training function +ere im!lemented in t is !roject suc as &esilient 0ac/!ro!agation 9trainr!:A 5caled Conjugate 7radient 9trainscg: and Fletc er< &eeves (!date 9traincgf:# &esults obtained b8 using t ese training functions indicated t e8 +ere suitable for face recognition# $t mean in training times consumed and !ercentage of recognition b8 t e net+or/# $n t is !roject ;ATLA0 %eural %et+or/ Toolbo=A +as being used to train and recogni>e t e face t roug learning correct classification of t ese image data# ;ATLA0 +as used since t e soft+are is good#

Aim
T e aims of t is !roject t at ave to be attained are listed as belo+@ "# Literature surve8 in using %eural %et+or/ to test and design face recognition s8stem#
JEP, FKEE (Semester 1 2012/2013)

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

2# To understand and e=amine !roblem from a %eural %et+or/s !oint of vie+ b8 !resenting +or/ on face recognition# 2# To stud8 and a!!l8 face recognition tec ni?ue#

Objectives
T e objectives of t is !roject are mainl8@ "# To introduce ;ATLA0 %eural %et+or/ Toolbo= in training and recogni>ing t e face# 2# To be able to describe t e !rinci!al stages of face recognition +it %eural %et+or/ using good design met odolog8# 2# To address t e benefits of %eural %et+or/# 4# To c ec/ and corroborate t e reliabilit8 of t e said net+or/ + ic +as conducted#

C1APT,& " $%T&O'(CT$O%


"#* Project Overvie+
Artificial %eural %et+or/s 9A%%:A or sim!l8 %eural %et+or/sA can be loosel8 defined as large sets of interconnected sim!le unitsA + ic e=ecute in !arallel to !erform a common global tas/# T ese units usuall8 undergo a learning !rocessA + ic automaticall8 u!dates

JEP, FKEE (Semester 1 2012/2013)

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

net+or/ !arameters in res!onse to a !ossibl8 evolving in!ut environment# T e units are often ig l8 sim!lified models of t e biological neurons found in t e animal brain B8C# Face recognition as ra!idl8 gained im!ortance +it in t e field of !attern recognitionA +it a variet8 of interesting a!!lications in areas suc as securit8 or inde=ing of image and video databases. ;ATLA0 %eural %et+or/ Toolbo= +as used for t is !roject# A standard 0ac/!ro!agation feed for+ard t+o<la8er log<sigmoidElog<sigmoid net+or/ +as designed to train a set of face image# T ese images consist of t+o !ersonA + ic +ere !erson A and !erson 0A + ic are a t+o dimensional arra8 of intensit8 values or gra8scaleA verticall8 oriented frontal vie+# %ormal e=!ression variation +as allo+ed and t e image +as !re!ared under roug l8 constant illumination# T e si>e of t e images +as 2* = 2* !i=els and +ill convert to 2* = 2* matrices vector# T ree t8!es of training function +ere im!lemented in t is !roject suc as &esilient 0ac/!ro!agation 9trainr!:A 5caled Conjugate 7radient 9trainscg: and Fletc er<&eeves (!date 9traincgf:# T e net+or/ +as trained b8 re!eatedl8 oneA fiveA ten and t+ent8 times# After trainingA a ne+ image +as given for testing# T en it +ill s o+ t e !erformance of t e net+or/ to t at image b8 !lotting t e target out!ut versus simulation out!ut and t e image after simulation# 0esides t atA it also s o+s t e correlation bet+een net+or/ out!uts +it target#

C1APT,& 2 L$T,&AT(&, &,-$,.


2#* %eural %et+or/ 0asic
1uman beingsA as +ell as man8 ot er living creaturesA tac/le !ractical !roblems almost effortlessl8 in com!arison +it a serial digital com!uter# T e motivation for Artificial %eural %et+or/ 9A%%: researc +as t e belief t at a umanFs ca!abilitiesA !articularl8 in real<time

JEP, FKEE (Semester 1 2012/2013)

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

visual !erce!tionA s!eec understandingA and sensor8 information !rocessing and in ada!tivel8 as +ell as intelligent decision ma/ing in generalA come from t e organi>ational and com!utational !rinci!les e= ibited in t e ig l8 com!le= neural net+or/ of t e uman brain# ,=!ectations of faster and better solutions !rovide us +it t e c allenge to build mac ines using t e same com!utational and organi>ational !rinci!les sim!lified and abstracted from neurobiological studies of t e brain B"C#

2#" 1istorical Pers!ective


T e istor8 of neural net+or/s is usuall8 !resented as a series of G+aves of researc G# T is as !robabl8 led c8nics to discount neural net+or/ researc as ig l8 vulnerable to G erd mentalit8G and to brand t e field as not ing more t an G 8!eG and Gfas ionG B8C# Table 2#" +as t e istorical notes about neural net+or/# Table 2#"@ 1istorical notes of neural net+or/ Year 1943 Description McCulloch and Pitts (start of the modern era of neural networks). Logical calculus of neural net+or/s# A net+or/ consists of sufficient number of neurons 9using a sim!le model: and !ro!erl8 set s8na!tic connections can com!ute an8 com!utable function# e!!"s !ook #The organization of behavior#. An e=!licit statement of a ! 8siological learning rule for synaptic modification +as !resented for t e first time# 1ebb !ro!oses t at t e 1949 connectivit8 of t e brain is continuall8 c anging as an organism learns differing functional tas/sA and t at neural assemblies are created b8 suc c anges# 1ebbHs +or/ +as immensel8 influential among !s8c ologists

JEP, FKEE (Semester 1 2012/2013)

"*

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

&osen!latt introduced Perceptron 19$% A novel met od of su!ervised learning# Perce!tron convergence t eorem# Least mean<s?uare 9L;5: algorit m Minsk( and Papert showed limits on perceptron computation. 19'9 ;ins/8 and Pa!ert s o+ed t at t ere are fundamental limits on + at single<la8er !erce!trons can com!ute# T e8 s!eculated t at t e limits could not be overcome for t e multi<la8er version# opfield"s networks 19%) 1o!field s o+ed o+ to use G$sing s!in glassG t8!e of model to store information in d8namicall8 stable net+or/s# 1is +or/ !aved t e +a8 for ! 8sicists to enter neural modelingA t ereb8 transforming t e field of neural net+or/s# *ohonen"s self+or,ani-in, maps (./M) 3o onenHs self<organi>ing ma!s is ca!able of re!roducing im!ortant as!ects of t e structure of biological neural nets@ 'ata re!resentation using to!ogra! ic ma!s 9+ ic are common in t e nervous s8stems:# 19%) 5O; also as a +ide range of a!!lications#5O; s o+s o+ t e out!ut la8er can !ic/ u! t e correlational structure 9from t e in!uts: 19%$ in t e form of t e s!atial arrangement of units# 0ckle(1 inton1 and .e2nowski1 de3eloped 4olt-mann machine1 which was the first successful reali-ation of a multila(er neural network. &umelhart1 19%' inton1 and 5illiams de3eloped the !ack+

propa,ation al,orithm T e most !o!ular learning algorit m for t e training of multila8er !erce!trons# $t a!!lications# as been t e +or/ orse for man8 neural net+or/

2#2 1istorical Of Face &ecognition

JEP, FKEE (Semester 1 2012/2013)

""

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

T e subject of face recognition is as old as com!uter visionA bot because of t e !ractical im!ortance of t e to!ic and t eoretical interest from cognitive scientists# Per a!s t e most famous earl8 e=am!le of a face recognition s8stem is due to 3o onen B2CA + o demonstrated t atA a sim!le neural net could !erform face recognition for aligned and normali>ed face images# T e t8!e of net+or/ e em!lo8ed com!uted a face descri!tion b8 a!!ro=imating t e eigenvectors of t e face imageHs autocorrelation matri=D t ese eigenvectors are no+ /no+n as Ieigenfaces#H 3irb8 and 5irovic 9")8): B4C later introduced an algebraic mani!ulation + ic made it eas8 to directl8 calculate t e eigenfacesA and s o+ed t at fe+er t an "** +ere re?uired to accuratel8 code carefull8 aligned and normali>ed face images# Tur/ and Pentland 9"))": B4C t en demonstrated t at t e residual error + en coding using t e eigenfaces could be used bot to detect faces in cluttered natural imager8A and to determine t e !recise location and scale of faces in an image# T e8 t en demonstrated t at b8 cou!ling t is met od for detecting and locali>ing faces +it t e eigenface recognition met odA one could ac ieve reliableA real<time recognition of faces in a minimall8 constrained environment# T is demonstration t at sim!leA real<time !attern recognition tec ni?ues could be combined to create a useful s8stem s!ar/ed an e=!losion of interest in t e to!ic of face recognition# Currentl8A several face<recognition !roducts are commerciall8 available# -isionicsA -iisageA and ;iros << seem to be t e current mar/et leaders in face recognition# ;ATLA0 +as used in t is !rojectA ;ATLA0 develo!ed b8 ;at .or/s $ncA +as soft+are for ig !erformance numerical com!utation and visuali>ation# T e combination of anal8sis ca!abilitiesA fle=ibilit8A and !o+erful gra! ics ma/es ;ATLA0 t e !remier soft+are !ac/age for electrical engineers#

JEP, FKEE (Semester 1 2012/2013)

"2

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

C1APT,& 2 %,(&AL %,T.O&3 A&C1$T,CT(&,


2#* Classification of %eural %et+or/s
Artificial %eural %et+or/s 9A%%s:A also called !arallel distributed !rocessing s8stems 9P'Ps: and connectionist s8stemsA are intended for modeling t e organi>ational !rinci!les of t e central nervous s8stemsA +it t e o!e t at t e biologicall8 ins!ired com!uting ca!abilities of t e A%% +ill allo+ t e cognitive and sensor8 tas/s to be !erformed more easil8 and more satisfactor8 t an +it conventional serial !rocessors B"C# %eural %et+or/ models can be classified in a number of +a8s# (sing t e net+or/ arc itecture as basisA t ere are t ree major t8!es of neural net+or/s B8C@

Recurrent networks < t e units are usuall8 laid out in a t+o<dimensional arra8 and are regularl8 connected# T8!icall8A eac unit sends its out!ut to ever8 ot er unit of t e net+or/ and receives in!ut from t ese same units# &ecurrent net+or/s are also called feedback networks. 5uc net+or/s are Gclam!edG to some initial configuration b8 setting t e activation values of eac of t e units# T e net+or/ t en goes t roug a stabili>ation !rocess + ere t e net+or/ units c ange t eir activation values and slo+l8 evolve and converge to+ard a final configuration of Glo+ energ8G# T e final configuration of t e net+or/ after stabili>ation constitutes t e out!ut or res!onse of t e net+or/# T is is t e arc itecture of t e 1o!field ;odel#

JEP, FKEE (Semester 1 2012/2013)

"2

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Feed forward networks J t ese net+or/s distinguis bet+een t ree t8!es of units@ in!ut unitsA idden unitsA and out!ut units# T e activit8 of t is t8!e of net+or/ !ro!agates for+ard from one la8er to t e ne=tA starting from t e in!ut la8er u! to t e out!ut la8er# 5ometimes called multila8ered net+or/sA feed for+ard net+or/s are ver8 !o!ular because t is is t e in erent arc itecture of t e 0ac/!ro!agation ;odel#

Competitive networksJ t ese net+or/s are c aracteri>ed b8 lateral in ibitor8 connections bet+een units +it in a la8er suc t at t e com!etition !rocess bet+een units causes t e initiall8 most active unit to be t e onl8 unit to remain activeA + ile all t e ot er units in t e cluster +ill slo+l8 be deactivated# T is is referred to as a G+inner<ta/es<allG mec anism# 5elf<Organi>ing ;a!sA Ada!tive &esonance T eor8A and &umel art K Li!serHs Com!etitive Learning ;odel are t e best e=am!les for t ese t8!es of net+or/s#

2#" 0ac/!ro!agation %et+or/s


0ac/!ro!agation net+or/sA and multi la8ered !erce!tronsA in generalA are feedforward networks +it distinct in!utA out!utA and idden la8ers# T e units function basicall8 li/e !erce!tronsA e=ce!t t at t e transition 9out!ut: rule and t e +eig t u!date 9learning: mec anism are more com!le= B8C# Figure 2#" !resents t e arc itecture of bac/!ro!agation net+or/s# T ere ma8 be an8 number of idden la8ersA and an8 number of idden units in an8 given idden la8er# $n generalA ever8 unit in a given la8er is connected to ever8 ot er unit in t e !receding la8er# T ere are no interconnections among units in t e same la8er# Activation of t e net+or/ !roceeds from t e in!ut unitsA la8er b8 la8erA u! to t e out!ut units# $n!ut and out!ut units can be binar8 M*A "NA bi<!olar M<"A O"NA or ma8 ave real values +it in a s!ecific range suc as B<"A "C#

JEP, FKEE (Semester 1 2012/2013)

"4

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 2#"@ Arc itecture of 0ac/!ro!agation net+or/ $n feedfor+ard activationA units of idden la8er " com!ute t eir activation and out!ut values and !ass t ese on to t e ne=t la8erA and so on until t e out!ut units +ill ave !roduced t e net+or/Hs actual res!onse to t e current in!ut# T e activation value a / of unit / is com!uted as follo+s#

As illustrated aboveA =i is t e in!ut signal coming from unit i at t e ot er end of t e incoming connection# +/i is t e +eig t of t e connection bet+een unit / and unit i# (nli/e in t e linear t res old unitA t e out!ut of a unit in a bac/!ro!agation net+or/ is no longer based on a t res old# T e out!ut 8/ of unit / is com!uted as follo+s@

JEP, FKEE (Semester 1 2012/2013)

"4

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

T e function f9=: is referred to as t e out!ut function# $t is a continuousl8 increasing function of t e sigmoid t8!eA as8m!toticall8 a!!roac ing * as = decreasesA and as8m!toticall8 a!!roac es " as = increases# At = P *A f9=: is e?ual to *#4#

Figure 2#2@ Log sigmoid transfer function Once activation is fed for+ard all t e +a8 to t e out!ut unitsA t e net+or/Fs res!onse is com!ared to t e desired out!ut 8di + ic accom!anies t e training !attern# T ere are t+o t8!es of error# T e first error is t e error at the output layer# T is can be directl8 com!uted as follo+s@

T e second t8!e of error is t e error at the hidden layers # T is cannot be com!uted directl8 since t ere is no available information on t e desired out!uts of t e idden la8ers# T is is + ere t e retro<!ro!agation of error is called for# ,ssentiall8A t e error at t e out!ut la8er is used to com!ute for t e error at t e idden la8er immediatel8 !receding t e out!ut la8er# Once t is is com!utedA t is is used in turn to com!ute for t e error of t e ne=t idden la8er immediatel8 !receding t e last idden la8er# T is is done se?uentiall8 until t e error at t e ver8 first idden la8er is com!uted# T e retro< !ro!agation of error is illustrated in t e Figure 2#2@

JEP, FKEE (Semester 1 2012/2013)

"6

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 2#2 &etro<!ro!agation of error Com!utation of errors ei at a idden la8er is done as follo+s@

T e errors at t e ot er end of t e outgoing connections of t e idden unit

ave been

earlier com!uted# T ese could be error values at t e out!ut la8er or at a idden la8er# T ese error signals are multi!lied b8 t eir corres!onding outgoing connection +eig ts and t e sum of t ese is ta/en#

JEP, FKEE (Semester 1 2012/2013)

"7

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 2#4@ ,rror algorit m After com!uting for t e error for eac unitA + et er it is at a idden unit or at an out!ut unitA t e net+or/ t en fine<tunes its connection +eig ts +/jtO"# T e +eig t u!date rule is uniform for all connection +eig ts#

T e learning rate QaF is t8!icall8 a small value bet+een * and "# $t controls t e si>e of +eig t adjustments and as some bearing on t e s!eed of t e learning !rocess as +ell as on t e !recision b8 + ic t e net+or/ can !ossibl8 o!erate# fF9=: also controls t e si>e of +eig t adjustmentsA de!ending on t e actual out!ut f9=:# $n t e case of t e sigmoid function aboveA its first derivative 9slo!e: fF9=: is easil8 com!uted as follo+s@

%ote t at t e c ange in +eig t is directl8 !ro!ortional to t e error term com!uted for t e unit at t e out!ut end of t e incoming connection# 1o+everA t is +eig t c ange is

JEP, FKEE (Semester 1 2012/2013)

"8

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

controlled b8 t e out!ut signal coming from t e in!ut end of t e incoming connection# $t can infer t at ver8 little +eig t c ange 9learning: occurs + en t is in!ut signal is almost >ero# T e +eig t c ange is furt er controlled b8 t e term fF9a /:# 0ecause t is term measures t e slo!e of t e functionA and /no+ing t e s a!e of t e functionA +e can infer t at t ere +ill li/e+ise be little +eig t c ange + en t e out!ut of t e unit at t e ot er end of t e connection is close to * or "# T usA learning +ill ta/e !lace mainl8 at t ose connections +it s8na!tic signals and non<committed 9 overing around *#4: !ost<s8na!tic signals# ig !re<

2#2 %et+or/ 'esign


A standard 0ac/!ro!agation feedfor+ard net+or/s +as designed for t is !roject# T e first ste! in designing a feedfor+ard net+or/ +as to create t e net+or/ object# T e function ne+ff creates a feedfor+ard net+or/# $t re?uires four in!uts and returns t e net+or/ object# T e first in!ut +as an & b8 2 matri=es of minimum and ma=imum values for eac of t e & elements of t e in!ut vector# T e second in!ut +as an arra8 containing t e si>es of eac la8er# T e t ird in!ut +as a cell arra8 containing t e names of t e transfer functions to be used in eac la8er# T e final in!ut contains t e name of t e training function to be used# T e follo+ing command creates a t+o<la8er net+or/# T ere +as one in!ut vector +it one element# T e values for t e element ! of t e in!ut vector range bet+een * and "# T ere +ere t irt8 neurons 95": in t e first la8er and t irt8 neurons 952: in t e second 9out!ut: la8er# T e transfer function in t e first la8er +as log<sigmoidA and t e out!ut la8er transfer function also +as log<sigmoid# T e training function is trainr!# ! P im2double9imread9Hface"#j!gH:: B52A5"C P si>e9!: netPne+ff9minma=9!:AB5" 52CAMHlogsigH HlogsigHNAHtrainr!H:D T e net+or/ +ill be trained to recogni>e t e t+o !erson images# All images +ere in gra8 or intensit8 scale +it resolution of 2* b8 2*# T e range of value for t e images +as bet+een * and "# T e net+or/ +as a t+o<la8er log<sigmoidElog<sigmoid net+or/A t e log< sigmoid transfer function +as !ic/ed because its out!ut ranges 9* to ": is !erfect for learning
JEP, FKEE (Semester 1 2012/2013)

")

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

to out!ut values# T e function log<sigmoid generates out!uts bet+een * and " as t e neuronFs net in!ut goes from negative to !ositive infinit8#

Figure 2#4@ Arc itecture of t e net+or/ T e idden 9first: la8er as 2* neurons# T is number +as !ic/ed because t e image si>e +as 2* b8 2* matricesA it +ill give enoug neurons to learn t e net+or/# T e out!ut 9second: la8er also as 2* neurons to train and given bac/ t e result e=actl8 same as t e in!ut matrices#

2#2 %et+or/ Arc itecture in 5imulin/6


After designing t e arc itecture of t e net+or/ it +ill be simulin/ 6 b8 ;ATLA06 to see + et er t e design +as e=actl8 t e same as t e original design# 0elo+ +as t e simulin/ 6 results arc itecture of neural net+or/#

Figure 2#6@ %eural net+or/ diagrams

JEP, FKEE (Semester 1 2012/2013)

2*

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 2#7@ %eural net+or/ la8ers

Figure 2#8@ First la8er of t e net+or/

Figure 2#) 5econd la8er of t e net+or/ &efer to Figure 2#6 +as t e main bloc/ diagram of t e net+or/A ! M"N and 8 M"N +as t e net+or/ in!ut and out!ut# T e neural net+or/ bo= consists of neural net+or/ la8ers as s o+ in Figure 2#7 and it +as t+o la8ers log<sigmoidElog<sigmoid net+or/ as illustrated in Figure 2#8 and Figure 2#)# Finall8 Figure 2#"* and Figure 2#"" !oint u! t e structure +eig ts or neurons in t e net+or/# 0ot la8er " and la8er 2 as 2* +eig tsA it +ill give enoug neurons to learn t e net+or/#

JEP, FKEE (Semester 1 2012/2013)

2"

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 3.10: First and Second layer weights of the network


JEP, FKEE (Semester 1 2012/2013)

22

Figure 3.11: Second layer weights of the network

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

C1APT,& 4 %,(&AL %,T.O&3 T&A$%$%7


4#* T8!e of Training
Commonl8 neural net+or/s are adjustedA or trainedA so t at a !articular in!ut leads to a s!ecific target out!ut# 5uc a situation is s o+n in Figure 4#"# T ereA t e net+or/ is adjustedA based on a com!arison of t e out!ut and t e targetA until t e net+or/ out!ut matc es t e target# T8!icall8 man8 suc in!utEtarget !airs are usedA in t is supervised learningA to train a net+or/ B7C#

Figure 4#"@ %eural net+or/ diagrams# T e t+o main /inds of learning algorit ms t at most +ell /no+n met ods are su!ervised and unsu!ervised# $n supervised learningA t e correct results 9target valuesA desired out!uts: are /no+n and are given to t e neural net+or/ during training so t at t e neural net+or/ can adjust its +eig ts to tr8 matc ing its out!uts to t e target values# After trainingA giving it onl8 in!ut values tests t e neural net+or/A not target valuesA and seeing o+ close it comes to out!utting t e correct target values# $n unsupervised learningA t e neural net+or/ is not !rovided +it t e correct results during training# (nsu!ervised neural net+or/s usuall8 !erform some /ind of data com!ressionA suc as dimensionalit8 reduction or clustering B)C#
JEP, FKEE (Semester 1 2012/2013)

22

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Practical a!!lications of neural net+or/s most often em!lo8 su!ervised learning# For su!ervised learningA it !rovides training data t at includes bot t e in!ut and t e desired result 9t e target value:# After successful trainingA it can !resent in!ut data alone to t e neural net+or/ 9t at isA in!ut data +it out t e desired result:A and t e neural net+or/ +ill com!ute an out!ut value t at a!!ro=imates t e desired result# 1o+everA for training to be successfulA it ma8 need lots of training data and lots of com!uter time to do t e training# $n man8 a!!licationsA suc as image and te=t !rocessingA it +ill ave to do a lot of +or/ to select a!!ro!riate in!ut data and to code t e data as numeric values B)C#

4#" 0ac/!ro!agation Algorit m


As mention before t ere +ere t ree t8!es of training function +ere im!lemented in t is !roject suc as &esilient 0ac/!ro!agation 9trainr!:A 5caled Conjugate 7radient 9trainscg: and Fletc er<&eeves (!date 9traincgf:# T ere are man8 variations of t e bac/!ro!agation algorit m# T e sim!lest im!lementation of bac/!ro!agation learning u!dates t e net+or/ +eig ts and biases in t e direction in + ic t e !erformance function decreases most ra!idl8 < t e negative of t e gradient# One iteration of t is algorit m can be +ritten B7C

. ere

is a vector of current +eig ts and biasesA

is t e current gradientA and

is t e learning rate#

4#2 &esilient 0ac/!ro!agation Algorit m 9&P&OP:


;ultila8er net+or/s t8!icall8 use sigmoid transfer functions in t e idden la8ers# T ese functions are often called Gs?uas ingG functionsA since t e8 com!ress an infinite in!ut range into a finite out!ut range# 5igmoid functions are c aracteri>ed b8 t e fact t at t eir slo!e must a!!roac >eroA as t e in!ut gets large# T is causes a !roblem + en using stee!est descent to train a multila8er net+or/ +it sigmoid functionsA since t e gradient can ave a ver8 small

JEP, FKEE (Semester 1 2012/2013)

24

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

magnitudeD and t ereforeA cause small c anges in t e +eig ts and biasesA even t oug t e +eig ts and biases are far from t eir o!timal values# T e !ur!ose of t e resilient bac/!ro!agation 9&!ro!: training algorit m is to eliminate t ese armful effects of t e magnitudes of t e !artial derivatives# Onl8 t e sign of t e derivative is used to determine t e direction of t e +eig t u!dateD t e magnitude of t e derivative as no effect on t e +eig t u!date# T e si>e of t e +eig t c ange is determined b8 a se!arate u!date value# T e u!date value for eac +eig t and bias is increased b8 a factor deltRinc + enever t e derivative of t e !erformance function +it res!ect to t at +eig t as t e same sign for t+o successive iterations# T e u!date value is decreased b8 a factor deltRdec + enever t e derivative +it res!ect t at +eig t c anges sign from t e !revious iteration# $f t e derivative is >eroA t en t e u!date value remains t e same# . enever t e +eig ts are oscillating t e +eig t c ange +ill be reduced# $f t e +eig t continues to c ange in t e same direction for several iterationsA t en t e magnitude of t e +eig t c ange +ill be increased# &esilient 0ac/!ro!agation 9trainr!: is a net+or/ training function t at u!dates +eig t and bias values according to t e resilient bac/!ro!agation algorit m 9&P&OP:# Trainr! can train an8 net+or/ as long as its +eig tA net in!utA and transfer functions ave derivative functions# 0ac/!ro!agation is used to calculate derivatives of !erformance !erf +it res!ect to t e +eig t and bias variables S# ,ac variable is adjusted according to t e follo+ing@ dS P deltaS#Tsign9gS: . ere t e elements of deltaS are all initiali>ed to delta* and gS is t e gradient# At eac iteration t e elements of deltaS are modified# $f an element of gS c anges signs from one iteration to t e ne=tA t en t e corres!onding element of deltaS is decreased b8 deltaRdec# $f an element of gS maintains t e same sign from one iteration to t e ne=tA t en t e corres!onding element of deltaS is increased b8 deltaRinc B7C#

JEP, FKEE (Semester 1 2012/2013)

24

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

4#2 Conjugate 7radient Algorit ms


T e basic bac/!ro!agation algorit m adjusts t e +eig ts in t e stee!est descent direction 9negative of t e gradient:# T is is t e direction in + ic t e !erformance function is decreasing most ra!idl8# $t turns out t atA alt oug t e function decreases most ra!idl8 along t e negative of t e gradientA t is does not necessaril8 !roduce t e fastest convergence# $n t e conjugate gradient algorit ms a searc is !erformed along conjugate directionsA + ic !roduces generall8 faster convergence t an stee!est descent directions B7C# $n most of t e training algorit msA a learning rate is used to determine t e lengt of t e +eig t u!date 9ste! si>e:# $n most of t e conjugate gradient algorit msA t e ste! si>e is adjusted at eac iteration# A searc is made along t e conjugate gradient direction to determine t e ste! si>eA + ic minimi>es t e !erformance function along t at line# All of t e conjugate gradient algorit ms start out b8 searc ing in t e stee!est descent direction 9negative of t e gradient: on t e first iteration#

A line searc is t en !erformed to determine t e o!timal distance to move along t e current searc direction@

T en t e ne=t searc direction is determined so t at it is conjugate to !revious searc directions# T e general !rocedure for determining t e ne+ searc direction is to combine t e ne+ stee!est descent direction +it t e !revious searc direction@

T e various versions of conjugate gradient are distinguis ed b8 t e manner in + ic t e constant is com!uted# For t e Fletc er<&eeves u!date t e !rocedure is

JEP, FKEE (Semester 1 2012/2013)

26

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

T is is t e ratio of t e norm s?uared of t e current gradient to t e norm s?uared of t e !revious gradient# Traincgf is a net+or/ training function t at u!dates +eig t and bias values according to t e conjugate gradient bac/!ro!agation +it Fletc er<&eeves u!dates# Traincgf can train an8 net+or/ as long as its +eig tA net in!utA and transfer functions ave derivative functions# 0ac/!ro!agation is used to calculate derivatives of !erformance !erf +it res!ect to t e +eig t and bias variables S# ,ac variable is adjusted according to t e follo+ing@ S P S O aTdSD + ere dS is t e searc direction# T e !arameter QaF is selected to minimi>e t e !erformance along t e searc direction# T e line searc function searc Fcn is used to locate t e minimum !oint# T e first searc direction is t e negative of t e gradient of !erformance# $n succeeding iterations t e searc direction is com!uted from t e ne+ gradient and t e !revious searc directionA according to t e formula@ dS P <gS O dSRoldTL + ere gS is t e gradient# T e !arameter L can be com!uted in several different +a8s# For t e Fletc er<&eeves variation of conjugate gradient it is com!uted according to L P normne+Rs?rEnormRs?r + ere normRs?r is t e norm s?uare of t e !revious gradient and normne+Rs?r is t e norm s?uare of t e current gradient#

4#4 5caled Conjugate 7radient Algorit m 95C7:


,ac of t e conjugate gradient algorit ms t at +e ave discussed so far re?uires a line searc at eac iteration# T is line searc is com!utationall8 e=!ensiveA since it re?uires t at t e net+or/ res!onse to all training in!uts be com!uted several times for eac searc # T e scaled

JEP, FKEE (Semester 1 2012/2013)

27

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

conjugate gradient algorit m 95C7:A develo!ed b8 ;oller B;oll)2CA +as designed to avoid t e time<consuming line searc B7C# Trainscg is a net+or/ training function t at u!dates +eig t and bias values according to t e scaled conjugate gradient met od# T e scaled conjugate gradient algorit m is based on conjugate directions# Trainscg can train an8 net+or/ as long as its +eig tA net in!utA and transfer functions ave derivative functions# 0ac/!ro!agation is used to calculate derivatives of !erformance !erf +it res!ect to t e +eig t and bias variables S#

4#4 Transfer Function


A standard 0ac/!ro!agation feedfor+ard net+or/ +as designed for t is !rojectD it +as a t+o<la8er log<sigmoidElog<sigmoid net+or/# T e sigmoid transfer function s o+n in Figure 4#2 ta/es t e in!utA + ic ma8 ave an8 value bet+een !lus and minus infinit8A and s?uas es t e out!ut into t e range * to "#

Figure 4#2 Log<sigmoid transfer function T is transfer function is commonl8 used in bac/!ro!agation net+or/sA in !art because it is differentiable# T e s8mbol in t e s?uare to t e rig t of eac transfer function gra! s o+n in Figure 4#2 re!resents t e associated transfer function# T ese icons +ill re!lace t e general f in t e bo=es of net+or/ diagrams to s o+ t e !articular transfer function being used# For t e log<sigmoid transfer function it is com!uted according to B7C #. ere n +as a number# logsig9n: P " E 9" O e=!9<n::

JEP, FKEE (Semester 1 2012/2013)

28

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

4#6 7ive Training $mages to %et+or/


5u!ervised learning +as used in t is !roject# After t e neural net+or/ structure +as setA t e most im!ortant t ing +as to !re!are t e training e=am!les# $n t e beginning of t e trainingA select a number of face images from eac !erson t at +ere +ell aligned frontal vie+# An8 of t em can re!resent t eir ost clearl8 refer Figure 4#2#

Figure 4#2@ Training images All training images +ere in intensit8 value# An intensit8 image +as a data matri=A + ose values re!resent intensities +it in some range# ;ATLA0 6 stores an intensit8 image as a single matri=A +it eac element of t e matri= corres!onding to one image !i=el# T e elements in t e intensit8 matri= re!resent various intensitiesA or a gra8 levelA + ere t e intensit8 * re!resents blac/ and t e intensit8 "A re!resents full intensit8A or + ite B7C# Figure 4#4 belo+ de!icts an intensit8 image#

JEP, FKEE (Semester 1 2012/2013)

2)

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 4#4@ Pi=el -alues in an $ntensit8 $mage 'efine 7ra8 Levels T e statement belo+ s o+ t at t e net+or/ +as reading t e image data files# T e s8mbol ! until !4 +as re!resented in!uts for eac faces to t e same !erson and t e8 +ill used to be train t e net+or/# T e t re!resented t e target for t e net+or/ to be train# ,ac face consists of 2* b8 2* matrices in gra8 scaled or intensit8 value and t eir value +as bet+een * and "# ! P im2double9imread9Hface"#j!gH:: !" P im2double9imread9Hface2#j!gH:: !2 P im2double9imread9Hface2#j!gH:: !2 P im2double9imread9Hface4#j!gH:: !4 P im2double9imread9Hface4#j!gH:: t P im2double9imread9Hface""#j!gH:: 1ere ne+ff +as used to create a t+o<la8er log<sigmoidElog<sigmoid feed<for+ard net+or/# T e net+or/ +ill ave an in!ut 9ranging from * to ":A follo+ed b8 a la8er of 2* log< sigmoid neuronsA and follo+ed b8 a la8er +it bac/!ro!agation +as used# netPne+ff9minma=9!:AB2* 2*CAMHlogsigH HlogsigHNAHtrainr!H:D 2* log<sigmoid neurons# Trainr!

JEP, FKEE (Semester 1 2012/2013)

2*

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

T ree training functions +ere im!lemented in t is !rojectA + ic means trainr! +ill be re!laced b8 trainscg and traincgf to fulfill t e re?uirement of t is !roject#

4#7 5etting Training Parameters


$n t e follo+ing code +ere t e !arameters of t e training function being usedA t ese !arameter +as set to get t e net+or/ !erform +ell# net#!erformFcn P HsseH net#trainParam#s o+ P 4 net#trainParam#e!oc s P 2** net#trainParam#goal P 4 T e first statement sse +as a net+or/ !erformance function# $t measures !erformance according to t e sum of s?uared errors# T en t e second statement +as t e training status dis!la8ed for ever8 4 s o+n iteration of t e algorit m# T e t ird !arameter determines + en t e training sto!s# T e training sto!s at 2** e!oc s# T8!icall8 one e!oc of training is defined as a single !resentation of all in!ut vectors to t e net+or/# T e net+or/ is t en u!dated according to t e results of all t ose !resentations# T e last one +as t e error goal of t e net+or/# At last t e net+or/ +as read8 to train# BnetAtrC P train 9netA!At: Training occurs until a ma=imum number of e!oc s occurA t e !erformance goal is metA or an8 ot er sto!!ing condition of t e function net#trainFcn occurs# T e training record QtrF contains information about t e !rogress of t e training#

4#8 Training +it ;ore Faces


Once t e training !arameter +as setA net+or/ +ill train b8 re!eatedl8 oneA fiveA ten and t+ent8 times# T e net+or/ +ill be given more e=am!les to trainA t is because t e more trainingA t e more accurate t e net+or/ to recogni>ed# %et+or/ recognition errors +ill be corrected and t e total !erformance +ill be im!roved# To train +it more e=am!lesA t e ma=imum number of e!oc s and t e error goal +as increased to 4** and "* res!ectivel8A reflecting t at ig er error +as e=!ected because more faces being trained#
JEP, FKEE (Semester 1 2012/2013)

2"

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

net#trainParam#s o+ P 4 net#trainParam#e!oc s P 4** net#trainParam#goal P "* After c anging t e training !arameterA ne+ face e=am!le +ill give to train# T re!resented targets of t e net+or/A and P re!resented in!uts faces for training# T is times t e net+or/ train for 4 times# T P Bt t t tC for !ass P "@ 4 P P B !" A !2 A9!2 O randn95"A52:T*#2:A9!4 O randn95"A52:T*#":C BnetAtrC P train9netAPAT: end T e ne=t c a!ter +ill discuss on o+ to test t e neural net+or/ after training#

C1APT,& 4 %,(&AL %,T.O&3 T,5T$%7


4#* %et+or/ 5imulation

JEP, FKEE (Semester 1 2012/2013)

22

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Once t e net+or/ as been trainedA it +ill be tested to determine t e !erformance of t is net+or/# $f t e !erformance +as not satisfiedA it +ill be train again until t e desired !erformance +as met# T e function 9sim: simulates a net+or/A sim ta/e t e net+or/ in!ut !A and t e net+or/ objects netA and returns t e net+or/ out!uts a B7C# 5tatement belo+ +as t e simulation of t e net+or/ for t is !roject# !) P im2double9imread9Hface"*#j!gH:: a P sim9netA!): 7iven !) +as t e ne+ in!ut of !erson QAFA using command sim to test t e net+or/ !erformance and QaF +as t e net+or/ out!ut# To s o+ t e net+or/ out!ut QaF versus corres!onding targetQtF in an easier +a8 +as ta/en a fe+ data for com!aring# $n t is !rojectA si>es of all data +ere in 2* b8 2* matrices# 08 using diagonal commandA select 2* data for net+or/ out!ut and target# T is mean all data +ere c osen in t e same slo!ing# T e follo+ing statement defined t e met od +as used# an P diag9a: tn P diag9t: !lot9tnAH<H: old on !lot9anAH@<H: T e term QanF and QtnF +as t e diagonal value of net+or/ out!ut QaF and targetQtF# After t atA !lot t e result in one gra! b8 using old on command and used difference t8!e of line to !lot#

4#" Post<Training Anal8sis


T e !erformance of a trained net+or/ can be measured to some e=tent b8 t e errors on t e trainingA validation and test setsA but it +as often useful to investigate t e net+or/ res!onse

JEP, FKEE (Semester 1 2012/2013)

22

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

in more detail# One o!tion +as to !erform a regression anal8sis bet+een t e net+or/ res!onse and t e corres!onding targets# T e routine !ostreg +as designed to !erform t is anal8sis# T e follo+ing commands illustrate o+ can !erform a regression anal8sis on t e net+or/ t at !reviousl8 trained# a P mean 9a: t P mean 9t: BmAbArC P !ostreg9aAt: T is is to searc t e mean value of net+or/ out!ut QaF and t e target QtF# T en !ass t e net+or/ out!ut QaF and t e corres!onding targetsQtF to !ostreg# $t returns t ree !arameters# T e first t+oA m and bA corres!ond to t e slo!e and t e 8<interce!t of t e best linear regression relating targets to net+or/ out!uts# $f !erfectl8 fits 9out!uts e=actl8 e?ual to targets:A t e slo!e +ould be " and t e 8<interce!t +ould be *# T e t ird variable returned b8 !ostreg +as t e correlation coefficient 9&<value: bet+een t e out!uts and targets# $t is a measure of o+ +ell t e variation in t e out!ut +as e=!lained b8 t e targets# $f t is number +as e?ual to "A t en t ere +as !erfect correlation bet+een targets and out!uts B7C# Figure 4#" illustrates t e gra! ical out!ut !rovided b8 !ostreg# T e net+or/ out!uts are !lotted versus t e targets as o!en circles# T e best linear fit is indicated b8 a das ed line# T e !erfect fit 9out!ut e?ual to targets: is indicated b8 t e solid line# $n t is e=am!leA it is difficult to distinguis t e best linear fit line from t e !erfect fit lineA because t e fit is so good B7C#

JEP, FKEE (Semester 1 2012/2013)

24

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 4#" Post<training anal8sis

JEP, FKEE (Semester 1 2012/2013)

24

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

C1APT,& 6 5OFT.A&, $;PL,;,%TAT$O%


6#* ;ATLA0 %eural %et+or/ Toolbo=
;ATLA0 %eural %et+or/ Toolbo= being used in t is !roject to design t e face recognition !rogram# T e name ;ATLA0 stands for matrix laboratory# ;ATLA0 +as originall8 +ritten to !rovide eas8 access to matri= soft+are develo!ed b8 t e L$%PAC3 and ,$5PAC3 !rojects# ;ATLA06 +as a ig <!erformance language for tec nical com!uting# $t integrates com!utationA visuali>ationA and !rogramming in an eas8<to<use environment + ere !roblems and solutions +ere e=!ressed in familiar mat ematical notation# T8!ical uses include@

;at and com!utation Algorit m develo!ment ;odelingA simulationA and !rotot8!ing 'ata anal8sisA e=!lorationA and visuali>ation 5cientific and engineering gra! ics A!!lication develo!mentA including gra! ical user interface building T e %eural %et+or/ Toolbo= +as a !o+erful collection of ;ATLA0 functions for

t e designA trainingA and simulation of neural net+or/s# $t su!!orts a +ide range of net+or/ arc itectures +it an unlimited number of !rocessing elements and interconnections 9u! to o!erating s8stem constraints:# 5u!!orted arc itectures and training met ods include@ su!ervised training of feedfor+ard net+or/s using t e !erce!tron learning ruleA .idro+<1off ruleA several variations on bac/!ro!agation 9including t e fast Levenberg<;ar?uardt algorit m:A and radial basis net+or/sD su!ervised training of recurrent ,lman net+or/sD unsu!ervised training of associative net+or/s including com!etitive and feature ma! la8ersD 3o onen net+or/sA self<organi>ing ma!sA and learning vector ?uanti>ation#

JEP, FKEE (Semester 1 2012/2013)

26

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

6#" -isuali>ation Tec ni?ue


$n t is !roject t+o /inds of +indo+ e=isted first +as t e select training +indo+A + ic allo+s user to select + ic one training function to be used# T ere +ere t ree t8!es of training function trainr!A trainscg and traincgf# 5econd +indo+ +as training +indo+A + ic allo+ user selected + o +ill be trained !erson A or !erson 0 and number of training range from "A 4A "*A and 2* times for eac !erson#

6#2 5elect Training .indo+


Figure 6#2 +as t e select training +indo+ t is +indo+ consists of t ree t8!es of training functions in t is !roject#

Figure 6#"@ Face &ecognition select training +indo+ T e select training +indo+ +as created base on ;ATLA0 uicontrol command# T e follo+ing statement created !us buttons t at e=!lained o+ select training +indo+ +as created# First created a uicontrol object b8 s!ecified a !us button name Train 9":# Train" P uicontrol9H5t8leHA H!us buttonHA### H5tringHA HTrain 9":HA### HCallbac/HA Htrainr!H:D T e Callbac/A in t is case trainr! +as t e name of a sub!rogram t at train base on trainr! function# . en user clic/ on t is !us buttonA a Trainr! .indo+ figure +ill came out# T en created a uicontrol object b8 s!ecified a !us button name Train 92:# Train2 P uicontrol9H5t8leHA H!us buttonHA### H5tringHA HTrain 92:HA###

JEP, FKEE (Semester 1 2012/2013)

27

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

HCallbac/HA HtrainscgH:D T e Callbac/A in t is case trainscg +as t e name of a sub!rogram t at train base on trainscg function# . en user clic/ on t is !us buttonA a Trainscg .indo+ figure +ill came out# %e=t created a uicontrol object b8 s!ecified a !us button name Train 92:# Train2 P uicontrol9H5t8leHA H!us buttonHA### H5tringHA HTrain 92:HA### HCallbac/HA HtraincgfH:D T e Callbac/A in t is case traincgf +as t e name of a sub!rogram t at train base on traincgf function# . en user clic/ on t is !us buttonA a Traincgf .indo+ figure +ill came out# Finall8 if t e user +anted to ?uit or close select training +indo+ b8 sim!l8 clic/ on t e UCloseV !us button# Close P uicontrol9H5t8leHA H!us buttonHA### H5tringHA HCloseHA### HCallbac/HA Hclose9gcf:H:D

6#2 Training .indo+


Figure 6#2 +as t e Trainr! .indo+A + ic contains difference !erson and training times#

JEP, FKEE (Semester 1 2012/2013)

28

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 6#2@ Trainr! +indo+ Figure 6#2 illustrate t e !lot gra! s for !erson A train " time# T e gra! s +ere using sub!lot command to !lot all in one figure# T ere +ere Train " Time 7ra! A %et+or/ PerformanceA Train " Time $mage and &ecognition Percentage# 08 using t is +indo+A user can test o+ %eural %et+or/ im!lemented in recogni>ing face +it various e=!ression#

6#4 Flo+ C art

JEP, FKEE (Semester 1 2012/2013)

2)

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 6#2@ Flo+c art As indicated in Figure 6#4A it +as flo+c art of t is !roject# First it starts +it face recognition main +indo+# T en it goes t roug select training function +indo+# T is +indo+ consists of selection of t ree t8!es training functionA + ic +as Trainr!A TrainscgA and Traincgf# (ser can c oose an8 one of t e training functionD if Trainr! being c ooseA a Trainr! +indo+ +ill a!!ear# T is +indo+ as t+o !eo!le to be select for trainingA + ic +as Person A and Person 0# ,ac !erson as same training +it oneA fiveA ten and t+ent8 times# After user c ooses t e training time t e net+or/ +ill start training and t e training result +ill be !lotted in t e +indo+# T e !lotted gra! s ave !erformance of t e net+or/ to t at image b8 !lotting t e target out!ut versus simulation out!ut and t e image after simulation# 0eside t at it also s o+s t e correlation bet+een net+or/ out!uts +it target#

JEP, FKEE (Semester 1 2012/2013)

4*

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

C1APT,& 7 &,5(LT5
%eural net+or/s ave been a!!lied in man8 ot er fields since t e ")88 'A&PA %eural %et+or/ 5tud8 B'A&P88C re!ort +as +ritten# 5ome a!!lications mentioned in t e literature follo+ li/e aeros!aceA automotiveA ban/ingA credit card activit8 c ec/ingA defenseA electronicsA entertainmentA financialA industrialA insuranceA manufacturingA medicalA oil and gasA roboticsA s!eec A securitiesA telecommunicationsA and trans!ortation# Face recognition as establis ed itself as an im!ortant sub<branc recognition s8stemA based on neural net+or/# A standard 0ac/!ro!agation feedfor+ard net+or/ +as designed for t is !rojectD it +as a t+o<la8er log<sigmoidElog<sigmoid net+or/# T e net+or/ in!ut +as 2* b8 2* matri=es and values for t e in!ut vector range bet+een * and "# T e idden 9first: la8er as 2* neurons and out!ut 9second: la8er also as 2* neurons to train and learn t e net+or/# of !attern recognition +it in t e field of com!uter science# T is !roject +as to train a neural net+or/ to be im!lementing a sim!le face

Figure 7#"@ Arc itecture of t e net+or/

JEP, FKEE (Semester 1 2012/2013)

4"

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

T ree t8!es of training function +ere im!lemented in t is !roject suc as &esilient 0ac/!ro!agation 9trainr!:A 5caled Conjugate 7radient 9trainscg: and Fletc er<&eeves (!date 9traincgf:# T is c a!ter e=!lains in detail about results obtained from testing various e=!eriments# All gra! s belo+ re!resented t+o !eo!leA + o +as !erson A and !erson 0# T e net+or/s +ere train b8 t+o training functionA + ic is Trainr! and Traincgf#

6i,ure 7.)8 Person 0 ori,inal ima,e and ,raph Figure 7#2 re!resented t e target gra! and original image of !erson A# Person A gra! consists of 2* data ta/en from t e image b8 using diagonal met od and its value +as bet+een * and "#

JEP, FKEE (Semester 1 2012/2013)

42

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#2@ Person A train " time gra! s Figure 7#2 illustrate Person A train " time gra! sA t e difference bet+een target line and net+or/ out!ut line +as ig # T e train " time out!ut image +as blurred and t e sum s?uared error +as ig at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#)4# Trainr! training function +as used#

JEP, FKEE (Semester 1 2012/2013)

42

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#4@ Person A train 4 times gra! s Figure 7#4 illustrate Person A train 4 times gra! sA target line and net+or/ out!ut line +as nearer# T e train 4 times out!ut image +as clearer com!are +it train " time image and t e sum s?uared error +as around 24 at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#)84# Trainr! training function +as used#

JEP, FKEE (Semester 1 2012/2013)

44

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#4@ Person A train "* times gra! s Figure 7#4 illustrate Person A train "* times gra! sA target line and net+or/ out!ut line +as more accurate# T e train "* times out!ut image +as clearer com!are +it train " and 4 times image and t e sum s?uared error +as ig at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#))# Trainr! training function +as used#

JEP, FKEE (Semester 1 2012/2013)

44

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#6@ Person A train 2* times gra! s Figure 7#6 illustrate Person A train 2* times gra! sA target line and net+or/ out!ut line +as more accurate# T e train 2* times out!ut image +as clearer com!are +it train "A4 and "* times image and t e sum s?uared error +as around ig at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#))4# Trainr! training function +as used#

Figure 7#7@ Original image Person@ A


JEP, FKEE (Semester 1 2012/2013)

46

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#8@ &econstructed image Training Function@ Trainr! Person@ A Figure 7#)@ &econstructed image Training Function@ Trainr! Person@ A Figure 7#"*@ &econstructed image Training Function@ Trainr! Person@ A Figure 7#""@ &econstructed image Training Function@ Trainr! Person@ A

T e results indicate t at given more training e=am!le to t e net+or/ +ill !roduce more accurate result# As s o+ing in t e resultA + en t e net+or/ given one training e=am!le t e out!ut image +as blurred# T e im!rovement result can be obtained b8 increasing number of training e=am!le to net+or/ for training# 7iven fiveA ten and t+ent8 training e=am!lesA t e !erformance of t e net+or/ +as better#

0elo+ +as t e data t at re!resented target value +it t e training values#

JEP, FKEE (Semester 1 2012/2013)

47

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

9ar,et Person 0
*#**** *#*4"* *#*862 *#"*4) *#2"47 *#22)2 *#"882 *#2627 *#"2)4 *#"842 *#27*6 *#2882 *#4"27 *#4724 *#4882 *#6*** *#6*** *#6*78 *#2244 *#"8*4 *#76*8 *#)2)4 *#7647 *#67*6 *#6824 *#744" *#6*** *#**** *#*488 *#"44"

9rain 1
*#**44 *#**"* *#**44 *#"476 *#2*44 *#")26 *#2462 *#2444 *#2242 *#27)* *#2624 *#2)*4 *#42)4 *#4474 *#42*2 *#48"7 *#4762 *#4*24 *#4")) *#2)6) *#46)4 *#7*") *#7*44 *#4)62 *#47"7 *#4264 *#4")" *#44)4 *#224) *#**)2

9rain $
*#*2)7 *#*648 *#*268 *#"448 *#224) *#2424 *#22*2 *#227* *#2""8 *#2")2 *#24") *#4227 *#4272 *#4787 *#6248 *#72*4 *#7*4) *#442" *#42)4 *#46"7 *#4)86 *#7724 *#6))2 *#4246 *#447* *#628* *#44"6 *#262) *#**74 *#**2"

9rain 1:
*#**** *#*482 *#*7)2 *#"8*4 *#2"** *#2*82 *#2448 *#2282 *#"7"7 *#2"22 *#2772 *#2)"7 *#4"2" *#46*4 *#62"7 *#4)2* *#6"62 *#4442 *#288* *#2684 *#7*44 *#828) *#72)* *#7284 *#7*48 *#7*42 *#6228 *#486" *#*"22 *#**"*

9rain ):
*#*2)2 *#*247 *#*66) *#""62 *#2"88 *#")*) *#24)" *#2247 *#2276 *#2864 *#2"82 *#4"82 *#4"8" *#4274 *#4)22 *#644* *#4)24 *#4622 *#2262 *#4**8 *#6462 *#82"6 *#764* *#6224 *#4))* *#7*8) *#464) *#""7" *#**** *#*472

Table 7#"@ Target value com!are +it training value

0elo+ +as t e data t at re!resented target value +it t e training values#

JEP, FKEE (Semester 1 2012/2013)

48

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Mean Person 0
*#**76 *#"2*2 *#"248 *#"6*4 *#2)4) *#24)" *#262* *#2872 *#4*42 *#4**8 *#4*24 *#2)78 *#2782 *#2676 *#4**4 *#447" *#4*26 *#4*2" *#474) *#4446 *#4)") *#4)44 *#4242 *#4262 *#44)8 *#4742 *#46** *#4"2" *#44)* *#*"42

9rain 1
*#*276 *#*62* *#*78" *#*8)8 *#"**" *#2486 *#2*44 *#2""8 *#28*2 *#26)" *#2647 *#28*2 *#267" *#2")7 *#2""4 *#2627 *#2748 *#2))8 *#2)4) *#2676 *#482* *#4847 *#4)*2 *#48"7 *#4822 *#4866 *#4*28 *#4646 *#4*44 *#*246

9rain $
*#**64 *#"244 *#""64 *#"486 *#2"68 *#242* *#2847 *#2)*8 *#4248 *#4247 *#422) *#42)6 *#42") *#4*8" *#428) *#4646 *#4"24 *#42*2 *#4)*2 *#478* *#478* *#48)7 *#4*6* *#4"4* *#4"6" *#4444 *#4472 *#44)) *#4862 *#**46

9rain 1:
*#**2) *#"42* *#"222 *#"87* *#2444 *#242" *#2867 *#272) *#4*67 *#4*46 *#4*44 *#4*62 *#4*6" *#2)6* *#4"** *#4488 *#482* *#4727 *#44)) *#4*44 *#4222 *#4*42 *#4222 *#44)4 *#4244 *#468" *#4447 *#4474 *#46"8 *#*242

9rain ):
*#*"64 *#"228 *#"227 *#"62) *#2622 *#2482 *#28*8 *#2)*7 *#4"78 *#4"62 *#4*8* *#4"2* *#4""7 *#2826 *#2)7* *#44** *#4877 *#4)74 *#4647 *#4782 *#4)76 *#4")4 *#42"2 *#4246 *#4282 *#44)4 *#44*4 *#4442 *#472" *#*"44

Table 7#2@ ;ean of target value com!are +it mean training value

JEP, FKEE (Semester 1 2012/2013)

4)

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#"2@ Person 0 original image and gra!

Figure 7#"2 re!resented t e target gra! and original image of !erson 0# Person 0 gra! consists of 2* data ta/en from t e image b8 using diagonal met od and its value +as bet+een * and "#

JEP, FKEE (Semester 1 2012/2013)

4*

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#"2@ Person 0 train " time gra! s Figure 7#"2 illustrates Person 0 train " time gra! sA t e difference bet+een target line and net+or/ out!ut line +as ig # T e train " time out!ut image +as blurred and t e sum s?uared error +as ig at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#7)2# Trainr! training function +as used#

JEP, FKEE (Semester 1 2012/2013)

4"

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#"4@ Person 0 train 4 times gra! s Figure 7#"4 illustrates Person 0 train 4 times gra! sA target line and net+or/ out!ut line +ere far# T e train 4 times out!ut image +as clearer com!are +it train " time image and t e sum s?uared error at t e beginning +as ig but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#824# Trainr! training function +as used#

JEP, FKEE (Semester 1 2012/2013)

42

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#"4@ Person 0 train "* times gra! s Figure 7#"4 illustrates Person 0 train "* times gra! sA target line and net+or/ out!ut line +as more accurate# T e train "* times out!ut image +as clearer com!are +it train " and 4 times image and t e sum s?uared error +as around ig at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#)22# Trainr! training function +as used#

JEP, FKEE (Semester 1 2012/2013)

42

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#"6@ Person 0 train 2* times gra! s Figure 7#"6 illustrates Person 0 train 2* times gra! sA target line and net+or/ out!ut line +as more accurate# T e train 2* times out!ut image +as clearer com!are +it train "A4 and "* times image and t e sum s?uared error +as around ig at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#)84# Trainr! training function +as used#

JEP, FKEE (Semester 1 2012/2013)

44

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#"7@ Original image Person@ 0 Figure 7#"8@ &econstructed image Training Function@ Trainr! Person@ 0 Figure 7#")@ &econstructed image Training Function@ Trainr! Person@ 0 Figure 7#2*@ &econstructed image Training Function@ Trainr! Person@ 0 Figure 7#2"@ &econstructed image Training Function@ Trainr! Person@ 0

T e results indicate t at given more training e=am!le to t e net+or/ +ill !roduce more accurate result# As s o+n in t e resultA + en t e net+or/ +as given one trainingA t e out!ut image +as blurred# T e im!rovement result can be obtained b8 increasing number of training e=am!le to net+or/ for training# 7iven fiveA ten and t+ent8 training e=am!lesA t e !erformance of t e net+or/ +as better#

JEP, FKEE (Semester 1 2012/2013)

44

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

0elo+ +as t e data t at re!resented target value +it t e training values# 9ar,et Person 4
*#*627 *#**** *#4*** *#6224 *#6)4" *#44"2 *#4224 *#2244 *#2*4) *#2*2* *#2244 *#4744 *#644) *#6824 *#6667 *#72)4 *#6)4" *#4""8 *#2*78 *#2784 *#7764 *#67*6 *#7"27 *#6744 *#7*)8 *#4)6" *#6224 *#*44) *#*2)2 *#*47"

9rain 1
*#*444 *#"*87 *#2628 *#4"64 *#4468 *#222) *#2"4) *#2428 *#2746 *#2846 *#2"82 *#4262 *#6)86 *#7644 *#7"*8 *#748* *#6444 *#4**7 *#42*2 *#4442 *#66"7 *#6)2" *#6)64 *#6)7) *#6848 *#48"7 *#)*2" *#4)*4 *#"4)* *#*"4*

9rain $
*#*"4" *#*""8 *#42** *#4))8 *#64*4 *#422) *#47)4 *#4864 *#4)42 *#44)4 *#4)8) *#4)84 *#6)"" *#744" *#8228 *#84*4 *#7444 *#62)7 *#4627 *#6*") *#64"7 *#6284 *#6442 *#6224 *#64"" *#62)2 *#848" *#4464 *#244) *#*246

9rain 1:
*#*444 *#*"62 *#422" *#6)48 *#6824 *#4"2* *#48"" *#4*4* *#2))2 *#2224 *#278* *#4)2* *#6248 *#6)2* *#686* *#8*)* *#667) *#424* *#"442 *#2827 *#6)42 *#7"22 *#6486 *#6728 *#628* *#6264 *#8*4* *#72*7 *#**** *#*"84

9rain ):
*#*"48 *#"8"2 *#24") *#647* *#6876 *#4624 *#4742 *#4"74 *#2284 *#22*2 *#2428 *#4442 *#6*)" *#72"" *#7466 *#78"7 *#6446 *#2622 *#2228 *#262" *#678" *#7272 *#67") *#66"8 *#6*)7 *#6768 *#772" *#*"2) *#**** *#*274

Table 7#2@ Target value com!are +it training value

0elo+ +as t e data t at re!resented target value +it t e training values#

JEP, FKEE (Semester 1 2012/2013)

46

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Mean Person 4
*#*447 *#42"4 *#6*44 *#628) *#4822 *#4)"" *#46)* *#4624 *#4476 *#4248 *#4"42 *#4*72 *#4)7) *#47*4 *#46*" *#48"6 *#42*) *#44"8 *#4)"4 *#4822 *#4"47 *#4*48 *#4282 *#4827 *#6*64 *#622* *#6"4) *#62*7 *#487" *#*6"2

9rain 1
*#*682 *#4*88 *#4427 *#4*27 *#4)68 *#466* *#4472 *#4744 *#464" *#474" *#4677 *#4644 *#4748 *#446* *#4"22 *#4)4) *#4*27 *#4"22 *#4"62 *#42"4 *#4482 *#446) *#4424 *#48"2 *#4872 *#4)4* *#6"24 *#64)* *#6"48 *#4242

9rain $
*#2824 *#2)4) *#46)4 *#448) *#48*) *#6*6) *#4726 *#6*4* *#486) *#4))4 *#4))2 *#4877 *#4)42 *#6*68 *#4)87 *#4866 *#477) *#48"2 *#4)42 *#6267 *#6*)" *#474" *#467" *#4842 *#4))* *#4)8" *#6*42 *#647* *#482) *#2822

9rain 1:
*#*82) *#4467 *#6464 *#64") *#4)72 *#6*62 *#6"44 *#478) *#4442 *#446" *#4244 *#4")7 *#4*44 *#4*26 *#4*22 *#48)2 *#4*88 *#4444 *#4)2) *#4764 *#4"*" *#4"88 *#42*4 *#4)22 *#4)84 *#6"4) *#66*7 *#6*42 *#47*) *#24)4

9rain ):
*#*)26 *#26)4 *#48"2 *#622* *#47)4 *#4678 *#472) *#47*2 *#4444 *#4442 *#4242 *#4*** *#484) *#4822 *#47)* *#4"*2 *#4224 *#4247 *#4774 *#484" *#4*62 *#4*)7 *#42*2 *#4)44 *#4))" *#6228 *#67*7 *#6)2* *#46)) *#*"62

Table 7#4@ ;ean of target value com!are +it mean training value Table 7#4@ ;ean of target value com!are +it mean training value

JEP, FKEE (Semester 1 2012/2013)

47

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#22@ Person A train " time gra! s Figure 7#22 illustrates Person A train " time gra! sA t e difference bet+een target line and net+or/ out!ut line +as ig # T e train " time out!ut image +as blurred and t e sum s?uared error +as ig at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#762# Traincgf training function +as used#

JEP, FKEE (Semester 1 2012/2013)

48

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#22@ Person A train 4 times gra! s Figure 7#22 illustrates Person A train 4 times gra! sA t e difference bet+een target line and net+or/ out!ut line +as ig # T e train 4 times out!ut image +as blurred and t e sum s?uared error +as ig at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#)27# Traincgf training function +as used#

JEP, FKEE (Semester 1 2012/2013)

4)

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#24@ Person A train "* times gra! s Figure 7#24 illustrates Person A train "* times gra! sA target line and net+or/ out!ut line +as more accurate# T e train "* times out!ut image +as clearer com!are +it train " and 4 times image and t e sum s?uared error +as ig at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#)4)# Traincgf training function +as used#

JEP, FKEE (Semester 1 2012/2013)

6*

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#24@ Person A train 2* times gra! s Figure 7#24 illustrates Person A train 2* times gra! sA target line and net+or/ out!ut line +as more accurate# T e train 2* times out!ut image +as clearer com!are +it train "A4 and "* times image and t e sum s?uared error +as ig at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#)82# Traincgf training function +as used#

Figure 7#26@ Original image


JEP, FKEE (Semester 1 2012/2013)

Person@ A

6"

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#27@ &econstructed image Training Function@ Traincgf Person@ A

Figure 7#28@ &econstructed image Training Function@ Traincgf Person@ A

Figure 7#2)@ &econstructed image Training Function@ Traincgf Person@ A

Figure 7#2*@ &econstructed image Training Function@ Traincgf Person@ A T e results indicate t at given more training e=am!le to t e net+or/ +ill !roduce more accurate result# As s o+n in t e resultA + en t e net+or/ +as given one trainingA t e out!ut image +as blurred# T e im!rovement result can be obtained b8 increasing number of training

JEP, FKEE (Semester 1 2012/2013)

62

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

e=am!le to net+or/ for training# 7iven fiveA ten and t+ent8 training e=am!lesA t e !erformance of t e net+or/ +as better# 0elo+ +as t e data t at re!resented target value +it t e training values# Table 7#4@ Target value com!are +it training value 9ar,et Person 0
*#**** *#*4"* *#*862 *#"*4) *#2"47 *#22)2 *#"882 *#2627 *#"2)4 *#"842 *#27*6 *#2882 *#4"27 *#4724 *#4882 *#6*** *#6*** *#6*78 *#2244 *#"8*4 *#76*8 *#)2)4 *#7647 *#67*6 *#6824 *#744" *#6*** *#**** *#*488 *#"44"

9rain 1
*#**6* *#*24) *#*24) *#*67) *#*27) *#"677 *#2*4* *#267" *#2)** *#4826 *#4*44 *#42*4 *#6)28 *#62"2 *#4*44 *#4*)6 *#4"7* *#27") *#22"* *#2266 *#4744 *#86*4 *#76** *#44"" *#4222 *#4482 *#2277 *#4*2" *#"224 *#*224

9rain $
*#**88 *#*2"4 *#*288 *#*87" *#"48" *#"784 *#2"2" *#2274 *#2728 *#24"8 *#4274 *#4*"8 *#44)" *#4))" *#462) *#4477 *#4242 *#428) *#27*) *#"62* *#2)82 *#6668 *#4)*2 *#4444 *#4782 *#2244 *#44** *#6"*8 *#2446 *#*"46

9rain 1:
*#*244 *#*744 *#*8"8 *#""*4 *#"244 *#"482 *#2268 *#2262 *#")28 *#2722 *#2248 *#4"67 *#42)2 *#464* *#44*" *#48*8 *#44)" *#4276 *#2478 *#262* *#76)4 *#842* *#7422 *#46"* *#6"*" *#6776 *#6222 *#448* *#**") *#*422

9rain ):
*#*24" *#*2"8 *#*44) *#"224 *#2"28 *#2"*7 *#22** *#22"4 *#"744 *#24** *#2"64 *#4*2) *#4862 *#4447 *#64*" *#6""* *#4468 *#6*44 *#24)6 *#2)77 *#6768 *#8427 *#7742 *#6277 *#66** *#6222 *#6*42 *#*224 *#**22 *#*447

JEP, FKEE (Semester 1 2012/2013)

62

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

0elo+ +as t e data t at re!resented target value +it t e training values# Mean Person 0
*#**76 *#"2*2 *#"248 *#"6*4 *#2)4) *#24)" *#262* *#2872 *#4*42 *#4**8 *#4*24 *#2)78 *#2782 *#2676 *#4**4 *#447" *#4*26 *#4*2" *#474) *#4446 *#4)") *#4)44 *#4242 *#4262 *#44)8 *#4742 *#46** *#4"2" *#44)* *#*"42

9rain 1
*#"442 *#""6" *#*8)* *#*8*7 *#*786 *#2*2) *#2486 *#2624 *#4728 *#4667 *#4226 *#4246 *#4728 *#4286 *#2642 *#2)"6 *#2""" *#2248 *#426) *#4"62 *#4882 *#482) *#4**4 *#4*7) *#4*)8 *#4"2* *#4*82 *#4*66 *#22") *#2676

9rain $
*#*668 *#*)"" *#*)24 *#"*42 *#"628 *#2766 *#28)4 *#4*"6 *#4*62 *#4"62 *#4224 *#424" *#4288 *#4"6* *#4*42 *#2724 *#2)2* *#4244 *#4474 *#4)42 *#464" *#4)*2 *#4*"4 *#4842 *#4)64 *#4742 *#48)8 *#4*)7 *#447* *#"*7"

9rain 1:
*#*488 *#"28) *#"488 *#"6"* *#"646 *#278* *#2678 *#2724 *#4222 *#4246 *#4"2) *#4226 *#44*2 *#2))2 *#287* *#2622 *#4464 *#4824 *#48*" *#474) *#48)4 *#4)84 *#4)22 *#4882 *#4)44 *#4478 *#4682 *#4624 *#4**8 *#**))

9rain ):
*#*62* *#""27 *#"2"2 *#"487 *#22** *#22") *#2424 *#2484 *#4*66 *#4*6" *#4*42 *#4*)* *#4*78 *#2))2 *#42)4 *#46)2 *#48)7 *#4264 *#4)47 *#422) *#4226 *#4274 *#4244 *#44)2 *#442* *#4676 *#4467 *#4224 *#4444 *#*424

Table 7#6@ ;ean of target value com!are +it mean training value

JEP, FKEE (Semester 1 2012/2013)

64

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#2"@ Person 0 train " time gra! s Figure 7#2" illustrates Person 0 train " time gra! sA t e difference bet+een target line and net+or/ out!ut line +as ig # T e train " time out!ut image +as blurred and t e sum s?uared error +as ig at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#8)2# Traincgf training function +as used#

JEP, FKEE (Semester 1 2012/2013)

64

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#22@ Person 0 train 4 times gra! s Figure 7#22 illustrates Person 0 train 4 times gra! sA t e difference bet+een target line and net+or/ out!ut line +as ig # T e train 4 times out!ut image +as blurred and t e sum s?uared error +as ig at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#)24# Traincgf training function +as used#

JEP, FKEE (Semester 1 2012/2013)

66

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#22@ Person 0 train "* times gra! s Figure 7#22 illustrates Person A train "* times gra! sA target line and net+or/ out!ut line +as more accurate# T e train "* times out!ut image +as clearer com!are +it train " and 4 times image and t e sum s?uared error +as ig at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#)4)# Traincgf training function +as used#

JEP, FKEE (Semester 1 2012/2013)

67

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#24@ Person 0 train 2* times gra! s Figure 7#24 illustrates Person A train 2* times gra! sA target line and net+or/ out!ut line +as more accurate# T e train 2* times out!ut image +as clearer com!are +it train "A4 and "* times image and t e sum s?uared error +as ig at t e beginning but later it been corrected b8 t e net+or/# &ecognition !ercentage +as *#)6# Traincgf training function +as used#

JEP, FKEE (Semester 1 2012/2013)

68

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Figure 7#24@ Original image Person@ 0

Figure 7#26@ &econstructed image Training Function@ Traincgf Person@ 0 Figure 7#27@ &econstructed image Training Function@ Traincgf Person@ 0 Figure 7#28@ &econstructed image Training Function@ Traincgf Person@ 0 Figure 7#2)@ &econstructed image Training Function@ Traincgf Person@ 0

T e results indicate t at given more training e=am!le to t e net+or/ +ill !roduce more accurate result# As s o+n in t e resultA + en t e net+or/ +as given one trainingA t e out!ut image +as blurred# T e im!rovement result can be obtained b8 increasing number of training e=am!le to net+or/ for training# 7iven fiveA ten and t+ent8 training e=am!lesA t e !erformance of t e net+or/ +as better# 0elo+ +as t e data t at re!resented target value +it t e training values#
JEP, FKEE (Semester 1 2012/2013)

6)

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

9ar,et Person 4
*#*627 *#**** *#4*** *#6224 *#6)4" *#44"2 *#4224 *#2244 *#2*4) *#2*2* *#2244 *#4744 *#644) *#6824 *#6667 *#72)4 *#6)4" *#4""8 *#2*78 *#2784 *#7764 *#67*6 *#7"27 *#6744 *#7*)8 *#4)6" *#6224 *#*44) *#*2)2 *#*47"

9rain 1
*#*444 *#*784 *#4422 *#6"4* *#4262 *#4*"8 *#4*4" *#248) *#"62* *#24") *#2)44 *#48*7 *#7*)2 *#7424 *#7282 *#72)4 *#6)"" *#6847 *#6*46 *#6*68 *#64)* *#66*4 *#4468 *#48"* *#4744 *#4*46 *#78"* *#46*2 *#"4*2 *#**82

9rain $
*#*"4* *#2764 *#6262 *#4424 *#442" *#4784 *#442* *#44") *#22"2 *#26"" *#4"87 *#48)8 *#776" *#8"7) *#7867 *#744* *#6447 *#44)7 *#442* *#6*)) *#724" *#7*24 *#7*72 *#684" *#6472 *#6424 *#77"2 *#42*) *#"7)4 *#**44

9rain 1:
*#*284 *#*"4* *#2284 *#72)7 *#7"24 *#4"*4 *#426" *#2)6" *#27*8 *#2"22 *#26"4 *#4"84 *#6)*2 *#74)7 *#7627 *#84** *#7224 *#4*4* *#227) *#28*2 *#7*4) *#6844 *#6448 *#64*) *#6"77 *#7*82 *#8842 *#"228 *#"82" *#*2*8

9rain ):
*#*2*" *#""72 *#48)4 *#64"2 *#6848 *#444) *#4762 *#2)26 *#2"62 *#2))2 *#22"" *#4482 *#6424 *#6)28 *#7462 *#8"44 *#6824 *#2474 *#26"6 *#4*4* *#64)8 *#7"2) *#6784 *#682* *#6287 *#644* *#7246 *#*)2" *#""2) *#*272

Table 7#7@ Target value com!are +it training value

0elo+ +as t e data t at re!resented target value +it t e training values#

JEP, FKEE (Semester 1 2012/2013)

7*

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Mean Person 4
*#*447 *#42"4 *#6*44 *#628) *#4822 *#4)"" *#46)* *#4624 *#4476 *#4248 *#4"42 *#4*72 *#4)7) *#47*4 *#46*" *#48"6 *#42*) *#44"8 *#4)"4 *#4822 *#4"47 *#4*48 *#4282 *#4827 *#6*64 *#622* *#6"4) *#62*7 *#487" *#*6"2

9rain 1
*#*724 *#4848 *#4)"* *#4477 *#44)2 *#4274 *#444" *#482) *#464* *#4684 *#444* *#442) *#468" *#46)* *#4""8 *#44)) *#4)4* *#47)) *#4282 *#4)"4 *#4442 *#46)4 *#47*2 *#4"74 *#4272 *#4287 *#622" *#6668 *#622) *#""4*

9rain $
*#22"8 *#478" *#48*2 *#4442 *#44*6 *#4874 *#4)*4 *#4827 *#476* *#476* *#4744 *#4784 *#4842 *#4722 *#4446 *#4242 *#4227 *#428) *#426" *#428) *#4642 *#4674 *#46)4 *#4)*2 *#4)"" *#4)"2 *#6*"2 *#62"6 *#6*77 *#2222

9rain 1:
*#*427 *#277) *#6"44 *#6776 *#6244 *#46*) *#4662 *#42)" *#4272 *#4284 *#4227 *#4227 *#422) *#422" *#447" *#4*62 *#4274 *#44"* *#4))) *#46)2 *#4222 *#4462 *#4424 *#6*76 *#62*2 *#64)4 *#6464 *#6487 *#44"" *#2*7*

9rain ):
*#2222 *#47"2 *#64)2 *#6*26 *#4824 *#4846 *#4762 *#4477 *#44** *#44)* *#422" *#4*)4 *#4*78 *#4)74 *#47)* *#4*74 *#4276 *#42)4 *#48)7 *#4)4) *#4"28 *#4"28 *#4227 *#4)88 *#6""4 *#6242 *#6*)4 *#666) *#2884 *#"6*7

Table 7#8@ ;ean of target value com!are +it mean training value

Com!aring training times 9Person A:


Table 7#)@ Training times ta/en b8 training function#
JEP, FKEE (Semester 1 2012/2013)

7"

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

9rain network for 1 time $ times 1: times ): times

9rainin, 6unction 9rainrp 9rainc,f 3.1%1$ seconds 3.'7$% seconds $.'337 seconds 7.$'7% seconds 1).)$71 seconds )7.1$1: seconds 14.%$4$ seconds 33.'1$9 seconds

Com!aring training times 9Person 0:


Table 7#"*@ Training times ta/en b8 training function 9rain network for 1 time $ times 1: times ): times 9rainin, 6unction 9rainrp 3.)%') seconds 4.7':3 seconds 9.%)'' seconds 1$.1:74 seconds 9rainc,f 3.1))4 seconds $.)7)7 seconds 1$.''%9 seconds )).19)% seconds

T e results indicate t at given more training e=am!le to t e net+or/ +ill !roduce more accurate result# As s o+ing in t e resultA + en t e net+or/ given one training e=am!le t e net+or/ out!ut !lotting gra! +as not met t e target gra! # 0eside t at t e out!ut image +as blurred# T e correlation bet+een net+or/ out!ut images +it t e corres!onding target image

JEP, FKEE (Semester 1 2012/2013)

72

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

+as lo+# T e im!rovement result can be obtained b8 increasing t e number of training e=am!le to net+or/ for training# 7iven fiveA ten and t+ent8 training e=am!lesA t e !erformance of t e net+or/ +as better# Furt ermore result +ill be affected if not !ro!erl8 c osen t e training function# As s o+ in result b8 using training function Trainr!A it !roduced ig correlation bet+een net+or/ out!uts +it target# &esults also indicated b8 using training function Trainr!A training times ta/en for training +ill be muc faster t an using training function Traincgf# Table 7#) and Table 7#"* re!resented times consumed for training +it training function Trainr! and Traincgf#

C1APT,& 8 CO%CL(5$O% A%' &,CO;;,%'AT$O%


8#* Conclusion
0efore $ acce!t t is !rojectA $ /no+ not ing about neural net+or/# . at can it do or donFt# 0ut +it t e $nternet tec nolog8 $ can find a lot of information about it#

JEP, FKEE (Semester 1 2012/2013)

72

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

Face recognition is a ver8 !o!ular !roblem t at researc ers demonstrate neural net+or/ tec ni?ue#

ave intensivel8 been

investigation for t e last t+o decades# $n t is !roject $ using bac/!ro!agation net+or/ to 0ased on t e results in c a!ter 7A it is necessar8 to conclude t e neural net+or/ $ trained did a good job of recogni>ing t+o<!erson face +it various e=!ressions# All face images +ere re!resented t+o !eo!le in t+o dimensional intensit8 values# &esilient 0ac/!ro!agation 9trainr!:A 5caled Conjugate 7radient 9trainscg: and Fletc er< &eeves (!date 9traincgf: are suitable for !attern recognition# T is can determine from training time consumed and !ercentage recogni>ed + en using t ese training function# ;ATLA0 +as used in t is !roject it +as a !o+erful engineering soft+are to design neural net+or/ !rogram# Finall8 $ ave learned a lot about neural net+or/ b8 doing t is !roject# $ o!e it +ill be benefit for me in future#

8#" &ecommendation
1. ;i3e more people ima,es For t is !roject it +as using onl8 t+o !eo!le image +it various facial e=!ression but it can be u!grade# T e met od is b8 increasing t e number of !eo!le faces# T is because it can train neural net+or/ to !erform +ell for recogni>ed more !eo!le faces#

). <ncrease si-e of the ima,e T is !roject +as use 2* b8 2* !i=els imageA t is si>e +as small and ard to see t e difference of t e net+or/ out!ut b8 using our na/ed e8e# $f could increase t e image si>es t at +ill !roduce a good result# 0ut t e net+or/ +ill be slo+er for !rocessing t e out!ut# 3. =se more trainin, function

JEP, FKEE (Semester 1 2012/2013)

74

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

T ree t8!es of training function +as used in t is !roject suc

as &esilient

0ac/!ro!agation 9trainr!:A 5caled Conjugate 7radient 9trainscg: and Fletc er<&eeves (!date 9traincgf:# T is face recognition !rogram could be train +it more training function and t e difference bet+een training function can be determined# 4. <ncrease hidden la(er A t+o<la8er log<sigmoidElog<sigmoid +as designed for t is !rojectD t e neural net+or/ can adjust its +eig ts to tr8 matc ing its out!uts to t e target values# $ncreasing t e idden la8er ma8 give t e neurons to learn more accuratel8#

&eferences
[1] N.K. ose and !."iang. Neural Network Fundamentals with Graphs, Algorithms, and Applications. #c$raw%&ill' (nc. Singa)ore' 1**+. )age ,. [-] ./0!/ Neural Network Study 11*22' /F34/ (nternational !ress' ). +05 [3] 6. Kohonen' Self-organization and Associative Memory' S)ringer%7erlag' erlin' 1*2*.

JEP, FKEE (Semester 1 2012/2013)

74

Fa%u&'( of E&e%'r)%a& a*d E&e%'ro*)% E*+)*eer)*+ T!$%& ' NEURAL NETWOR( FOR FACE RECOGNITION A)&"*)&"$ D+$& 13,04,2013

[,] #. Kirby and ". Siro8ich' 99 Application of the karhunen-loeve procedure for the characterization of human faces ':: (444 !attern /nalysis and #achine (ntelligence' 8ol. 1-' no. 1' )). 103%102' 1**0. [;] #. 6urk and /. !entland' 99 igenfaces for recognition':: <. 3og.

Neuroscience' 8ol. 3' no. 1' )). =1%2+' 1**1.

.ebsite@
B6C http8>>people.cs.uchica,o.edu>?@in,2>9hesis tml> B7C http8>>www.mathworks.com>access>helpdesk>help>pdfAdoc>nnet>nnet.pdf B8C http8>>www.comp.nus.edu.s,>?pris>0rtificialBeuralBetworks>Cinear9hreshold B)C ftp8>>ftp.sas.com>pu!>neural>60D.html

JEP, FKEE (Semester 1 2012/2013)

76

S-ar putea să vă placă și