Sunteți pe pagina 1din 6


International Journal Of Engineering, Education And Technology (ARDIJEET) ISSN 2320-883X, VOLUME 3, ISSUE 2, 01/04/2015


Dokhe Anita Dattatraya*1
*1(M.E Student SRES College Of Engineering Kopargaon)*1

recognizer as shown in Fig. 1 which is capable of
detecting a moving hand with its gesture in webcam
frames. In future, it may be considered that willing to be
more natural and more comforted, human being, who
has been communicated with a regulator or remote


control, may use their bare hands to interact with

machines without any mediator. As the set of materials
above, recognition of hand gestures and postures is a
satisfactory way to first steps of solutions instead of


using regulators, mechanical switches & remote control

as input devices. Since, only low processing power is

required to process the gestures, which is useful for
simple consumer control devices; very robust to lighting

For increasing/reducing the ceiling fan speed or for

switching the lights ON/OFF mostly we need to get up
and move towards the switch board. But the ones who are
physically handicap, its difficult for them to do so. This
system is targeted mainly to help people who are
physically handicap & unable to operate the switch
board for controlling the fan speed & turning the lights
ON/OFF. Its also possible to control these electronic
equipment using the remote controls. However, the
variety of physical shapes and functional commands that
each remote control features also raises numerous
problems: the difficulties in locating the required remote
control, the confusion with the button layout, the
replacement issue and so on. The proposed model for AC
motor speed control of ceiling fan based on image
processing is a recent innovative user interface that
resolves the complications of using numerous remote
controls for domestic appliances. Based on one unified
set of hand gestures, this system interprets the user hand
gestures into pre-defined commands to control one or
many devices simultaneously. The experimental results
are very encouraging as the system produces real-time
responses and highly accurate recognition towards
various gestures.




var iations; real-time operation.[2]

Fig. 1 Hand Gesture Recognition System


Keywords: Gesture Recognition, Image

Processing Techniques, PIC
Microcontroller, Firing Angle.

In image processing based hand gesture recognition

system, the movement of the hand is recorded by video
camera(s). This input video is decomposed into a set of

1. Introduction

features taking individual frames into account. The

Generally, Hand Gesture Recognition technology is
implemented using Data Gloves or Color pins
which in turn leads to additional cost and lack of
availability among majority of masses. Also, using






maintenance. Webcam is an easily available device

today. In our project, we will implement a hand gesture

hands are isolated from other body parts as well as other

background objects. The isolated hands are recognized
for different postures. Since, gestures are nothing but a
sequence of hand postures connected by continuous
motions, a recognizer can be trained against a possible
grammar. With this, hand gestures can be specified as
building up out of a group of hand postures in various
ways of composition, just as phrases are built up by
words. The recognized gestures can be used to drive a

International Journal Of Engineering, Education And Technology (ARDIJEET) ISSN 2320-883X, VOLUME 3, ISSUE 2, 01/04/2015
variety of applications. The image processing based
gesture recognition process can be decomposed into
three main stages: [3]

Single frame preprocessing and segmentation,

Hand feature extraction and pose classification, and

Hand tracking and pose sequence (gesture)


2. System Overview
Fig. 3 Detailed Gesture Recognition System


3. Implementation Details for Gesture Recognition


The above block diagram shows the system architecture

of the proposed hand gesture recognition system for
varying the speed of AC motor. The details of the
individual steps are described below.
3.1 Gesture Registration





The system is using the web camera, gesture processing

unit (PC), hardware interface comprising of microcontroller, firing angle control system & TRIAC for
controlling the fan speed as shown Fig. 5. The webcam is
used to capture the hand gestures which are then
registered, normalized and feature-extracted as in Fig. 6
for eventual classification to actuate the devices. The
setup of the basic components is shown in Figure below.
MATLAB is used throughout the project for real-time
data processing and classification. Once the user hand
gesture matches with a pre-defined command, the
command will be issued to the corresponding control
device. If an unknown gesture is issued, the system
rejects it, notifying the user.
Based on the command sensed the fan speed is controlled
by controlling the firing angle of TRIAC.

Fig. 2 Detailed System Overview

The captured hand gestures from a real-time video stream

need to be processed before they can be interpreted by a
computer. [11] It is extremely important that the captured
image is registered as a hand gesture using skin
segmentation after removing the background of the
image. The skin segmentation techniques used in this
research involves converting the image from RGB format
to YCbCr format. Then a threshold filter is applied to
remove non-skin components. The major advantage of
this approach is that the influence of luminosity can be
removed during the conversion process. Thus it makes
the segmentation less dependent on the lightning
condition, which has always been a critical obstacle for
image recognition. The threshold values were obtained
using our own data set.
To find the characteristics of the pixels representing the
skin region, a random image is selected out of many such
images. The pixels properties are categorized to
determine the threshold limits for the filter. Then, the
converted picture in YCbCr format is viewed in the
imtool of MATLAB so that every pixel and its
associated values such as x, y coordinates and intensity
can be determined accurately. A number of sample points
that represent skin patches and non-skin patches are
obtained. According to Fig. 4, both of the skin pixels

International Journal Of Engineering, Education And Technology (ARDIJEET) ISSN 2320-883X, VOLUME 3, ISSUE 2, 01/04/2015
and the non-skin pixels have their luminosity values
spreading over the full range, that is, the Y component of
skin patch is from 110 to 165, whereas that of non-skin
patch is from 16 to 208. This property of the Y
component implies that we are not filtering the luminance
of the image, but the other two remaining components,

This distortion, as expected, becomes more pronounced

in low lighting conditions. As a result, the skinsegmented image is noisy and distorted and is likely to
result in incorrect recognition at the subsequent stages.
These distortions, however, can be removed during the
gesture normalization stage.

Cb and Cr.

3.2 Gesture normalization

Gesture normalization is done by the well-known

morphological filtering technique, erosion combined
with dilation. The output of this stage is a smooth region
of the hand figure, which is stored in a logical bitmap
image as shown in Fig. 6.


Fig. 6 Smooth region output of the morphological

filtering technique
The experiments are carried out on an average computer
of 1.6 GHz with 256 MB RAM. This is mainly to
determine whether the system can operate from a set-top
box with limited processing power. The observed
execution time of 0.2 s is acceptable as it consumes only
one-fifth of the available processing time (1 s). Shorter
execution time can be obtained on a computer with better
specification. Above all, when the system is implemented
into a single integrated circuit (IC), hardware-based
processing will be swifter.



For better visualization, the Cb and Cr values of both skin

and non-skin patches are plotted in a graph to find the
region that they are likely to fall in. As can be seen from
Fig. 4, the skin pixels clearly distinguish themselves
from the non-skin ones. The thresholds of the filter can
hence be determined easily. A similar approach is
repeated on another image in fluorescent lighting
conditions. The threshold values, as expected, are quite
different as depicted in Table 1.


Fig.4 Luminosity value of skin and non-skin pixels



Table 1: Threshold values of incandescent and

fluorescent lighting conditions
A number of images are used to evaluate effectiveness of
the classification system. Both the original image and the
skin-segmented image are used to observe and verify the
accuracy of the segmentation algorithm. Images with low
lightning condition are also tested. It is quite apparent
from Fig. 5 that there are always a number of noisy spots
in the filtered images, regardless of the lightning

Fig. 5 Noisy spots in the filtered images

3.3 Skin segmentation testing

This test was mainly aimed at evaluating the performance
of the skin segmentation and normalization modules of
the control system. A number of hand gesture images
were taken and the skin segmentation and the subsequent
normalization are shown in Fig. 7.
As seen from Fig. 7, the filter had successfully segmented
the skin regions out of all the tested images. It was also
noticeable that the shadow of the hand and the body did
not have any effect on the filtering process. The
remaining noise and unfilled pixels were removed by the
normalization filter which resulted in a smooth and clear
Non-uniform background images were also tested and
some of the results are shown in Fig. 8. It was quite
apparent from Fig. 8 that non-uniform background

International Journal Of Engineering, Education And Technology (ARDIJEET) ISSN 2320-883X, VOLUME 3, ISSUE 2, 01/04/2015
images produced many scattered patterns and noisy spots
during the skin segmentation process. In particular, in the
first two images taken under incandescent light, the hand
was segmented along with some parts of the guitar and
the edges of the wardrobe, forming a fairly distorted

resultant effect was that the object might have been

recognized as a skin region as its color had been
modified. Fluorescent light, conversely, generated white
light and thus retained the original color of the object.
This resulted in less skin-like regions in the image and
the hand could be extracted out more accurately.
In conclusion, the performance of the skin segmentation
and normalization filters was firmly robust against the
variance in backgrounds and lighting conditions.
Nevertheless, the user should consider the negative effect
created by skin-like regions in the working environment
under incandescent light.
3.4 Feature extraction



It is not too difficult to realize that effective real-time

classification cannot be achieved using attempts such as
template matching. Template matching itself is very
much prone to error when a user cannot reproduce an
exact hand gesture to a gesture that is already stored in
the library. It also fails because of variance to scaling as
the distance to the camera may produce a scaled version
of the gesture. The gesture variations because of rotation,
scaling and translation can be circumvented using a set of
features that are invariant to these operations. Moment
invariants offer a set of features that encapsulate these





Fig. 7 Hand gesture images and the skin segmentation

and the subsequent normalization

Fig. 8 Results of the non-uniform background images

However, after being passed through the normalization
filter, the resultant images only consisted of the largest
region found in filtered images, in this case effectively
the hand region. This also implied that larger region of
skin-like objects might result in incorrect segmentation
and should be carefully considered. The last two images
taken under fluorescent light, on the other hand, showed
significantly less noise than the first two images. This
could be explained in terms of the difference in physical
characteristics of the two light sources. In particular,
incandescent light generated a yellowish glow which
modified the look of objects that was captured. The

4. Implementation Details for AC motor speed control

The voltage regulator is used to supply the required
voltage for the circuit where in this case, a
5V supply is required. According to the gesture
recognized by the gesture recognition system, for
actuating the respective task, the signal is fed to the PIC
microcontroller. The resulting signal will then be used to
generate pulses with varying time period to generate
different signals to trigger the TRIAC with variable firing
angle. With the firing angle controlled, the AC motor
speed can also be controlled at variable three different
speeds viz. Low, Medium or High.
4.1 Phase Control Method of a PSC Induction Motor

Basically, phase control method is one of the simplest

ways to control a PSC induction motor. The TRIAC
which acts as a single switch that is in series with the AC
line. The duration of on time of the TRIAC control the
AC voltage supply to the motor by chopping the AC
waveform. It causes the power to shut off during a portion

International Journal Of Engineering, Education And Technology (ARDIJEET) ISSN 2320-883X, VOLUME 3, ISSUE 2, 01/04/2015
of the AC Cycle. Fig. 9 shows the schematic diagram of
a TRIAC controlled drive. The motor in the figure is a
PSC motor which has two windings and a capacitor for
phase shift.

5. Experimental Results
A simple web camera was used for capturing the hand
gesture. Background used here was uniform as well as
non- uniform. Due to non- uniform background,
erroneous results were faced. The image was displaying
the gesture as shown in the fig. 11. According to the
gesture performed the software algorithm resulted with
the expected gesture recognition. This result was

Fig. 9 AC Chopper/ TRIAC Control of 1-Phase AC


forwarded towards the PIC microcontroller serially.

According to the gesture performed the motor speed was


varied as per the requirement viz. low speed, medium


speed & high speed.

Fig. 10 Line Voltage vs. Motor Voltage




Generally, the AC motor acts as a low pass filter causing

the resulting current waveform to be sinusoidal at the
same operating frequency with a slight lag. The slip to
the motor will be increased by the lower root mean square
(RMS) voltage that being created by the chopper. The
slip is defined as the difference between synchronous
speed and the base speed.


It is expressed as a percentage and can be determined

with the following equation:
Percentage Slip = [(Ns Nb)/Ns] *100
Where Ns = synchronous speed in revolution per minute
Nb = the base speed in RPM
As a result, the motor speed is reduced if the slip to the
motor increased. It starves the motor of its power, the
motor slows down. Eventually the motor stops after there
is not enough energy to maintain its rotational speed. The
speed of the motor can be controlled by the conducting
period of the TRIAC where the conducting period
depends on the firing angle that is controlled by the
circuit at the gate of the TRIAC.

Fig. 11 Hand Gesture Set for respective task

6. Conclusion
The system is developed to reject unintentional and
erratic hand gestures (such as childrens irrational
movements) and to supply visual feedback on the
gestures registered.
This work consists a set of gestures that are distinct
from each other yet easy to recognize by the system. This
set of hand gestures is adequate for any consumer
electronic control system. The software interface
produces unique key mapping ability such that the

International Journal Of Engineering, Education And Technology (ARDIJEET) ISSN 2320-883X, VOLUME 3, ISSUE 2, 01/04/2015
Speed function of a ceiling fan can be mapped to the

[10]S. Zhou, Z. Dong, W. J. Li, and C.P. Kwong, Hand- written

Volume gesture in TV mode. The main aim of this

Character Recognition Using MEMS Motion Sensing Technology,

published in Proc. IEEE/AS ME Int. Conf. Advanced Intelligent

project is to make a system which control the fan speed

Mechatronics , 2008, pp.14181423

and lights remotely, for physically impaired people

[11]Premaratne, P, and Nguyen, Q, Consumer Electronics Control

without the use of complex form of inputs.

System Based On Hand Gesture Moment Invariants, published in IET

Computer Vision, 1(1), 2007, 35-41.

7. References:

[12]H. Je, J. Kim, and D. Kim, Hand Gesture Recognition to

[1]Chee-Hoe Pang, Jer-Vui Lee, Yea-Dat Chuah, Yong-Chai Tan and

Understand Musical Conducting Action, published in The IEEE Int.

N. Debnach, "Design of a Microcontroller based Fan Motor Controller

Conf. Robot &Human Interactive Communication, 2007.

for Smart Home Environment", International Journal of Smart Home

[13]Zhixin Shi Srirangaraj Setlur, Venu Govindaraju, "Digital Image

Vol. 7, No. 4, July, 2013


Enhancement using Normalization Techniques and their application to

[2]D. Vishnu Vardhan, P. Penchala Prasad, Hand Gesture

Palm Leaf Manuscripts", Center of Excellence for Document Analysis

Recognition Application for Physically Disabled People, published in

and Recognition(CEDAR), State University of New York at Buffalo,

International Journal of Science and Research (IJSR), Andhra Pradesh,

Buffalo, NY 14228, U.S.A., February 21, 2005

India, Volume 3 Issue 8, August 2014.

[14]W. T. Freeman and C. D. Weissman, TV Control By Hand

Gestures, presented at the IEEE Int. Workshop on Automatic Face and

Hand Gesture Recognition for Human Computer Interaction",

Gesture Recognition, Zurich, Switzerland, 1995

International Journal of Engineering Science and Innovative


[3]Radhika Bhatt, Nikita Fernandes, Archana Dhage, "Vision Based

[15]T. Yang, Y. Xu, and A., Hidden Markov Model for Gesture

Technology (IJESIT) Volume 2, Issue 3, Mcognitay, 2013

Recognition, presented at Robotics Institute, Carnegie Mellon

[4]P. Nagasekhara Reddy, Microcontroller Based Speed Control of

Univ.,Pittsburgh, PA, 1994

Induction Motor using Wireless Technology, International Journal of

[16]S. S. Fels and G. E. Hinton, Glove-talk: A Neural Network

Volume-1, Issue-9, July 2013

Interface between a Data Glove and a Speech Synthesizer, published



Emerging Science and Engineering (IJESE) ISSN: 23196378,

in IEEE Trans. Neural Netw., 1993, 4, l, pp. 28.

[5]K.C. Shriharipriya and K. Arthy, Flex Sensor Based Hand Gesture

[17]J. S. Lipscomb, A Trainable Gesture Recognizer, published in

of Innovative Research a Studies (IJIRS), Vellore, India, May 2013

Pattern. Recognition, 1991, 24, 9, pp. 895907,


Recognition System, published in Proceedings International Journal

[18]D. H. Rubine, The Automatic Recognition of Gesture, presented

Using MEMS for Specially Challenged People, published in

at Ph.D dissertation, Computer Science Dept., Carnegie Mellon Univ.,

International Journal of VLSI and Embedded Systems-IJVES, Madurai,

Pittsburgh, PA, 1991.


[6]A.Alice Linsie, J.Mangaiyarkarasi, Hand Gesture Recognition

Tamilnadu, INDIA, Vol 04, Issue 02; March - April 2013

[19]W. M. Newman and R. F. Sproull, Principles of Interactive
[7] Wlodzimierzkasprak, Arturwilkowski, Karolczapni, "Hand Gesture

Computer Graphics, New York: McGraw-Hill, 1979.

Recognition based on Free- Form Contours & Probabilitic interface",

Int. J. Appl. Math. Comput. Sci., Vol. 22, No. 2, 437448, 2012
[8]Piotr Dalka, Andrzej Czyzewski, "Human-Computer interface based
on Visual lip movement & gesture recognition", International Journal
of Computer Science and Applications, Technomathematics Research
Foundation Vol. 7 No. 3, pp. 124 - 139, 2010
[9]S.F. Ahmed, et al., "Electronic Speaking Glove For Speechless
Patients, published in the IEEE Jaya, Conference on Sustainable
Utilization and Development in Engineering and Technology, Petaling
Malaysia, 2010, pp. 56-60