Sunteți pe pagina 1din 65

A

Project Report
On
Real-Time Hand Gesture Recognition
In partial fulfillment of requirements for the diploma in
Computer Technology
SUBMITTED BY:
1. Mr. Bhutadiya Kiran
2. Mr. Kulkarni Devavrat
3. Mr. Dhadve Nitin
4. Mr. Warang Harmit
Under the Guidance of
Prof. Sushil Andhale

DEPARTMENT OF COMPUTER TECHNOLOGY


BABASAHEB GAWDE INSTITUTE OF TECHNOLOGY
Dr. A. B. Nair Road, Mumbai Central
Mumbai 400 008
2014 - 2015
MM's BGIT - Mumbai 2014 -15

Side view of the Report

Diploma
CM
Real-Time Hand Gesture Recognition

2014 -
2015

2
MM's BGIT - Mumbai 2014 -15

CERTIFICATE
This is to certify that the project entitled HAND
DETECTION USING MULTIPLE PROPOSALS has been
carried out by the team under my guidance in partial fulfillment
of the diploma in Computer Technology MSBTE during the
academic year 2014-2015(Semester-V and Semester-VI).

Team:

1. Mr. Bhutadiya Kiran


2. Mr. Kulkarni Devavrat
3. Mr. Dhadve Nitin
4. Mr. Warang Harmit

Date:
Place: Mumbai

Project guide Head of Department


Prof. Shushil Andhale Mr. Ajit Parab

Principal External Examiner


Mr. S.C. Nawle

3
MM's BGIT - Mumbai 2014 -15

PROJECT APPROVAL SHEET

Following team has done the appropriate work


related to the Real-Time Hand Gesture Recognition in
partial fulfillment for the award of Diploma in Computer
Technology of MSBTE and is being submitted to
Babasaheb Gawde Institute Of Technology-Mumbai.

Team:
1. Mr. Bhutadiya Kiran
2. Mr. Kulkarni Devavrat
3. Mr. Dhadve Nitin
4. Mr. Warang Harmit

Guide: Prof. Sushil Andhale

External Examiner:

Date:

Place: Babasaheb Gawde Institute of Technology

ACKNOWLEDGEMENT

We impart special gratitude to our Principal Shri S.C.Nawale and Prof.


Ajit Parab the H.O.D of Computer Technology Department who were a
4
MM's BGIT - Mumbai 2014 -15

constant source of help and played an important role in the successful


execution of the project.

We also appreciate Prof. Sushil Andhale, Lecturer our Project Guide who
also put in lot of efforts in giving us the right guidance during the
development process of the project. We also appreciate her eagerness and
enthusiasm in encouraging us to develop our creative and technical ideas,
which ultimately led to success of our project.

Our special thanks also to the non-teaching staff for their great support and
kind cooperation to provide us with whatever we required for the project.
We also thank our family and our friends for their support and good wishes
for our project. Never to be forgotten, we thank God for granting us
success in our efforts during the formation of the project.

By-

1. Mr. Bhutadiya Kiran


2. Mr. Kulkarni Devavrat
3. Mr. Dhadve Nitin
4. Mr. Warang Harmit

5
MM's BGIT - Mumbai 2014 -15

INDEX
1 Introduction
1.1 Real-Time Hand Gesture Recognition
2 Market Survey
3 System Analysis
3.1 Static Hand Gesture
3.2 Dynamic Hand Gesture Recognition
3.3 Virtual Mouse
4 Coding
5 Testing
5.1 Formal Technical Review
5.2 Test Plan
5.3 Unit Testing
5.4 Integration Testing
5.5 System Testing
5.6 Functional Testing
5.7 Performance Testing
5.8 Stress Testing
5.9 Goals & Objectives
5.10 Test Cases
6 Modeling
6.1 Use Case Diagrams
6.2 Activity Diagrams
6.3 Sequence Diagrams
6.4 State Transition Diagrams
7 Snapshots
8 Appendix
8.1 Programming
8.2 Programming Principles
9 Bibilography

6
MM's BGIT - Mumbai 2014 -15

ABSTRACT

Real-time, static and dynamic hand gesture


recognition affords users the ability to interact with
computers in more natural and intuitive ways.
The gesture can be used to communicate much more
information by itself compared to computer mice,
joysticks, etc. allowing a greater number of possibilities
for computer interaction.
The purpose of this work is to design, develop and
study a practical framework for real-time gesture
recognition that can be used in a variety of human-
computer interaction applications.
The system include the processes of image.

7
MM's BGIT - Mumbai 2014 -15

Chapter 1

Introduction

8
MM's BGIT - Mumbai 2014 -15

CHAPTER -1
INTRODUCTION

With the massive influx of computers in society and the


increasing importance of service sectors in many of
industrialized nations, the market for robots in
conventional applications of manufacturing automation is
reaching saturation, and the research on robotics is rapidly
proliferating in the field of service industries. Service
robots operate in dynamic and unstructured environment
and interact with people who are not necessarily computer
literates. As a new form of human-computer interaction in
the middle of the seventies and there has been a growing
interest in it recently. As a special case of human-
computer interaction, human robot interaction is imposed
by several constraints the background is complex and
dynamic; the lighting condition is variable; the shape of
the human hand is deformable; the implementation is
required to be executed in real time and the system is
expected to be user and device independent.
Numerous techniques on gesture-based interaction have
been proposed, but hardly any published work fulfills all
the requirements and present a real-time gesture system
which was used in place of the mouse to move and resize
windows. In this system, the hand was segmented from the
background using skin color and hands pose by neural
network skilled in communicating with robots. Friendly
and cooperative interface is thus critical for the
development of service robots. Gesture-based interface
holds the promise of making human-robot interaction
more natural and efficient.
A drawback of the system is that its hand tracking has
to be specifically adapted for each user.
9
MM's BGIT - Mumbai 2014 -15

In the system, a variety of features, such as intensity,


edge, motion, disparity and color has been used for gesture
recognition. This system was implemented only in a
restricted indoor environment. In the gesture-based
human-robot interaction system, the combination of
motion, color and stereo cues was used to track and locate
the human hand, and the hand posture recognition was
based on elastic graph matching. This system is prone to
noise and sensitive to the change of the illumination.

10
MM's BGIT - Mumbai 2014 -15

Chapter 2

Market
Survey

11
MM's BGIT - Mumbai 2014 -15

CHAPTER -2
MARKET SURVEY

The presently Available System in the market

The current hand recognition and detection system is


not a advance enough to give the proper highlight of hand
in read image. The available systems are separately not
giving the results. But integrating them presents a
possibility of having more accurate results.
Current method use Skin detection, Super Pixilation of
Skin, Hand shape detection using concave and convex
skin detection on still images. The market has state of the
art hand detection for detecting hand in static image.

But hand detection systems in real time are not present.


Those which are present are robust and not satisfactory.

Future Prospects:-
The hand detection is still images will further detect both
left and right hand in still images and also index finger,
middle finger, ring finger, little finger and also the thumb.
And this application will be useful in creating a platform
for gesture recognizing.

MUSIC PLAYER:
ActiveX is a Microsoft Windows protocol for component
integration. With help of ActiveX,it is possible to integrate
matlab and Microsoft media player .
Han = actxcontrol('WMPlayer.OCX.7)
It will create an object of WMPlayer. By changing the

12
MM's BGIT - Mumbai 2014 -15

pathname and filename we can change the song. We


assign different operation for each hand gesture.

13
MM's BGIT - Mumbai 2014 -15

Chapter 3

System
Analysis

14
MM's BGIT - Mumbai 2014 -15

CHAPTER -3
SYSTEM ANALYSIS

This section describes the implementation details of the


Real time static and dynamic hand gesture recognition
system. The system is subdivided into in to three
independent sub systems
They are :
1. Static hand gesture recognition system.
2. Dynamic hand gesture recognition system.
3. Virtual mouse system.
The three systems work independently and they are
selected using a graphical user interface as in fig.1.
The static hand gesture recognition system detects the
number of fingers in the hand using k curvature algorithm.
Dynamic hand gesture recognition system track the
hand motion, that is it identify whether hand moves right,
left, up or down.
Virtual mouse system changes the mouse position as
the hand moves. Both dynamic and virtual mouse systems
are implemented using the centroid tracking method.

Fig. 1 Flow Chart Of System Flow


15
MM's BGIT - Mumbai 2014 -15

Fig2. Flow Chart of Static Gesture system

16
MM's BGIT - Mumbai 2014 -15

1) Static Hand Gesture

The static hand gesture recognition is used to find out


the static hand movements like the number of fingers in th
hand and performs application according to that. The
peaks and valleys are extracted using the k curvature
method (as explained above). Using the co- ordinate
values of tips and valleys we plotted the captured image.
From the number of peaks and valleys we can identify the
number of fingers in the current hand gesture.

I) K Curvature Method

The K curvature method is used to find the static hand


gesture in the system as in fig2 that is the count of hand
fingers. The K curvature method by using k curvature
algorithm as in fig3 will helps to identify the peaks and
valleys of the binary image. Using the number of these
peaks and valleys we can identify the gesture.

Fig3. Flow chart of the k curvature algorithm

17
MM's BGIT - Mumbai 2014 -15

2) Dynamic Hand Gesture Recognition

Dynamic hand gesture is used to detect the moving


hand gestures like waving of hands etc. Dynamic hand
gesture recognition system track the hand motion, that is it
identify whether hand moves right, left, up or down.

Fig4. Flow Chart of Dynamic gesture System

18
MM's BGIT - Mumbai 2014 -15

Fig5. Flow Chart of Dynamic Gesture System

II) Centroid Measurement and Tracking

The centroids of the binary image can be calculated


using
the MATLAB function regionprops(BW, properties) here
the properties will be centroid. The result will be the
vertical and horizontal (x,y) coordinates of the centroids as
in fig4.
Since the input binary image BW will contain only one
white region(hand extracted image).the result of the

19
MM's BGIT - Mumbai 2014 -15

function regionprops(BW, properties) will be the centroid


of the hand [1],[2].
By passing this centroid value to the java.awt.Robot we
can change the position of the mouse pointer. If the x
coordinate of the centroid cross a threshold value then the
dynamic gesture right can detect else left. Similarly if y
coordinate cross a threshold the up can detect else
downas in Fig.5.

3) Virtual mouse system.

The application of the dynamic hand gesture is virtual


mouse in which the mouse functions according to our
hand movements as in fig8.

Fig6. Flow Chart of Virtual Mouse

20
MM's BGIT - Mumbai 2014 -15

Chapter 4

Coding

21
MM's BGIT - Mumbai 2014 -15

CHAPTER - 4
CODING

clear all;
%vid = videoinput('winvideo', 1,'YUY2_640x480');
%vid = videoinput('winvideo', 1,'RGB24_640x480');
%vid = videoinput('winvideo', 1, 'YUY2_160x120');
%vid = videoinput('winvideo', 1, 'YUY2_176x144');
vid = videoinput('winvideo', 1, 'YUY2_640x480');
set(vid, 'FramesPerTrigger', Inf);
set(vid, 'ReturnedColorspace', 'rgb');
vid.FrameGrabInterval = 5;
num=0;
count=0;
%start the video aquisition here
cnt(1)=100;
cnt(2)=100;
import java.awt.Robot;
mouse = Robot;
t=0;
mouse.mouseMove(0, 0);
screenSize = get(0, 'screensize');

while(num<10)

%dialogue box
t=0;
str={'static gesture','dynamic gesture','virtual mouse','exit'} ;
[s,v] = listdlg('PromptString','SELECT THE OPERATION:',...
'name','HAND GESTURE RECOGNITION
SYSTEM','SelectionMode','single',...
'ListSize',[500 500],'ListString',str,'uh',30);

if(v==0)
break;

22
MM's BGIT - Mumbai 2014 -15

end

switch s

case 1
%-------------------------------------------------------------------------
%static gesture
%-------------------------------------------------------------------------
prompt = {'Enter k value','enter the angle threshold'};
dlg_title = 'threshold values';
num_lines = 1;
def = {'33','60'};
th = inputdlg(prompt,dlg_title,num_lines,def);

while(t<10)
start(vid);

CH=0;
while(vid.FramesAvailable<=30)

% Get the snapshot of the current frame


data = getsnapshot(vid);
%SKIN COLOUR EXTRACTION
J=rgb2ycbcr(data);
L=graythresh(J(:,:,2));
BW=im2bw(J(:,:,2),L);
BW1=~BW;
M=graythresh(J(:,:,3));
BW2=im2bw(J(:,:,3),M);
o=BW1.*BW2;
roi=o;
%
% se = strel('line',11,90);
se = strel('diamond', 4) ;
I2 = imdilate(roi,se);

Lw=bwlabel(I2);
stat = regionprops(Lw,'Area');
23
MM's BGIT - Mumbai 2014 -15

[cal,index] = max([stat.Area]);
tf = ismember(Lw, index);
stats1 = regionprops(tf, 'BoundingBox', 'Centroid');
%---clc--------------------------------------------------

grayFrame=tf;
%canny for hand coutour extraction
canny_op = edge(grayFrame,'canny');
L = bwlabel(canny_op,8);
stats = regionprops(L,'Area');
[cal,index] = max([stats.Area]);
%%% display(cal);
pix = ismember(L,index);

%k = ceil(sqrt(cal)/3);
k = str2num(th{1});

% k=33;
% imshow(L);
% figure,imshow(pix);

Q=pix;
[start_row,start_col]=find(Q,1);
hand_boundary_canny = bwtraceboundary(Q,
[start_row,start_col],'ne',8,Inf,'clockwise');
Q=hand_boundary_canny;

len=size(Q);
l=len(1);
%theta=zeros(l,1);
for i=k:1:l-k
if (i==k)
continue;
end
X1=Q(i,1);
X2=Q(i+k,1);
X3=Q(i-k,1);
24
MM's BGIT - Mumbai 2014 -15

Y1=Q(i,2);
Y2=Q(i+k,2);
Y3=Q(i-k,2);
m1=(Y2-Y1)/(X2-X1);
m2=(Y3-Y1)/(X3-X1);
theta(i)=atand((m2-m1)/(1+m1*m2));

end
% imshow(qq);
% % qq=zeros(pp);
% L3 = zeros(size(Q,1),size(Q,2));
%
%
% yy=size(theta);
% t=1:yy(:,2);
%plot(t,theta,'r*');
[ww ee]=find(theta> str2num(th{2})& theta<90);
es=size(ee);
%imshow(pix);
%plot(Q(:,1),Q(:,2),'r.');
[pp bb]=size(L);
ab=zeros(pp,bb);
%ab=[];
%ab(Q)=1;
% figure,imshow(ab)
for j=1:es(1,2)
m=ee(j);
if(m>size(Q,1))
continue;
end
% ab(Q(m,1),Q(m,2))=250;
ab(Q(m,1)-1:Q(m,1)+1,Q(m,2)-1:Q(m,2)+1)=250;
%hold on;
% plot(Q(m,1),Q(m,2),'g*','LineWidth',20,...
% 'MarkerEdgeColor','y',...
% 'MarkerFaceColor','r',...
% 'MarkerSize',20);
end
25
MM's BGIT - Mumbai 2014 -15

m=ab|L;
% imshow(m);
se = strel('disk',14);
closeBW = imdilate(ab,se);

%imshow(~closeBW);
[L num]=bwlabel(closeBW);
%disp(num)
f=regionprops(L,'centroid','BoundingBox');
box=cat(1,f.BoundingBox);
centroids = cat(1, f.Centroid);
if(numel(centroids)==0)
centroids=[start_row,start_col];
end

imshow(data)
hold on
plot(centroids(:,1), centroids(:,2),'g.','LineWidth',5,...
'MarkerEdgeColor','y',...
'MarkerFaceColor','y',...
'MarkerSize',20);

%figure,imshow(ab)
% figure,plot(t,theta)
switch num

case 1
m=text(600,150,'1');
k=1;
%disp('1');
case 2
%disp('1');
m=text(600,150,'1');
k=1;
26
MM's BGIT - Mumbai 2014 -15

case 3
% disp('2');
m=text(600,150,'2');
k=2;
case 4
%disp('2');
m=text(600,150,'2');
k=2;
case 5
%disp('3');
m=text(600,150,'3');
k=3;
case 6
%disp('3');
m=text(600,150,'3');
k=3;
case 7
%disp('4');
m=text(600,150,'4');
k=4;
case 8
%disp('4');
m=text(600,150,'4');
k=4;
case 9
%disp('5');
m=text(600,150,'5');
k=5;
case 10
%disp('5');
m=text(600,150,'5');
k=5;
otherwise
%disp('5');
m=text(600,150,'5');
k=5;

end
27
MM's BGIT - Mumbai 2014 -15

%...................................................

%-----------------------------------------------------

%This is a loop to bound the skin colour in a rectangular box.


for object = 1:length(stats1)
bb = stats1(object).BoundingBox;
bc = stats1(object).Centroid;
rectangle('Position',bb,'EdgeColor','b','LineWidth',1)
plot(bc(1),bc(2), '-m+')

% if(bc(:,1)>700)
% a=text(200,50,'left');
% else
% a=text(200,50,'right');
% end
%
% if (bc(:,2)>350)
% b=text(600,50,'down');
% else
% b=text(600,50,'up');
% end
%
% set(a, 'FontName', 'Arial', 'FontWeight', 'bold', 'FontSize',
30, 'Color', 'red');
% set(b, 'FontName', 'Arial', 'FontWeight', 'bold', 'FontSize',
30, 'Color', 'red');
% set(m, 'FontName', 'Arial', 'FontWeight', 'bold', 'FontSize',
30, 'Color', 'red');
% set(text(60,500,'DYNAMIC GESTURE
RECOGNITION'), 'FontName', 'Arial', 'FontWeight', 'bold',
'FontSize', 12, 'Color', 'green');
% mouse.mouseMove(bc(:,1), bc(:,2));
28
MM's BGIT - Mumbai 2014 -15

% pause(0.00001);
count=[];
if(vid.FramesAvailable>=15)
set(text(50,50,'.'), 'FontName', 'Arial', 'FontWeight', 'bold',
'FontSize', 50, 'Color', 'red');
set(m, 'FontName', 'Arial', 'FontWeight', 'bold', 'FontSize',
10, 'Color', 'black');
count=[count ;k]

end

end

end
hold off;

num=mode(count);
count=[];

disp(num);
%warndlg(num2str(num));
mm=num2str(num);
st=' is the count';
mm=[mm st];
% Construct a questdlg with three options
choice = questdlg(mm, ...
'count', ...
'I WANT TO CONTINUE','NO','NO');
% Handle response
if(strcmp(choice,'I WANT TO CONTINUE'))

CH = 1;
t=t+1;
stop(vid);
flushdata(vid);
end
29
MM's BGIT - Mumbai 2014 -15

if(strcmp(choice,'NO'))
stop(vid) ;
break;
end

close all;

end
stop(vid);
flushdata(vid);

case 2

%-------------------------------------------------------------------------
%dynamic gesture
%-------------------------------------------------------------------------
start(vid);
while(vid.FramesAvailable<=30)

% Get the snapshot of the current frame


data = getsnapshot(vid);

%SKIN COLOUR EXTRACTION

J=rgb2ycbcr(data);
L=graythresh(J(:,:,2));
BW=im2bw(J(:,:,2),L);
BW1=~BW;
M=graythresh(J(:,:,3));
BW2=im2bw(J(:,:,3),M);
o=BW1.*BW2;
roi=o;

% Here we do the image blob analysis.


% We get a set of properties for each labeled region.
stats = regionprops(roi, 'BoundingBox', 'Centroid');
30
MM's BGIT - Mumbai 2014 -15

% Display the image


imshow(data)

hold on

%This is a loop to bound the red objects in a rectangular box.


for object = 1:length(stats)
bb = stats(object).BoundingBox;
bc = stats(object).Centroid;
%rectangle('Position',bb,'EdgeColor','r','LineWidth',2)
plot(bc(1),bc(2), '-m+')
%a=text(bc(1)+15,bc(2), strcat('X: ', num2str(round(bc(1))),
' Y: ', num2str(round(bc(2)))));

if(bc(:,1)>350)
a=text(200,50,'left');
else
a=text(200,50,'right');
end

if (bc(:,2)>250)
b=text(600,50,'down');
else
b=text(600,50,'up');
end
% if(bc(:,1)>512)
% a=text(bc(:,1)+15,bc(:,2),'right');
% else
% a=text(bc(:,1)+15,bc(:,2),'left');
% end
%
% if (bc(:,2)>384)
% b=text(bc(:,1)+100,bc(:,2),'down');
% else
% b=text(bc(:,1)+100,bc(:,2),'up');
% end

31
MM's BGIT - Mumbai 2014 -15

set(a, 'FontName', 'Arial', 'FontWeight', 'bold', 'FontSize', 20,


'Color', 'red');
set(b, 'FontName', 'Arial', 'FontWeight', 'bold', 'FontSize', 20,
'Color', 'red');
% set(text(60,500,'DYNAMIC GESTURE
RECOGNITION'), 'FontName', 'Arial', 'FontWeight', 'bold',
'FontSize', 12, 'Color', 'green');
% mouse.mouseMove(bc(:,1), bc(:,2));

%-------------------------------------------------------------------
end
end
stop(vid);
flushdata(vid);

case 3

%-------------------------------------------------------------------
%mouse
%-------------------------------------------------------------------

start(vid);
while(vid.FramesAvailable<=30)

% Get the snapshot of the current frame


data = getsnapshot(vid);

%SKIN COLOUR EXTRACTION

J=rgb2ycbcr(data);
L=graythresh(J(:,:,2));
BW=im2bw(J(:,:,2),L);
BW1=~BW;
M=graythresh(J(:,:,3));
BW2=im2bw(J(:,:,3),M);
32
MM's BGIT - Mumbai 2014 -15

o=BW1.*BW2;
roi=o;

% Here we do the image blob analysis.


% We get a set of properties for each labeled region.
stats = regionprops(roi, 'BoundingBox', 'Centroid');

% Display the image


imshow(data)

hold on

%This is a loop to bound the red objects in a rectangular box.


for object = 1:length(stats)
bb = stats(object).BoundingBox;
bc = stats(object).Centroid;
bb1=[bc(1) bc(2) bb(3)/4 bb(4)/4];
rectangle('Position',bb1,'EdgeColor','r','LineWidth',2)
plot(bc(1),bc(2), '-m+')
%a=text(bc(1)+15,bc(2), strcat('X: ', num2str(round(bc(1))),
' Y: ', num2str(round(bc(2)))));

mouse.mouseMove(bc(:,1), bc(:,2));

end
end
stop(vid);
flushdata(vid);

case 4
close all;
break;
otherwise
break;

33
MM's BGIT - Mumbai 2014 -15

end
===============================================

34
MM's BGIT - Mumbai 2014 -15

Chapter 5

Testing

35
MM's BGIT - Mumbai 2014 -15

CHAPTER - 5
TESTING

Formal Technical Review

The FTR (Formal Technical Review) is a software quality


assurance activity with the objective to uncover errors in
functions, logic or implementation for any representation of the
software; to verify that the s/w under review meets its
requirement, to unsure that the s/w has been represented
according to predefined standards, to achieve software that is
developed in a uniform manner and t make project more
manageable.
FTR (Formal Technical Review) is also a learning ground for
junior developers to know more about different approaches to s/w
analysis, design & implementation. It also server as backup &
continue for the people who are not exposed to the s/w
development so far. FTR(Formal Technical Review) activities
include walk through, inspection & round-robin reviews and other
technical assessment. The above-mentioned methods are
different. FTR(Formal Technical Review) formats,
We have learn walk-through among group members Bhutadiya
Kiran, Devavrat Kulkarni, Dhadve Nitin & Warang Harmit.
Inspection was conducted in our internal presentation that was in
front of CM dept. staff. HOD Mr. Ajit Parab, Mr. Anil Ahir, Mr.
Asif Ansari, Mr. Mukesh Phadtare.

Test Plan

Availability of necessary h/w & s/w


Configuration Testing
Compatibility testing

36
MM's BGIT - Mumbai 2014 -15

Unit Testing

Unit Testing is the most MACRO scale of testing to test


particular function or core modules. Typically, done by a
programmer & not by a testers & it require detail
knowledge of internal program design and code. Not
always easily done unless application has a well-design
architecture tight code, may require developing test drive
modules or test harness.
Unit testing focuses first on the modules, independently
of one another, to locate errors. This enables the tester to
detect errors in coding and logical errors that is contained
within that module alone. Those resulting from the
interaction between modules are initially avoided.

Integration Testing
It involves testing of combined part of application. To
determine if they function together correctly. The part can
be core module, individual application, client & server
application on n/w etc.
This type of testing is specially relevant to client/server
and distribute system.

37
MM's BGIT - Mumbai 2014 -15

System Testing

Most s/w product today is modular. System testing is a


phase of s/w testing in which developer see if there are
any communication flaws-either not passing info., or
passing incorrect info between two modules. Testing that
attempt to discover defect that are properties of the entire
system rather than of its individual components.

Functional Testing

Each part of code is tested individually and panels were


tested individually on all platforms to see if they are
working properly.

Performance Testing

These determine the amount of execution time spend


on various part of unit of a resulting throughput, response
time given by a module.

Stress Testing

A lot of test files were made to work at a same time in


order to check how much load a software could take.

Goals & Objectives

Testing is a process of executing a program with intent


of finding an errors. A good test case that has a probability
of finding an undiscovered errors. Successful test is a one
that discovers an as yet undiscovered errors.
38
MM's BGIT - Mumbai 2014 -15

Our objective is to design process that systemically


uncover different classes of error and so on with min.
Amount of time.

39
MM's BGIT - Mumbai 2014 -15

Test Cases:-

Sr. Test Objective Pre- Steps I/P data Expected Actual Status
No. Cases requistes Results Results
1 TC_1 Check the K-value 1.Run the Int K-value Click on PASS
validity of K- field is program should OK
value available & 2.Select be makes
not passive static entered static
gesture as valid gesture
3.Enter int. work
K-value
2 TC_2 Check the K-value 1.Run the Int K-value Click on PASS
validity of field is program should OK
threshold available & 2.Select be makes
value not passive static entered static
gesture as valid gesture
3.Enter int. work
K-value
3 TC_3 Check the K-curvature 1.Run Real- 5 is 5 is PASS
hand gesture algorithm is 2.Select Time count count
working static Hand
gesture (5
fingers)
4 TC_4 Check the Centroid 1.Run Real- Left-up Left-up PASS
position of tracking is 2.Select Time
hand working dynamic hand
hand moving
gesture in left-
up
directio
n
5 TC_5 Check the Centroid 1.Run Real Mouse Mouse PASS
validity of tracking is 2.Select hand is pointer is pointer is
virtual mouse working virtual moving moving moving
mouse

40
MM's BGIT - Mumbai 2014 -15

Chapter 6

Modeling

41
MM's BGIT - Mumbai 2014 -15

Chapter 6
MODELING

UML Diagram

Use Case Diagrams

A use case diagram at its simplest is a graphical


representation of a user's interaction with the system and
depecting the specification of a use case. A use case
diagram can portray the different types of users of a
system and various ways to interact with them. This type
of diagram is typically used in conjuction with the textual
use case and will often be accompained by other diagrams
as well.

A use case diagram can help provide a higher level


view of the system. It has been said before that Use Case
diagrams are blue-prints for your system. They provide
the simplifies and graphical representation of what the
system must actually do.

Due to their simplistic nature, use case diagrams can


bee a good communication tool for stakeholders. The
drawings attempt to mimic the real world and provide a
view for the stakeholder to understand how the system is
going to be designed.

42
MM's BGIT - Mumbai 2014 -15

Relationships in Use Case Diagram:


Use cases share different kinds of relationships as
follows:-

Communicates
The participants of an actor in a use case is shown by
connecting the actor symbol to use case symbol by a solid
path. The actor is said to 'communicate' with the use case.
This is only relation between an actor and use cases.

Extends
An extend shows the relationships between use cases.
Relationship between use case A and B indicates that an
instance of use case B may include the behavior specified
by A. An extends' relationship between use cases is
depicted with a directed arrow having dotted.

43
MM's BGIT - Mumbai 2014 -15

44
MM's BGIT - Mumbai 2014 -15

Use Case Diagram for Static Gesture

45
MM's BGIT - Mumbai 2014 -15

Use Case Diagram for Dynamic Gesture

46
MM's BGIT - Mumbai 2014 -15

Activity Diagram

Activity Diagram is used for business process


modelling, for modeling the logic captured bye a single
usecase or usage scenario or for modeling scenario or for
modelling detailed logic of a business role.
Activity Diagram is dynamic diagram which shows the
activity and the event that causes the object to be in the
particular state.
Elements of an activity diagram:
1)Initial Activithy :
This shows the starting point or first activity of the flow
and denoted by a solid circle. There can only be one intial
state in a diagram.

Fig a.Initial Activity


2) Activity :
It is represented by a rectangle with rounded edges.
Activity
Desc

Fig b.Activity
3) Transition:
When an activity state is completed, processing moves to
another activity state. Transitions are use to mark this
movement. Transitions are modeled.
Event[guard condition]/ action

Fig c.Transition

47
MM's BGIT - Mumbai 2014 -15

4) Decisions :
It is similar to flow charts a logic where a decision is to
be made is depicted by the diamond with the options
written on either side of the arrows emerging from the
diamond within box brackets.

Activity Diagram for Static Gesture

48
MM's BGIT - Mumbai 2014 -15

Activity Diagram for Dynamic Gesture


49
MM's BGIT - Mumbai 2014 -15

Sequence Diagram
When an objet passes a message to another object the
receiving object might in turn sends a message to another
object which in turn sends message to yet another object
and so on. This stream of message forms a sequence
diagram depicts a sequence of action that occur in a
system .The invocation of methods in each objects and the
oreder in which the invocation occur is captured in
sequence diagram.This makes the sequence diagram a
very useful tool to easily represent the dynamic behaviour
of system.
Sequence diagram are typically used to model:
1) Usage Scenarios.
2) The Logic Of Methods.
Element of sequence diagrams:
1)Class Roles:
Class roles describe the way an object will behave in
context use he UML object symbol to illustrate class roles.
But does not list the object attributes.
Object Name : Class :Class Name
Name

Fig a.Object
2)Actor : An external entity that interacts with the system.

Actor
Fig b Actor

50
MM's BGIT - Mumbai 2014 -15

Sequence Diagram for Static Gesture


51
MM's BGIT - Mumbai 2014 -15

52
MM's BGIT - Mumbai 2014 -15

Sequence Diagram for Static Gesture


State Transistion Digram Dynamic Gesture

53
MM's BGIT - Mumbai 2014 -15

State Transistion Digram Static Gesture


54
MM's BGIT - Mumbai 2014 -15

Chapter 7

Snapshots

55
MM's BGIT - Mumbai 2014 -15

56
MM's BGIT - Mumbai 2014 -15

57
MM's BGIT - Mumbai 2014 -15

58
MM's BGIT - Mumbai 2014 -15

59
MM's BGIT - Mumbai 2014 -15

60
MM's BGIT - Mumbai 2014 -15

Chapter 8

Appendix

61
MM's BGIT - Mumbai 2014 -15

Chapter -8
APPENDIX

Programming

Programming is not only creative activity but also had


an intelligent rigorous discipline. It is the part where
design is actually translated into machine readable form
called program.

Programming Principles

The main activity of coding phase is to translate design


into code. If we translate the structure of design properly,
we will have structured program. It is the end product of
series of efforts that try to understand the problems and
develop a structured and understandable solution plan. It is
impossible to write a good structured program based on
unstructured poor design. The coding phase affects both
testing and maintenance profoundly the time spent in
coding is small percentage of total software cost while
testing and maintenance consume major percentage.

A good programming style is characterized by following -


Simplicity
Readability
Good Documentation
Changeability
Predictability
Good Structure
62
MM's BGIT - Mumbai 2014 -15

Chapter 9

BIBILOGRA
PHY

63
MM's BGIT - Mumbai 2014 -15

CHAPTER -9
BIBILOGRAPHY

[1] BurakOzer, Tiehan Lu, and Wayne Wolf, Design of


a Real-Time Gesture Recognition System, IEEE Signal
Processing Magazine, 2005.
[2] Chan Wa Ng, S. Rangganath, Real-time gesture
recognition system and application. Image and Vision
computing (20): 993-1007, 2002.
[3] Chen-Chiung Hsieh and Dung-Hua Liou, A Real
Time Hand Gesture Recognition System Using Motion
History Image, 2nd International Conference on Signal
Processing Systems (ICSPS),2010.
[4] D. H. Liou, A real-time hand gesture recognition
system by adaptive skin- color detection and motion
history image, Master Thesis of the Dept. of Computer
Science and Engineering, Tatung University, Taipei,
Taiwan,2009.
[5] Kwang-Ho Seok, Chang-Mug Lee, Oh-Young
Kwon, Yoon Sang Kim, A Robot Motion Authoring
using Finger-Robot
Interaction,Fourth International Conference on Computer
Sciences and Convergence Information Technology,2009.
[6] Nguyen Dang Binh, Enokida Shuichi, Toshiaki
Ejima, Real-Time Hand Tracking and Gesture
Recognition System, GVIP 05 Conference, CICC, Cairo,
Egypt, 19-21 December,2005.
[7] S.M.Hassan Ahmeda, Todd C.Alexanderb, and
Georgios C. Anagnostopoulosb, Real-time, Static and
Dynamic Hand Gesture Recognition for Human-
Computer Interaction, IEEE,2006.

64
MM's BGIT - Mumbai 2014 -15

[8] T. Maung, Real-time hand tracking and gesture


recognition system using neural networks, Proc. Of
World Academy of Science, Engineering and Technology,
vol. 38, pp. 470-474, Feb. 2009.
[9] Vladimir I. Pavlovic, Rajeev Sharma, Thomas S
Huang, Visual Interpretation of Hand Gestures for
Human- Computer Interaction: A review IEEE
Transactions of pattern analysis and machine intelligence,
Vol 19, NO 7, July,1997.
[10] XIAOMING YIN,Hand Posture Recognition in
Gesture-Based Human-Robot Interaction, International
journal of Robotics and Autonomous System, 2006.

65

S-ar putea să vă placă și