Sunteți pe pagina 1din 4

e-ISSN (O): 2348-4470

Scientific Journal of Impact Factor (SJIF): 4.72


p-ISSN (P): 2348-6406

International Journal of Advance Engineering and Research


Development
Volume 4, Issue 2, February -2017

User Interface using Accelerometer Sensor


Amara Dinesh Kumar1
1
System Engineer,Tata Consultancy Services
1
Electronics and Communications Department,TKR Engineering College

Abstract The objective of this work is two-fold. The first part is the design and implementation of a wireless data
glove based on an inertial sensor to measure hand motion and the appropriate software device driver to record the
sensor signals.We present an input method which take input from the user using gestures made by him using
accelerometer sensor and by processing that gestures using the software recognizes them and the extracted data is sent to
the Bluetooth module present in the glove.Here the mems based accelerometer is used which three dimensionally tracks
our hand movements and that information is fed to microprocessor which process that movements and converts it in to
text using the algorithm given to it.finally the Bluetooth module will transfer that data and a lot of research is going on
which gives the highest efficiency compared to the other algorithms.

Keywords-Sensor,accelerometer,MEMS,Bluetooth,gestures

I. INTRODUCTION

Gesture-based interfaces have been proposed as alternate modality for controlling stationary computers, e.g. to
navigate in office applications and play immersive console games. In contrast to their classic interpretation to support
conversation, gestures are understood in this area as directed body movements, primarily of arms and hands, to interact
with computers. Gesture-based interfaces can enrich and diversify interaction options as in gaming. Moreover, they are
vital for computer access by handicapped, such as enabled by sign language interfaces, and for further applications and
environments where traditional computer interaction methods are not acceptable. Mobile systems and devices are a
primary field concern for such alternate interaction solutions. While the above cited stationary applications demonstrate
the applicability of directed gestures for interaction, future mobile solutions could profit from deployment of gesture-
based interfaces in particular. Currently, mobile systems lack solutions that minimize user attention or support access for
users with specific interaction needs. Hence, gestures that are directly sensed and recognized by a mobile or wearable
device are of common interest for numerous applications.
In this work, we investigate gesture-based interaction using a wrist-worn watch device. We consider
gestures as an intuitive modality, especially for watches, and potentially feasible for wearers that cannot operate tiny
watch buttons.To this end, it is essential to evaluate the wearers performance and convenience while operating such an
interface. Moreover, the integration of gesture interfaces into a watch device has not been previously evaluated. Resource
constraints of wrist-worn watch devices impose challenging restrictions regarding processing complexity for embedding
a gesture recognition solution into watch devices.

Figure 1:Various Modules of the Device

@IJAERD-2017, All rights Reserved 475


International Journal of Advance Engineering and Research Development (IJAERD)
Volume 4, Issue 2, February -2017, e-ISSN: 2348 - 4470, print-ISSN: 2348-6406

II. PROBLEM DEFINITION

Hand gestures have always been part of human communication. Gestures are used in non verbal speech, when acoustic
communication is not applicable or not desirable, for example, over large distances or in noisy environments. Gestures
are also used as an additional information channel in verbal communication. Thus, it is obvious to use gestures as an
interface for human-computer interaction. For that reason, computers have to be able to recognize these body
movements. Various advances have been made in this direction. One can identify two basic approaches to the task of
machine based gesture recognition, namely external and internal systems, which are distinguished by the sensor set-up
used. External systems, like almost all vision based solutions, rely on cameras, installed in the environment of the user.
Internal systems incorporate body-worn sensors. As external systems are unobtrusive, since the user does not
have to carry any special devices, internal systems are independent of a special environment, like a sensor equipped
room. They are even suitable for outdoor use, which is hard to accomplish with vision based systems. Body worn
systems normally utilize modern MEMS1 inertial sensors to detect motion. Commonly used sensors are accelerometers,
gyroscopes and magnetometers. The body-worn sensor approach is extensively studied in the context of wearable
computing and mixed reality applications. For such systems, the question of an ideal input device is still open. While
pressing keys on a small keyboard does not seem to be the optimal method, hand gesture recognition might permit the
design of an intuitive input device. For this purpose, any kind of predefined gesture set could be used, but it would be
appealing to use characters and digits, which can be seen as special gestures, since users would not have to learn a new
gesture system.
Here the idea of a new gesture based interface for wearable computing is introduced. It consists of an inertial
sensor-equipped pen-like wireless device, enabling the user to write in the air, like on an imaginary blackboard. The
written text is recognized and can be further processed. I call this type of input airwriting. Experiments on recognition,
up to the single character level, show promising results.

III. ACCELEROMETERS AND GYROSCOPES

Both Accelerometers and Gyroscopes belong to the category of inertial sensors. They have been used for decades, mainly
for navigation purposes. While there exist different technologies from early mechanical systems to laser based sensors, I
will make use of micromachined electromechanical system (MEMS) sensors. These types of sensors are based on silicon
and the same techniques as in integrated circuits production can be used to produce them. Thereby, the sensors are small,
cheap, robust and leightweight. The performance is good enough for gesture recognition, especially handwriting.
Accelerometers measure the linear acceleration of the sensor device in one axis. Gyroscopes measure the angular velocity
around one axis. It should be emphasized, that gravitational force is always present and therefore, acting on an
accelerometer. As result, every accelerometer measures a part of the earth
Figure 2: The data glove with all parts worn by a user.

Program code :
/*
userinterface_using_accelerometer.c
Author: amara dinesh kumar */
#define F_CPU 8000000UL
#include <avr/io.h>
#include <util/delay.h> #include <avr/interrupt.h> unsigned char x_val,y_val,x[10],y[10];

@IJAERD-2017, All rights Reserved 476


International Journal of Advance Engineering and Research Development (IJAERD)
Volume 4, Issue 2, February -2017, e-ISSN: 2348 - 4470, print-ISSN: 2348-6406

void InitADC()
{
ADMUX=(1<<REFS0);
ADCSRA=((1<<ADEN) | (1<<ADPS0) |(1<<ADPS1)|(1<<ADPS2));
}
int ReadADC( char ch)
{
ch=ch&0b00000111;
ADMUX=ch|(1<<REFS0);
ADCSRA|=(1<<ADSC); while(!(ADCSRA&(1<<ADIF))); ADCSRA|=(1<<ADIF);
return(ADC);
}
Void USART_Init()
{
UBRRL= 0x33;
UCSRB = (1<<RXEN)|(1<<TXEN);
UCSRC = (3<<UCSZ0)|(1<<URSEL);
}
void USART_Transmit( unsigned char data )
{
UDR=data;
while(!(UCSRA & (1<<TXC)));
_delay_ms(100);
}
unsigned char USART_receive()
{
while ( !(UCSRA & (1<<RXC)) );
return UDR;
}
void Tx_String( char *str)
{
while(*str)
{
USART_Transmit(*str); str++;
}
}

void READ_ACCELEROMETER()
{

_delay_ms(10); x_val=ReadADC(0); _delay_ms(10);


y_val=ReadADC(1);

}
void SEND_VALUES_OF_X_Y()
{
itoa(x_val,x,10); _delay_ms(10); itoa(y_val,y,10); _delay_ms(10);
Tx_String("X:");
_delay_ms(20); Tx_String(x); if(x_val>130) Tx_String("B"; elseif(x_val<60) Tx_String(" F"); else
if(y_val>120) Tx_String(" R"); elseif(y_val<60) Tx_String(" L"); else if(60<x_val<13) Tx_String(" N");
USART_Transmit(0x0d);
_delay_ms(100);
Tx_String("Y:");
_delay_ms(20);
Tx_String(y);
USART_Transmit(0x0d);
}
int main(void)
{

DDRC=0x00;

@IJAERD-2017, All rights Reserved 477


International Journal of Advance Engineering and Research Development (IJAERD)
Volume 4, Issue 2, February -2017, e-ISSN: 2348 - 4470, print-ISSN: 2348-6406

DDRB=0xff;
DDRD=0xfe;
InitADC();
USART_Init(); _delay_ms(20);
while(1)
{
READ_ACCELEROMETER();
SEND_VALUES_OF_X_Y();
}
}

IV. CONCLUSION

1. Interaction with mobile devices that are intended for everyday use is challenging since such systems are continuously
optimized towards small outlines.
2. Particularly critical parameters like display size, processing capabilities, and weight are tight constraints.
3. The final aim of this project is to create a new interactive user interface which replaces the traditional type interface
and the existing touch interface.

V. REFERENCES

[1] Lyons, K., Starner, T., Plaisted, D., Fusia, J., Lyons, A., Drew, A., Looney, E.W.: Twiddler typing: one-handed
chording text entry for mobile phones. In: Proc. of the SIGCHI conference on Human factors in computing systems
(CHI04) (2004)
[2] McGuire, R., Hernandez-Rebollar, J., Starner, T., Henderson, V., Brashear, H., Ross, D.: Towards a one-way
american sign language translator. In: Proc. Sixth IEEE International Conference on Automatic Face and Gesture
Recognition (FGR04) (2004)
[3] Mistry, P., Maes, P., Chang, L.: Wuw - wear ur world: a wearable gestural interface. In: Proc. of the 27th
international conference extended abstracts on Human factors in computing systems (CHI EA 09) (2009)
[4] Rabiner, L.: A tutorial on hidden markov models and selected applications in speech recognition. Proceedings of the
IEEE 77(2), 257286 (1989)
[5] Stiefmeier, T., Roggen, D., Troster, G., Ogris, G., Lukowicz, P.: Wearable activity tracking in car manufacturing.
IEEE Pervasive Computing 7(2), 42 (2008)
[6] Stolcke, A.: Srilm - an extensible language modeling toolkit. In: International Conference on Spoken Language
Processing (2002)
[7] Tamaki, E., Miyaki, T., Rekimoto, J.: Brainy hand: an ear-worn hand gesture interaction device. In: Proc. of the 27th
international conference extended abstracts on Human factors in computing systems (CHI EA 09) (2009)
[8] Amft, O., Amstutz, R., Smailagic, A., Siewiorek, D., Trster, G.: Gesture-controlled user input to complete
questionnaires on wrist-worn watches. In: HumanComputer Interaction. Novel Interaction Methods and Techniques,
Lecture Notes in Computer Science, vol. 5611, pp. 131140. Springer Berlin / Heidelberg (2009)

@IJAERD-2017, All rights Reserved 478

S-ar putea să vă placă și