Sunteți pe pagina 1din 8

Computer vision syndrome prevention system

Definition Both visual and ophthalmic symptoms occur among computer users. These have collectively been called the computer vision syndrome (CVS). Ocular symptoms associated with the syndrome include decreased vision, burning, stinging, and photophobia. Overview According to statistics from India Today Magazine 1) 98% of professionals in urban India show sympoms of CVS. 2) 16 new patients are treated each month by Ophthalmologists. 3) 40 million Indians surf the net and 180 million use cell phones everyday. 4) 90% of urban Indians use computers over 4 hours a day. According to the 2003 U.S. Census data (the most recent statistics available as of November, 2007), 64% of adults and 86% of children use computers at school, at work, or at home. Computer Vision Syndrome (CVS) affects the majority of computer users. About 88% of people who use computers everyday suffer from eyestrain, and children are no exception. Printed words are easier to focus on than words on a computer screen. Letters on the screen may have the illusion of being crisp and clear, but they are not. They are made of tiny pixels which cause your eyes to make constant micro-movements, shifting to the resting point of accommodation and then back, trying to focus on the words. Computers are now a way of life, something that you cannot do without. As the number of people working on computers continues to grow each year, the number of people encountering temporary vision problems due to computer use has also increased. Computers do not harm your eyes, but often cause temporary problems. Computers are one of the most fascinating inventions of the 20th century. However, computer users confront new challenges both at their workplace and school systems. By working long hours using a computer monitor, a compilation of systemic and ocular symptoms may develop.More than 90% of Urban Indians use the computer. With more and more use of Computers, computer related eye problems are on the rise. If you get headache or Eye strain at the end of the day after using the computer for long hours, or if you have difficulty in focussing distant objects, you may be suffering from Computer Vision Syndrome or CVS. Computer vision syndrome is a serious problem for the millions of Indians who spend hours in front of a computer every day. Aside from the physical discomfort you may experience from symptoms,

computer vision syndrome can have a lasting effect on your vision. There are several preventive steps that you should take if you frequently use a computer.

Symptoms
If someone spends more than 3 hrs infront of the computer, they may suffer from CVS. Headache Neck pain and Body pains. Inability to focus. Redness of the Eyes. Blurred vision. Double Vision. Causes for CVS Blink more: Blinking rewets your eyes and helps avoid dryness and irritation. It is normal, when working at a computer, to blink less frequently. Offices are air-conditioned and are very dry environments - which reduce tearing. Make a conscious effort to blink and re-hydrate those eyes! CVS is caused by our eyes and brain reacting differently to characters on the screen than they do to printed characters. Characters ( Pixels) on a computer screen lack the contrast or well-defined edges that printed characters have. Because the color intensity of digital characters diminishes around the edges, it is difficult for eyes to remain focused. Having to continually refocus on digital text fatigues the eyes and can lead to burning or tired eyes. Normal blink rates is 15 blinks a minute which is reduced to 7.5 blinks a minute. Atmost Only four blinks per minute cause by CVS A great deal of computer vision research is dedicated to the implementation of systems designed to detect user movements and facial gestures [1, 2, 4, 5, 6, 15, 16]. In many cases, such systems are created with the specific goal of providing a way for people with disabilities or limited motor skills to be able to use computer systems. The motivation for the system proposed here is to provide an inexpensive, unobtrusive which could detect eye blink and alert if the number of blinks per minute reduces

The detection of blinking and the analysis of blink duration are based solely on observation of the correlation scores generated by the tracking at the previous step using the online template of the users eye. As the users eye closes during the process of a blink, its similarity to the open eye template decreases. Likewise, it regains its similar- ity to the template as the blink ends and the users eye becomes fully open again.

Initialization In this stage, the system will try to locate the eyes by analyzing the blinking of the user. Given the

current grayscaled framegray and the previously saved frame prev, we obtain the difference image diff. The difference image then thresholded, resulting a binary image showing the regions of movement that occured between two frames.
cvSub(gray, prev, diff, NULL); cvThreshold(diff, diff, 5, 255, CV_THRESH_BINARY);

The remove noise and produce fewer and larger connected components, a 33 star-shaped convolution kernel is passed over the binary image in an Opening Morphological operation.
IplConvKernel* kernel; kernel = cvCreateStructuringElementEx(3, 3, 1, 1, CV_SHAPE_CROSS, NULL); cvMorphologyEx(diff, diff, NULL, kernel, CV_MOP_OPEN, 1);

Connected component labeling is applied next to obtain the number of connected components in the difference image.
CvSeq* comp; int nc = cvFindContours( diff,/* the difference image */ storage,/* created with cvCreateMemStorage() */ &comp,/* output: connected components */ sizeof(CvContour), CV_RETR_CCOMP, CV_CHAIN_APPROX_SIMPLE, cvPoint(0,0) );

The function above will return the connected components in comp, as well as the number of connected components nc. At this point, we have to determine whether the components are eye pair or not. Well use experimentally derived heuristics for this, based on the width, height, vertical distance, and horizontal distance of the components. To make things simple, we only proceed if the number of the connected components is 2. Here are the rules applied to determine whether connected components are eye pair: 1 2 3 4 The width of the components are about the same. The height of the components are about the same. Vertical distance is small. Reasonable horizontal distance, based on the components width.

If the components successfully pass the filter above, the system will continue with the online template creation.

Online Template Creation


After the connected components passed the heuristics filter above, the system will obtain the boundaries of the first connected component, rect_eye. It will be used to extract a portion of the current frame as the eye template.
cvWaitKey(250); cvSetImageROI(gray, rect_eye); cvCopy(gray, tpl, NULL); cvResetImageROI(gray);

Note that we set some delay before creating the template. Thats because what we need is an open eye template. Since the users eyes are still closed at the heuristics filtering above, well wait a moment for the user to open his eyes.

Eye Tracking
Having the eye template and live video feed from camera, the system will try to locate the users eye in the subsequent frames using template matching. The searching is limited in a small search window since searching the whole image will use extensive amount of CPU resources.
/* get the centroid of eye */ point = cvPoint( rect_eye.x + rect_eye.width / 2, rect_eye.y + rect_eye.height / 2 ); /* setup search window */ window = cvRect( point.x - WIN_WIDTH / 2, point.y - WIN_HEIGHT / 2, WIN_WIDTH, WIN_HEIGHT ); /* locate the eye with template matching */ cvSetImageROI(gray, window); cvMatchTemplate(gray, tpl, res, CV_TM_SQDIFF_NORMED); cvMinMaxLoc(res, &minval, &maxval, &minloc, &maxloc, 0); cvResetImageROI(gray);

The location of the best matches is available in minloc. It will be used to draw a rectangle in the displayed frame to label the object being tracked.

Blink Detection
Not only locating the users eye, the system also detect users blinks. In the algorithm from the paper above, they detect eye blinks by analyzing the correlation score from the eye tracking stage. In theory, when the user blinks the similarity to the open eye template decreases. While it is true in most cases, Ive found that it is only reliable if the user doesnt make any significant head movements. If the user move his head, the correlation score also decreases even if the user doesnt blink. In this system I use motion analysis to detect eye blinks, just like the very first initilization stage above. Only this time the detection is limited in a small search window, the same window that is used to locating the users eye.
/* motion analysis cvSetImageROI has been applied to the images below */ cvSub(gray, prev, diff, NULL); cvThreshold(diff, diff, 5, 255, CV_THRESH_BINARY); cvMorphologyEx(diff, diff, NULL, kernel, CV_MOP_OPEN, 1); /* detect eye blink */ nc = cvFindContours(diff, storage, &comp, sizeof(CvContour), CV_RETR_CCOMP, CV_CHAIN_APPROX_SIMPLE, cvPoint(0,0));

cvFindContours will return the connected components in comp, and the number of connected components nc. To determine whether a motion is eye blink or not, we apply several rules for the connected component:

There is only 1 connected component. The component is located at the centroid of users eye. Note that we require only 1 connected component, while normally user blink will yielding 2 connected components. Thats because we perform the motion analysis in a small search window, where the window fits only for 1 eye.

Pro gramme has been initialized

The eye is detected using face detection and eye tracking. The left picture show the difference in the picture taken

The Blink is detected and a text showing blink is displayed. The rate of blink in showing on the display. If it reduces it automatically gives a sound

References [1] M. Betke, J. Gips, and P. Fleming. The camera mouse: Visual tracking of body features to provide computer access for people with severe disabilities. IEEE Trans- actions on Neural Systems and Rehabilitation Engi- neering, 10:1, pages 110, March 2002. [2] M. Betke, W. Mullally, and J. Magee. Active detection of eye scleras in real time. Proceedings of the IEEE CVPR Workshop on Human Modeling, Analysis and Synthesis (HMAS 2000), Hilton Head Island, SC, June 2000. [3] T.N. Bhaskar, F.T. Keat, S. Ranganath, and Y.V. Venkatesh. Blink detection and eye tracking for eye localization. Proceedings of the Conference on Con- vergent Technologies for Asia-Pacific Region (TEN- CON 2003), pages 821824, Bangalore, Inda, October 15-17 2003. [4] R.L. Cloud, M. Betke, and J. Gips. Experiments with a camera-based human-computer interface system. Pro- ceedings of the 7th ERCIM Workshop, User Interfaces For All (UI4ALL 2002), pages 103110, Paris, France, October 2002. [5] S. Crampton and M. Betke. Counting fingers in real time: A webcam-based human-computer interface with game applications. Proceedings of the Confer- ence on Universal Access in

Human-Computer Inter- action (affiliated with HCI International 2003), pages 13571361, Crete, Greece, June 2003. [6] C. Fagiani, M. Betke, and J. Gips. Evaluation of track- ing methods for human-computer interaction. Pro- ceedings of the IEEE Workshop on Applications in Computer Vision (WACV 2002), pages 121126, Or- lando, Florida, December 2002. [7] D.O. Gorodnichy. On importance of nose for face tracking. Proceedings of the IEEE International Con- ference on Automatic Face and Gesture Recognition (FG 2002), pages 188 196, Washington, D.C., May 20-21 2002. [8] D.O. Gorodnichy. Second order change detection, and its application to blink-controlled perceptual in- terfaces. Proceedings of the IASTED Conference on Visualization, Imaging and Image Processing (VIIP 2003), pages 140145, Benalmadena, Spain, Septem- ber 8-10 2003. [9] D.O. Gorodnichy. Towards automatic retrieval of blink-based lexicon for persons suffered from brain- stem injury using video cameras. Proceedings of the CVPR Workshop on Face Processing in Video (FPIV 2004), Washington, D.C., June 28 2004. [10] D.O. Gorodnichy and G. Roth. Nouse use your nose as a mouse perceptual vision technology for hands- free games and interfaces. Proceedings of the Interna- tional Conference on Vision Interface (VI 2002), Cal- gary, Canada, May 27-29 2002. [11] K. Grauman, M. Betke, J. Gips, and G. Bradski. Communication via eye blinks - detection and dura- tion analysis in real time. Proceedings of the IEEE Computer Vision and Pattern Recognition Confer- ence (CVPR 2001), Vol. 2, pages 10101017, Kauai, Hawaii, December 2001. [12] K. Grauman, M. Betke, J. Lombardi, J. Gips, and G. Bradski. Communication via eye blinks and eye- brow raises: Video-based human-computer interaces. Universal Access In The Information Society, 2(4), pages 359373, November 2003. [13] Intel image processing library (ipl). http://developer.intel.com/software/products/perflib/ijl. [14] R. Jain, R. Kasturi, and B.G. Schunck. Machine Vi- sion. Mc-Graw Hill, New York, 1995. [15] J. Lombardi and M. Betke. A camera-based eyebrow tracker for hands-free computer control via a binary switch. Proceedings of the 7th ERCIM Workshop, User Interfaces For All (UI4ALL 2002), pages 199 200, Paris, France, October 2002. [16] J.J. Magee, M.R. Scott, B.N. Waber, and M. Betke. Eyekeys: A real-time vision interface based on gaze detection from a low-grade video camera. Proceed- ings of the IEEE Workshop on Real-Time Vision for Human-Computer Interaction (RTV4HCI), Washing- ton, D.C., July 2004. [17] O p e n C V L i b r a r y F o r S h a r i n g a n d h e l p i n g u s t o c o d e t h e programme

[18] M.A. Miglietta, G. Bochicchio, and T.M. Scalea. Computer-assisted communication for criticcally ill patients: a pilot study. TRAUMA In- jury, Infection, and Critical Care, Vol. 57, pages 488 493, September 2004.

The Journal of

[19] T. Moriyama, T. Kanade, J.F. Cohn, J. Xiao, Z. Am- badar, J. Gao, and H. Imamura. Automatic recognition of eye blinking in spontaneously occurring behavior. Proceedings of the International Conference on Pat- tern Recognition (ICPR 2002), Vol. IV, pages 7881, Quebec City, Canada, 2002. [20] Opencv library. http://sourceforge.net/projects/opencvlibrary. [21] J. Rurainsky and P. Eisert. Eye center localization using adaptive templates. Proceedings of the CVPR Workshop on Face Processing in Video (FPIV 2004), Washington, D.C., June 28 2004.

S-ar putea să vă placă și