Controlling Projected Computer System Using Hand Gesture Computer Science Essay

Published: Last Edited:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

Lately, the importance of man-machine interface has been recognized by the IT industry and emphasis has been given on the Human-Computer Interaction (HCI). One of the ways of achieving the natural communication between humans and machine is through Gestures. The existing development in this field includes electro-mechanical or magnetic sensing devices such as data gloves [1] which are some of the effective tools for capturing hand gestures. Vision-based hand gesture recognition [2] is a promising alternative to these techniques (mentioned above) because of its potential to provide more natural and non-contact interaction.

In today's world, business presentations, seminars have become a daily routine in IT industry. In case of presentations, seminars etc., it is tedious to use other devices(lasers etc) or computer peripherals(keyboard, mouse etc) while giving presentations. The traditional method of projecting the computer screen and controlling the computer events by keyboard and mouse consumes time as the presenter has to walk all the way to the computer and then do the changes. Another option for this would be to take help from an assistant which cannot be possible every time.

Although many gesture recognition systems are available in the market, most of them are problem-specific and are not cost-effective as well.

So, in this paper we present a solution to control entire computer system, projected on a plane surface, without using any accessories such as data gloves, lasers or LED sticks etc. It will make use of hand gesture recognition techniques to control the computer system. By using various hand gestures, user can interact with our system.


I. Colour band based hand tracking system:

One of the existing technologies present is Colour band Hand Tracking System (CbHTS)[3]. It is slightly different than the traditional mouse. User's fingers will have different coloured bands on them. CbHTS scans specific zone for colour bands, and the dimension of scan zone is determined by applying fuzzy logic on cursor speed of previous image. One of the user's fingertip is treated as the mouse cursor, and single click is defined as the contact of two fingers. Double click is defined as continuous two single clicks. CbHTS concludes the centre of each finger band ,it proceeds to the judgment of clicks by analyzing the difference between time period of the consecutive clicks. But in this method, colours of bands and background plays a vital role. Thus, it requires a camera and an algorithm which can differentiate band and background color precisely. It requires that the surrounding colors should not be same as that of the color bands and the band colours should have noticeable difference in their RGB levels. Also only predefined colors can be used this type of tracking system.

II.. Computer control using LED stick:

This is another technology that is used for gesture recognition in front of the projector. A stick is used in this method and the stick has a LED attached to its tip. When it is moved in front of the projected image, the camera detects glowing LED and moves the cursor to the respective location on the screen. The stick has two buttons, one for left-click and another for right-click. When these buttons are pressed , the respective actions are performed at the location pointed by the stick. This technique is expensive to implement, since it requires number of hardware components. Also it is not flexible to use a stick every time. Color of the LED matters in this vision based tracking system as in some cases its color may not get detected because of the background's color.

III. Kinect based hand tracking :

Kinect[4] is 3-D camera, with high depth sensing capability, it has given birth to various exciting systems in human-machine interaction. One potential application is in the "playing games with hand motion" environment where a person can play games on computer without physically touching consoles etc. With Kinect, processing is fast and it gives the higher efficiency. But affording it for the general purpose is very costly and can't be used for normal activities. Also the size of it cannot be accommodated in the desired location/manner.


The proposed system will be able to handle computer events using hand gestures only. The application will be cost effective since it will require a normal web camera for capturing the runtime changes in front of the projected screen. Camera will be placed in such a position, that it captures complete projected image. When the user will be interacting with the computer system, using hand gestures, the web camera will capture images at regular intervals, only when it detects any movement. The projected images captured by the web camera will be processed using various Image Processing algorithms to recognize the hand gestures. Finally, the mapping of hand gesture and mouse movements will be performed to invoke computer event(s) (single click, double click etc.) at the intended screen position.


Projected Screen Detection.

When the application starts, it first captures and detects the projected screen using web camera.

It then transforms the image into a perfect rectangle, if necessary.

Gesture Detection.

Hand is detected in the captured image.

After that, the application recognizes hand gestures, if any.

Invoking an event.

Intended position for the event is mapped on desktop.

According to the recognized hand gesture, corresponding computer event is invoked.

Fig. 1 System flowchart.


A. Projected Screen Detection

In this method, first a completely white(Fig 2.A) and black(Fig 2.B) images are projected and captured. Then both the images are subtracted[5] to obtain projected screen region(Fig 2.C).

D:\project\be\test images\screen\w.jpgD:\project\be\test images\screen\b.jpg

(A) (B)


Fig. 2 (A) White projected screen

(B) Black projected screen

(C) Subtraction of (A) and (B)

B. Perspective Transformation

Using screen coordinates found previously, perspective transformation[6] is applied to result a perfect rectangle as shown in Fig. 3.

H:\BE PROJECT\be\IEEE paper 29-8-2012\0\after trasformation.png

Fig. 3 Projected screen obtained after applying Perspective Transformation

C. Hand Detection

Hand can be detected by tracking the moving object (human body) in front of the projected screen. Hand specifically is detected using trained classifiers[8].

D. Gesture Recognition

For identifying the gestures, we find finger tips [Fig. 4.a]. Then, gestures are recognized[9]. For example, index finger and thumb is used for moving the cursor as shown in Fig. 4.b.

H:\BE PROJECT\be\IEEE paper 29-8-2012\0\paper images\229.jpgH:\BE PROJECT\be\IEEE paper 29-8-2012\0\paper images\106.jpg

(A) (B)

Fig. 4 Gesture recognition.

A. Gestures are recognized from finger tips

B. Thumb and index finger is used to move the cursor

E. Invoking Computer Event

The final stage is to invoke a computer event corresponding to the hand gesture. When we get the hand gesture, mapping is done between the hand position (which acts as a cursor) and the actual screen coordinates. Accordingly, the corresponding computer event like single click, double click or drag etc. is invoked at the hand position.


Operating requirements for the system are listed below :-

A. User Interfaces

Frontend software : Developed GUI Application software.

System type : Stand-alone system.

B. Hardware Interfaces

RAM : 1 GB (min.)

Web Camera : 8 MP (min.)

Other : Projector (long throw)

C. Software Interfaces

Operating System : WindowsXP/7/8, Linux

Framework : JavaCV (OpenCV wrapper for Java)

Programming Language : Java (JDK6.1)

Database : XML,File system.

Others : Windows APIs


This software can be used to give presentations. One can change the slides, play custom animations by merely standing beside the projected screen and making corresponding gestures.

Using this system, user can control the computer system through gestures only, by performing click or double click on the icons, scrolling windows and much more.

The system enables a user to be more creative and expressive, as, while presentations the system simulates a touch screen (projection).

Compared to other similar solutions, this system is cheaper and general purpose.