Mobile Eye Based Human Computer Interaction Computer Science Essay

Published:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

Eye based human computer interaction techniques have thus far been limited to closed environments. However advancement in technology and advent of mobile computing promises to bring eye based interaction techniques to wearable and handled mobile devices. This paper investigates recent methods to track eye movements in the mobile environment. Methods were evaluated by comparing the techniques they rely on, type of work, use of theoretical proofs and simulations.

Mobile computing is the modern way of interacting with digital objects and content. The traditional computing devices such as desktop and laptops computers would soon be replaced by their more mobile and portable counterpart smart phones and tablet computers. As mobile computing techniques evolve, users will able to interact with their mobile and stationary devices in a variety of different ways like through touch, eye movements, voice commands, physical gestures and also the emerging bioelectrical signal detection based approaches, a few of which being already available in today's commercial products. These different forms of modalities are collectively called Multimodal Human Computer Interaction (MMHCI) [10] techniques; they facilitate increased interaction, increased usability, and increased ability to retrieve data remotely without much physical exertion.

Systems providing MMHCI techniques would require incorporating advanced interactive electronic visual display, video cameras, microphones and other bio-metric sensors, providing a source of input data from daily life settings. These futuristic systems mainly rely on powerful processor to compute data from multiple sources in real time with optimised battery power usage. Current research on MMHCI is focused on differential approach, each interaction techniques here are treated as a separate modality, whereas integrating these different modalities would result in a highly sophisticated cross platform device. A large number of applications and services can be built for these devices by combining these devices with existing mobile telephone network infrastructure.

Amongst the different modalities, Eye based Human Computer Interaction is one of the complex and interesting techniques which has been in HCI research for over two decades [7] and has been making steady progress. The technique of Eye tracking involves recording and measuring eye moments of the user to know where the point of gaze is and also to record the sequence of eyes movements over different source of information. Eye tracking data can be effectively used for interface-evaluation - aiding in the process of designing better interfaces and as control signal to interact with interfaces [14] directly without or with minimal use from traditional input devices. Research on eye tracking has also focused on making use of gaze gestures [5] along with eye tracking to provide more accurate tracking results.

Movement of eyes have been shown to be reflective of cognitive processes [2], which has been a great source of motivation of HCI researches to device mechanisms to gather eye movement data and perform investigations. Eye based techniques also have the advantage of being unobtrusive and a natural way of interaction with machines and hence form an important type of input mechanism for machines.

In recent years, Multimodal Interaction techniques are increasingly used in portable electronic devices to enhance the overall user experience. Visual interaction with digital objects is becoming more important with the ongoing convergence of phone, computer, television and Internet. This has led to a significant amount of research activity on Eye-based Mobile Interaction and interactive mobile phones [8, 11, 12, 13, 20, 26].

1.1 Problem definition

In this paper, a survey is presented to explore power and possibilities of Mobile Eye based Human Computer Interaction techniques and also to study design issues and challenges in the area. Here a qualitative analysis is made on the different Mobile Eye based Human Computer Interaction systems to identify their strengths and weaknesses. Further a proposal is made on the possible future trends of these systems.

The main goal of this study is to explore some of the recent research in Mobile Eye based Human Computer Interaction, to perform a feasibility study on recognising human activity using eye movement analysis, to gather details of best practices in design and development of these innovative systems and to establish a base for possible future research.

The main problem statements in this paper can be outlined as follows:

Identify Mobile Eye based Human Computer Interaction techniques.

Organize and analyze the approaches used for building Mobile Eye based Human Computer Interaction systems.

Define the strength and weakness of each system.

Infer enhancements and enrichment that could be added to the systems.

1.2 Motivation

The main reason for working on the topic of Mobile Eye based Human Computer Interaction was to study this amazing interaction technique that would make way for a natural and comfortable interaction with machines, not for normal individuals but also for people with physical disabilities and also it is an opportunity to learn some of the innovative applications in mobile computing and HCI. I was particularly inspired from research project titled NETRA: Interactive Display for Estimating Refractive Errors and Focal Range [21], a system which measure eyes focusing ability using a mobile phone, a truly innovative and low cost system developed at MIT lab. Though this is not a typical eye tracking system, it makes use of eye as input for mobile application with the help of a small hardware and a normal touch screen mobile phone, an innovative approach indeed.

Eye tracking products are already available in the market, some of the popular commercial ones are Mobile Eye by Applied Sci­ence Laboratories [15], iView X HED by SensoMotoric Instruments [16] and Tobii Technology eye tracking and eye control [17] there are also several products available from open source projects like openEyes [18], Opengazer [19] and ITU Gaze Tracker [20]. They defer not only in architecture but also sensing, tracking and usability. In this paper, I will be closely examining several researches on Mobile Eye based Human Computer Interaction, which will give me a chance to gain in-depth knowledge and hands-on experience on an emerging technology. In the years to come with rapid advancement of technology, Eye based Human Computer Interaction would soon become part of normal way of living.

1.3 Report organization

This paper analyzes the recent Mobile Eye based Human Computer Interaction systems and conceives ideas for possible futuristic applications. The report is structured as follows:

Chapter 2 gives an overview of previous research on eye-tracking. It defines the concept, introduces to certain terminologies, components and challenges in this area.

Chapter 3 evaluates the techniques from literature survey. It describes their architecture, components, information flow and gives an analysis of its strengths and weaknesses.

Chapter 4 addresses the future trends mobile eye based human computer interaction.

Finally, Chapter 5 provides conclusions and comments from the study.

Chapter 2

Previous research on Eye based HCI

Eye movement recognition and tracking is an interesting area of research in the field of HCI, psychology and cognitive linguistics, as eye moments has been found to be reflective of not only attention but also cognitive processes [2]. Eye tracking has been used in reading research for over 100 years [3]. Rayner (1998) [27] [28] provides a wonderful classification on history of eye tracking by dividing it into three eras starting from approximately 1879 till 1998. In the first era interesting facts about basic eye movements are said to discovered, in the second era behavioural aspects of eye movements are mapped to experimental psychology and in the third era (1970 onwards) significant growth was seen in recording and measurement of eye movements. In the recent years eye tracking has seen significant interest in the field of user interface design [29] and simulations, number of research have increased and can be seen predominantly in Eye Tracking Research and Applications Symposium (ETRA) conference series sponsored by the Association for Computer Machinery (ACM).

I identified three different approaches used for eye tracking until now: 1) Invasive - Wearable contact lenses attached with metal coil to measure electromagnetic fluctuations [31], 2) Non invasive - Video based eye tracking and Infrared reflection detection, 3) Biometric sensors - electrodes are placed around eye which pick up electro-oculography (EOG) signals from which gaze direction can be estimated. Invasive approach has now been discarded, non-invasive and biometric sensors based approaches have been greatly experimented upon.

2.1 Terminologies

Eyes are always in a state of motion, collecting data from wide range of sources around us. Eye moments have been identified by definitive terminologies, each of which can be interpreted differently depending on the duration of vision and interface used for tracking. Some of the basic terminologies are,

Gaze - Direction of vision.

Saccades - Simultaneous movement of both eyes in same direction.

Fixation - Short pause in the movement of eye to observe interesting information.

Scanpath - Sequence of saccades and fixation (straight line reading).

2.2 Research Challenges

Eye tracking research faced multiple challenges in the past when eye tracking systems were bulky, unreliable and time consuming. Advancement in technology has eliminated some of these limitations and has made eye tracking systems portability and stable. But today still a large number of challenges exit apart from the constant demand for low cost, increased power and miniaturization.

Eye tracking systems have some basic issues such as user with eye glass or contact lenses block the normal path of reflection, which is an open issue for prototypes depending on reflection from eye. Systems have difficulty in identifying users with large pupil and dropping eye lids. Eyes are sensitive organs so any operation involving eyes as input should be natural, intuitive and ergonomic.

Eye tracking systems have some of the complex issues as well, architecture of the system should be non-intrusive - it should not block the normal vision of its user. Systems should incorporate accessory free and wireless gesture recognition techniques. Gestures should be developed based on concrete context and should be standardised for all the devices. Designing an eye tracking interface is challenging task, system should follow some important guidelines: 1) system should not trigger event for normal eye movements - "Midas touch" problem, 2) cultural issues have to be considered while developing gestures, 3) operations should be sequenced in order and 4) system should be context aware [32]. As with the challenges there are also plenty of opportunities.

Chapter 3

Mobile Eye based HCI: Literature Survey

Dynamicity of the modern life requires devices that are capable of performing complex operations and yet can be carried places within a pocket. Eye tracking is a highly challenging technique for stationary settings and even more for mobile settings because it involves constant monitoring of rapid eye movements. A variety of different solutions have been proposed to track eye movements and to cope with unstable mobile environment. These solutions have been previously classified into different categories. Andrew Duchowski [33] provides one such wonderful classification from system analysis point of view as Interactive or Diagnostic systems and also sub divided them on the type of interfaces used such as screen-based or Model based.

I categorized seven of the recent research papers on mobile eye tracking from the hardware point of view as Wearable or Handheld Mobile device based approaches and in the process determined the best suited eye tracking technology for mobile environment. In this section an overview of different architectural approaches used to build eye tracking applications is given, with emphasis on research direction, technology and results from theoretical proofs or simulations.

Mobile Eye Tracking Systems (7)

Wearable systems (4) Handheld Mobile device based systems (3)

Figure 1: Organization of eye tracking applications

3.1 Wearable systems

Wearable computing systems have been primarily used in the field of medicine and behavioural sciences, recent advancements in technology and the growing demand for portable communication and entertainment has brought out multiple full time wearable devices such as audio players, watches and mobile phones with innovative features. Anywhere and all time accessibility features make wearable computing systems more attractive. Wearable gesture identification system could revolutionise the way we interact and use everyday objects, going further wearable eye tracking or eye gesture recognition system could provide more private and intuitive way of interaction with digital objects, these systems will not come the way of other while using unlike some of the other similar systems based on hand gesture or voice commands.

Wearable eye tracking solutions bring with them their own set of complexities. Some of the common issues faced are: 1) Positioning of the sensors to track eye movements - sensors should not be bulky nor come in the normal view of the user, 2) Changes in ambient light - system should adjust to changing lighting conditions, 3) Recalibration - system should not require constant recalibration for changing environment and users, 4) Computational power should be minimum, etc. These issues have been overcome to certain extent by some of the recent applications by using innovative solutions like cornea limbus tracking, cornea reflex method and electro-oculography (EOG). In this section a review of recent articles dealing with such solutions is presented.

3.1.1 Review of Aided Eyes: Eye Activity Sensing for Daily Life [21]

Yoshio Ishiguro and his colleagues have proposed a prototype named Aided Eyes [21] which involves real-time tracking and recording of eye movement to provide cognitive assistance to users. This system is used to create lifelog (visual recording) of wearer's daily activities which can be helpful as an aid to human memory. The prototype consists of small phototransistors, infrared LEDs and a video camera mounted on wearable eye goggles. An innovative eye tracking technique called limbus tracking is used, which measures eye activity by measuring infrared light reflected by the eye. Developing a video lifelog system with gaze information was the main objective of the authors.

The prototype composed of four phototransistors, two infrared LEDs and a small camera, all mounted on eye goggles. Additionally for signal conversion an amplifier and analog/digital (AD) converter and for computation a micro-processing unit (MPU with clock frequency of 16MHz from Atmel) are used. Phototransistors capture the infrared light reflected from eye surface, these sensor values are then passed to AD converter and finally to MPU. Camera is used to capture surrounding data as seen by the user and not to track the eye itself. Before usage the system is calibrated by making users to focus at a target object on the display and collecting senor values. These sensor values are compared with recorded data, along with comparison data center of gravity is calculated to get accurate gaze direction. Lifelog computing is done by detecting the gazed information from the video. After detection the gazed information corresponding user's eye movements are extracted and matched for relevancy, finally only the filtered data is logged in to the lifelog database. Real time object recognition is achieved using SURF [34], face recognition by haar-like objects from "OpenCV library" [34] with time log and location. Text logging is achieved with the OCR technology "tesseractocr" [36] which extracts text from images and then logs it with time and location details.

Results from case study on the system reveal 99% accuracy in detecting eye blinks within a time span of 3 minutes and also the system observed that blink frequency changed with the task, i.e. blinks were slower when user concentrated and faster when they did not. Real time object recognition was also tested with users requiring to observe each of the 100 pre-registered posters in a room, results from which show the system performed well in accurately picking up details of the poster by using gaze information. With highly positive results from the experimental runs, authors decide to further their accuracy and wearability features by adding a high resolution strap-on camera along with an existing camera on the goggles, which recorded data in resolution better resolution then camera on the goggles. Aided Eyes has one drawback, the phototransistors integrated in the goggle blocks view of users. This visual obstruction is eliminated using a transmissive sensor system, in which acrylic boards used to guide the infrared reflections from eye to a phototransistor placed out of user's view.

Aided Eyes is a wonderful attempt in developing a wearable human memory enhancement system. The prototype highly portable and can be carried every wear. A complete user study of the system would have provided more information. The authors have not mentioned the power resource required for the system. Most users might not be comfortable wearing an eye goggle as a device which can be a drawback, but in favour of system in the near future environment factors such pollution might compel people to use protective eye gear, in such a scenario systems like Aided Eyes could prove successful.

3.1.2 Review of Wireless Head Cap for EOG and Facial EMG Measurements [22]

Antti Vehkaoja and his colleagues propose a Wireless head cap for EOG and facial EMG measurement. This is an interesting light-weight application to capture eye movements and facial muscle activity. The system consists of a head cap with integrated electrodes embroidered from silver coated thread and a radio circuit board. A signal receiver connected to a PC receives the signal from the head cap which is recorded and interpreted. The main object for development of this system was to be used as tool for human emotion studies and also as a controller for computer interface.

The system makes use of 5 electrodes, which are located on the front section of the cap to measure biometric signals from forehead of the user. Adjacent electrodes pairs are used to measure EOG and facial electromyography signals (fEMG) - which captures signals generated from vertical eye movement and movement of facial muscles, signal from a non adjacent electrode pair provide a third channel of data in the form of horizontal eye movement signals which is computed using vector algebra.

The proposed system was published in the year 2005, so the hardware used to collecting and transmitting data are relatively old. The authors have not evaluated system in operation as a result performance data is not available. Also much of the methodology described in the paper is about frequency in which data is being transferred to the receiver and the use of amplifier to measure signals in terms of few tens of micro volt amplitude. The authors have done an experimental evaluation of the system but have not been able to clearly explain their results on influence of noise characteristics, the only positive results shown is the quality of measure of the signals from the electrodes which have been shown to be good within a maximum transfer distance of approximately 3 meters between the transmitter and receiver.

Wireless Head Cap prototype for EOG and facial EMG measurement is a novel idea. One of the most challenging aspects of developing personal computing devices is making them wearable. Authors here have attempted one such approach and have been successful to a certain extent. They also intend to develop this system further with additional measurement capabilities and a graphical interface for real time usage.

3.1.3 Review of Full-time Wearable Headphone-Type Gaze Detector [23]

Hiroyuki Manabe and Masaaki Fukumoto have proposed an eye tracker embedded in a commercial headphone. In this approach electro-oculography (EOG) based gaze detectors are used, these detectors consist of an array of electrodes attached to the headphone cushion, near wearer's eye to measure the electric field and in turn estimate gaze direction using Kalman filter. EOG signals have been proven to provide gaze accuracy within angular range of ±50O horizontal and ±30O vertically [24], which was a motivation factor for authors to take up EOG signal measurement for gaze detection.

This prototype makes use of 4 headphone electrodes on either side and 2 electrodes for earth and reference. Electrodes measure signals when the wearer's face is fixed at a reference target, they detect both horizontal and vertical components. EOG signal value is the differential signal value between each electrode, this value is then equated with values representing drift and noise, finally by applying Kalman filter, the final signal value are got which is closer to the true signal. Gaze position is got by computing state transition matrix, driving matrix and plant noise.

A total of 16 horizontal EOGs and 36 vertical EOGs are calculated. The system estimates Gaze position for every 32 micro seconds. Measured EOG signals are compared with recorded EOG signals and gaze position of eye is estimated in three different experimental stages 1) measurement of suppression in drift, 2) robustness against electrode disconnection, and 3) accuracy of gaze estimation. Drift here is the measure of difference between the reference and measured signal value from each electrode when the user is gazing at a particular target. Results from the experiment show reduction in drift by factors in the range 2 to 10, accurate values are estimated even when an electrode is detached from array and also accuracy of gaze is estimated to have an error of 4.4o horizontally and 8.3o vertically. Along with these positive results the system also has some issues such as low signal to noise ratio (SNR) and the separation of horizontal and vertical components (because of only sideways measurement). As a future work the authors indent to make the system more robust and accurate with reduced estimation errors.

Headphone-type gaze detector interface is an innovative way of eye tracking system. It is an improvement from the ordinary EOG detection systems but still has some issues with blink detection. In comparison with system like Aided Eye [21], the proposed system does not obscure the wearer's vision, huge popularity of digital media players and mobile phones is an added advantage as users can buy an audio accessory which will not act as audio devices but also gaze detection interface. This system would find a wide range of commercial applications. It can be used as controller to other device or as data synchronizer just by a gaze.

3.1.4 Review of Wearable EOG Goggles: Eye-Based Interaction in Everyday Environments [6]

Andreas Bulling and his colleagues demonstrate an eye tracker embedded in electro-oculography (EOG) goggles, a highly portable and self-contained system for real-time eye motion detection. This system consists of multiple electrodes integrated into the frame of the eye goggles and pocket sized microcontroller to process the signals. This prototype was developed with clear design goals for portability, unobtrusiveness and real time signal processing capabilities.

The miniature hardware consists of five dry electrodes and a small analogue amplification circuit board integrated with eye goggles and a processing unit. Four electrodes are arranged around the left eye frame and a fifth is above right eye, the set up also contains light sensor on the frame between the eyes. The processing unit is called wearable EOG processing unit (WEPU), which is a digital signal processor (DSP) containing 24 bit analog-digital converter (ADC), Bluetooth and MMC module are provided for data transmission and storage.

In order to correctly identify the eye gesture and to distinguish them from normal eye movements, three different signal processing steps are performed. First blink is detected and removed from vertical EOG signal. Second simultaneous movement of both eyes in same direction (called saccades) are detected using Continuous Wavelet Transform - Saccade Detection (CWT-SD) algorithm by computing 1-D wavelet coefficients from the measured signals. Third step is eye gesture recognition, here the basic left, right, up and down movements are encoded into symbols L, R, U, D respectively and the diagonal eye movements which causes saccades in both EOG signal component is represented by symbols 1, 3, 7 and 9. To recognise gesture with multiple eye movements a string matching approach is used, where combined string sequences from basic eye movements is compared with templates representing specific gesture. The system is also evaluated for performance in stationary settings. Results for experiment suggest accuracy in eye gesture recognition with EOG signals and also EOG as being robust for HCI applications.

Wearable EOG Goggles is a state-of-the-art, and highly portable system. Eye gestures are effectively recognised in this system, combined with its self-contained settings makes it highly suitable for mobile settings. The authors have not presented a comparative evaluation of the system with other such system which would have clearly indicated the superiority of the prototype. This system was developed with clear design and system goals which are reflected in system performance.

Research direction

Technique

New concept or Improvement of previous work

Theoretical proof

Comparison

Aided Eyes

Full-time Wearable Headphone

Wireless Head Cap

Wearable EOG Goggles

3.2 Review of Handheld Mobile device based systems

Handheld mobile devices with their increasing computing ability and touch screen technology have revolutionized the way we communicate and interact. Additionally today's mobile devices have advanced microphones, multiple cameras, accelerometer and gyroscope integrated into a single device, which makes it more suitable for addition of new interaction modalities.

Issue faced for integration of eye tracking applications in mobile devices are 1) Self sustainability - device should not depend on bulky external hardware for eye tracking, 2) Instantaneous - reaction time of the software should be fast, 3) High resolution camera with sophisticated image stability - for accuracy, 4) Performance - eye tracking application should not consume all the battery and computational power. Mobile devices are going through a revolution now. To develop an eye tracking system on these devices at this point in time would require dependence on external hardware but if we look at the growth rate of the technology, the day is not far away when full-fledged eye tracking and gaze recognition systems would be available in them. Several ongoing researches are focused on overcoming these limitations by making most of innovative hardware and software. This section provides a review on some of the recent articles on handheld mobile eye tracking systems.

3.2.1 Review of Eye-gaze interaction for mobile phones [8]

Heiko Drewes and his colleagues provide a wonderful analysis on two different eye-gaze interaction techniques which can be used in mobile phones. The prototype was created using existing commercial eye tracker in combination with a mobile phone (Nokia N80). Results from user study show that users liked the new form of interaction technique. The main motivation for developing this prototype is to evaluate suitability of the classical gaze interaction technique of dwell-time for command activation and innovative gaze gesture technique.

Hardware limitations of previous generation mobile phones to support new complex techniques compelled authors to look for a realistic alternative. They design this experimental setup by combining an existing commercial eye-tracker (ERICA) with a mobile phone connected via wireless LAN. Additionally to keep the phone position stable, it is attached to the screen of ERICA with a wooden apparatus, the camera used for eye tracking is attached below the screen. Further to stabilize users head position a chin rest is also set up in front of the screen. Gaze-inputs from the users are first detected by the eye tracker in terms of screen coordinates which is then translated to mobile screen coordinates and transferred to the phone through custom software. The received data is interpreted by a Java based phone application. This complex setup requires two-step calibration, first the user is calibrated to the screen of eye tracker by standard mechanisms of ERICA and then to mobile phone screen.

Algorithms used for eye gesture recognition is adopted from popular mouse gesture plugin-in for the Firefox browser and EdgeWrite. The mouse gesture recognition technique provides output as a character representing direction of movement in x-y position. The basic left, right, up and down movements are encoded into symbols L, R, U, D respectively and the diagonal eye movements are represented by symbols 1, 3, 7 and 9. EdgeWrite algorithm identifies the direction in which four corners of a square are reached by an input device. Both the algorithms have been modified to be capable of continues recognition rather than wait for a click event and a time out is added to eliminate false recognition of natural eye movements.

In order to identify the best suited eye input technique two phase evaluation is performed on the prototype. The first phase required users to perform two different tasks of mobile phone and answer few questions, second phase involved analysis of user gaze dataset. Tasks in the first phase required user to interact with a phonebook application based on dwell time commands which are executed depending on duration of gaze at particular regions of screen. A gaze towards top or bottom of screen would invoke scroll command and gaze at a name on the centre of the screen would invoke call command. User study is conducted with eight participants aged between 23 and 50. Results from the first phase indicate that subjects performed both the task easily, two subjects preferred gesture while a majority of five preferred dwell-time. In second phase results, authors analysed 37 minutes of recorded data from which they found only one gesture was identified as a command, when user was not performing a gesture, which is a positive outcome for the system and its gesture commands.

Eye-gaze interaction for mobile phones is very interesting article which explored suitability of different eye-gaze interaction techniques on mobile phones in spite of hardware limitations. Also results showed users were comfortable in using the new techniques. The setup here took the mobile device back to stationary position and they also made of use of chin rest to stabilize user movements which were two major drawbacks.

3.2.2 Review of EyePhone: Activating Mobile Phones with Your Eyes [13]

Emiliano Miluzzo and his colleagues propose EyePhone system capable of tracking eye movements of users using camera mounted in front of the mobile phones. Here eye tracking and gaze position detection are achieved using light weight machine learning algorithms. This system makes use of eye blink as input to trigger target application; it is built on a Nokia N810 tablet by making use of only the in-built hardware. The prototype was developed with aim to avail eye tracking on mobile phone with minimum use of external device and phone battery power.

Eye tracking and blink detection approach used in this system is a continuation of a similar work designed for desktop machine and a USB camera [25]. The original algorithm is modified to suite to the mobile device with limited resources. The algorithm has four operational phases: 1) Eye detection, 2) Open eye template creation, 3) Eye tracking and 4) Blink detection. Eye detection is done by motion analysis technology on consecutive frames of images and gaze position is got by eye contours recognition. An eye template is created when users access the application for the first time. This template is then saved in the present memory and is fetched when application is invoked. Eye tracking is done using template matching function and by calculating correlation score between the template and a search window. The search window typically looks for right eye location at nine equally divided positions on the phone display screen. A correlation co-efficient of 0.4 indicates eye at particular location in the search window. Blink detection is done by thresholding technique for the normalized correlation co-efficient got from template matching. As mobile camera captures images in less quality, when iris is turned towards corner of eye a blink is detected even if the eye is open, so the authors derive four thresholds mathematically to distinguish eye movements and blinks.

System evaluation is performed for different lighting condition. Results from the evaluation showed that system had accuracy of 84% in day light with a stationary subject, if the subject was walking, a drop in accuracy is anticipated and also system showed less accuracy level under artificial lighting conditions in comparison to day light settings. The results also show a drop in accuracy levels if the distance between eye and the tablet is increased by 18-12cm. EyePhone is also shown to be lightweight in terms of CPU needs and consumed 40% of battery for 3 hour usage which is optimal. The authors have also developed an application EyeMenu to test usability of the phone. In which an application is selected when the phone detects gaze on one of the menu button, followed by a blink.

EyePhone is a very good attempt in using only the mobile phone resources for eye tracking. The authors have made full use of the hardware and software capabilities but still the system can be optimised in several fronts.

3.2.3 Review of MobiGaze: Development of a Gaze Interface for Handheld Mobile Devices [26]

Takashi Nagamatsu and his colleagues introduce an innovative prototype called MobiGaze, implemented for a touch screen mobile device. It is operable by both gaze and touch, which makes it a unique mobile interface. The system is built using an iPod touch, two cameras with IR-LED and a notebook PC. Gaze detection is done using the two stereo cameras, which detect user's line of sight in 3D. The prototype is developed to address some of the issues in the recent touch screen mobile devices like operability with single hand and blocking of display areas by the thumb.

Cameras used in the system are equipped with a 1/3'' CMOS image sensor having a resolution of 752 Ã- 480 pixels, a 25 mm lens and an IR filter near the nodal point. These cameras are connected to the mobile device (iPod touch 3.1, Apple Inc) via IEEE 1394. Since the mobile device does not have enough computational power to process eye images it is connect to a Windows-based notebook PC via wireless LAN. Point of gaze is estimate mathematically using stereo gaze tracking method [38], which can identify centre of the cornea and the vector along the line of sight of the cameras. Also users are calibrated to system using one-point calibration method [37], by calculating an offset between optical axis (imaginary path from geometric center of eyes) and visual (imaginary path from eyes to point of gaze) axis.

This interface uses eyes as cursor and the pointed objects are select only if the user touches the screen (anywhere), this approach avoids "Midas-touch" problem and facilitates easy single handed usability of touch screens. The authors have demonstrated usability by developing map browser application, where users can perform zoom-in operation by gazing at an area on the map and touching the screen. They have also extended the prototype for dual screens by integrating an additional iPod touch adjacent to the initial prototype. The top screen serves as gaze interface and bottom one for touch, display between both the screens is synchronized so that users can gaze at a particular option and select it by tapping on the bottom screen. Usability of the extended prototype is tested using the map browser application and also by developing a new browser application where users can select a category presented on the top screen to be displayed on the bottom screen, much like a button press event simulation. Results from experiment indicate an accuracy of 0.5 - 1.0o in gaze tracking.

MobiGaze is very innovative prototype which combines eye tracking technology with the emerging touch screen mobile devices, it has good accuracy. The authors have not performed usability testing, which could provide more insights into how different people react to such a multimodal interface and also on features that can be improved. The bulky gear to hold the dual camera is also a drawback. Variation of accuracy in gaze detection with increased distance between eye and the cameras are yet to be evaluated. Increasing accuracy is clearly on the maker agenda and they are also trying to make it a self-contained system. An improved and portable MobiGaze could be a revolution in the mobile device market.

Research direction

Technique

New concept or Improvement of previous work

Theoretical proof

Comparison

Eye gaze interaction for mobile phone

EyePhone

MobiGaze

Chapter 4

Future trends

Future of mobile eye based human computer interaction looks fascinating. It has a great potential to provide easy access and remote controlling features on every day applications in the very near future, this technology it will revolutionize the way we interact with electronic devices.

Future research in this area needs to address standardizing eye tracking metrics and gesture, user should not be required to learn new gestures every time they switch to a new device. Another significant research problem is to track eyes in complex scenarios which involve multiple users coordinating and interacting. New eye tracking applications need to be developed to detect behavioural signals to identify emotional state and attempt to predict behavioural intensions. Error recovery and security aspects of eye based interface should be taken up parallel to development. The primary challenge in eye based human computer interaction is facilitating transition from traditional computer interaction techniques to new natural interaction techniques in a form that is easy to understand.

The race for a ubiquitous mobile device has just begun. Future manufactures will be competing to gain market share with more innovative products having multimodal interaction facilities. Product designers should focus on using right combination of modalities particular job, rather than forcing modalities to an unsuitable job.

Eyes are a source of biometric identification which can be leveraged by the eye tracking application to provide customized interfaces, access to digital media, physical access to places and objects and also futuristic shopping facilities, the possibilities are endless.

Chapter 5

Conclusion

This paper is focusing on exploring mobile eye based Human computer interaction techniques, detecting its strengths and weakness, identifying possible enhancements and future trends.

From the literature survey performed on the eye based Human computer interaction techniques, it can be concluded that these techniques posses new and efficient means of communicating information between computers and users, also techniques could enhances the immersive feelings and make the entire process of human computer interaction more natural. The combined strengths of multiple modalities such as touch and eye gesture techniques were also seen. However, weakness were also present: some of the interfaces were obstructive, bulky, eye blink detection was not accurate, re-calibration overhead, lack of interpretability, privacy and security aspects which are yet to be implemented.

Some of the prototypes reviewed lacked important modules to be successful eye tracking system. The best method among the systems was Wearable EOG Goggles: Eye-Based Interaction in Everyday Environments [6], it defined the components necessary to build the application and the interaction techniques very clearly. The suggested design is light weight, non intrusive and fulfills the demands of application. This conclusion is based on usability and accuracy of the system. Further investigations and comparison of performance with other parameters and prototypes could alter the conclusion.

Writing Services

Essay Writing
Service

Find out how the very best essay writing service can help you accomplish more and achieve higher marks today.

Assignment Writing Service

From complicated assignments to tricky tasks, our experts can tackle virtually any question thrown at them.

Dissertation Writing Service

A dissertation (also known as a thesis or research project) is probably the most important piece of work for any student! From full dissertations to individual chapters, we’re on hand to support you.

Coursework Writing Service

Our expert qualified writers can help you get your coursework right first time, every time.

Dissertation Proposal Service

The first step to completing a dissertation is to create a proposal that talks about what you wish to do. Our experts can design suitable methodologies - perfect to help you get started with a dissertation.

Report Writing
Service

Reports for any audience. Perfectly structured, professionally written, and tailored to suit your exact requirements.

Essay Skeleton Answer Service

If you’re just looking for some help to get started on an essay, our outline service provides you with a perfect essay plan.

Marking & Proofreading Service

Not sure if your work is hitting the mark? Struggling to get feedback from your lecturer? Our premium marking service was created just for you - get the feedback you deserve now.

Exam Revision
Service

Exams can be one of the most stressful experiences you’ll ever have! Revision is key, and we’re here to help. With custom created revision notes and exam answers, you’ll never feel underprepared again.