logo资料库

人机交互中的眼动跟踪Eye Gaze Tracking for Human Computer Interaction.pdf

第1页 / 共166页
第2页 / 共166页
第3页 / 共166页
第4页 / 共166页
第5页 / 共166页
第6页 / 共166页
第7页 / 共166页
第8页 / 共166页
资料共166页,剩余部分请下载后查看
1 Introduction
1.1 Motivation
1.2 Eye Gaze and Human Communication
1.3 Eye Gaze for Computer Input
1.4 Methods and Approach
1.5 Thesis Outline
1.6 Contributions
2 Overview and Related Work
2.1 Definition of Eye Tracking
2.2 History of Eye Tracking
2.3 Application Domains for Eye Tracking
2.4 Technological Basics of Eye Tracking
2.4.1 Methods of Eye Tracking
2.4.2 Video-Based Eye Tracking
2.4.3 The Corneal Reflection Method
2.5 Available Video-Based Eye Tracker Systems
2.5.1 Types of Video-Based Eye Trackers
2.5.2 Low-Cost Open Source Eye Trackers for HCI
2.5.3 Commercial Eye Trackers for HCI
2.5.4 Criteria for the Quality of an Eye Tracker
2.6 The ERICA Eye Tracker
2.6.1 Specifications
2.6.2 Geometry of the Experimental Setup
2.6.3 The ERICA-API
2.7 Related Work on Interaction by Gaze
2.8 Current Challenges
3 The Eye and its Movements
3.1 Anatomy and Movements of the Eye
3.2 Accuracy, Calibration and Anatomy
3.3 Statistics on Saccades and Fixations
3.3.1 The Data
3.3.2 Saccades Lengths
3.3.3 Saccade Speed
3.3.4 Fixation Times
3.3.5 Summary of Statistics
3.4 Speed and Accuracy
3.4.1 Eye Speed Models
3.4.2 Fitts’ Law
3.4.3 The Debate on Fitts’ Law for Eye Movements
3.4.4 The Screen Key Experiment
3.4.5 The Circle Experiment
3.4.6 Ballistic or Feedback-Controlled Saccades
3.4.7 Conclusions on Fitts’ Law for the Eyes
4 Eye Gaze as Pointing Device
4.1 Overview on Pointing Devices
4.1.1 Properties of Pointing Devices
4.1.2 Problems with Traditional Pointing Devices
4.1.3 Problems with Eye Gaze as Pointing Device
4.2 Related Work for Eye Gaze as Pointing Device
4.3 MAGIC Pointing with a Touch-Sensitive Mouse Device
4.4 User Studies with the Touch-Sensitive Mouse Device
4.4.1 First User Study – Testing the Concept
4.4.2 Second User Study – Learning and Hand-Eye Coordination
4.4.3 Third User Study – Raw and Fine Positioning
4.5 A Deeper Understanding of MAGIC Touch
4.6 Summary on the Results for Eye Gaze as Pointing Device
4.7 Conclusions on Eye Gaze as Pointing Device
5 Gaze Gestures
5.1 Related Work on Gaze Gestures
5.2 The Concept of Gaze Gestures
5.2.1 The Firefox/Opera Mouse Gestures
5.2.2 The EdgeWrite Gestures
5.3 The Gaze Gesture Recognition Algorithm
5.4 User Studies and Experiments with Gaze Gestures
5.4.1 First User Study – Testing the Concept
5.4.2 Experiments – When to Use Gaze Gestures
5.4.3 Second User Study – Optimizing the Parameters
5.4.4 Mobile Phone User Study
5.4.5 PIN-Entry User Study
5.5 The Gaze Gesture Alphabet
5.6 Separation of Gaze Gestures from Natural Eye Movements
5.7 Summary of the Results for Gaze Gestures
5.8 Conclusions for Gaze Gestures
6 Eye Gaze as Context Information
6.1 Eye Gaze and Context Awareness
6.2 Related Work for Eye Gaze as Context Information
6.3 A Usability Tool to Record Eye Movements
6.3.1 Explanation of UsaProxy
6.3.2 Extending UsaProxy to Record Eye Movements
6.3.3 Discussion of UsaProxy’s Extension
6.4 Reading Detection
6.4.1 Analysis of the Gaze Path while Reading
6.4.2 An Algorithm for Reading Detection
6.4.3 User Study for Reading Detection
6.4.4 Values for the Quality of Reading
6.5 Gaze-Awareness in E-Learning Environments
6.6 Summary of the Results for Eye Gaze as Context Information
6.7 Conclusions on Eye Gaze as Context Information
7 Conclusions
7.1 Summary of the Results
7.2 Conclusions for Eye Gaze User Interfaces
7.3 How to go on with Eye-Tracking Research for Interaction
7.4 The Future of Gaze-Aware Systems
References
Web References
Acknowledgements
Eye Gaze Tracking for Human Computer Interaction Heiko Drewes Dissertation an der LFE Medien-Informatik der Ludwig-Maximilians-Universität München München 2010
Erstgutachter: Zweitgutachter: Externer Gutachter: Tag der mündlichen Prüfung: Professor Dr. Heinrich Hußmann Professor Dr. Albrecht Schmidt Professor Dr. Alan Dix 18.3.2010
Abstract With a growing number of computer devices around us, and the increasing time we spend for interacting with such devices, we are strongly interested in finding new interaction methods which ease the use of computers or increase interaction efficiency. Eye tracking seems to be a promising technology to achieve this goal. This thesis researches interaction methods based on eye-tracking technology. After a discussion of the limitations of the eyes regarding accuracy and speed, including a general discussion on Fitts’ law, the thesis follows three different approaches on how to utilize eye tracking for computer input. The first approach researches eye gaze as pointing device in combination with a touch sensor for multimodal input and presents a method using a touch sensitive mouse. The second approach examines people’s ability to perform gestures with the eyes for computer input and the separation of gaze gestures from natural eye movements. The third approach deals with the information inherent in the movement of the eyes and its application to assist the user. The thesis presents a usability tool for recording of interaction and gaze activity. It also describes algorithms for reading detection. All approaches present results based on user studies conducted with prototypes developed for the purpose. Zusammenfassung Mit einer wachsenden Zahl von Computergeräten um uns herum und zunehmender Zeit die wir mit der Bedienung dieser Geräte zubringen, haben wir ein großes Interesse daran, Interaktionsmethoden zu finden, welche die Benutzung der Computer erleichtern oder effizienter machen. Blickverfolgung scheint eine viel versprechende Technologie zu sein um dieses Ziel zu erreichen. Die vorliegende Arbeit untersucht Interaktionsmethoden, die auf Blickverfolgertechnik beruhen. Nach einer Diskussion der Beschränkungen des Auges in Bezug auf Genauigkeit und Geschwindigkeit, die eine generelle Diskussion des Fitts’ Law enthält, verfolgt die Arbeit drei verschiedene Ansätze wie Blickverfolgung für Computereingaben benutzt werden kann. Der erste Ansatz untersucht den Blick als Zeigegerät in Kombination mit einem Berührungssensor für multimodale Eingabe und stellt eine Methode mit einer berührungsempfind- lichen Maus vor. Der zweite Ansatz untersucht die Fähigkeit von Menschen Gesten mit den Augen für Computereingaben durchzuführen und wie diese Blickgesten von natürlichen Augenbewegungen unterschieden werden können. Der dritte Ansatz beschäftigt sich mit der Information, die den Augenbewegungen entnommen werden kann, und ihrer Anwendung zur Unterstützung der Benutzer. Es wird ein Usability-Werkzeug zum Auf- zeichnen von Interaktions- und Blickaktivität vorgestellt. Außerdem werden Algorithmen zur Leseerkennung beschrieben. Für alle Ansätze werden Ergebnisse präsentiert, die auf durchgeführten Benutzerstudien mit speziell entwickel- ten Prototypen beruhen.
Introduction Motivation Eye Gaze and Human Communication Eye Gaze for Computer Input Methods and Approach Thesis Outline Contributions Overview and Related Work Definition of Eye Tracking History of Eye Tracking Application Domains for Eye Tracking Technological Basics of Eye Tracking Methods of Eye Tracking Video-Based Eye Tracking The Corneal Reflection Method Available Video-Based Eye Tracker Systems Types of Video-Based Eye Trackers Low-Cost Open Source Eye Trackers for HCI Commercial Eye Trackers for HCI Criteria for the Quality of an Eye Tracker The ERICA Eye Tracker Specifications Geometry of the Experimental Setup The ERICA-API Related Work on Interaction by Gaze Current Challenges The Eye and its Movements Anatomy and Movements of the Eye Accuracy, Calibration and Anatomy Table of Contents 6 6 7 9 11 11 12 14 14 15 16 18 18 19 20 21 21 22 23 25 26 26 27 27 29 31 33 33 35 2 1 1.1 1.2 1.3 1.4 1.5 1.6 2 2.1 2.2 2.3 2.4 2.4.1 2.4.2 2.4.3 2.5 2.5.1 2.5.2 2.5.3 2.5.4 2.6 2.6.1 2.6.2 2.6.3 2.7 2.8 3 3.1 3.2
Statistics on Saccades and Fixations The Data Saccades Lengths Saccade Speed Fixation Times Summary of Statistics Speed and Accuracy Eye Speed Models Fitts’ Law The Debate on Fitts’ Law for Eye Movements The Screen Key Experiment The Circle Experiment Ballistic or Feedback-Controlled Saccades Conclusions on Fitts’ Law for the Eyes Eye Gaze as Pointing Device Overview on Pointing Devices Properties of Pointing Devices Problems with Traditional Pointing Devices Problems with Eye Gaze as Pointing Device Related Work for Eye Gaze as Pointing Device MAGIC Pointing with a Touch-Sensitive Mouse Device User Studies with the Touch-Sensitive Mouse Device First User Study – Testing the Concept Second User Study – Learning and Hand-Eye Coordination Third User Study – Raw and Fine Positioning A Deeper Understanding of MAGIC Touch Summary on the Results for Eye Gaze as Pointing Device Conclusions on Eye Gaze as Pointing Device Gaze Gestures Related Work on Gaze Gestures 3 38 39 39 42 43 45 47 47 48 56 63 65 69 72 74 74 74 75 76 77 79 81 81 87 94 95 97 97 99 99 Table of Contents 3.3 3.3.1 3.3.2 3.3.3 3.3.4 3.3.5 3.4 3.4.1 3.4.2 3.4.3 3.4.4 3.4.5 3.4.6 3.4.7 4 4.1 4.1.1 4.1.2 4.1.3 4.2 4.3 4.4 4.4.1 4.4.2 4.4.3 4.5 4.6 4.7 5 5.1
4 5.2 5.2.1 5.2.2 5.3 5.4 5.4.1 5.4.2 5.4.3 5.4.4 5.4.5 5.5 5.6 5.7 5.8 6 6.1 6.2 6.3 6.3.1 6.3.2 6.3.3 6.4 6.4.1 6.4.2 6.4.3 6.4.4 6.5 6.6 6.7 7 Table of Contents The Concept of Gaze Gestures The Firefox/Opera Mouse Gestures The EdgeWrite Gestures The Gaze Gesture Recognition Algorithm User Studies and Experiments with Gaze Gestures First User Study – Testing the Concept Experiments – When to Use Gaze Gestures Second User Study – Optimizing the Parameters Mobile Phone User Study PIN-Entry User Study The Gaze Gesture Alphabet Separation of Gaze Gestures from Natural Eye Movements Summary of the Results for Gaze Gestures Conclusions for Gaze Gestures Eye Gaze as Context Information Eye Gaze and Context Awareness Related Work for Eye Gaze as Context Information A Usability Tool to Record Eye Movements Explanation of UsaProxy Extending UsaProxy to Record Eye Movements Discussion of UsaProxy’s Extension Reading Detection Analysis of the Gaze Path while Reading An Algorithm for Reading Detection User Study for Reading Detection Values for the Quality of Reading Gaze-Awareness in E-Learning Environments 100 100 101 101 103 103 107 109 111 114 117 119 122 123 124 124 125 127 127 128 130 132 132 133 135 136 138 Summary of the Results for Eye Gaze as Context Information 138 Conclusions on Eye Gaze as Context Information Conclusions 139 140
Summary of the Results Conclusions for Eye Gaze User Interfaces How to go on with Eye-Tracking Research for Interaction The Future of Gaze-Aware Systems Table of Contents 7.1 7.2 7.3 7.4 References Web References Acknowledgements 5 140 141 145 146 149 162 164
Introduction 1 6 Eye Gaze Tracking for Human Computer Interaction 1 Introduction With the invention of the computer in the middle of the last century there was also the need of an interface for users. In the beginning experts used teletype to interface with the computer. Due to the tremendous progress in computer technology in the last decades, the capabilities of computers increased enormously and working with a computer became a normal activity for nearly everybody. With all the possibilities a computer can offer, humans and their interaction with computers are now a limiting factor. This gave rise to a lot of research in the field of HCI (human computer interaction) aiming to make interaction easier, more intuitive, and more efficient. Interaction with computers is not limited to keyboards and printers anymore. Different kinds of pointing devices, touch-sensitive surfaces, high-resolution displays, microphones, and speakers are normal devices for computer interaction nowadays. There are new modalities for computer interaction like speech interaction, input by gestures or by tangible objects with sensors. A further input modality is eye gaze which nowadays finds its application in accessibility systems. Such systems typically use eye gaze as the sole input, but outside the field of accessibility eye gaze can be combined with any other input modality. Therefore, eye gaze could serve as an interaction method beyond the field of accessibility. The aim of this work is to find new forms of interactions utilizing eye gaze and suitable for standard users. 1.1 Motivation Nowadays most eye-tracking systems work on video-based pupil detection and a reflection of an infrared LED. Video cameras became cheap over the last years and the price for a LED is negligible. Many computer devices come already with built-in cameras, such as mobile phones, laptops, and displays. Processor power still increases steadily and standard processors are powerful enough to process the video stream necessary to do eye tracking, at least on desktop and laptop computers. Head-tracking systems, which are necessary to give the users the freedom to move in front of their display, are also video-based. Such systems can be implemented unobtrusively with a second camera. If produced for the mass market, a future standard eye tracker should not cost much more than an optical mouse or a webcam today. Some people interact with the computer all day long, for their work and in their leisure time. As most interaction is done with keyboard and mouse, both using the hands, some people suffer from overstressing particular parts of their hands, typically causing a carpal tunnel syndrome. With a vision of ubiquitous computing, the amount of interaction with computers will increase and we are in need of interaction techniques which do not cause physical problems. The eyes are a good candidate because they move anyway when interacting with computers. Using the information lying in the eye movements could save some interaction, in particular hand-based interaction.
分享到:
收藏