• eyetracking@ethz.ch

Category Archives: Gaze-based interaction

  • -

ETAVI – 2nd Call for Papers

A quick reminder for the “1st International Workshop on Eye-Tracking in Aviation (ETAVI)” that is going to take place March 2019 in Toulouse, France.

The submission deadlines are:

  • Abstracts: 9th September 2019
  • Paper: 30th September 2019

Feel free to also forward the Call for Papers to any interested colleagues.

We look forward to seeing you there!


  • -

Gaze-based interactions in the cockpit of the future: a survey

An article titled “Gaze-based interactions in the cockpit of the future: a survey” will appear in one of the next issues of the Journal on Multimodal User Interfaces. It is now available online:

Abstract Flying an aircraft is a mentally demanding task where pilots must process a vast amount of visual, auditory and vestibular information. They have to control the aircraft by pulling, pushing and turning different knobs and levers, while knowing that mistakes in doing so can have fatal outcomes. Therefore, attempts to improve and optimize these interactions should not increase pilots’ mental workload. By utilizing pilots’ visual attention, gaze-based interactions provide an unobtrusive solution to this. This research is the first to actively involve pilots in the exploration of gaze-based interactions in the cockpit. By distributing a survey among 20 active commercial aviation pilots working for an internationally operating airline, the paper investigates pilots’ perception and needs concerning gaze-based interactions. The results build the foundation for future research, because they not only reflect pilots’ attitudes towards this novel technology, but also provide an overview of situations in which pilots need gaze-based interactions.


  • -

PEGGASUS in the news

Our research project PEGGASUS (Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions) has attracted quite a bit of coverage in the media.

See for yourself in this little press review:


  • -

1st International Workshop on Eye Tracking in Aviation: Call for Papers

We’re glad to announce the 1st International Workshop on Eye Tracking in Aviation (ETAVI), which will take place March 17, 2020 in Toulouse, France.

The workshop aims to bring together researchers and practitioners who have a common interest in using eye tracking in the aviation domain including, but not limited to the cockpit and air traffic control / management.

The keynote will be given by Leonardo Di Stasi an assistant professor at the University of Granada (Spain) with an extensive research background in the field of Aviation and Eye Tracking.

The Call for Papers is now available (Paper submission deadline: September 30, 2019; Abstract submission deadline: September 9, 2019).


  • -

ET4S and ETRA 2019: Impressions

Our group has organized the “Eye Tracking for Spatial Research” event as a track at this year’s ETRA conference in Denver, Colorado. It featured four full paper presentations, one short paper presentation, as well as an invited talk (see program). A dominant topic at this year’s ET4S was augmented/mixed/virtual reality. As a particular highlight, our invited speaker Sophie Stellmach (Senior Scientist at Microsoft) highlighted the fascinating opportunities of HoloLens 2, an upcoming mixed reality device that will have eye tracking capabilities included.

The GeoGazeLab was further involved with Fabian’s talk on “POI-Track: Improving Map-Based Planning with Implicit POI Tracking” and Kuno presenting his work on “Space-Time Volume Visualization of Gaze and Stimulus” in the ETRA main program. A paper co-authored by Martin was presented by one of his co-authors (“Eye Tracking Support for Visual Analytics Systems: Foundations, Current Applications, and Research Challenges”).

 

The invited talk by Sophie Stellmach (Microsoft) …

 

… attracted quite some audience.

 

Testing HoloLens 2 after ET4S.


  • -

Meet us at CHI 2019

We’ll present one full paper and two workshop position papers at CHI in Glasgow this year:

Workshop: Designing for Outdoor Play (4th May, Saturday – 08:00 – 14:00, Room: Alsh 1)

Kiefer, P.(2019) Gaze-guided narratives for location-based games. In CHI 2019 Workshop on “Designing for Outdoor Play”, Glasgow, U.K., DOI: https://doi.org/10.3929/ethz-b-000337913

Workshop: Challenges Using Head-Mounted Displays in Shared and Social Spaces (5th May, Sunday – 08:00 – 14:00, Room: Alsh 2)

Göbel, F., Kwok, T.C.K., and Rudi, D.(2019) Look There! Be Social and Share. In CHI 2019 Workshop on “Challenges Using Head-Mounted Displays in Shared and Social Spaces”, Glasgow, U.K., DOI: https://doi.org/10.3929/ethz-b-000331280

Paper Session: Audio Experiences(8th May, Wednesday – 14:00 – 15:20, Room: Alsh 1)

Kwok, T.C.K., Kiefer, P., Schinazi, V.R., Adams, B., and Raubal, M. (2019) Gaze-Guided Narratives: Adapting Audio Guide Content to Gaze in Virtual and Real Environments. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), Maz 4-9, Glasgow, U.K. [PDF]

We are looking forward to seeing you in Glasgow!
These researches are part of the LAMETTA or IGAMaps projects.


  • -

FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps

An article titled “FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps” will appear in one of the next issues of the Geoinformatica journal. It is now available online:

FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps

Abstract Map reading is a visual task that can strongly vary between individuals and maps of different characteristics. Aspects such as where, when, how long, and in which sequence information on a map is looked at can reveal valuable insights for both the map design process and to better understand cognitive processes of the map user. Contrary to static maps, for which many eye tracking studies are reported in the literature, established methods for tracking and analyzing visual attention on interactive maps are yet missing. In this paper, we present a framework called FeaturEyeTrack that allows to automatically log the cartographic features that have been inspected as well as the mouse input during the interaction with digital interactive maps. In particular, the novelty of FeaturEyeTrack lies in matching of gaze with the vector model of the current map visualization, therefore enabling a very detailed analysis without the requirement for manual annotation. Furthermore, we demonstrate the benefits of this approach in terms of manual work, level of detail and validity compared to state-of-the-art methods through a case study on an interactive cartographic web map.


  • -

ETIZ Meeting March 2019

We are going to host the next meeting of the Eye Tracking Interest Group Zurich (ETIZ). Everyone using, or planning to use eye tracking in their research is cordially welcome!

Date, time: 26th March 2019, 17:30
Place: ETH Zurich Hönggerberg, HIL D 53

 

17:30 – 17:35
Welcome

17:35 – 18:15
“Recent Advances Towards Pervasive Eye Tracking”
Prof. Dr. Andreas Bulling, Professor for Human-Computer Interaction and Cognitive Systems
University of Stuttgart, Germany

18:15 – 18:35
“Gaze-Guided Narratives”
Tiffany C.K. Kwok, Doctoral Student
Geoinformation Engineering, ETH Zurich

18:35 – 18:55
“Eye tracking in VR and AR displays: A mini-workshop”
Dr. Arzu Çöltekin, Assoc. Prof., Principal Investigator
Institute for Interactive Technologies IIT, University of Applied Sciences and Arts Northwestern Switzerland FHNW

18:55 – 19:00
Closing

19:00
Apéro, with demo of a gaze-adaptive interactive map by Fabian Göbel, Geoinformation Engineering


  • -

New aviation project: PEGGASUS

PEGGASUS (Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions)

We’re glad to announce the start of a new aviation project at the GeoGazeLab.

Check out our vision for pilot interactions in the cockpit of the future at the project page.


  • -

Full Paper accepted at CHI 2019

Our paper “Gaze-Guided Narratives: Adapting Audio Guide Content to Gaze in Virtual and Real Environments” has been accepted by ACM CHI 2019 as a full paper:

Kwok, T.C.K., Kiefer, P., Schinazi, V.R., Adams, B., and Raubal, M. (2019) Gaze-Guided Narratives: Adapting Audio Guide Content to Gaze in Virtual and Real Environments. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), ACM (accepted)

This paper proposes Gaze-Guided Narratives as an implicit gaze-based interaction concept for guiding tourists through the hidden stories of a city panorama. It reports on the implementation and evaluation of this concept which has been developed as part of the LAMETTA project. This research has been performed in collaboration between the GeoGazeLab, Victor R. Schinazi (Chair of Cognitive Science, ETH Zurich) and Benjamin Adams (Department of Geography, University of Canterbury).

Abstract. Exploring a city panorama from a vantage point is a popular tourist activity. Typical audio guides that support this activity are limited by their lack of responsiveness to user behavior and by the difficulty of matching audio descriptions to the panorama. These limitations can inhibit the acquisition of information and negatively affect user experience. This paper proposes Gaze-Guided Narratives as a novel interaction concept that helps tourists find specific features in the panorama (gaze guidance) while adapting the audio content to what has been previously looked at (content adaptation). Results from a controlled study in a virtual environment (n=60) revealed that a system featuring both gaze guidance and content adaptation obtained better user experience, lower cognitive load, and led to better performance in a mapping task compared to a classic audio guide. A second study with tourists situated at a vantage point (n=16) further demonstrated the feasibility of this approach in the real world.


  • -

Short Paper and Workshop Paper accepted at GIScience 2018

We are happily announcing that two of our papers have been accepted at GIScience 2018 and the Workshop on Spatial big data and machine learning in GIScience:

Fabian Göbel, Peter Kiefer, Ioannis Giannopoulos and Martin Raubal. 2018. Gaze Sequences and Map Task Complexity. GIScience 2018, Melbourne, Australia.

Fabian Göbel, Henry Martin. 2018. Unsupervised Clustering of Eye Tracking Data. Spatial big data and machine learning in GIScience, Workshop at GIScience 2018, Melbourne, Australia.

Both works are part of the IGAMaps project.


  • -

LAMETTA at GeoSummit: Visit by Federal Councillor Guy Parmelin

We have presented the LAMETTA project at the GeoSummit in Bern (6-7 June 2018), the largest congress for geoinformatics, surveying and planning in Switzerland.

Federal councilor Guy Parmelin was one of the first visitors of our exhibit and was very interested in the innovative system. Due to his subsequent opening speech, there was no time to try out the gaze-based tourist guide to Lake Lucerne himself, but the short visit seemed already impressive.

A large number of visitors from both, industry and academia, visited our exhibit and tried out the system. In addition, our exhibit was part of the GeoSchoolDay – an event in conjunction with GeoSummit which introduces students at high school age to applications and opportunities of geo information technologies. Approx. 500 pupils visited LAMETTA and learned about eye tracking and its application in interactive systems.


  • -

Papers accepted at ETRA and ETVIS

We are happy to announce, that two of our papers have been accepted at ETRA and ETVIS.

Fabian Göbel, Peter Kiefer, Ioannis Giannopoulos, Andrew T. Duchowski, and Martin Raubal. 2018. Improving Map Reading with Gaze-Adaptive Legends. In ETRA ’18: 2018 Symposium on Eye Tracking Research & Applications

David Rudi, Peter Kiefer, and Martin Raubal. 2018. Visualizing Pilot Eye Movements for Flight Instructors. In ETVIS’18: 3rdWorkshop on Eye Tracking and Visualization

These papers are part of the IGAMaps and Awareness in Aviation projects.

Peter Kiefer has further been involved in ETRA as an Area Chair.


  • -

Science City March 2018 – Impressions

The LAMETTA project has been demoed at this year’s “Treffpunkt Science City” event, an educational program of ETH Zurich for the general public where more than 3,000 visitors came.

Our panorama wall installation and the LAMETTA software allowed our visitors to experience as if they were exploring the view from a vantage point. Just by looking at the interested area (such as lakes, mountains and villages), our system can provide related information to the user.


  • -

Meeting point Science City – March 2018

We’re excited to demonstrate the LAMETTA project at ETH Treffpunkt Science City, the educational programs of ETH Zurich for all. Come and try out an interactive mobile eye tracking system! Explore a mountain panorama and interact with it only by using your gaze (more details)!

You can find us Sunday, 25 March in ETH Hönggerberg HCI, Room E2.


  • -

Position Paper at CHI Workshop on Outdoor HCI

We’ll present our ideas on how to enrich a tourist’s experience with gaze-guided narratives at a CHI workshop in Montreal this year:

Kiefer, P., Adams, B., and Raubal, M. (2018) Gaze-Guided Narratives for Outdoor Tourism. HCI Outdoors: A CHI 2018 Workshop on Understanding Human-Computer Interaction in the Outdoors

This research is part of the LAMETTA project.


  • -

Tiffany Kwok joins the team

We welcome Tiffany as a new PhD student in the LAMETTA project!

[Current Team]


  • -

Full Paper accepted at CHI 2018

Andrew T. Duchowski, Krzysztof Krejtz, Izabela Krejtz, Cezary Biele, Anna Niedzielska, Peter Kiefer, Ioannis Giannopoulos, and Martin Raubal (2018). The Index of Pupillary Activity: Measuring Cognitive Load vis-à-vis Task Difficulty with Pupil Oscillation In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems (CHI 2018), ACM (accepted)


  • -

3rd ET4S Workshop

Many thanks to all participating in this workshop and making it an inspiring event.

We have published the workshop proceedings through the ETH Research Collection.


  • -

Student Project Finished: A Public Gaze-Controlled Campus Map

Four Geomatics Master students have developed a public gaze-controlled campus map in the context of an interdisciplinary project work this autumn semester (Nikolaos Bakogiannis, Katharina Henggeler, Roswita Tschümperlin and Yang Xu).

The system prototype has been tested in a one week field study performed at the Campus Info Point at ETH Hönggerberg with 50 campus visitors.

The results of the thesis will be presented at a public event on Thursday 14 December, 2017 between 17:00 – 18:00 at HIL D 53. During the apéro afterwards, you are welcome to try the system yourself.

We’d like to thank the visitor and information management of ETH Zurich Services (in particular Stephanie Braunwalder) for supporting this project


  • -

3rd ET4S Workshop: Keynote by Roman Bednarik

We are glad to announce that the ET4S workshop 2018 will be opened with a keynote given by Roman Bednarik, an adjunct professor at the School of Computing, University of Eastern Finland:

 

Predicting user states from gaze and other multimodal data

Abstract: In this talk I will present research conducted by our team at UEF related to user state recognition during problem solving and other interactive contexts. We adapt and apply machine learning techniques to model behavioral and mental states, including action prediction and problem-solving state prediction.

 


  • -

Scientifica 2017 – Impressions

We have demoed the LAMETTA project at this year’s Scientifica, the Zurich science exhibition of ETH Zurich and University of Zurich with more than 30,000 visitors.

Our panorama wall installation enabled visitors to query information about lakes, mountains, and villages just by looking at them. Many visitors found the hidden treasure in the panorama and were rewarded with a piece of Swiss chocolate.

Visitors of all ages tried out and learned more about gaze-based interaction and mobile eye tracking technology. We are happy that so many people were interested and eager to discuss our research and potential applications to tourist guides of the future.


  • -

GeoGazeLab at Scientifica 2017

We’re excited to present the LAMETTA project at Scientifica, the science fair of ETH Zurich and University of Zurich. Come and try out an interactive mobile eye tracking system! Explore a mountain panorama and interact with it only by using your gaze (details in German)!

You can find us Friday, 1 September to Sunday, 3 September at University of Zurich main building (West Foyer).

Check out our Scientifica video!


  • -

Open PhD position

We are looking for a PhD candidate (LAMETTA project).

More details and application on the ETH website.


  • -

Gaze-Informed Location Based Services

Our article “Gaze-Informed Location Based Services” has been accepted for publication by the International Journal of Geographical Information Science (IJGIS):

Anagnostopoulos, V.-A., Havlena, M., Kiefer, P., Giannopoulos, I., Schindler, K., and Raubal, M. (2017).  Gaze-informed location based services.  International Journal of Geographical Information Science, 2017. (accepted), PDF

The article introduces the concept of location based services which take the user’s viewing direction into account. It reports on the implementation and evaluation of such gaze-informed location based service which has been developed as part of the LAMETTA project. This research has been performed in collaboration between the GeoGazeLab, Michal Havlena (Computer Vision Laboratory, ETH Zurich) and Konrad Schindler (Institute of Geodesy and Photogrammetry, ETH Zurich).

Abstract
Location-Based Services (LBS) provide more useful, intelligent assistance to users by adapting to their geographic context. For some services that context goes beyond a location and includes further spatial parameters, such as the user’s orientation or field of view. Here, we introduce Gaze-Informed LBS (GAIN-LBS), a novel type of LBS that takes into account the user’s viewing direction. Such a system could, for instance, provide audio information about the specific building a tourist is looking at from a vantage point. To determine the viewing direction relative to the environment, we record the gaze direction
relative to the user’s head with a mobile eye tracker. Image data from the tracker’s forward-looking camera serve as input to determine the orientation of the head w.r.t. the surrounding scene, using computer vision methods that allow one to estimate the relative transformation between the camera and a known view of the scene in real-time and without the need for artificial markers or additional sensors. We focus on how to map the Point of Regard of a user to a reference system, for which the objects of interest are known in advance. In an experimental validation on three real city panoramas, we confirm that the approach can cope with head movements of varying speed, including fast rotations up to 63 deg/s. We further demonstrate the feasibility of GAIN-LBS for tourist assistance with a proof-of-concept experiment in which a tourist explores a city panorama, where the approach achieved a recall that reaches over 99%. Finally, a GAIN-LBS can provide objective and qualitative ways of examining the gaze of a user based on what the user is currently looking at.


  • -

Short Paper accepted at AGILE conference

Fabian Göbel, Peter Kiefer and Martin Raubal (2017). FeaturEyeTrack: A Vector Tile-Based Eye Tracking Framework for Interactive Maps In Proceedings of the 20th International Conference on Geographic Information Science (AGILE 2017), Wageningen, The Netherlands. (accepted)


  • -

Special Issue Appearing: Spatial Cognition&Computation 17 (1-2)

A double Special Issue on “Eye Tracking for Spatial Research” in Spatial Cognition&Computation, guest-edited by Peter, Ioannis, Martin, and Andrew Duchowski, has appeared [URL].

Nineteen manuscripts were submitted to an open Call for Submissions, out of which seven were finally accepted after a rigorous review process.

The Special Issue commences with an overview article, authored by the Guest Editors: “Eye tracking for spatial research: Cognition, computation, challenges” [URL, PDF].


  • -

Controllability matters: The user experience of adaptive maps

An article titled “Controllability matters: The user experience of adaptive maps” will appear in one of the next issues of the Geoinformatica journal. It is now available online:

Controllability matters: The user experience of adaptive maps

Abstract Adaptive map interfaces have the potential of increasing usability by providing more task dependent and personalized support. It is unclear, however, how map adaptation must be designed to avoid a loss of control, transparency, and predictability. This article investigates the user experience of adaptive map interfaces in the context of gaze-based activity recognition. In a Wizard of Oz experiment we study two adaptive map interfaces differing in the degree of controllability and compare them to a non-adaptive map interface. Adaptive interfaces were found to cause higher user experience and lower perceived cognitive workload than the non-adaptive interface. Among the adaptive interfaces, users clearly preferred the condition with higher controllability. Results from structured interviews reveal that participants dislike being interrupted in their spatial cognitive processes by a sudden adaptation of the map content. Our results suggest that adaptive map interfaces should provide their users with control at what time an adaptation will be performed.


  • -

Article in Horizonte

The latest issue of the Horizonte magazine, published by the Swiss National Science Foundation, is reporting on our research.

Source: Horizonte 111, December 2016

German (PDF)

English (PDF)


  • -

INNOLEC Lecture

Martin Raubal was invited for the INNOLEC Lecture at the Department of Geography of the Masaryk University Brünn, Czech Republic.

The title of his talk is: Gaze-based assistance for wayfinders in the real world (slides as PDF, all our presentations).