• eyetracking@ethz.ch

Category Archives: Gaze-based interaction

  • -

3rd ET4S Workshop: Keynote by Roman Bednarik

We are glad to announce that the ET4S workshop 2018 will be opened with a keynote given by Roman Bednarik, an adjunct professor at the School of Computing, University of Eastern Finland:

 

Predicting user states from gaze and other multimodal data

Abstract: In this talk I will present research conducted by our team at UEF related to user state recognition during problem solving and other interactive contexts. We adapt and apply machine learning techniques to model behavioral and mental states, including action prediction and problem-solving state prediction.

 


  • -

Scientifica 2017 – Impressions

We have demoed the LAMETTA project at this year’s Scientifica, the Zurich science exhibition of ETH Zurich and University of Zurich with more than 30,000 visitors.

Our panorama wall installation enabled visitors to query information about lakes, mountains, and villages just by looking at them. Many visitors found the hidden treasure in the panorama and were rewarded with a piece of Swiss chocolate.

Visitors of all ages tried out and learned more about gaze-based interaction and mobile eye tracking technology. We are happy that so many people were interested and eager to discuss our research and potential applications to tourist guides of the future.


  • -

GeoGazeLab at Scientifica 2017

We’re excited to present the LAMETTA project at Scientifica, the science fair of ETH Zurich and University of Zurich. Come and try out an interactive mobile eye tracking system! Explore a mountain panorama and interact with it only by using your gaze (details in German)!

You can find us Friday, 1 September to Sunday, 3 September at University of Zurich main building (West Foyer).

Check out our Scientifica video!


  • -

Open PhD position

We are looking for a PhD candidate (LAMETTA project).

More details and application on the ETH website.


  • -

Gaze-Informed Location Based Services

Our article “Gaze-Informed Location Based Services” has been accepted for publication by the International Journal of Geographical Information Science (IJGIS):

Anagnostopoulos, V.-A., Havlena, M., Kiefer, P., Giannopoulos, I., Schindler, K., and Raubal, M. (2017).  Gaze-informed location based services.  International Journal of Geographical Information Science, 2017. (accepted), PDF

The article introduces the concept of location based services which take the user’s viewing direction into account. It reports on the implementation and evaluation of such gaze-informed location based service which has been developed as part of the LAMETTA project. This research has been performed in collaboration between the GeoGazeLab, Michal Havlena (Computer Vision Laboratory, ETH Zurich) and Konrad Schindler (Institute of Geodesy and Photogrammetry, ETH Zurich).

Abstract
Location-Based Services (LBS) provide more useful, intelligent assistance to users by adapting to their geographic context. For some services that context goes beyond a location and includes further spatial parameters, such as the user’s orientation or field of view. Here, we introduce Gaze-Informed LBS (GAIN-LBS), a novel type of LBS that takes into account the user’s viewing direction. Such a system could, for instance, provide audio information about the specific building a tourist is looking at from a vantage point. To determine the viewing direction relative to the environment, we record the gaze direction
relative to the user’s head with a mobile eye tracker. Image data from the tracker’s forward-looking camera serve as input to determine the orientation of the head w.r.t. the surrounding scene, using computer vision methods that allow one to estimate the relative transformation between the camera and a known view of the scene in real-time and without the need for artificial markers or additional sensors. We focus on how to map the Point of Regard of a user to a reference system, for which the objects of interest are known in advance. In an experimental validation on three real city panoramas, we confirm that the approach can cope with head movements of varying speed, including fast rotations up to 63 deg/s. We further demonstrate the feasibility of GAIN-LBS for tourist assistance with a proof-of-concept experiment in which a tourist explores a city panorama, where the approach achieved a recall that reaches over 99%. Finally, a GAIN-LBS can provide objective and qualitative ways of examining the gaze of a user based on what the user is currently looking at.


  • -

Short Paper accepted at AGILE conference

Fabian Göbel, Peter Kiefer and Martin Raubal (2017). FeaturEyeTrack: A Vector Tile-Based Eye Tracking Framework for Interactive Maps In Proceedings of the 20th International Conference on Geographic Information Science (AGILE 2017), Wageningen, The Netherlands. (accepted)


  • -

Special Issue Appearing: Spatial Cognition&Computation 17 (1-2)

A double Special Issue on “Eye Tracking for Spatial Research” in Spatial Cognition&Computation, guest-edited by Peter, Ioannis, Martin, and Andrew Duchowski, has appeared [URL].

Nineteen manuscripts were submitted to an open Call for Submissions, out of which seven were finally accepted after a rigorous review process.

The Special Issue commences with an overview article, authored by the Guest Editors: “Eye tracking for spatial research: Cognition, computation, challenges” [URL, PDF].


  • -

Controllability matters: The user experience of adaptive maps

An article titled “Controllability matters: The user experience of adaptive maps” will appear in one of the next issues of the Geoinformatica journal. It is now available online:

Controllability matters: The user experience of adaptive maps

Abstract Adaptive map interfaces have the potential of increasing usability by providing more task dependent and personalized support. It is unclear, however, how map adaptation must be designed to avoid a loss of control, transparency, and predictability. This article investigates the user experience of adaptive map interfaces in the context of gaze-based activity recognition. In a Wizard of Oz experiment we study two adaptive map interfaces differing in the degree of controllability and compare them to a non-adaptive map interface. Adaptive interfaces were found to cause higher user experience and lower perceived cognitive workload than the non-adaptive interface. Among the adaptive interfaces, users clearly preferred the condition with higher controllability. Results from structured interviews reveal that participants dislike being interrupted in their spatial cognitive processes by a sudden adaptation of the map content. Our results suggest that adaptive map interfaces should provide their users with control at what time an adaptation will be performed.


  • -

Article in Horizonte

The latest issue of the Horizonte magazine, published by the Swiss National Science Foundation, is reporting on our research.

Source: Horizonte 111, December 2016

German (PDF)

English (PDF)


  • -

INNOLEC Lecture

Martin Raubal was invited for the INNOLEC Lecture at the Department of Geography of the Masaryk University Brünn, Czech Republic.

The title of his talk is: Gaze-based assistance for wayfinders in the real world (slides as PDF, all our presentations).


  • -

GeoGazeLab at the Smarttention Workshop

GeoGazeLab was represented at the Smarttention Workshop at MobileHCI to raise awareness for visual attention in adaptive interfaces.

Thanks to the organizers for their great work. We had inspiring discussions about the future of mobile UIs.whatsapp-image-2016-09-10-at-11-47-57


  • -

Paper accepted at PETMEI Workshop

Vasileios-Athanasios Anagnostopoulos and Peter Kiefer (2016). Towards gaze-based interaction with urban outdoor spaces.  In 6th International Workshop on Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2016), UbiComp’16 Adjunct, New York, NY, USA, ACM. accepted


  • -

  • -

GeoGazeLab at GEOSummit 2016

This year at GEOSummit, our group displayed a gaze-based Geo-Game. During the GEOSchool Day, pupils and adults could experience how it feels to control maps just by using their gaze.
– We had great fun changing the world in the blink of an eye.

DSC00916_2


  • -

Full paper accepted at GIScience 2016

Peter Kiefer, Ioannis Giannopoulos, Andrew Duchowski, Martin Raubal (2016) Measuring cognitive load for map tasks through pupil diameter. In Proceedings of the Ninth International Conference on Geographic Information Science (GIScience 2016). Springer


  • -

Fabian Göbel joins the team

Fabian Göbel has started as a PhD student in the IGAMaps project (Intention-Aware Gaze-Based Assistance on Maps).

Great to have you on board, Fabian!

[Current Team]


  • -

BSc/MSc topics for spring 2016

We offer topics for student theses on Bachelor and Master level:

Bachelor (PDF, German)

Master (PDF, English)

You may also propose your own topic related to eye tracking, wayfinding, or gaze-based interaction. Contact us for more information!

The full lists of all topics (including non-eye tracking topics) can be found on the main page of the Chair of Geoinformation Engineering.


  • -

Open PhD position

We are looking for a PhD candidate (IGAMaps project).

More details and application on the ETH website, and as PDF.


  • -

New SNSF Project: IGAMaps

Exciting research project to be started soon!

Our project proposal on “Intention-Aware Gaze-Based Assistance on Maps” (IGAMaps) has been approved by the Swiss National Science Foundation (PI: Peter Kiefer, Co-PI: Martin Raubal, 1 PhD for 3 yrs).

The project envisions intention-aware gaze-based assistance on cartographic maps. A future intention-aware gaze-based assistive map could, for instance, recognize from the user’s gaze that he or she is planning a touristic round trip, and adapt to the user’s needs accordingly. The main objective of this project consists in the investigation of methods for the recognition of activities and intentions from gaze data, collected from cartographic map users.

scanpathonmap


  • -

PETMEI 2015: Summary

The PETMEI 2015 workshop at UbiComp, which Peter Kiefer has co-organized, took place on September, 7 in Osaka (Japan). There were 6 presentations, a keynote by Ali Borji, a demo, and a group work session, all with very active participation and interesting discussions. The workshop ended with a workshop dinner in a restaurant with food from the Okinawa region, and some participants continued to a Japanese Karaoke bar.

All in all, it has been a stimulating, fascinating and enjoyable event. Thanks to all participants, co-organizers, and sponsors!

Links:


  • -

Vasilis Anagnostopoulos joins the team

Vasilis Anagnostopoulos has started as a PhD student in the LAMETTA project (Location-Aware Mobile Eye Tracking for Tourist Assistance).

Welcome to our team, Vasilis!

[Current Team]


  • -

Short Paper accepted at Smarttention Workshop

Peter Kiefer, and Ioannis Giannopoulos (2015). A Framework for Attention-Based Implicit Interaction on Mobile Screens.  In Proceedings of the Workshop Smarttention, Intelligent Attention Management on Mobile Devices, in conjunction with MobileHCI 2015. ACM, New York, NY, USA (accepted)


  • -

Full Paper accepted at MobileHCI 2015

Ioannis Giannopoulos, Peter Kiefer, and Martin Raubal (2015). GazeNav: Gaze-Based Pedestrian Navigation.  In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices & Services. ACM, New York, NY, USA.

GazeNav Talk

Leading Mobile HCI researchers from all over the world meet in Copenhagen to present innovative research and gadgets. Our research group is present with 4 contributions. Read More

 


  • -

Department Annual Report 2014

A summary of our research on “Gaze-Based Geographic Human Computer Interaction” (PDF) is included as a research highlight in the annual report 2014 of our department (D-BAUG, Department of Civil, Environmental and Geomatic Engineering).

annualreport2014


  • -

PETMEI 2015

We are co-organizing a workshop at the UbiComp conference: the workshop on “Pervasive Eye Tracking and Mobile Eye-Based Interaction“ (PETMEI 2015). The workshop is concerned with eye tracking and gaze-based interaction in mobile and everyday (“ubiquitous”) situations, such as in pedestrian navigation.


  • -

Open PhD position

We are looking for a PhD candidate (LAMETTA project).

More details and application on the ETH website.


  • -

Master theses started

Two students have started their Master theses in the GeoGazeLab:

Aikaterini Tsampazi, a Master student in Geomatics, will use eye tracking to measure the visual behavior of wayfinders in a virtual environment in her Master thesis titled “Pedestrian Navigation: The use of navigation aids under time pressure in virtual urban environments”. The goal will be to investigate how pedestrian wayfinders behave under time pressure.

Yufan Miao, a Master student in Computational Science from Uppsala University, is visiting our group in spring and summer 2015. He will be working on his Master thesis on “Landmark detection for mobile eye tracking”, co-supervised by our group and the Chair of Photogrammetry and Remote Sensing (Prof. Schindler). The goal is to apply image processing techniques to outdoor eye tracking videos for the automatic computation of the object of regard.


  • -

Book chapter on gaze-based interaction for GIS (German)

Kiefer, P. (2015). Blickbasierte Mensch-Computer-Interaktion mit Geoinformationssystemen. In Thomas H. Kolbe, Ralf Bill, and Andreas Donaubauer, editors, Geoinformationssysteme 2015. Wichmann, Heidelberg.

[PDF]


  • 0

Paper accepted at CHI Workshop 2015

Ioannis Giannopoulos, Peter Kiefer, and Martin Raubal. Watch What I Am Looking At! Eye Gaze and Head-Mounted Displays. In Mobile Collocated Interactions: From Smartphones to Wearables, Workshop at CHI 2015, Seoul, Korea, 2015.

[PDF]

 

etglassold


  • -

Special Issue: Call for Submissions

We are planning a Special Issue on “Eye Tracking for Spatial Research” in Spatial Cognition and Computation: Call for Submissions (PDF).

Submission Deadline is May 27, 2015.

scc