Four Geomatics Master students have developed a public gaze-controlled campus map in the context of an interdisciplinary project work this autumn semester (Nikolaos Bakogiannis, Katharina Henggeler, Roswita Tschümperlin and Yang Xu).
The system prototype has been tested in a one week field study performed at the Campus Info Point at ETH Hönggerberg with 50 campus visitors.
The results of the thesis will be presented at a public event on Thursday 14 December, 2017 between 17:00 – 18:00 at HIL D 53. During the apéro afterwards, you are welcome to try the system yourself.
We’d like to thank the visitor and information management of ETH Zurich Services (in particular Stephanie Braunwalder) for supporting this project
Predicting user states from gaze and other multimodal data
Abstract: In this talk I will present research conducted by our team at UEF related to user state recognition during problem solving and other interactive contexts. We adapt and apply machine learning techniques to model behavioral and mental states, including action prediction and problem-solving state prediction.
Our panorama wall installation enabled visitors to query information about lakes, mountains, and villages just by looking at them. Many visitors found the hidden treasure in the panorama and were rewarded with a piece of Swiss chocolate.
Visitors of all ages tried out and learned more about gaze-based interaction and mobile eye tracking technology. We are happy that so many people were interested and eager to discuss our research and potential applications to tourist guides of the future.
We’re excited to present the LAMETTA project at Scientifica, the science fair of ETH Zurich and University of Zurich. Come and try out an interactive mobile eye tracking system! Explore a mountain panorama and interact with it only by using your gaze (details in German)!
You can find us Friday, 1 September to Sunday, 3 September at University of Zurich main building (West Foyer).
Check out our Scientifica video!
Our article “Gaze-Informed Location Based Services” has been accepted for publication by the International Journal of Geographical Information Science (IJGIS):
Anagnostopoulos, V.-A., Havlena, M., Kiefer, P., Giannopoulos, I., Schindler, K., and Raubal, M. (2017). Gaze-informed location based services. International Journal of Geographical Information Science, 2017. (accepted), PDF
The article introduces the concept of location based services which take the user’s viewing direction into account. It reports on the implementation and evaluation of such gaze-informed location based service which has been developed as part of the LAMETTA project. This research has been performed in collaboration between the GeoGazeLab, Michal Havlena (Computer Vision Laboratory, ETH Zurich) and Konrad Schindler (Institute of Geodesy and Photogrammetry, ETH Zurich).
Location-Based Services (LBS) provide more useful, intelligent assistance to users by adapting to their geographic context. For some services that context goes beyond a location and includes further spatial parameters, such as the user’s orientation or field of view. Here, we introduce Gaze-Informed LBS (GAIN-LBS), a novel type of LBS that takes into account the user’s viewing direction. Such a system could, for instance, provide audio information about the specific building a tourist is looking at from a vantage point. To determine the viewing direction relative to the environment, we record the gaze direction
relative to the user’s head with a mobile eye tracker. Image data from the tracker’s forward-looking camera serve as input to determine the orientation of the head w.r.t. the surrounding scene, using computer vision methods that allow one to estimate the relative transformation between the camera and a known view of the scene in real-time and without the need for artificial markers or additional sensors. We focus on how to map the Point of Regard of a user to a reference system, for which the objects of interest are known in advance. In an experimental validation on three real city panoramas, we confirm that the approach can cope with head movements of varying speed, including fast rotations up to 63 deg/s. We further demonstrate the feasibility of GAIN-LBS for tourist assistance with a proof-of-concept experiment in which a tourist explores a city panorama, where the approach achieved a recall that reaches over 99%. Finally, a GAIN-LBS can provide objective and qualitative ways of examining the gaze of a user based on what the user is currently looking at.
Fabian Göbel, Peter Kiefer and Martin Raubal (2017). FeaturEyeTrack: A Vector Tile-Based Eye Tracking Framework for Interactive Maps In Proceedings of the 20th International Conference on Geographic Information Science (AGILE 2017), Wageningen, The Netherlands. (accepted)
A double Special Issue on “Eye Tracking for Spatial Research” in Spatial Cognition&Computation, guest-edited by Peter, Ioannis, Martin, and Andrew Duchowski, has appeared [URL].
Nineteen manuscripts were submitted to an open Call for Submissions, out of which seven were finally accepted after a rigorous review process.
An article titled “Controllability matters: The user experience of adaptive maps” will appear in one of the next issues of the Geoinformatica journal. It is now available online:
Abstract Adaptive map interfaces have the potential of increasing usability by providing more task dependent and personalized support. It is unclear, however, how map adaptation must be designed to avoid a loss of control, transparency, and predictability. This article investigates the user experience of adaptive map interfaces in the context of gaze-based activity recognition. In a Wizard of Oz experiment we study two adaptive map interfaces differing in the degree of controllability and compare them to a non-adaptive map interface. Adaptive interfaces were found to cause higher user experience and lower perceived cognitive workload than the non-adaptive interface. Among the adaptive interfaces, users clearly preferred the condition with higher controllability. Results from structured interviews reveal that participants dislike being interrupted in their spatial cognitive processes by a sudden adaptation of the map content. Our results suggest that adaptive map interfaces should provide their users with control at what time an adaptation will be performed.
Vasileios-Athanasios Anagnostopoulos and Peter Kiefer (2016). Towards gaze-based interaction with urban outdoor spaces. In 6th International Workshop on Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2016), UbiComp’16 Adjunct, New York, NY, USA, ACM. accepted
Fabian Göbel, Ioannis Giannopoulos and Martin Raubal (2016). The Importance of Visual Attention for Adaptive Interfaces. In Proceedings of the Workshop Smarttention, Please! Intelligent Attention Management on Mobile Devices, in conjunction with MobileHCI 2016. ACM, New York, NY, USA (accepted)
Peter Kiefer, Ioannis Giannopoulos, Andrew Duchowski, Martin Raubal (2016) Measuring cognitive load for map tasks through pupil diameter. In Proceedings of the Ninth International Conference on Geographic Information Science (GIScience 2016). Springer
We offer topics for student theses on Bachelor and Master level:
Bachelor (PDF, German)
Master (PDF, English)
You may also propose your own topic related to eye tracking, wayfinding, or gaze-based interaction. Contact us for more information!
The full lists of all topics (including non-eye tracking topics) can be found on the main page of the Chair of Geoinformation Engineering.
Exciting research project to be started soon!
The project envisions intention-aware gaze-based assistance on cartographic maps. A future intention-aware gaze-based assistive map could, for instance, recognize from the user’s gaze that he or she is planning a touristic round trip, and adapt to the user’s needs accordingly. The main objective of this project consists in the investigation of methods for the recognition of activities and intentions from gaze data, collected from cartographic map users.
The PETMEI 2015 workshop at UbiComp, which Peter Kiefer has co-organized, took place on September, 7 in Osaka (Japan). There were 6 presentations, a keynote by Ali Borji, a demo, and a group work session, all with very active participation and interesting discussions. The workshop ended with a workshop dinner in a restaurant with food from the Okinawa region, and some participants continued to a Japanese Karaoke bar.
All in all, it has been a stimulating, fascinating and enjoyable event. Thanks to all participants, co-organizers, and sponsors!
Peter Kiefer, and Ioannis Giannopoulos (2015). A Framework for Attention-Based Implicit Interaction on Mobile Screens. In Proceedings of the Workshop Smarttention, Intelligent Attention Management on Mobile Devices, in conjunction with MobileHCI 2015. ACM, New York, NY, USA (accepted)
Ioannis Giannopoulos, Peter Kiefer, and Martin Raubal (2015). GazeNav: Gaze-Based Pedestrian Navigation. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices & Services. ACM, New York, NY, USA.
Leading Mobile HCI researchers from all over the world meet in Copenhagen to present innovative research and gadgets. Our research group is present with 4 contributions. Read More
We are co-organizing a workshop at the UbiComp conference: the workshop on “Pervasive Eye Tracking and Mobile Eye-Based Interaction“ (PETMEI 2015). The workshop is concerned with eye tracking and gaze-based interaction in mobile and everyday (“ubiquitous”) situations, such as in pedestrian navigation.
Two students have started their Master theses in the GeoGazeLab:
Aikaterini Tsampazi, a Master student in Geomatics, will use eye tracking to measure the visual behavior of wayfinders in a virtual environment in her Master thesis titled “Pedestrian Navigation: The use of navigation aids under time pressure in virtual urban environments”. The goal will be to investigate how pedestrian wayfinders behave under time pressure.
Yufan Miao, a Master student in Computational Science from Uppsala University, is visiting our group in spring and summer 2015. He will be working on his Master thesis on “Landmark detection for mobile eye tracking”, co-supervised by our group and the Chair of Photogrammetry and Remote Sensing (Prof. Schindler). The goal is to apply image processing techniques to outdoor eye tracking videos for the automatic computation of the object of regard.