Our article “Gaze-Informed Location Based Services” has been accepted for publication by the International Journal of Geographical Information Science (IJGIS):
Anagnostopoulos, V.-A., Havlena, M., Kiefer, P., Giannopoulos, I., Schindler, K., and Raubal, M. (2017). Gaze-informed location based services. International Journal of Geographical Information Science, 2017. (accepted), PDF
The article introduces the concept of location based services which take the user’s viewing direction into account. It reports on the implementation and evaluation of such gaze-informed location based service which has been developed as part of the LAMETTA project. This research has been performed in collaboration between the GeoGazeLab, Michal Havlena (Computer Vision Laboratory, ETH Zurich) and Konrad Schindler (Institute of Geodesy and Photogrammetry, ETH Zurich).
Location-Based Services (LBS) provide more useful, intelligent assistance to users by adapting to their geographic context. For some services that context goes beyond a location and includes further spatial parameters, such as the user’s orientation or field of view. Here, we introduce Gaze-Informed LBS (GAIN-LBS), a novel type of LBS that takes into account the user’s viewing direction. Such a system could, for instance, provide audio information about the specific building a tourist is looking at from a vantage point. To determine the viewing direction relative to the environment, we record the gaze direction
relative to the user’s head with a mobile eye tracker. Image data from the tracker’s forward-looking camera serve as input to determine the orientation of the head w.r.t. the surrounding scene, using computer vision methods that allow one to estimate the relative transformation between the camera and a known view of the scene in real-time and without the need for artificial markers or additional sensors. We focus on how to map the Point of Regard of a user to a reference system, for which the objects of interest are known in advance. In an experimental validation on three real city panoramas, we confirm that the approach can cope with head movements of varying speed, including fast rotations up to 63 deg/s. We further demonstrate the feasibility of GAIN-LBS for tourist assistance with a proof-of-concept experiment in which a tourist explores a city panorama, where the approach achieved a recall that reaches over 99%. Finally, a GAIN-LBS can provide objective and qualitative ways of examining the gaze of a user based on what the user is currently looking at.
A double Special Issue on “Eye Tracking for Spatial Research” in Spatial Cognition&Computation, guest-edited by Peter, Ioannis, Martin, and Andrew Duchowski, has appeared [URL].
Nineteen manuscripts were submitted to an open Call for Submissions, out of which seven were finally accepted after a rigorous review process.
Ioannis Giannopoulos, Peter Kiefer, and Martin Raubal (2015). GazeNav: Gaze-Based Pedestrian Navigation. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices & Services. ACM, New York, NY, USA.
Leading Mobile HCI researchers from all over the world meet in Copenhagen to present innovative research and gadgets. Our research group is present with 4 contributions. Read More
We are co-organizing a workshop at the UbiComp conference: the workshop on “Pervasive Eye Tracking and Mobile Eye-Based Interaction“ (PETMEI 2015). The workshop is concerned with eye tracking and gaze-based interaction in mobile and everyday (“ubiquitous”) situations, such as in pedestrian navigation.
Two students have started their Master theses in the GeoGazeLab:
Aikaterini Tsampazi, a Master student in Geomatics, will use eye tracking to measure the visual behavior of wayfinders in a virtual environment in her Master thesis titled “Pedestrian Navigation: The use of navigation aids under time pressure in virtual urban environments”. The goal will be to investigate how pedestrian wayfinders behave under time pressure.
Yufan Miao, a Master student in Computational Science from Uppsala University, is visiting our group in spring and summer 2015. He will be working on his Master thesis on “Landmark detection for mobile eye tracking”, co-supervised by our group and the Chair of Photogrammetry and Remote Sensing (Prof. Schindler). The goal is to apply image processing techniques to outdoor eye tracking videos for the automatic computation of the object of regard.
Great news right before the holiday season!
ETH Zurich will support our research on “Location-Aware Mobile Eye Tracking for Tourist Assistance” (LAMETTA) with an ETH Zurich Research Grant for a 3-year project, starting in 2015 (PI: Peter Kiefer).
The project will pioneer gaze-based interaction techniques for tourists in outdoor environments. The project envisions mobile assistance systems that trigger information services based on the user’s gaze on touristic areas of interest. For instance, a gaze-based recommender system could notify the observer of a city panorama about buildings that match her interest, given the objects she has looked at before. The main objective of this project consists in the investigation of novel gaze-based interaction methods for tourists exploring a city panorama.
Stay tuned for updates on LAMETTA!