LAMETTA is a research project on gaze-based interaction techniques in outdoor environments. The problem of gaze estimation and tracking attracted a lot of research interest because of its many potential applications in areas such as human attention analysis and gaze-based user interfaces. The main question this project is trying to answer is how to use the input from gaze-based devices in an intuitive, efficient, and privacy-preserving way to provide information services for touristic areas of interest such as elements in a city panorama.
The goals of this project are to design and develop an outdoor gaze-based interaction platform and to investigate explicit and implicit gaze-based interaction methods for tourists visually exploring a city panorama. The project envisions a mobile assistance system that triggers information services based on the user’s gaze on touristic areas of interest.
To achieve these goals the project will employ multimodal approaches combining gaze, audio, and vibro-tactile feedback methods for the interaction with the user. Machine learning methods will be used to recognize touristic interest from gaze input. Data will be collected through both, field-based studies in a real city, as well as lab-based studies with different city panoramas projected in a virtual environment.
The insights and platform that will be obtained from this project will also be helpful for other outdoor gaze-based interactions scenarios, such as pedestrian wayfinding assistance.
Check out our video for Scientifica 2017!
- Dr. Peter Kiefer (Principal Investigator)
- Prof. Dr. Martin Raubal (PhD student Supervisor)
- Tiffany Kwok (PhD student, since February 2018)
- Vasilis Anagnostopoulos (PhD student, until April 2017)
Chair of Cognitive Science, ETH Zürich
- Prof. Dr. Christoph Hölscher
- Dr. Victor Schinazi
Chair of Photogrammetry and Remote Sensing, ETH Zürich
- Prof. Dr. Konrad Schindler
Chair of Computing in the Cultural Sciences, University of Bamberg
- Prof. Dr. Christoph Schlieder
- Start Date: 01.09.2015
- End Date: 30.09.2019
- Research: Gaze-Informed LBS