• eyetracking@ethz.ch

Category Archives: Outdoor eye tracking

  • -

Open PhD position

We are looking for a PhD candidate (LAMETTA project).

More details and application on the ETH website.


  • -

Gaze-Informed Location Based Services

Our article “Gaze-Informed Location Based Services” has been accepted for publication by the International Journal of Geographical Information Science (IJGIS):

Anagnostopoulos, V.-A., Havlena, M., Kiefer, P., Giannopoulos, I., Schindler, K., and Raubal, M. (2017).  Gaze-informed location based services.  International Journal of Geographical Information Science, 2017. (accepted), PDF

The article introduces the concept of location based services which take the user’s viewing direction into account. It reports on the implementation and evaluation of such gaze-informed location based service which has been developed as part of the LAMETTA project. This research has been performed in collaboration between the GeoGazeLab, Michal Havlena (Computer Vision Laboratory, ETH Zurich) and Konrad Schindler (Institute of Geodesy and Photogrammetry, ETH Zurich).

Abstract
Location-Based Services (LBS) provide more useful, intelligent assistance to users by adapting to their geographic context. For some services that context goes beyond a location and includes further spatial parameters, such as the user’s orientation or field of view. Here, we introduce Gaze-Informed LBS (GAIN-LBS), a novel type of LBS that takes into account the user’s viewing direction. Such a system could, for instance, provide audio information about the specific building a tourist is looking at from a vantage point. To determine the viewing direction relative to the environment, we record the gaze direction
relative to the user’s head with a mobile eye tracker. Image data from the tracker’s forward-looking camera serve as input to determine the orientation of the head w.r.t. the surrounding scene, using computer vision methods that allow one to estimate the relative transformation between the camera and a known view of the scene in real-time and without the need for artificial markers or additional sensors. We focus on how to map the Point of Regard of a user to a reference system, for which the objects of interest are known in advance. In an experimental validation on three real city panoramas, we confirm that the approach can cope with head movements of varying speed, including fast rotations up to 63 deg/s. We further demonstrate the feasibility of GAIN-LBS for tourist assistance with a proof-of-concept experiment in which a tourist explores a city panorama, where the approach achieved a recall that reaches over 99%. Finally, a GAIN-LBS can provide objective and qualitative ways of examining the gaze of a user based on what the user is currently looking at.


  • -

Special Issue Appearing: Spatial Cognition&Computation 17 (1-2)

A double Special Issue on “Eye Tracking for Spatial Research” in Spatial Cognition&Computation, guest-edited by Peter, Ioannis, Martin, and Andrew Duchowski, has appeared [URL].

Nineteen manuscripts were submitted to an open Call for Submissions, out of which seven were finally accepted after a rigorous review process.

The Special Issue commences with an overview article, authored by the Guest Editors: “Eye tracking for spatial research: Cognition, computation, challenges” [URL, PDF].


  • -

Article in Horizonte

The latest issue of the Horizonte magazine, published by the Swiss National Science Foundation, is reporting on our research.

Source: Horizonte 111, December 2016

German (PDF)

English (PDF)


  • -

INNOLEC Lecture

Martin Raubal was invited for the INNOLEC Lecture at the Department of Geography of the Masaryk University Brünn, Czech Republic.

The title of his talk is: Gaze-based assistance for wayfinders in the real world (slides as PDF, all our presentations).


  • -

Vasilis Anagnostopoulos joins the team

Vasilis Anagnostopoulos has started as a PhD student in the LAMETTA project (Location-Aware Mobile Eye Tracking for Tourist Assistance).

Welcome to our team, Vasilis!

[Current Team]


  • -

Full Paper accepted at MobileHCI 2015

Ioannis Giannopoulos, Peter Kiefer, and Martin Raubal (2015). GazeNav: Gaze-Based Pedestrian Navigation.  In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices & Services. ACM, New York, NY, USA.

GazeNav Talk

Leading Mobile HCI researchers from all over the world meet in Copenhagen to present innovative research and gadgets. Our research group is present with 4 contributions. Read More

 


  • -

Department Annual Report 2014

A summary of our research on “Gaze-Based Geographic Human Computer Interaction” (PDF) is included as a research highlight in the annual report 2014 of our department (D-BAUG, Department of Civil, Environmental and Geomatic Engineering).

annualreport2014


  • -

PETMEI 2015

We are co-organizing a workshop at the UbiComp conference: the workshop on “Pervasive Eye Tracking and Mobile Eye-Based Interaction“ (PETMEI 2015). The workshop is concerned with eye tracking and gaze-based interaction in mobile and everyday (“ubiquitous”) situations, such as in pedestrian navigation.


  • -

Open PhD position

We are looking for a PhD candidate (LAMETTA project).

More details and application on the ETH website.


  • -

Master theses started

Two students have started their Master theses in the GeoGazeLab:

Aikaterini Tsampazi, a Master student in Geomatics, will use eye tracking to measure the visual behavior of wayfinders in a virtual environment in her Master thesis titled “Pedestrian Navigation: The use of navigation aids under time pressure in virtual urban environments”. The goal will be to investigate how pedestrian wayfinders behave under time pressure.

Yufan Miao, a Master student in Computational Science from Uppsala University, is visiting our group in spring and summer 2015. He will be working on his Master thesis on “Landmark detection for mobile eye tracking”, co-supervised by our group and the Chair of Photogrammetry and Remote Sensing (Prof. Schindler). The goal is to apply image processing techniques to outdoor eye tracking videos for the automatic computation of the object of regard.


  • 0

Paper accepted at CHI Workshop 2015

Ioannis Giannopoulos, Peter Kiefer, and Martin Raubal. Watch What I Am Looking At! Eye Gaze and Head-Mounted Displays. In Mobile Collocated Interactions: From Smartphones to Wearables, Workshop at CHI 2015, Seoul, Korea, 2015.

[PDF]

 

etglassold


  • -

ETH Zurich Research Grant

Great news right before the holiday season!

ETH Zurich will support our research on “Location-Aware Mobile Eye Tracking for Tourist Assistance” (LAMETTA) with an ETH Zurich Research Grant for a 3-year project, starting in 2015 (PI: Peter Kiefer).

The project will pioneer gaze-based interaction techniques for tourists in outdoor environments. The project envisions mobile assistance systems that trigger information services based on the user’s gaze on touristic areas of interest. For instance, a gaze-based recommender system could notify the observer of a city panorama about buildings that match her interest, given the objects she has looked at before. The main objective of this project consists in the investigation of novel gaze-based interaction methods for tourists exploring a city panorama.

Stay tuned for updates on LAMETTA!

lametta1