• eyetracking@ethz.ch

News

  • -

New project in aviation research

The European Comission will fund our project “Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions (PEGGASUS)” (January 2019-January 2021) within the Clean Sky initiative, which is part of the H2020 program.

We will conduct eye tracking studies with professional pilots in flight simulators to analyze the application of novel gaze-based interactions in the cockpit and for use in pilot training. Read more about our research on aviation.

The project is a cooperation with SWISS International Air Lines, CSEM, SERMA and Thales.


  • -

Short Paper and Workshop Paper accepted at GIScience 2018

We are happily announcing that two of our papers have been accepted at GIScience 2018 and the Workshop on Spatial big data and machine learning in GIScience:

Fabian Göbel, Peter Kiefer, Ioannis Giannopoulos and Martin Raubal. 2018. Gaze Sequences and Map Task Complexity. GIScience 2018, Melbourne, Australia.

Fabian Göbel, Henry Martin. 2018. Unsupervised Clustering of Eye Tracking Data. Spatial big data and machine learning in GIScience, Workshop at GIScience 2018, Melbourne, Australia.

Both works are part of the IGAMaps project.


  • -

Best Paper at ETVIS 2018

We are happy to announce, that our paper received the best paper award at ETVIS!

David Rudi, Peter Kiefer, and Martin Raubal. 2018. Visualizing Pilot Eye Movements for Flight Instructors. In ETVIS’18: 3rdWorkshop on Eye Tracking and Visualization

The paper is part of the Awareness in Aviation project.


  • -

LAMETTA at GeoSummit: Visit by Federal Councillor Guy Parmelin

We have presented the LAMETTA project at the GeoSummit in Bern (6-7 June 2018), the largest congress for geoinformatics, surveying and planning in Switzerland.

Federal councilor Guy Parmelin was one of the first visitors of our exhibit and was very interested in the innovative system. Due to his subsequent opening speech, there was no time to try out the gaze-based tourist guide to Lake Lucerne himself, but the short visit seemed already impressive.

A large number of visitors from both, industry and academia, visited our exhibit and tried out the system. In addition, our exhibit was part of the GeoSchoolDay – an event in conjunction with GeoSummit which introduces students at high school age to applications and opportunities of geo information technologies. Approx. 500 pupils visited LAMETTA and learned about eye tracking and its application in interactive systems.


  • -

  • -

Papers accepted at ETRA and ETVIS

We are happy to announce, that two of our papers have been accepted at ETRA and ETVIS.

Fabian Göbel, Peter Kiefer, Ioannis Giannopoulos, Andrew T. Duchowski, and Martin Raubal. 2018. Improving Map Reading with Gaze-Adaptive Legends. In ETRA ’18: 2018 Symposium on Eye Tracking Research & Applications

David Rudi, Peter Kiefer, and Martin Raubal. 2018. Visualizing Pilot Eye Movements for Flight Instructors. In ETVIS’18: 3rdWorkshop on Eye Tracking and Visualization

These papers are part of the IGAMaps and Awareness in Aviation projects.

Peter Kiefer has further been involved in ETRA as an Area Chair.


  • -

Ioannis Giannopoulos Professor at TU Vienna

We congratulate Ioannis Giannopoulos to his new university professor position at TU Vienna, where he is heading the Research Group Geoinformation at the Department of Geodesy and Geoinformation (since January). Ioannis was a PhD student and PostDoc with the GeoGazeLab between 2012 and 2017.


  • -

Science City March 2018 – Impressions

The LAMETTA project has been demoed at this year’s “Treffpunkt Science City” event, an educational program of ETH Zurich for the general public where more than 3,000 visitors came.

Our panorama wall installation and the LAMETTA software allowed our visitors to experience as if they were exploring the view from a vantage point. Just by looking at the interested area (such as lakes, mountains and villages), our system can provide related information to the user.


  • -

Meeting point Science City – March 2018

We’re excited to demonstrate the LAMETTA project at ETH Treffpunkt Science City, the educational programs of ETH Zurich for all. Come and try out an interactive mobile eye tracking system! Explore a mountain panorama and interact with it only by using your gaze (more details)!

You can find us Sunday, 25 March in ETH Hönggerberg HCI, Room E2.


  • -

Call for Papers: Spatial Big Data and Machine Learning in GIScience

Have you ever thought of eye tracking data as spatial “big” data? Are you collecting large amounts of eye tracking data together with geo-spatial coordinates, or are you applying machine learning on such data? Then the workshop on “Spatial Big Data and Machine Learning in GIScience” at this year’s GIScience conference in Melbourne might be interesting for you.

The 1st Call for Papers is now available on the website.


  • -

Position Paper at CHI Workshop on Outdoor HCI

We’ll present our ideas on how to enrich a tourist’s experience with gaze-guided narratives at a CHI workshop in Montreal this year:

Kiefer, P., Adams, B., and Raubal, M. (2018) Gaze-Guided Narratives for Outdoor Tourism. HCI Outdoors: A CHI 2018 Workshop on Understanding Human-Computer Interaction in the Outdoors

This research is part of the LAMETTA project.


  • -

Tiffany Kwok joins the team

We welcome Tiffany as a new PhD student in the LAMETTA project!

[Current Team]


  • -

Full Paper accepted at CHI 2018

Andrew T. Duchowski, Krzysztof Krejtz, Izabela Krejtz, Cezary Biele, Anna Niedzielska, Peter Kiefer, Ioannis Giannopoulos, and Martin Raubal (2018). The Index of Pupillary Activity: Measuring Cognitive Load vis-à-vis Task Difficulty with Pupil Oscillation In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems (CHI 2018), ACM (accepted)


  • -

3rd ET4S Workshop

Many thanks to all participating in this workshop and making it an inspiring event.

We have published the workshop proceedings through the ETH Research Collection.


  • -

Student Project Finished: A Public Gaze-Controlled Campus Map

Four Geomatics Master students have developed a public gaze-controlled campus map in the context of an interdisciplinary project work this autumn semester (Nikolaos Bakogiannis, Katharina Henggeler, Roswita Tschümperlin and Yang Xu).

The system prototype has been tested in a one week field study performed at the Campus Info Point at ETH Hönggerberg with 50 campus visitors.

The results of the thesis will be presented at a public event on Thursday 14 December, 2017 between 17:00 – 18:00 at HIL D 53. During the apéro afterwards, you are welcome to try the system yourself.

We’d like to thank the visitor and information management of ETH Zurich Services (in particular Stephanie Braunwalder) for supporting this project


  • -

3rd ET4S Workshop: Program Online

The program of the ET4S workshop on 14 January 2018 is now available on the workshop website: http://spatialeyetracking.org/et4s-2018/program

 


  • -

3rd ET4S Workshop: Call for Demos

You have an interesting eye tracking research prototype to show? You would like to join the ET4S workshop, but have missed the deadline for regular papers? You have a regular paper accepted and would like to take the opportunity to gain even more visibility and get direct feedback for your system?

We’ll have a dedicated demo session at the ET4S workshop in January 2018 in Zurich. We hope you consider submitting a short abstract (1 page) with a demo proposal (before November, 8).

ET4S website: http://spatialeyetracking.org/et4s-2018/


  • -

3rd ET4S Workshop: Keynote by Roman Bednarik

We are glad to announce that the ET4S workshop 2018 will be opened with a keynote given by Roman Bednarik, an adjunct professor at the School of Computing, University of Eastern Finland:

 

Predicting user states from gaze and other multimodal data

Abstract: In this talk I will present research conducted by our team at UEF related to user state recognition during problem solving and other interactive contexts. We adapt and apply machine learning techniques to model behavioral and mental states, including action prediction and problem-solving state prediction.

 


  • -

3rd ET4S Workshop: Deadline Extension

Due to several requests, the submission deadline for the ET4S 2018 workshop has been extended to

11 October 2017

ET4S website


  • -

Scientifica 2017 – Impressions

We have demoed the LAMETTA project at this year’s Scientifica, the Zurich science exhibition of ETH Zurich and University of Zurich with more than 30,000 visitors.

Our panorama wall installation enabled visitors to query information about lakes, mountains, and villages just by looking at them. Many visitors found the hidden treasure in the panorama and were rewarded with a piece of Swiss chocolate.

Visitors of all ages tried out and learned more about gaze-based interaction and mobile eye tracking technology. We are happy that so many people were interested and eager to discuss our research and potential applications to tourist guides of the future.


  • -

3rd ET4S Workshop – Registration now open

Registration for the ET4S workshop and the LBS2018 conference is now open:

Early bird discount will be available until 28 November 2017.

 

We are looking forward to seeing you in Zurich.


  • -

GeoGazeLab at Scientifica 2017

We’re excited to present the LAMETTA project at Scientifica, the science fair of ETH Zurich and University of Zurich. Come and try out an interactive mobile eye tracking system! Explore a mountain panorama and interact with it only by using your gaze (details in German)!

You can find us Friday, 1 September to Sunday, 3 September at University of Zurich main building (West Foyer).

Check out our Scientifica video!


  • -

Open PhD position

We are looking for a PhD candidate (LAMETTA project).

More details and application on the ETH website.


  • -

Culmann Award

Dr. Ioannis Giannopoulos received the ETH Zurich Culmann Award in 2017 for an outstanding doctoral thesis “Supporting Wayfinding Through Mobile Gaze-Based Interaction”.

 


  • -

An inverse-linear logistic model of the main sequence

Peter Kiefer and Ioannis Giannopoulos have contributed to an article titled “An inverse-linear logistic model of the main sequence” (Journal of Eye Movement Research, JEMR). It is now available online:

http://dx.doi.org/10.16910/jemr.10.3.4

Abstract. A model of the main sequence is proposed based on the logistic function. The model’s fit to the peak velocity-amplitude relation resembles an S curve, simulta- neously allowing control of the curve’s asymptotes at very small and very large amplitudes, as well as its slope over the mid amplitude range. The proposed inverse-linear logistic model is also able to express the linear relation of duration and amplitude. We demonstrate the utility and robustness of the model when fit to aggregate data at the small- and mid-amplitude ranges, namely when fitting microsaccades, saccades, and superposition of both. We are confident the model will suitably extend to the large-amplitude range of eye movements.


  • -

3rd ET4S Workshop: Call for Papers

We’re glad to announce the 3rd International Workshop on Eye Tracking for Spatial Research (ET4S), which will take place January 14, 2018 in Zurich.

The workshop aims to bring together researchers from different areas who have a common interest in using eye tracking for research questions related to spatial information. This is the 3rd edition of the workshop, after two successful ET4S workshops in Scarborough (2013) and Vienna (2014). This time the workshop will be co-located with the 14th International Conference on Location Based Services (LBS 2018).

The workshop will be opened with an invited talk given by Roman Bednarik, an adjunct professor at the School of Computing, University of Eastern Finland.

The Call for Papers is now available (Paper submission deadline: September 27, 2017).

ET4S 2018 is supported by Ergoneers.


  • -

Gaze-Informed Location Based Services

Our article “Gaze-Informed Location Based Services” has been accepted for publication by the International Journal of Geographical Information Science (IJGIS):

Anagnostopoulos, V.-A., Havlena, M., Kiefer, P., Giannopoulos, I., Schindler, K., and Raubal, M. (2017).  Gaze-informed location based services.  International Journal of Geographical Information Science, 2017. (accepted), PDF

The article introduces the concept of location based services which take the user’s viewing direction into account. It reports on the implementation and evaluation of such gaze-informed location based service which has been developed as part of the LAMETTA project. This research has been performed in collaboration between the GeoGazeLab, Michal Havlena (Computer Vision Laboratory, ETH Zurich) and Konrad Schindler (Institute of Geodesy and Photogrammetry, ETH Zurich).

Abstract
Location-Based Services (LBS) provide more useful, intelligent assistance to users by adapting to their geographic context. For some services that context goes beyond a location and includes further spatial parameters, such as the user’s orientation or field of view. Here, we introduce Gaze-Informed LBS (GAIN-LBS), a novel type of LBS that takes into account the user’s viewing direction. Such a system could, for instance, provide audio information about the specific building a tourist is looking at from a vantage point. To determine the viewing direction relative to the environment, we record the gaze direction
relative to the user’s head with a mobile eye tracker. Image data from the tracker’s forward-looking camera serve as input to determine the orientation of the head w.r.t. the surrounding scene, using computer vision methods that allow one to estimate the relative transformation between the camera and a known view of the scene in real-time and without the need for artificial markers or additional sensors. We focus on how to map the Point of Regard of a user to a reference system, for which the objects of interest are known in advance. In an experimental validation on three real city panoramas, we confirm that the approach can cope with head movements of varying speed, including fast rotations up to 63 deg/s. We further demonstrate the feasibility of GAIN-LBS for tourist assistance with a proof-of-concept experiment in which a tourist explores a city panorama, where the approach achieved a recall that reaches over 99%. Finally, a GAIN-LBS can provide objective and qualitative ways of examining the gaze of a user based on what the user is currently looking at.


  • -

SWISS Eye Tracking project @ Lufthansa Digital Aviation Forum 2017

This year together with our colleagues from SWISS International Air Lines Ltd., we were present at the Lufthansa Digital Aviation Forum 2017 in Frankfurt am Main to present our ongoing project. Visit: http://releasd.com/3583 and http://newsroom.lufthansagroup.com/de/themen/digital-aviation.html for further information on the Lufthansa Digital Aviation Forum 2017.


  • -

Short Paper accepted at AGILE conference

Fabian Göbel, Peter Kiefer and Martin Raubal (2017). FeaturEyeTrack: A Vector Tile-Based Eye Tracking Framework for Interactive Maps In Proceedings of the 20th International Conference on Geographic Information Science (AGILE 2017), Wageningen, The Netherlands. (accepted)


  • -

GeoGazeLab goes facebook

Now you can find us on facebook!