• eyetracking@ethz.ch
  • +41 44 633 71 59


  • -

Winter School 2023

We are co-organizing an ETH Winter School on “Eye Tracking – Experimental Design, Implementation, and Analysis” which is going to take place in Monte Verità (Ticino), Switzerland, from 8 to 13 January 2023. Download the first announcement as PDF.

The Winter School targets at PhD students and early PostDocs (coming from any research field) who are using, or planning to use, eye tracking in their research. Internationally recognized experts will provide lectures and hands-on sessions on eye tracking methodology, experimental design, and analysis.

The registration for the Winter School is now open.

Building on the successful 2016 Winter School, the 2023 School will be updated to focus on state-of-the-art software (licensed and open-source, e.g., PsychoPy and Pupil Labs) and hardware. Hands-on exercises will focus on table-mounted eye trackers.

  • 0

Participation in COSIT 2022

We are excited to be part of the 15th International Conference on Spatial Information Theory (COSIT 2022) that is taking place on September 5-9, 2022, in Kobe, Japan.

Two of our lab members, Kevin Kim and Adrian Sarbach, will attend the conference (in person!) and present our latest work. We are looking forward to meeting other researchers and discussing exciting research!

More information: http://cosit2022.iniad.org


  • -

FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight

Our article “FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight” has been accepted for publication by the International Journal of Human–Computer Interaction (IJHCI):

Luis Lutnyk, David Rudi, Emanuel Meier, Peter Kiefer, Martin Raubal (2022). FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight , International Journal of Human–Computer Interaction, DOI: 10.1080/10447318.2022.2075627.

Abstract. Contemporary aircraft cockpits rely mostly on audiovisual information propagation which can overwhelm particularly novice pilots. The introduction of tactile feedback, as a less taxed modality, can improve the usability in this case. As part of a within-subject simulator study, 22 participants are asked to fly a visual-flight-rule scenario along a predefined route and identify objects in the outside world that serve as waypoints. Participants fly two similar scenarios with and without a tactile belt that indicates the route. Results show that with the belt, participants perform better in identifying objects, have higher usability and user experience ratings, and a lower perceived cognitive workload, while showing no improvement in spatial awareness. Moreover, 86\% of the participants state that they prefer flying with the tactile belt. These results suggest that a tactile belt provides pilots with an unobtrusive mode of assistance for tasks that require orientation using cues from the outside world.

The article will appear in one of the next issues of the International Journal of Human–Computer Interaction.

It has been published as Open Access and you can get the PDF here:

  • -

Co-authored paper on PEGGASUS at SPIE Photonics West

A joint paper with our partners at CSEM on the eye tracking hard- and software developed in PEGGASUS was published on the SPIE digital library. We encourage you to take a deep dive into the technological achievements of PEGGASUS:

“The pipeline, which is a combination of data-driven and analytics approaches, runs in real time at 60 fps with a latency of about 32ms. The eye gaze estimation error was evaluated in terms of the point of regard distance error with respect to the 3D point location. An average error of less than 1.1cm was achieved over 28 gaze points representing the cockpit instruments placed at about 80-110cm from the participants’ eyes.”

Engin Türetkin, Sareh Saeedi, Siavash Bigdeli, Patrick Stadelmann, Nicolas Cantale, Luis Lutnyk, Martin Raubal, Andrea L. Dunbar (2022). Real time eye gaze tracking for human machine interaction in the cockpit In AI and Optical Data Sciences III (Vol. 12019, pp. 24-33). SPIE..

The paper was presented at SPIE’s Photonics West conference at San Francisco’s Moscone Center.

Find the full text and presentation video here: https://doi.org/10.1117/12.2607434

The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461

  • -

Control Room Decision-Making Lab at Singapore-ETH Centre

We are happy to announce the starting of our new Control Room Decision-Making Lab at the Singapore-ETH Centre (SEC) for our project on Communicating Predicted Disruptions in Future Resilient Systems (FRS 2). The lab is equipped with sensors for measuring decision makers’ psycho-physiological state, including remote and head-mounted eye trackers, galvanic skin response sensors, and accelerometers. The new lab infrastructure will be used to study how different communication techniques can be used in control rooms to support decision-makers. This research will improve the next generation of control rooms, thus enhancing the resilience of the monitored infrastructure in case of disruptions

We want to thank all our collaborators and the SEC management for their help setting up the laboratory.

Control Room Decision-Making Lab

  • -

Successful finish of PEGGASUS and summary

We are happy to report that our aviation project PEGGASUS (Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions) successfully finished.

We would like to thank all partners involved in the project for the extensive efforts to finish the project successfully and deliver the results despite all the Covid-related restrictions and hurdles.

You can find a summary of the project outcomes at the EU Cordis Portal:

“The PEGGASUS consortium has developed a multi-camera vision system for tracking eye-gaze and gestures of the pilots in real time in the cockpit environment. This system allows a leap towards a more comprehensive human-machine interface in the cockpit to reduce the stress and cognitive load of the pilots, while bringing forward future pilot training techniques. Better awareness of the instruments will help the flight management to optimize trajectories and better manage fuel use, in line with the overall objectives of the Clean Sky JU.”

Two images showing the results of the algorithms including pupil detection and eye gaze estimation

Ultimate Prototype Hardware setup installed in the cockpit simulator










Excerpt and images taken from: https://cordis.europa.eu/project/id/821461/reporting

This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461

  • -

Kevin Gonyop Kim joins the team

We’re excited to welcome Kevin Gonyop Kim, who has now started as a postdoctoral researcher in the 3D sketch maps project. Welcome to the team, Kevin!

  • -

  • -

Joint Talk at the Royal Aeronautical Society Flight Simulation Conference

We had the honor of giving a presentation at the Royal Aeronautical Society‘s Flight Simulation Conference which took place from October 26nd to 27th in London.

In the joint talk with Gilad Scherpf of Lufthansa Group, we presented results from the PEGGASUS project and showcased how eye and gesture tracking can support the assessment of EBT competencies. (Evidence-Based Training)

We want to thank the hosts at the RAeS for the invitation and the attendees for the very positive feedback and interesting questions during the panel discussion. A special thank you also goes out to our partners at SWISS and CSEM.

The full programme of the conference can be found here.


The talk was given as part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461

  • -

PhD graduation Fabian Göbel

Fabian Göbel has successfully completed his doctoral thesis on “Visual Attentive User Interfaces for Feature-Rich Environments”. The doctoral graduation has been approved by the Department conference in their last meeting. Congratulations, Fabian!

After his thesis defense, Fabian has started a research internship at Microsoft on the topic of interaction with HoloLens 2. We wish him all the best and thank him for all the contributions he has made to our research!

  • -

Workshop Paper presented at the INTERACT 2021

Our paper “Improving resilience by communicating predicted disruptions in control rooms” has been presented at the INTERACT 2021 Workshop on Control Rooms in Safety Critical Contexts (CRiSCC): Design, Engineering and Evaluation Issues. The full-day workshop was held in a hybrid manner at Bari, Italy, with 13 interdisciplinary researchers. The vision paper outlines some of the ideas and challenges which we are addressing in the FRS 2 project on “Optimizing strategies for communicating predicted disruptions to stakeholders”:

Chakraborty, S., Kiefer, P., & Raubal, M. (2021). Improving resilience by communicating predicted disruptions in control rooms, INTERACT 2021.

Abstract: Even though the importance of resilience for control rooms is generally acknowledged, cognitive resilience is often not taken into account properly during control room design. This vision paper aims at improving the cognitive resilience in control rooms through advancements in three key research areas: 1) automated detection of upcoming disruptions, 2) visualization of spatio-temporal uncertainty, 3) cognition-aware interaction design.

  • -

  • -

New project and open position: 3D Sketch Maps

We’re very much looking forward to the start of the 3D Sketch Maps project, for which we have now announced an open PhD position: Interaction with 3D Sketch Maps in Extended Reality.

The 3D Sketch Maps project, funded by the Swiss National Science Foundation in the scope of the Sinergia funding program, investigates 3D sketch maps from a theoretical, empirical, cognitive, as well as tool-​related perspective, with a particular focus on Extended Reality (XR) technologies. Sketch mapping is an established research method in fields that study human spatial decision-​making and information processing, such as navigation and wayfinding. Although space is naturally three-​dimensional (3D), contemporary research has focused on assessing individuals’ spatial knowledge with two-​dimensional (2D) sketches. For many domains though, such as aviation or the cognition of complex multilevel buildings, it is essential to study people’s 3D understanding of space, which is not possible with the current 2D methods. Eye tracking will be used for the analysis of people’s eye movements while using the sketch mapping tools.

The 4-​year project will be carried out jointly by the Chair of Geoinformation Engineering, the Chair of Cognitive Science at ETH Zurich (Prof. Dr. Christoph Hölscher), and the Spatial Intelligence Lab at University of Münster (Prof. Dr. Angela Schwering).

Interested? Please check out the open PhD position on the ETH job board!

  • -

Adrian Sarbach joins the team

We’re glad that Adrian Sarbach has joined the geoGAZElab as a doctoral student on the project “The Expanded Flight Deck – Improving the Weather Situation Awareness of Pilots (EFDISA)“.

Adrian has, among others, studied at EPFL (Bachelor) and at ETH Zurich (Master), obtaining his degrees in electrical engineering. He wrote his MSc thesis in collaboration with Swiss International Air Lines, on the topic of tail assignment optimization.

  • -

Full Paper presentation at ETRA 2021

Our accepted paper “Gaze-Adaptive Lenses for Feature-Rich Information Spaces” will be presented at ACM ETRA 2021:

May 25.2021 at 11:00 – 12:00 and 18:00 – 19:00 in “Posters & Demos & Videos”
May 26.2021 at 14:4516.15 in Track 1: “Full Papers V”

Join the virtual conference for a chat!

  • -

Kuno Kurzhals: Junior Research Group Lead

Kuno Kurzhals has left us for his new position as Junior Research Group Lead in the Cluster of Excellence Integrative Computational Design and Construction for Architecture (IntCDC) at the University of Stuttgart, Germany.

We wish him all the best and thank for the contributions he has made to the geoGAZElab during his PostDoc time!

  • -

New project and open position: EFDISA

We are excited that we receive funding from the Swiss Federal Office of Civil Aviation (BAZL) for a new project, starting in July 2021: “The Expanded Flight Deck – Improving the Weather Situation Awareness of Pilots (EFDISA)“. The project aims at improving contemporary pre-flight and in-flight representations of weather data for pilots. This will allow pilots to better perceive, understand, and anticipate meteorological hazards. The project will be done in close collaboration with industry partners and professional pilots (Swiss International Air Lines & Lufthansa Systems).

We are looking for a highly motivated doctoral student for this project. Applications are now open.

  • -

Results of the Interdisciplinary Project 2020

As an interdisciplinary project, the three Geomatics Master students Laura Schalbetter, Tianyu Wu and Xavier Brunner have developed an indoor navigation system for Microsoft HoloLens 2. The system was implemented using ESRI CityEngine, Unity, and Microsoft Visual Studio.

Check out their video:

  • -

Workshop Paper published from ICCAS 2020

Our paper “Recognizing Pilot State: Enabling Tailored In-Flight Assistance Through Machine Learning” has been published in the proceedings of the 1st International Conference on Cognitive Aircraft Systems:

Lutnyk, L., Rudi, D., Kiefer, P., & Raubal, M. (2020). Recognizing Pilot State: Enabling Tailored In-Flight Assistance Through Machine Learning ICCAS 2020.

Abstract. Moving towards the highly controversial single pilot cockpit, more and more automation capabilities are added to today’s airliners. However, to operate safely without a pilot monitoring, avionics systems in future cockpits will have to be able to intelligently assist the remaining pilot. One critical enabler for proper assistance is a reliable classification of the pilot’s state, both in normal conditions and more critically in abnormal situations like an equipment failure. Only with a good assessment of the pilot’s state, the cockpit can adapt to the pilot’s current needs, i.e. alert, adapt displays, take over tasks, monitor procedures, etc.

The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461

  • -

Suvodip Chakraborty starting in January

After a Corona-caused delay in the hiring process, we’re excited to announce that Suvodip Chakraborty will start as a PhD student in our Singapore-based project on Communicating Predicted Disruptions in the scope of the Future Resilient Systems 2 research program. Suvodip will start in January 2021.

Suvodip holds a Master of Science from the Indian Institute of Technology Kharagpur. His Master thesis was titled “Design of Electro-oculography based wearable systems for eye movement analysis”.

  • -

ETRA 2021 Call for Demos&Videos

As last year, we are involved in the organization of ETRA 2021, the ACM Symposium on Eye Tracking Research & Applications. Peter is again co-chairing the Demo&Video Track, for which the Call is now available online. The submission deadline is 2 February 2021.

  • -

Workshop talk: Flight safety – Line of sight

Pilots not only have to make the right decisions, but they have to do it quickly and process a lot of information – especially visual information. In a unique project, ETH Zurich and Swiss International Air Lines have investigated what the eyes of pilots do in this process.

Martin Raubal, Professor of Geoinformation Engineering at ETH Zurich, appreciates the practical relevance of this research collaboration, which could contribute to increasing flight safety. Anyone who wants to develop it further should take off their blinders and think outside the box, says Christoph Ammann, captain and instructor at Swiss. And ETH Zurich is an ideal partner for this.

Watch the video on Vimeo!

  • -

  • -

Book Chapter on Outdoor HCI accepted

Kiefer, P., Adams, B., Kwok, T., Raubal, M. (2020) Modeling Gaze-Guided Narratives for Outdoor Tourism. In: McCrickard, S., Jones, M., and Stelter, T. (eds.): HCI Outdoors: Theory, Design, Methods and Applications. Springer International Publishing (in print)

  • -

ET4S and ETVIS proceedings

We’ve been involved in the organization of two co-located events at this year’s ETRA conference: Eye Tracking for Spatial Research (ET4S) and Eye Tracking and Visualization (ETVIS). Even though ETRA and all co-located events had to be canceled, the review process was finished regularly, and accepted papers are now available in the ETRA Adjunct Proceedings in the ACM Digital Library.

Accepted ET4S papers are also linked from the ET4S website.

  • -

New article on iAssyst

The instructor assistant system (iAssyst) that we developed as part of our research collaboration with Swiss International Air Lines is being featured in an article by innoFRAtor, the innovation portal of the Fraport AG.

You may read more about the system in our related research article: Rudi D., Kiefer P., and Raubal M. (2020). The Instructor Assistant System (iASSYST) – Utilizing Eye Tracking for Commercial Aviation Training Purposes. Ergonomics, vol. 63: no. 1, pp. 61-​79, London: Taylor & Francis, 2020. DOI: https://doi.org/10.1080/00140139.2019.1685132

Our project on Enhanced flight training program for monitoring aircraft automation with Swiss International Air Lines, NASA, and the University of Oregon was officially concluded end of last year.

  • -

Workshop Paper published from ETAVI 2020

Our paper Towards Pilot-Aware Cockpits has been published in the proceedings of the 1st International Workshop on Eye-Tracking in Aviation (ETAVI 2020):

Lutnyk L., Rudi D., and Raubal M. (2020). Towards Pilot-​Aware Cockpits. In Proceedings of the 1st International Workshop on Eye-​Tracking in Aviation (ETAVI 2020), ETH Zurich. DOI: https://doi.org/10.3929/ethz-b-000407661

Abstract. Eye tracking has a longstanding history in aviation research. Amongst others it has been employed to bring pilots back “in the loop”, i.e., create a better awareness of the flight situation. Interestingly, there exists only little research in this context that evaluates the application of machine learning algorithms to model pilots’ understanding of the aircraft’s state and their situation awareness. Machine learning models could be trained to differentiate between normal and abnormal patterns with regard to pilots’ eye movements, control inputs, and data from other psychophysiological sensors, such as heart rate or blood pressure. Moreover, when the system recognizes an abnormal pattern, it could provide situation specific assistance to bring pilots back in the loop. This paper discusses when pilots benefit from such a pilot-aware system, and explores the technical and user oriented requirements for implementing this system.

Edit. The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461

  • -

GeoGazeLab involved in Future Resilient Systems II programme

The second phase of the FRS programme at the Singapore-ETH Centre officially started on April 1st with an online research kick-off meeting. It was launched in the midst of a global crisis – COVID-​19, highlighting the need to better understand and foster resilience. Within FRS-II there is a particular emphasis on social resilience to enhance the understanding of how socio-​technical systems perform before, during and after disruptions.

GeoGazeLab researchers will contribute within a research cluster focusing on distributed cognition (led by Martin Raubal). More specifically, we will develop a visualization, interaction, and notification framework for communicating predicted disruptions to stakeholders. Empirical studies utilizing eye tracking and gaze-based interaction methods will be part of this project, which is led by Martin Raubal and Peter Kiefer.

  • -

Full Paper accepted at ETRA 2020

Our paper “Gaze-Adaptive Lenses for Feature-Rich Information Spaces” has been accepted at ACM ETRA 2020 as a full paper:

Göbel, F., Kurzhals K., Schinazi V. R., Kiefer, P., and Raubal, M. (2020). Gaze-Adaptive Lenses for Feature-Rich Information Spaces. In Proceedings of the 12th ACM Symposium on Eye Tracking Research & Applications (ETRA ’20), ACM. DOI: https://doi.org/10.1145/3379155.3391323

  • -

Workshop Paper accepted at CHI 2020

Our workshop contribution “Gaze-Aware Mixed-Reality: Addressing Privacy Issues with Eye Tracking” has been accepted at the “Workshop on Exploring Potentially Abusive Ethical, Social and Political Implications of Mixed Reality in HCI” at ACM CHI 2020:

Fabian Göbel, Kuno Kurzhals Martin Raubal and Victor R. Schinazi (2020). Gaze-Aware Mixed-Reality: Addressing Privacy Issues with Eye Tracking.
In CHI 2020 Workshop on Exploring Potentially Abusive Ethical, Social and Political Implications of Mixed Reality in HCI (CHI 2020), ACM.