• eyetracking@ethz.ch
  • +41 44 633 71 59

News

  • -

Kevin Gonyop Kim: Professor at FHNW

Kevin has left us for his new position as Professor of Spatial Computing and 3D Technologies at the Institute of Interactive Technologies at the University of Applied Sciences and Arts Northwestern Switzerland (FHNW) in Windisch, Switzerland.

We thank him for the contributions he has made to the geoGAZElab during his PostDoc time, in particular in the scope of the 3D Sketch Maps project. All the best, and let’s stay in contact!


  • -

Design++

We have joined the Design++ initiative. We’re looking forward to an interesting and fruitful interdisciplinary exchange with other members of Design++.

We’ll be using the Design++ infrastructure in the scope of the MSCA Doctoral Network Eyes4ICU. Controlled eye tracking experiments will be performed in the immersive Audiovisual Room of the Large-scale Virtualization and Modeling Lab.

 


  • -

Johanna Wörle joins the team

We’re happy to welcome Johanna Wörle, who has started as a postdoctoral researcher at the Singapore ETH Centre. In the scope of the Future Resilient Systems 2 research program, her research will focus on the effects of stress on human performance. Johanna holds a PhD in psychology from Ulm University, Germany.


  • -

Eyes4ICU at LBS 2023

In the scope of the MSCA Doctoral Network Eyes4ICU, our doctoral students Lin Che and Yiwei Wang are investigating novel ways of using eye tracking for the improvement of location-based services. They presented and discussed their research at the 18th Conference on Location Based Services in Ghent, Belgium, last week.

Congrats, Lin, for receiving the best short paper award!

Work-in-progress papers (DOI assignment pending):

  • Che, L., Raubal, M., and Kiefer, P. (2023) Towards Personalized Pedestrian Route Recommendation Based on Implicit Visual Preference. In: Huang, H., Van de Weghe, N., and Gartner, G. (editors), Proceedings of the 18th International Conference on Location Based Services, Ghent, Belgium (to appear) [PDF]
  • Wang, Y., Raubal, M., and Kiefer, P. (2023) Towards gaze-supported emotion-enhanced travel experience logging. In: Huang, H., Van de Weghe, N., and Gartner, G. (editors), Proceedings of the 18th International Conference on Location Based Services, Ghent, Belgium (to appear) [PDF]

 

 


  • -

  • -

“Do You Need Instructions Again? Predicting Wayfinding Instruction Demand”

In collaboration with colleagues from TU Vienna, we have published a full paper in the proceedings of this year’s GIScience conference, taking place next week in Leeds, U.K.:

The demand for instructions during wayfinding can be considered as an important indicator of the internal cognitive processes during wayfinding. In the paper, we are predicting instruction demand in a real-world wayfinding experiment with 45 participants using different environmental, user, instructional, and gaze-related features. Being able to predict instruction demand can, for instance, be beneficial for navigation systems that adapt instructions in real-time, based on their users’ behavior.

Alinaghi, N., Kwok, T. C., Kiefer, P., & Giannopoulos, I. (2023). Do You Need Instructions Again? Predicting Wayfinding Instruction Demand. In 12th International Conference on Geographic Information Science (GIScience 2023). Schloss Dagstuhl-Leibniz-Zentrum für Informatik.

Download PDF


  • -

ETRA 2024: Call for Papers

Peter Kiefer will be serving as one of the Full Paper Chairs for the ETRA 2024 conference, taking place in Glasgow, U.K., 4-7 June 2024.

Please check out the Call for Papers!

The geoGAZElab has been part of the ETRA community for many years. We’re happy to continue contributing to this vibrant community, pushing forward excellent research in this exciting field.


  • -

Swiss Council visit to Singapore-ETH Centre

swiss council visit
On the 4th of August, the Singapore-ETH Centre (SEC) had the privilege of hosting the Swiss federal council member Mr. Ignazio Cassis, along with Ambassador Heinrich Schellenberg and Ambassador Frank Grütter of the Embassy of Switzerland in Singapore. The day was filled with insightful discussions between the visiting dignitaries and SEC’s leadership, highlighting the importance of international collaboration in advancing scientific and technological research.

A doctoral student from the geoGAZElab, Suvodip Chakraborty, demonstrated “Improving Resilience in Control Rooms Through Eye Tracking”, a technology that promises to revolutionize control room operations. By employing eye-tracking technology, geoGAZElab work aims to enhance the effectiveness and efficiency of control room operators, thereby bolstering overall system resilience. This research holds the potential to reshape control room dynamics and contribute significantly to the broader fields of human-computer interaction and cognitive science.

The demonstration garnered considerable interest from the visiting Swiss delegation and SEC’s academic community. Mr. Ignazio Cassis, expressed his keen interest in the innovative research being conducted at the SEC. The Swiss delegation’s presence underscored the significance of this scientific exchange and hinted at the strong ties between Switzerland and Singapore in academic research.

In addition to Chakraborty’s demonstration, the visit included an exhibition tour showcasing other SEC’s research endeavours. The showcased projects spanned a spectrum of disciplines, from sustainable urban planning to future food supply.


  • 0

Presentation & Publication on Aeronautical Charts at AGILE 2023

It was a great pleasure to attend this year’s AGILE conference on Geographic Information Science in Delft (The Netherlands), where Adrian Sarbach presented the results of our research on visualisation and perception of airspace structures on aeronautical charts.

Our paper contains a theoretical cartographic analysis on aeronautical charts used for flights following visual flight rules, and the results of a user study, which confirmed the findings from the theoretical analysis.

This project was conducted together with Thierry Weber, Katharina Henggeler, Luis Lutnyk, and Martin Raubal.

If you are interested in reading the full open access paper, titled “Evaluating and Comparing Airspace Structure Visualisation and Perception on Digital Aeronautical Charts“, you can find it here: https://doi.org/10.5194/agile-giss-4-12-2023 .

Adrian Sarbach presenting his work on aeronautical charts at AGILE 2023

Adrian Sarbach presenting his work on aeronautical charts at AGILE 2023


  • -

Open position: Stress Detection from Physiological Sensors in Crisis Management

We have an open position (Postdoc/Senior Researcher/Research Engineer) at the Singapore-ETH Centre, based in Singapore.

In collaboration with a Singapore agency, we will study stress and stressors during crisis response in a virtual reality (VR) training environment. The project is part of the research programme “Future Resilient Systems“.


  • -

PhD graduation Luis Lutnyk

We’re glad to announce that our colleague Luis Lutnyk has successfully defended his dissertation on “Pilot Decision-Making Support through Intelligent Cockpit Technologies”. Congratulations, Luis!


  • -

New abstract in Resilience 2023- Using eye tracking for enhancing resilience in control rooms

We are pleased to announce that our abstract titled “Using eye tracking for enhancing resilience in control rooms” has been accepted for presentation at the Resilience 2023 conference.

In this abstract, we highlight three principle ways how eye tracking can support decision makers in control rooms: 1) Evaluation of information visualization for decision support systems, 2) unobtrusive assessment of decision-makers’ cognitive state, and 3) supporting distributed cognition through information sharing using eye-gaze.

In our talk, we’ll be presenting our current research progress in each of these directions. We look forward to seeing you at the Resilience 2023 conference in Mexico.

This project has received funding from the Future Resilience Systems program at the Singapore-ETH Center.


  • -

1st Eyes4ICU Winter School

What a great first event of the Eyes4ICU MSCA doctoral network! During a 1-week Winter School on Reisensburg Castle (close to Ulm, Germany), Peter, Lin and Yiwei met the other consortium members and the advisory board of Eyes4ICU.

The Winter School consisted of courses on eye tracking, computational modeling, and transferable skills. We kicked off the doctoral candidate projects by meeting the co-supervisors and partners and, certainly, this was a great opportunity for social networking.

A perfect start for Lin and Yiwei, who have just started on the project this month!

Our participation in this MSCA doctoral network is funded by the State Secretariate for Education, Research and Innovation. We’re grateful to have Esri as a partner for our doctoral candidate projects.

 

 


  • -

Welcome, Lin Che and Yiwei Wang!

We’re happy that two new doctoral students have joined our team, Lin Che and Yiwei Wang. They’re both part of the MSCA doctoral network “Eyes for Interaction, Communication, and Understanding (Eyes4ICU)”. Lin will be working on
Gaze-supported Trip Recommendation (DC6), Yiwei on Gaze-supported Travel Experience Logging (DC12). Welcome!

 


  • -

New article in Applied Ergonomics – The effect of flight phase on electrodermal activity and gaze behavior: A simulator study

Our article “The effect of flight phase on electrodermal activity and gaze behavior: A simulator study” has been accepted for publication in the journal Applied Ergonomics:

Luis Lutnyk, David Rudi, Victor R. Schinazi, Peter Kiefer, Martin Raubal (2022). The effect of flight phase on electrodermal activity and gaze behavior: A simulator study , Applied Ergonomics, Volume 109, DOI: 10.1016/j.apergo.2023.103989 .

Highlights:

  • Unobtrusive technologies were used to record electrodermal activity and gaze behavior in an instrument failure scenario.
  • Participants’ electrodermal activity increased significantly during high workload phases of the failure scenario.
  • AOI-based & non-AOI eye tracking metrics show significant differences when a secondary task needs to be solved during flight.
  • The observed measures show great potential for future cockpits that can provide assistance based on the sensed pilot state.

Abstract. Current advances in airplane cockpit design and layout are often driven by a need to improve the pilot’s awareness of the aircraft’s state. This involves an improvement in the flow of information from aircraft to pilot. However, providing the aircraft with information on the pilot’s state remains an open challenge. This work takes a first step towards determining the pilot’s state based on biosensor data. We conducted a simulator study to record participants’ electrodermal activity and gaze behavior, indicating pilot state changes during three distinct flight phases in an instrument failure scenario. The results show a significant difference in these psychophysiological measures between a phase of regular flight, the incident phase, and a phase with an additional troubleshooting task after the failure. The differences in the observed measures suggest great potential for a pilot-aware cockpit that can provide assistance based on the sensed pilot state.

The article has been published as Open Access and you can get the PDF here:
https://www.sciencedirect.com/science/article/pii/S0003687023000273

The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

New Article Published in Human-Computer Interaction

Our article “Unobtrusive interaction: a systematic literature review and expert survey” has been accepted for publication by the Human–Computer Interaction (HCI):

Tiffany C.K. Kwok, Peter Kiefer & Martin Raubal (2023). Unobtrusive interaction: a systematic literature review and expert survey, Human–Computer Interaction, DOI: 10.1080/07370024.2022.2162404

Abstract. Unobtrusiveness has been highlighted as an important design principle in Human-Computer Interaction (HCI). However, the understanding of unobtrusiveness in the literature varies. Researchers often claim unobtrusiveness for their interaction method based on their understanding of what unobtrusiveness means. This lack of a shared definition hinders effective communication in research and impedes comparability between approaches. In this article, we approach the question “What is unobtrusive interaction?” with a systematic and extensive literature review of 335 papers and an online survey with experts. We found that not a single definition of unobtrusiveness is universally agreed upon. Instead, we identify five working definitions from the literature and experts’ responses. We summarize the properties of unobtrusive interaction into a design framework with five dimensions and classify the reviewed papers with regard to these dimensions. The article aims to provide researchers with a more unified context to compare their work and identify opportunities for future research.

The article will appear in one of the next issues of the Human–Computer Interaction. It has been published as Open Access and you can get the article here:
https://doi.org/10.1080/07370024.2022.2162404


  • -

Winter School 2023: Summary

What an exciting way of starting into the new year!

Our Winter School on “Eye Tracking – Experimental Design, Implementation, and Analysis” took place in the second week of January on Monte Verità in Ascona, Switzerland. A total of 36 participants attended, with a large variety in terms of research background.

With her virtual keynote, Enkelejda Kasneci (TU Munich, Germany) opened the Winter School, in which she presented her research “On opportunities and challenges of eye tracking and machine learning for adaptive educational interfaces and classroom research”. Over the five days of the Winter School, participants learned about the different steps involved in performing eye tracking experiments, starting from experimental design, over data collection and processing, to statistical analysis (speakers: Nina Gehrer, University of Tübingen, Germany; Andrew Duchowski, Clemson University, S.C., US; Izabela and Krzysztof Krejtz, SWPS University of Social Sciences and Humanities, Poland). In hands-on sessions, participants designed and performed their own small eye tracking experiments.

The Winter School also enabled exchange between participants, through group work, poster presentations, and an excursion. The atmosphere of Monte Verità offered the perfect atmosphere and surroundings for this.

Thanks to all who have made this possible, especially our speakers and all sponsors!

 

 


  • -

Full Paper published at ICMI 2022

Our paper “Two-Step Gaze Guidance” has been published in the proceedings of the International Conference on Multimodal Interaction (ICMI ’22) as a full paper.

Tiffany C.K. Kwok, Peter Kiefer, Martin Raubal (2022). Two-Step Gaze Guidance, International Conference on Multimodal Interaction (ICMI ’22), DOI: 10.1145/3536221.3556612

Abstract. One challenge of providing guidance for search tasks consists in guiding the user’s visual attention to certain objects in a potentially large search space. Previous work has tried to guide the user’s attention by providing visual, audio, or haptic cues. The state-of-the-art methods either provide hints pointing towards the approximate direction of the target location for a fast but less accurate search or require the user to perform a fine-grained search from the beginning for a precise yet less efficient search. To combine the advantage of both methods, we propose an interaction concept called Two-Step Gaze Guidance. The first-step guidance focuses on quick guidance toward the approximate direction, and the second-step guidance focuses on fine-grained guidance toward the exact location of the target. A between-subject study (N = 69) with five conditions was carried out to compare the two-step gaze guidance method with the single-step gaze guidance method. Results revealed that the proposed method outperformed the single-step gaze guidance method. More precisely, the introduction of Two-Step Gaze Guidance slightly improves the searching accuracy, and the use of spatial audio as the first-step guidance significantly helps in enhancing the searching efficiency. Our results also indicated several design suggestions for designing gaze guidance methods.


  • -

Eyes4ICU: 2 open positions in MSCA doctoral network

Exciting news! The geoGAZElab will be participating in the MSCA Doctoral Network “Eyes for Interaction, Communication, and Understanding  (Eyes4ICU)” as an Associated Partner, funded by the Swiss State Secretariat for Education, Research and Innovation.

Eyes4ICU explores novel forms of gaze-based interaction that rely on current psychological theories and findings, computational modelling, as well as expertise in highly promising application domains. Its approach to developing inclusive technology by tracing gaze interaction back to its cognitive and affective foundations results in better models to predict user behaviour. By integrating insights in application fields, gaze-based interaction can be employed in the wild.

In the scope of Eyes4ICU, 12 doctoral candidates (DC) will be working at 7 different host institutions across Europe. Out of these, 2 DCs will be hosted at the geoGAZElab of ETH Zurich (PI: Peter Kiefer). They will be working on the topics Gaze-supported Trip Recommendation (DC6), and Gaze-supported Travel Experience Logging (DC12) respectively.

We are looking for two highly motivated doctoral candidates, starting at the earliest possible date: Position announcement.

 


  • -

PhD graduation Tiffany C.K. Kwok

We congratulate Tiffany C.K. Kwok for successfully completing her doctoral thesis on “Designing Unobtrusive Gaze-Based Interactions: Applications to Audio-Guided Panorama Viewing”. The doctoral graduation has been approved by the Department conference in their last meeting. The research was performed in the scope of the LAMETTA project.

Tiffany is staying with us for a PostDoc, continuing her research in the geoGAZElab. It’s great having you in our team, Tiffany!


  • -

Winter School Updates

We’re looking forward to our Winter School, taking place in January 2023.

Exciting updates to the program are now included in the updated announcement:

We’re glad that Prof. Dr. Enkelejda Kasneci (Technical University Munich) will be opening the Winter School with a keynote titled “On opportunities and challenges of eye tracking and machine learning for adaptive educational interfaces and classroom research“.

We’d like to thank the following sponsors, whose generous support will enable us to support several young researchers with a travel grant:

Application for travel grants is open until 15 October 2022.

The Winter School is a great opportunity for getting trained on eye tracking methodology, experimental design, and analysis. At the same time, it will facilitate networking with speakers, sponsor representatives, as well as among participants.


  • -

Tianyi Xiao joins the team

Our team is growing further: we’re so happy to have Tianyi Xiao on board, who is joining the 3D Sketch Maps project as a doctoral student. Welcome!


  • -

Winter School 2023

We are co-organizing an ETH Winter School on “Eye Tracking – Experimental Design, Implementation, and Analysis” which is going to take place in Monte Verità (Ticino), Switzerland, from 8 to 13 January 2023. Download the first announcement as PDF.

The Winter School targets at PhD students and early PostDocs (coming from any research field) who are using, or planning to use, eye tracking in their research. Internationally recognized experts will provide lectures and hands-on sessions on eye tracking methodology, experimental design, and analysis.

The registration for the Winter School is now open.

Building on the successful 2016 Winter School, the 2023 School will be updated to focus on state-of-the-art software (licensed and open-source, e.g., PsychoPy and Pupil Labs) and hardware. Hands-on exercises will focus on table-mounted eye trackers.


  • -

Participation in COSIT 2022

We are excited to be part of the 15th International Conference on Spatial Information Theory (COSIT 2022) that is taking place on September 5-9, 2022, in Kobe, Japan.

Two of our lab members, Kevin Kim and Adrian Sarbach, will attend the conference (in person!) and present our latest work. We are looking forward to meeting other researchers and discussing exciting research!

More information: http://cosit2022.iniad.org

 


  • -

FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight

Our article “FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight” has been accepted for publication by the International Journal of Human–Computer Interaction (IJHCI):

Luis Lutnyk, David Rudi, Emanuel Meier, Peter Kiefer, Martin Raubal (2022). FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight , International Journal of Human–Computer Interaction, DOI: 10.1080/10447318.2022.2075627.

Abstract. Contemporary aircraft cockpits rely mostly on audiovisual information propagation which can overwhelm particularly novice pilots. The introduction of tactile feedback, as a less taxed modality, can improve the usability in this case. As part of a within-subject simulator study, 22 participants are asked to fly a visual-flight-rule scenario along a predefined route and identify objects in the outside world that serve as waypoints. Participants fly two similar scenarios with and without a tactile belt that indicates the route. Results show that with the belt, participants perform better in identifying objects, have higher usability and user experience ratings, and a lower perceived cognitive workload, while showing no improvement in spatial awareness. Moreover, 86\% of the participants state that they prefer flying with the tactile belt. These results suggest that a tactile belt provides pilots with an unobtrusive mode of assistance for tasks that require orientation using cues from the outside world.

The article will appear in one of the next issues of the International Journal of Human–Computer Interaction.

It has been published as Open Access and you can get the PDF here:
https://www.tandfonline.com/doi/full/10.1080/10447318.2022.2075627


  • -

Co-authored paper on PEGGASUS at SPIE Photonics West

A joint paper with our partners at CSEM on the eye tracking hard- and software developed in PEGGASUS was published on the SPIE digital library. We encourage you to take a deep dive into the technological achievements of PEGGASUS:

“The pipeline, which is a combination of data-driven and analytics approaches, runs in real time at 60 fps with a latency of about 32ms. The eye gaze estimation error was evaluated in terms of the point of regard distance error with respect to the 3D point location. An average error of less than 1.1cm was achieved over 28 gaze points representing the cockpit instruments placed at about 80-110cm from the participants’ eyes.”

Engin Türetkin, Sareh Saeedi, Siavash Bigdeli, Patrick Stadelmann, Nicolas Cantale, Luis Lutnyk, Martin Raubal, Andrea L. Dunbar (2022). Real time eye gaze tracking for human machine interaction in the cockpit In AI and Optical Data Sciences III (Vol. 12019, pp. 24-33). SPIE..

The paper was presented at SPIE’s Photonics West conference at San Francisco’s Moscone Center.

Find the full text and presentation video here: https://doi.org/10.1117/12.2607434

The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Control Room Decision-Making Lab at Singapore-ETH Centre

We are happy to announce the starting of our new Control Room Decision-Making Lab at the Singapore-ETH Centre (SEC) for our project on Communicating Predicted Disruptions in Future Resilient Systems (FRS 2). The lab is equipped with sensors for measuring decision makers’ psycho-physiological state, including remote and head-mounted eye trackers, galvanic skin response sensors, and accelerometers. The new lab infrastructure will be used to study how different communication techniques can be used in control rooms to support decision-makers. This research will improve the next generation of control rooms, thus enhancing the resilience of the monitored infrastructure in case of disruptions

We want to thank all our collaborators and the SEC management for their help setting up the laboratory.

Control Room Decision-Making Lab


  • -

Successful finish of PEGGASUS and summary

We are happy to report that our aviation project PEGGASUS (Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions) successfully finished.

We would like to thank all partners involved in the project for the extensive efforts to finish the project successfully and deliver the results despite all the Covid-related restrictions and hurdles.

You can find a summary of the project outcomes at the EU Cordis Portal:

“The PEGGASUS consortium has developed a multi-camera vision system for tracking eye-gaze and gestures of the pilots in real time in the cockpit environment. This system allows a leap towards a more comprehensive human-machine interface in the cockpit to reduce the stress and cognitive load of the pilots, while bringing forward future pilot training techniques. Better awareness of the instruments will help the flight management to optimize trajectories and better manage fuel use, in line with the overall objectives of the Clean Sky JU.”

Two images showing the results of the algorithms including pupil detection and eye gaze estimation

Ultimate Prototype Hardware setup installed in the cockpit simulator

 

 

 

 

 

 

 

 

 

Excerpt and images taken from: https://cordis.europa.eu/project/id/821461/reporting

This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Kevin Gonyop Kim joins the team

We’re excited to welcome Kevin Gonyop Kim, who has now started as a postdoctoral researcher in the 3D sketch maps project. Welcome to the team, Kevin!


  • -