• eyetracking@ethz.ch
  • +41 44 633 71 59

Author Archives: Fabian Göbel

  • -

Full Paper presentation at ETRA 2021

Our accepted paper “Gaze-Adaptive Lenses for Feature-Rich Information Spaces” will be presented at ACM ETRA 2021:

May 25.2021 at 11:00 – 12:00 and 18:00 – 19:00 in “Posters & Demos & Videos”
May 26.2021 at 14:4516.15 in Track 1: “Full Papers V”

Join the virtual conference for a chat!
https://etra.acm.org/2021/schedule.html


  • -

Results of the Interdisciplinary Project 2020

As an interdisciplinary project, the three Geomatics Master students Laura Schalbetter, Tianyu Wu and Xavier Brunner have developed an indoor navigation system for Microsoft HoloLens 2. The system was implemented using ESRI CityEngine, Unity, and Microsoft Visual Studio.

Check out their video:


  • -

Full Paper accepted at ETRA 2020

Our paper “Gaze-Adaptive Lenses for Feature-Rich Information Spaces” has been accepted at ACM ETRA 2020 as a full paper:

Göbel, F., Kurzhals K., Schinazi V. R., Kiefer, P., and Raubal, M. (2020). Gaze-Adaptive Lenses for Feature-Rich Information Spaces. In Proceedings of the 12th ACM Symposium on Eye Tracking Research & Applications (ETRA ’20), ACM. DOI: https://doi.org/10.1145/3379155.3391323


  • -

Workshop Paper accepted at CHI 2020

Our workshop contribution “Gaze-Aware Mixed-Reality: Addressing Privacy Issues with Eye Tracking” has been accepted at the “Workshop on Exploring Potentially Abusive Ethical, Social and Political Implications of Mixed Reality in HCI” at ACM CHI 2020:

Fabian Göbel, Kuno Kurzhals Martin Raubal and Victor R. Schinazi (2020). Gaze-Aware Mixed-Reality: Addressing Privacy Issues with Eye Tracking.
In CHI 2020 Workshop on Exploring Potentially Abusive Ethical, Social and Political Implications of Mixed Reality in HCI (CHI 2020), ACM.


  • -

Invited Talk by Sophie Stellmach on Mixed Reality on 10.10.2019

We are glad to announce an invited talk by Sophie Stellmach on Eye Tracking and Mixed Reality as part of the VERA Geomatics Seminar.

Dr. Sophie Stellmach is a Senior Scientist at Microsoft where she explores entirely new ways to engage with and blend our virtual and physical realities in products such as Microsoft HoloLens. Being an avid eye tracking researcher for over a decade, she was heavily involved in the development of gaze-based interaction with HoloLens 2.

The talk will take place as part of the VERA Geomatics Seminar on Thursday, 10th October 2019, 5:00 p.m. at ETH Hönggerberg, HIL D 53.
Title: Multimodal Gaze-supported Input in Mixed Reality and its Promises for Spatial Research


  • -

FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps

An article titled “FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps” will appear in one of the next issues of the Geoinformatica journal. It is now available online:

FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps

Abstract Map reading is a visual task that can strongly vary between individuals and maps of different characteristics. Aspects such as where, when, how long, and in which sequence information on a map is looked at can reveal valuable insights for both the map design process and to better understand cognitive processes of the map user. Contrary to static maps, for which many eye tracking studies are reported in the literature, established methods for tracking and analyzing visual attention on interactive maps are yet missing. In this paper, we present a framework called FeaturEyeTrack that allows to automatically log the cartographic features that have been inspected as well as the mouse input during the interaction with digital interactive maps. In particular, the novelty of FeaturEyeTrack lies in matching of gaze with the vector model of the current map visualization, therefore enabling a very detailed analysis without the requirement for manual annotation. Furthermore, we demonstrate the benefits of this approach in terms of manual work, level of detail and validity compared to state-of-the-art methods through a case study on an interactive cartographic web map.


  • -

ET4S goes ETRA 2019

We’re glad to announce the 4th edition of the “Eye Tracking for Spatial Research” (ET4S) event which will take place June 25-28, 2019 in Denver, Colorado, USA.

After three successful ET4S workshops in Scarborough (2013), Vienna (2014) and Zurich (2018), the 4th edition of ET4S will be organized as a conference track at the “ACM 2019 Symposium on Eye Tracking Research & Applications in Denver” (ETRA)

ET4S aims to bring together researchers from different areas who have a common interest in using eye tracking for research questions related to spatial information and spatial decision making.

The Call for Papers is now available (Paper abstracts due: December 14, 2018).


  • -

Short Paper and Workshop Paper accepted at GIScience 2018

We are happily announcing that two of our papers have been accepted at GIScience 2018 and the Workshop on Spatial big data and machine learning in GIScience:

Fabian Göbel, Peter Kiefer, Ioannis Giannopoulos and Martin Raubal. 2018. Gaze Sequences and Map Task Complexity. GIScience 2018, Melbourne, Australia.

Fabian Göbel, Henry Martin. 2018. Unsupervised Clustering of Eye Tracking Data. Spatial big data and machine learning in GIScience, Workshop at GIScience 2018, Melbourne, Australia.

Both works are part of the IGAMaps project.


  • -

3rd ET4S Workshop

Many thanks to all participating in this workshop and making it an inspiring event.

We have published the workshop proceedings through the ETH Research Collection.


  • -

Student Project Finished: A Public Gaze-Controlled Campus Map

Four Geomatics Master students have developed a public gaze-controlled campus map in the context of an interdisciplinary project work this autumn semester (Nikolaos Bakogiannis, Katharina Henggeler, Roswita Tschümperlin and Yang Xu).

The system prototype has been tested in a one week field study performed at the Campus Info Point at ETH Hönggerberg with 50 campus visitors.

The results of the thesis will be presented at a public event on Thursday 14 December, 2017 between 17:00 – 18:00 at HIL D 53. During the apéro afterwards, you are welcome to try the system yourself.

We’d like to thank the visitor and information management of ETH Zurich Services (in particular Stephanie Braunwalder) for supporting this project


  • -

Scientifica 2017 – Impressions

We have demoed the LAMETTA project at this year’s Scientifica, the Zurich science exhibition of ETH Zurich and University of Zurich with more than 30,000 visitors.

Our panorama wall installation enabled visitors to query information about lakes, mountains, and villages just by looking at them. Many visitors found the hidden treasure in the panorama and were rewarded with a piece of Swiss chocolate.

Visitors of all ages tried out and learned more about gaze-based interaction and mobile eye tracking technology. We are happy that so many people were interested and eager to discuss our research and potential applications to tourist guides of the future.


  • -

3rd ET4S Workshop – Registration now open

Registration for the ET4S workshop and the LBS2018 conference is now open:

Early bird discount will be available until 28 November 2017.

 

We are looking forward to seeing you in Zurich.


  • -

3rd ET4S Workshop: Call for Papers

We’re glad to announce the 3rd International Workshop on Eye Tracking for Spatial Research (ET4S), which will take place January 14, 2018 in Zurich.

The workshop aims to bring together researchers from different areas who have a common interest in using eye tracking for research questions related to spatial information. This is the 3rd edition of the workshop, after two successful ET4S workshops in Scarborough (2013) and Vienna (2014). This time the workshop will be co-located with the 14th International Conference on Location Based Services (LBS 2018).

The workshop will be opened with an invited talk given by Roman Bednarik, an adjunct professor at the School of Computing, University of Eastern Finland.

The Call for Papers is now available (Paper submission deadline: September 27, 2017).

ET4S 2018 is supported by Ergoneers.


  • -

Short Paper accepted at AGILE conference

Fabian Göbel, Peter Kiefer and Martin Raubal (2017). FeaturEyeTrack: A Vector Tile-Based Eye Tracking Framework for Interactive Maps In Proceedings of the 20th International Conference on Geographic Information Science (AGILE 2017), Wageningen, The Netherlands. (accepted)


  • -

GeoGazeLab goes facebook

Now you can find us on facebook!


  • -

GeoGazeLab at the Smarttention Workshop

GeoGazeLab was represented at the Smarttention Workshop at MobileHCI to raise awareness for visual attention in adaptive interfaces.

Thanks to the organizers for their great work. We had inspiring discussions about the future of mobile UIs.whatsapp-image-2016-09-10-at-11-47-57


  • -

  • -

GeoGazeLab at GEOSummit 2016

This year at GEOSummit, our group displayed a gaze-based Geo-Game. During the GEOSchool Day, pupils and adults could experience how it feels to control maps just by using their gaze.
– We had great fun changing the world in the blink of an eye.

DSC00916_2