• eyetracking@ethz.ch
  • +41 44 633 71 59

News

  • -

Meeting point Science City – March 2018

We’re excited to demonstrate the LAMETTA project at ETH Treffpunkt Science City, the educational programs of ETH Zurich for all. Come and try out an interactive mobile eye tracking system! Explore a mountain panorama and interact with it only by using your gaze (more details)!

You can find us Sunday, 25 March in ETH Hönggerberg HCI, Room E2.


  • -

Call for Papers: Spatial Big Data and Machine Learning in GIScience

Have you ever thought of eye tracking data as spatial “big” data? Are you collecting large amounts of eye tracking data together with geo-spatial coordinates, or are you applying machine learning on such data? Then the workshop on “Spatial Big Data and Machine Learning in GIScience” at this year’s GIScience conference in Melbourne might be interesting for you.

The 1st Call for Papers is now available on the website.


  • -

Position Paper at CHI Workshop on Outdoor HCI

We’ll present our ideas on how to enrich a tourist’s experience with gaze-guided narratives at a CHI workshop in Montreal this year:

Kiefer, P., Adams, B., and Raubal, M. (2018) Gaze-Guided Narratives for Outdoor Tourism. HCI Outdoors: A CHI 2018 Workshop on Understanding Human-Computer Interaction in the Outdoors

This research is part of the LAMETTA project.


  • -

  • -

Full Paper accepted at CHI 2018

Andrew T. Duchowski, Krzysztof Krejtz, Izabela Krejtz, Cezary Biele, Anna Niedzielska, Peter Kiefer, Ioannis Giannopoulos, and Martin Raubal (2018). The Index of Pupillary Activity: Measuring Cognitive Load vis-à-vis Task Difficulty with Pupil Oscillation In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems (CHI 2018), ACM (accepted)


  • -

3rd ET4S Workshop

Many thanks to all participating in this workshop and making it an inspiring event.

We have published the workshop proceedings through the ETH Research Collection.


  • -

Student Project Finished: A Public Gaze-Controlled Campus Map

Four Geomatics Master students have developed a public gaze-controlled campus map in the context of an interdisciplinary project work this autumn semester (Nikolaos Bakogiannis, Katharina Henggeler, Roswita Tschümperlin and Yang Xu).

The system prototype has been tested in a one week field study performed at the Campus Info Point at ETH Hönggerberg with 50 campus visitors.

The results of the thesis will be presented at a public event on Thursday 14 December, 2017 between 17:00 – 18:00 at HIL D 53. During the apéro afterwards, you are welcome to try the system yourself.

We’d like to thank the visitor and information management of ETH Zurich Services (in particular Stephanie Braunwalder) for supporting this project


  • -

  • -

3rd ET4S Workshop: Call for Demos

You have an interesting eye tracking research prototype to show? You would like to join the ET4S workshop, but have missed the deadline for regular papers? You have a regular paper accepted and would like to take the opportunity to gain even more visibility and get direct feedback for your system?

We’ll have a dedicated demo session at the ET4S workshop in January 2018 in Zurich. We hope you consider submitting a short abstract (1 page) with a demo proposal (before November, 8).

ET4S website: http://spatialeyetracking.org/et4s-2018/


  • -

3rd ET4S Workshop: Keynote by Roman Bednarik

We are glad to announce that the ET4S workshop 2018 will be opened with a keynote given by Roman Bednarik, an adjunct professor at the School of Computing, University of Eastern Finland:

 

Predicting user states from gaze and other multimodal data

Abstract: In this talk I will present research conducted by our team at UEF related to user state recognition during problem solving and other interactive contexts. We adapt and apply machine learning techniques to model behavioral and mental states, including action prediction and problem-solving state prediction.

 


  • -

  • -

Scientifica 2017 – Impressions

We have demoed the LAMETTA project at this year’s Scientifica, the Zurich science exhibition of ETH Zurich and University of Zurich with more than 30,000 visitors.

Our panorama wall installation enabled visitors to query information about lakes, mountains, and villages just by looking at them. Many visitors found the hidden treasure in the panorama and were rewarded with a piece of Swiss chocolate.

Visitors of all ages tried out and learned more about gaze-based interaction and mobile eye tracking technology. We are happy that so many people were interested and eager to discuss our research and potential applications to tourist guides of the future.


  • -

3rd ET4S Workshop – Registration now open

Registration for the ET4S workshop and the LBS2018 conference is now open:

Early bird discount will be available until 28 November 2017.

 

We are looking forward to seeing you in Zurich.


  • -

GeoGazeLab at Scientifica 2017

We’re excited to present the LAMETTA project at Scientifica, the science fair of ETH Zurich and University of Zurich. Come and try out an interactive mobile eye tracking system! Explore a mountain panorama and interact with it only by using your gaze (details in German)!

You can find us Friday, 1 September to Sunday, 3 September at University of Zurich main building (West Foyer).

Check out our Scientifica video!


  • -

  • -

Culmann Award

Dr. Ioannis Giannopoulos received the ETH Zurich Culmann Award in 2017 for an outstanding doctoral thesis “Supporting Wayfinding Through Mobile Gaze-Based Interaction”.

 


  • -

An inverse-linear logistic model of the main sequence

Peter Kiefer and Ioannis Giannopoulos have contributed to an article titled “An inverse-linear logistic model of the main sequence” (Journal of Eye Movement Research, JEMR). It is now available online:

http://dx.doi.org/10.16910/jemr.10.3.4

Abstract. A model of the main sequence is proposed based on the logistic function. The model’s fit to the peak velocity-amplitude relation resembles an S curve, simulta- neously allowing control of the curve’s asymptotes at very small and very large amplitudes, as well as its slope over the mid amplitude range. The proposed inverse-linear logistic model is also able to express the linear relation of duration and amplitude. We demonstrate the utility and robustness of the model when fit to aggregate data at the small- and mid-amplitude ranges, namely when fitting microsaccades, saccades, and superposition of both. We are confident the model will suitably extend to the large-amplitude range of eye movements.


  • -

3rd ET4S Workshop: Call for Papers

We’re glad to announce the 3rd International Workshop on Eye Tracking for Spatial Research (ET4S), which will take place January 14, 2018 in Zurich.

The workshop aims to bring together researchers from different areas who have a common interest in using eye tracking for research questions related to spatial information. This is the 3rd edition of the workshop, after two successful ET4S workshops in Scarborough (2013) and Vienna (2014). This time the workshop will be co-located with the 14th International Conference on Location Based Services (LBS 2018).

The workshop will be opened with an invited talk given by Roman Bednarik, an adjunct professor at the School of Computing, University of Eastern Finland.

The Call for Papers is now available (Paper submission deadline: September 27, 2017).

ET4S 2018 is supported by Ergoneers.


  • -

Gaze-Informed Location Based Services

Our article “Gaze-Informed Location Based Services” has been accepted for publication by the International Journal of Geographical Information Science (IJGIS):

Anagnostopoulos, V.-A., Havlena, M., Kiefer, P., Giannopoulos, I., Schindler, K., and Raubal, M. (2017).  Gaze-informed location based services.  International Journal of Geographical Information Science, 2017. (accepted), PDF

The article introduces the concept of location based services which take the user’s viewing direction into account. It reports on the implementation and evaluation of such gaze-informed location based service which has been developed as part of the LAMETTA project. This research has been performed in collaboration between the GeoGazeLab, Michal Havlena (Computer Vision Laboratory, ETH Zurich) and Konrad Schindler (Institute of Geodesy and Photogrammetry, ETH Zurich).

Abstract
Location-Based Services (LBS) provide more useful, intelligent assistance to users by adapting to their geographic context. For some services that context goes beyond a location and includes further spatial parameters, such as the user’s orientation or field of view. Here, we introduce Gaze-Informed LBS (GAIN-LBS), a novel type of LBS that takes into account the user’s viewing direction. Such a system could, for instance, provide audio information about the specific building a tourist is looking at from a vantage point. To determine the viewing direction relative to the environment, we record the gaze direction
relative to the user’s head with a mobile eye tracker. Image data from the tracker’s forward-looking camera serve as input to determine the orientation of the head w.r.t. the surrounding scene, using computer vision methods that allow one to estimate the relative transformation between the camera and a known view of the scene in real-time and without the need for artificial markers or additional sensors. We focus on how to map the Point of Regard of a user to a reference system, for which the objects of interest are known in advance. In an experimental validation on three real city panoramas, we confirm that the approach can cope with head movements of varying speed, including fast rotations up to 63 deg/s. We further demonstrate the feasibility of GAIN-LBS for tourist assistance with a proof-of-concept experiment in which a tourist explores a city panorama, where the approach achieved a recall that reaches over 99%. Finally, a GAIN-LBS can provide objective and qualitative ways of examining the gaze of a user based on what the user is currently looking at.


  • -

SWISS Eye Tracking project @ Lufthansa Digital Aviation Forum 2017

This year together with our colleagues from SWISS International Air Lines Ltd., we were present at the Lufthansa Digital Aviation Forum 2017 in Frankfurt am Main to present our ongoing project. Visit: http://releasd.com/3583 and http://newsroom.lufthansagroup.com/de/themen/digital-aviation.html for further information on the Lufthansa Digital Aviation Forum 2017.


  • -

Short Paper accepted at AGILE conference

Fabian Göbel, Peter Kiefer and Martin Raubal (2017). FeaturEyeTrack: A Vector Tile-Based Eye Tracking Framework for Interactive Maps In Proceedings of the 20th International Conference on Geographic Information Science (AGILE 2017), Wageningen, The Netherlands. (accepted)


  • -

  • -

Special Issue Appearing: Spatial Cognition&Computation 17 (1-2)

A double Special Issue on “Eye Tracking for Spatial Research” in Spatial Cognition&Computation, guest-edited by Peter, Ioannis, Martin, and Andrew Duchowski, has appeared [URL].

Nineteen manuscripts were submitted to an open Call for Submissions, out of which seven were finally accepted after a rigorous review process.

The Special Issue commences with an overview article, authored by the Guest Editors: “Eye tracking for spatial research: Cognition, computation, challenges” [URL, PDF].


  • -

SWISS project presentation

On the 28th of November Martin Raubal, Peter Kiefer and David Rudi from the GeoGazeLab co-hosted a project presentation of the “Awareness in Aviation” project in collaboration with SWISS International Air Lines Ltd.

During that event the project and its goals was first presented to a wider audience consisting of guests from SWISS International Air Lines Ltd., Swiss Aviation Training, the Swiss Federal Office of Civil Aviation (BAZL) and journalists from different news journals. One of the journals was the “My SWISS” magazine, which recently published an article of the event.


  • -

Controllability matters: The user experience of adaptive maps

An article titled “Controllability matters: The user experience of adaptive maps” will appear in one of the next issues of the Geoinformatica journal. It is now available online:

Controllability matters: The user experience of adaptive maps

Abstract Adaptive map interfaces have the potential of increasing usability by providing more task dependent and personalized support. It is unclear, however, how map adaptation must be designed to avoid a loss of control, transparency, and predictability. This article investigates the user experience of adaptive map interfaces in the context of gaze-based activity recognition. In a Wizard of Oz experiment we study two adaptive map interfaces differing in the degree of controllability and compare them to a non-adaptive map interface. Adaptive interfaces were found to cause higher user experience and lower perceived cognitive workload than the non-adaptive interface. Among the adaptive interfaces, users clearly preferred the condition with higher controllability. Results from structured interviews reveal that participants dislike being interrupted in their spatial cognitive processes by a sudden adaptation of the map content. Our results suggest that adaptive map interfaces should provide their users with control at what time an adaptation will be performed.


  • -

Article in Horizonte

The latest issue of the Horizonte magazine, published by the Swiss National Science Foundation, is reporting on our research.

Source: Horizonte 111, December 2016

German (PDF)

English (PDF)


  • -

INNOLEC Lecture

Martin Raubal was invited for the INNOLEC Lecture at the Department of Geography of the Masaryk University Brünn, Czech Republic.

The title of his talk is: Gaze-based assistance for wayfinders in the real world (slides as PDF, all our presentations).


  • -

GeoGazeLab at the Smarttention Workshop

GeoGazeLab was represented at the Smarttention Workshop at MobileHCI to raise awareness for visual attention in adaptive interfaces.

Thanks to the organizers for their great work. We had inspiring discussions about the future of mobile UIs.whatsapp-image-2016-09-10-at-11-47-57


  • -

3rd Workshop on Ubiquitous Technologies for Augmenting the Human Mind (UbiComp 2016)

Keynote by Dr. Lewis Chuang

Organizers: Andreas Dengel , Tilman Dingler , Ioannis Giannopoulos, Cathal Gurrin, Koichi Kise, Kai Kunze, Evangelos Niforatos

Website: http://recall-fet.eu/wahm16/

 


  • -

Full paper accepted at SUI 2016

David Rudi, Ioannis Giannopoulos, Peter Kiefer, Christian Peier, Martin Raubal (2016) Interacting with Maps on Optical Head-Mounted Displays. In Proceedings of the 4th ACM Symposium on Spatial User Interaction (SUI 2016). ACM, 2016