• eyetracking@ethz.ch

Author Archives: Peter Kiefer

  • -

Victor Schinazi joins the team

We’re very happy to welcome Victor Schinazi as a new team member! He’ll be working with us for 4 months before joining the faculty of Psychology at Bond University in Australia.

[Current team]


  • -

PhD graduation David Rudi

David Rudi has successfully defended his doctoral thesis on 16 September (“Enhancing Spatial Awareness of Pilots in Commercial Aviation”). We cordially congratulate, and are happy that he’ll stay with us as a PostDoc starting from November!


  • -

Visit by the Vice President for Research and Corporate Relations

On 11 September, Prof. Dr. Detlef Günther, the Vice President for Research and Corporate Relations of ETH Zurich, has visited the D-BAUG department and informed himself about the exciting research activities of the different institutes.

Our institute was represented by Peter Kiefer, who summarized the research of the GeoGazeLab. The slides provide an overview on our research interests and current projects.


  • -

ET4S and ETRA 2019: Impressions

Our group has organized the “Eye Tracking for Spatial Research” event as a track at this year’s ETRA conference in Denver, Colorado. It featured four full paper presentations, one short paper presentation, as well as an invited talk (see program). A dominant topic at this year’s ET4S was augmented/mixed/virtual reality. As a particular highlight, our invited speaker Sophie Stellmach (Senior Scientist at Microsoft) highlighted the fascinating opportunities of HoloLens 2, an upcoming mixed reality device that will have eye tracking capabilities included.

The GeoGazeLab was further involved with Fabian’s talk on “POI-Track: Improving Map-Based Planning with Implicit POI Tracking” and Kuno presenting his work on “Space-Time Volume Visualization of Gaze and Stimulus” in the ETRA main program. A paper co-authored by Martin was presented by one of his co-authors (“Eye Tracking Support for Visual Analytics Systems: Foundations, Current Applications, and Research Challenges”).

 

The invited talk by Sophie Stellmach (Microsoft) …

 

… attracted quite some audience.

 

Testing HoloLens 2 after ET4S.


  • -

ET4S 2019: Program and Invited Talk

The program of ET4S 2019, a track at ETRA, is now available on the website: http://et4s.ethz.ch/program/.

We’re excited to announce Sophie Stellmach (Senior Scientist @ Microsoft, HoloLens team) as this year’s invited speaker at ET4S. The title of her talk is “Eye Tracking in Mixed Reality and its Promises for Spatial Research”.

ET4S 2019 is going to take place 26 June 2019, 15:30 – 18:00, at the ETRA conference in Denver, Colorado, USA. Attendance of ET4S is included in the ETRA registration.


  • -

Kuno Kurzhals joins the team

Kuno Kurzhals has started in our group as a PostDoc.

Welcome to the team!

[Current Team]


  • -

ETIZ Meeting March 2019 – Impressions

More than 30 participants attended the meeting of the Eye Tracking Interest Group Zurich (ETIZ) hosted by us on 26 March 2019. Our invited speaker Andreas Bulling (University of Stuttgart) provided insights into his current and past research on pervasive eye tracking. Tiffany Kwok (GeoGazeLab, LAMETTA project) presented her PhD research on gaze-guided narratives. In an interactive mini-workshop, moderated by Arzu Çöltekin (FHNW), attendees brainstormed about challenges of eye tracking in VR and AR displays. Discussions were continued during an apéro, and many took the opportunity to try out a gaze-adaptive map demo (Fabian Göbel, GeoGazeLab, IGAMaps project).


  • -

ETIZ Meeting March 2019

We are going to host the next meeting of the Eye Tracking Interest Group Zurich (ETIZ). Everyone using, or planning to use eye tracking in their research is cordially welcome!

Date, time: 26th March 2019, 17:30
Place: ETH Zurich Hönggerberg, HIL D 53

 

17:30 – 17:35
Welcome

17:35 – 18:15
“Recent Advances Towards Pervasive Eye Tracking”
Prof. Dr. Andreas Bulling, Professor for Human-Computer Interaction and Cognitive Systems
University of Stuttgart, Germany

18:15 – 18:35
“Gaze-Guided Narratives”
Tiffany C.K. Kwok, Doctoral Student
Geoinformation Engineering, ETH Zurich

18:35 – 18:55
“Eye tracking in VR and AR displays: A mini-workshop”
Dr. Arzu Çöltekin, Assoc. Prof., Principal Investigator
Institute for Interactive Technologies IIT, University of Applied Sciences and Arts Northwestern Switzerland FHNW

18:55 – 19:00
Closing

19:00
Apéro, with demo of a gaze-adaptive interactive map by Fabian Göbel, Geoinformation Engineering


  • -

Luis Lutnyk joins the team

We are happy to welcome Luis Lutnyk as a new PhD student in the GeoGazeLab! His research will be about eye tracking in aviation.

[Current Team]


  • -

LAMETTA at GeoSummit: Visit by Federal Councillor Guy Parmelin

We have presented the LAMETTA project at the GeoSummit in Bern (6-7 June 2018), the largest congress for geoinformatics, surveying and planning in Switzerland.

Federal councilor Guy Parmelin was one of the first visitors of our exhibit and was very interested in the innovative system. Due to his subsequent opening speech, there was no time to try out the gaze-based tourist guide to Lake Lucerne himself, but the short visit seemed already impressive.

A large number of visitors from both, industry and academia, visited our exhibit and tried out the system. In addition, our exhibit was part of the GeoSchoolDay – an event in conjunction with GeoSummit which introduces students at high school age to applications and opportunities of geo information technologies. Approx. 500 pupils visited LAMETTA and learned about eye tracking and its application in interactive systems.


  • -

Ioannis Giannopoulos Professor at TU Vienna

We congratulate Ioannis Giannopoulos to his new university professor position at TU Vienna, where he is heading the Research Group Geoinformation at the Department of Geodesy and Geoinformation (since January). Ioannis was a PhD student and PostDoc with the GeoGazeLab between 2012 and 2017.


  • -

Call for Papers: Spatial Big Data and Machine Learning in GIScience

Have you ever thought of eye tracking data as spatial “big” data? Are you collecting large amounts of eye tracking data together with geo-spatial coordinates, or are you applying machine learning on such data? Then the workshop on “Spatial Big Data and Machine Learning in GIScience” at this year’s GIScience conference in Melbourne might be interesting for you.

The 1st Call for Papers is now available on the website.


  • -

Position Paper at CHI Workshop on Outdoor HCI

We’ll present our ideas on how to enrich a tourist’s experience with gaze-guided narratives at a CHI workshop in Montreal this year:

Kiefer, P., Adams, B., and Raubal, M. (2018) Gaze-Guided Narratives for Outdoor Tourism. HCI Outdoors: A CHI 2018 Workshop on Understanding Human-Computer Interaction in the Outdoors

This research is part of the LAMETTA project.


  • -

Tiffany Kwok joins the team

We welcome Tiffany as a new PhD student in the LAMETTA project!

[Current Team]


  • -

Full Paper accepted at CHI 2018

Andrew T. Duchowski, Krzysztof Krejtz, Izabela Krejtz, Cezary Biele, Anna Niedzielska, Peter Kiefer, Ioannis Giannopoulos, and Martin Raubal (2018). The Index of Pupillary Activity: Measuring Cognitive Load vis-à-vis Task Difficulty with Pupil Oscillation In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems (CHI 2018), ACM (accepted)


  • -

3rd ET4S Workshop: Program Online

The program of the ET4S workshop on 14 January 2018 is now available on the workshop website: http://spatialeyetracking.org/et4s-2018/program

 


  • -

3rd ET4S Workshop: Call for Demos

You have an interesting eye tracking research prototype to show? You would like to join the ET4S workshop, but have missed the deadline for regular papers? You have a regular paper accepted and would like to take the opportunity to gain even more visibility and get direct feedback for your system?

We’ll have a dedicated demo session at the ET4S workshop in January 2018 in Zurich. We hope you consider submitting a short abstract (1 page) with a demo proposal (before November, 8).

ET4S website: http://spatialeyetracking.org/et4s-2018/


  • -

3rd ET4S Workshop: Keynote by Roman Bednarik

We are glad to announce that the ET4S workshop 2018 will be opened with a keynote given by Roman Bednarik, an adjunct professor at the School of Computing, University of Eastern Finland:

 

Predicting user states from gaze and other multimodal data

Abstract: In this talk I will present research conducted by our team at UEF related to user state recognition during problem solving and other interactive contexts. We adapt and apply machine learning techniques to model behavioral and mental states, including action prediction and problem-solving state prediction.

 


  • -

3rd ET4S Workshop: Deadline Extension

Due to several requests, the submission deadline for the ET4S 2018 workshop has been extended to

11 October 2017

ET4S website


  • -

GeoGazeLab at Scientifica 2017

We’re excited to present the LAMETTA project at Scientifica, the science fair of ETH Zurich and University of Zurich. Come and try out an interactive mobile eye tracking system! Explore a mountain panorama and interact with it only by using your gaze (details in German)!

You can find us Friday, 1 September to Sunday, 3 September at University of Zurich main building (West Foyer).

Check out our Scientifica video!


  • -

Open PhD position

We are looking for a PhD candidate (LAMETTA project).

More details and application on the ETH website.


  • -

Culmann Award

Dr. Ioannis Giannopoulos received the ETH Zurich Culmann Award in 2017 for an outstanding doctoral thesis “Supporting Wayfinding Through Mobile Gaze-Based Interaction”.

 


  • -

An inverse-linear logistic model of the main sequence

Peter Kiefer and Ioannis Giannopoulos have contributed to an article titled “An inverse-linear logistic model of the main sequence” (Journal of Eye Movement Research, JEMR). It is now available online:

http://dx.doi.org/10.16910/jemr.10.3.4

Abstract. A model of the main sequence is proposed based on the logistic function. The model’s fit to the peak velocity-amplitude relation resembles an S curve, simulta- neously allowing control of the curve’s asymptotes at very small and very large amplitudes, as well as its slope over the mid amplitude range. The proposed inverse-linear logistic model is also able to express the linear relation of duration and amplitude. We demonstrate the utility and robustness of the model when fit to aggregate data at the small- and mid-amplitude ranges, namely when fitting microsaccades, saccades, and superposition of both. We are confident the model will suitably extend to the large-amplitude range of eye movements.


  • -

Gaze-Informed Location Based Services

Our article “Gaze-Informed Location Based Services” has been accepted for publication by the International Journal of Geographical Information Science (IJGIS):

Anagnostopoulos, V.-A., Havlena, M., Kiefer, P., Giannopoulos, I., Schindler, K., and Raubal, M. (2017).  Gaze-informed location based services.  International Journal of Geographical Information Science, 2017. (accepted), PDF

The article introduces the concept of location based services which take the user’s viewing direction into account. It reports on the implementation and evaluation of such gaze-informed location based service which has been developed as part of the LAMETTA project. This research has been performed in collaboration between the GeoGazeLab, Michal Havlena (Computer Vision Laboratory, ETH Zurich) and Konrad Schindler (Institute of Geodesy and Photogrammetry, ETH Zurich).

Abstract
Location-Based Services (LBS) provide more useful, intelligent assistance to users by adapting to their geographic context. For some services that context goes beyond a location and includes further spatial parameters, such as the user’s orientation or field of view. Here, we introduce Gaze-Informed LBS (GAIN-LBS), a novel type of LBS that takes into account the user’s viewing direction. Such a system could, for instance, provide audio information about the specific building a tourist is looking at from a vantage point. To determine the viewing direction relative to the environment, we record the gaze direction
relative to the user’s head with a mobile eye tracker. Image data from the tracker’s forward-looking camera serve as input to determine the orientation of the head w.r.t. the surrounding scene, using computer vision methods that allow one to estimate the relative transformation between the camera and a known view of the scene in real-time and without the need for artificial markers or additional sensors. We focus on how to map the Point of Regard of a user to a reference system, for which the objects of interest are known in advance. In an experimental validation on three real city panoramas, we confirm that the approach can cope with head movements of varying speed, including fast rotations up to 63 deg/s. We further demonstrate the feasibility of GAIN-LBS for tourist assistance with a proof-of-concept experiment in which a tourist explores a city panorama, where the approach achieved a recall that reaches over 99%. Finally, a GAIN-LBS can provide objective and qualitative ways of examining the gaze of a user based on what the user is currently looking at.


  • -

Special Issue Appearing: Spatial Cognition&Computation 17 (1-2)

A double Special Issue on “Eye Tracking for Spatial Research” in Spatial Cognition&Computation, guest-edited by Peter, Ioannis, Martin, and Andrew Duchowski, has appeared [URL].

Nineteen manuscripts were submitted to an open Call for Submissions, out of which seven were finally accepted after a rigorous review process.

The Special Issue commences with an overview article, authored by the Guest Editors: “Eye tracking for spatial research: Cognition, computation, challenges” [URL, PDF].


  • -

Controllability matters: The user experience of adaptive maps

An article titled “Controllability matters: The user experience of adaptive maps” will appear in one of the next issues of the Geoinformatica journal. It is now available online:

Controllability matters: The user experience of adaptive maps

Abstract Adaptive map interfaces have the potential of increasing usability by providing more task dependent and personalized support. It is unclear, however, how map adaptation must be designed to avoid a loss of control, transparency, and predictability. This article investigates the user experience of adaptive map interfaces in the context of gaze-based activity recognition. In a Wizard of Oz experiment we study two adaptive map interfaces differing in the degree of controllability and compare them to a non-adaptive map interface. Adaptive interfaces were found to cause higher user experience and lower perceived cognitive workload than the non-adaptive interface. Among the adaptive interfaces, users clearly preferred the condition with higher controllability. Results from structured interviews reveal that participants dislike being interrupted in their spatial cognitive processes by a sudden adaptation of the map content. Our results suggest that adaptive map interfaces should provide their users with control at what time an adaptation will be performed.


  • -

Article in Horizonte

The latest issue of the Horizonte magazine, published by the Swiss National Science Foundation, is reporting on our research.

Source: Horizonte 111, December 2016

German (PDF)

English (PDF)


  • -

INNOLEC Lecture

Martin Raubal was invited for the INNOLEC Lecture at the Department of Geography of the Masaryk University Brünn, Czech Republic.

The title of his talk is: Gaze-based assistance for wayfinders in the real world (slides as PDF, all our presentations).


  • -

Full paper accepted at GIScience 2016

Peter Kiefer, Ioannis Giannopoulos, Andrew Duchowski, Martin Raubal (2016) Measuring cognitive load for map tasks through pupil diameter. In Proceedings of the Ninth International Conference on Geographic Information Science (GIScience 2016). Springer


  • -

ETIZ Meeting March 2016

We are going to host the next meeting of the Eye Tracking Interest Group Zurich (ETIZ). Everyone using, or planning to use eye tracking in their research is cordially welcome!

Date, time: Wednesday, 23rd March 2016, 17:00-19:00
Place: ETH Zurich Hönggerberg, HIL G 22
Topic: Measuring Cognitive Load with Eye Tracking

Please sign up in the Doodle to allow us plan the coffee break: http://ethz.doodle.com/poll/6ti5qbqx23wvf53g (before 16 March)

 

17:00 – 17:05
Welcome

17:05 – 17:20
Cognitive Load: Introduction
Christoph Hölscher, Chair of Cognitive Science, ETH Zürich

17:20 – 17:45
Cognitive Load and Eye Tracking: Overview on Methods
Andrew Duchowski, School of Computing, Clemson University, S.C., USA

17:45 – 18:15
Break
with possibility to try out a mobile gaze-based interaction system
Vasilis Anagnostopoulos, LAMETTA project, Geoinformation Engineering ETH Zürich

18:15 – 18:45
Discussion: Cognitive Load

18:45 – 18:55
Discussion: Format of ETIZ meeting