You have an interesting eye tracking research prototype to show? You would like to join the ET4S workshop, but have missed the deadline for regular papers? You have a regular paper accepted and would like to take the opportunity to gain even more visibility and get direct feedback for your system?
We’ll have a dedicated demo session at the ET4S workshop in January 2018 in Zurich. We hope you consider submitting a short abstract (1 page) with a demo proposal (before November, 8).
ET4S website: http://spatialeyetracking.org/et4s-2018/
Predicting user states from gaze and other multimodal data
Abstract: In this talk I will present research conducted by our team at UEF related to user state recognition during problem solving and other interactive contexts. We adapt and apply machine learning techniques to model behavioral and mental states, including action prediction and problem-solving state prediction.
Our panorama wall installation enabled visitors to query information about lakes, mountains, and villages just by looking at them. Many visitors found the hidden treasure in the panorama and were rewarded with a piece of Swiss chocolate.
Visitors of all ages tried out and learned more about gaze-based interaction and mobile eye tracking technology. We are happy that so many people were interested and eager to discuss our research and potential applications to tourist guides of the future.
We’re excited to present the LAMETTA project at Scientifica, the science fair of ETH Zurich and University of Zurich. Come and try out an interactive mobile eye tracking system! Explore a mountain panorama and interact with it only by using your gaze (details in German)!
You can find us Friday, 1 September to Sunday, 3 September at University of Zurich main building (West Foyer).
Check out our Scientifica video!
Dr. Ioannis Giannopoulos received the ETH Zurich Culmann Award in 2017 for an outstanding doctoral thesis “Supporting Wayfinding Through Mobile Gaze-Based Interaction”.
Peter Kiefer and Ioannis Giannopoulos have contributed to an article titled “An inverse-linear logistic model of the main sequence” (Journal of Eye Movement Research, JEMR). It is now available online:
Abstract. A model of the main sequence is proposed based on the logistic function. The model’s fit to the peak velocity-amplitude relation resembles an S curve, simulta- neously allowing control of the curve’s asymptotes at very small and very large amplitudes, as well as its slope over the mid amplitude range. The proposed inverse-linear logistic model is also able to express the linear relation of duration and amplitude. We demonstrate the utility and robustness of the model when fit to aggregate data at the small- and mid-amplitude ranges, namely when fitting microsaccades, saccades, and superposition of both. We are confident the model will suitably extend to the large-amplitude range of eye movements.
We’re glad to announce the 3rd International Workshop on Eye Tracking for Spatial Research (ET4S), which will take place January 14, 2018 in Zurich.
The workshop aims to bring together researchers from different areas who have a common interest in using eye tracking for research questions related to spatial information. This is the 3rd edition of the workshop, after two successful ET4S workshops in Scarborough (2013) and Vienna (2014). This time the workshop will be co-located with the 14th International Conference on Location Based Services (LBS 2018).
The workshop will be opened with an invited talk given by Roman Bednarik, an adjunct professor at the School of Computing, University of Eastern Finland.
The Call for Papers is now available (Paper submission deadline: September 27, 2017).
ET4S 2018 is supported by Ergoneers.
Our article “Gaze-Informed Location Based Services” has been accepted for publication by the International Journal of Geographical Information Science (IJGIS):
Anagnostopoulos, V.-A., Havlena, M., Kiefer, P., Giannopoulos, I., Schindler, K., and Raubal, M. (2017). Gaze-informed location based services. International Journal of Geographical Information Science, 2017. (accepted), PDF
The article introduces the concept of location based services which take the user’s viewing direction into account. It reports on the implementation and evaluation of such gaze-informed location based service which has been developed as part of the LAMETTA project. This research has been performed in collaboration between the GeoGazeLab, Michal Havlena (Computer Vision Laboratory, ETH Zurich) and Konrad Schindler (Institute of Geodesy and Photogrammetry, ETH Zurich).
Location-Based Services (LBS) provide more useful, intelligent assistance to users by adapting to their geographic context. For some services that context goes beyond a location and includes further spatial parameters, such as the user’s orientation or field of view. Here, we introduce Gaze-Informed LBS (GAIN-LBS), a novel type of LBS that takes into account the user’s viewing direction. Such a system could, for instance, provide audio information about the specific building a tourist is looking at from a vantage point. To determine the viewing direction relative to the environment, we record the gaze direction
relative to the user’s head with a mobile eye tracker. Image data from the tracker’s forward-looking camera serve as input to determine the orientation of the head w.r.t. the surrounding scene, using computer vision methods that allow one to estimate the relative transformation between the camera and a known view of the scene in real-time and without the need for artificial markers or additional sensors. We focus on how to map the Point of Regard of a user to a reference system, for which the objects of interest are known in advance. In an experimental validation on three real city panoramas, we confirm that the approach can cope with head movements of varying speed, including fast rotations up to 63 deg/s. We further demonstrate the feasibility of GAIN-LBS for tourist assistance with a proof-of-concept experiment in which a tourist explores a city panorama, where the approach achieved a recall that reaches over 99%. Finally, a GAIN-LBS can provide objective and qualitative ways of examining the gaze of a user based on what the user is currently looking at.
This year together with our colleagues from SWISS International Air Lines Ltd., we were present at the Lufthansa Digital Aviation Forum 2017 in Frankfurt am Main to present our ongoing project. Visit: http://releasd.com/3583 and http://newsroom.lufthansagroup.com/de/themen/digital-aviation.html for further information on the Lufthansa Digital Aviation Forum 2017.
Fabian Göbel, Peter Kiefer and Martin Raubal (2017). FeaturEyeTrack: A Vector Tile-Based Eye Tracking Framework for Interactive Maps In Proceedings of the 20th International Conference on Geographic Information Science (AGILE 2017), Wageningen, The Netherlands. (accepted)
A double Special Issue on “Eye Tracking for Spatial Research” in Spatial Cognition&Computation, guest-edited by Peter, Ioannis, Martin, and Andrew Duchowski, has appeared [URL].
Nineteen manuscripts were submitted to an open Call for Submissions, out of which seven were finally accepted after a rigorous review process.
On the 28th of November Martin Raubal, Peter Kiefer and David Rudi from the GeoGazeLab co-hosted a project presentation of the “Awareness in Aviation” project in collaboration with SWISS International Air Lines Ltd.
During that event the project and its goals was first presented to a wider audience consisting of guests from SWISS International Air Lines Ltd., Swiss Aviation Training, the Swiss Federal Office of Civil Aviation (BAZL) and journalists from different news journals. One of the journals was the “My SWISS” magazine, which recently published an article of the event.
An article titled “Controllability matters: The user experience of adaptive maps” will appear in one of the next issues of the Geoinformatica journal. It is now available online:
Abstract Adaptive map interfaces have the potential of increasing usability by providing more task dependent and personalized support. It is unclear, however, how map adaptation must be designed to avoid a loss of control, transparency, and predictability. This article investigates the user experience of adaptive map interfaces in the context of gaze-based activity recognition. In a Wizard of Oz experiment we study two adaptive map interfaces differing in the degree of controllability and compare them to a non-adaptive map interface. Adaptive interfaces were found to cause higher user experience and lower perceived cognitive workload than the non-adaptive interface. Among the adaptive interfaces, users clearly preferred the condition with higher controllability. Results from structured interviews reveal that participants dislike being interrupted in their spatial cognitive processes by a sudden adaptation of the map content. Our results suggest that adaptive map interfaces should provide their users with control at what time an adaptation will be performed.
The past years have seen a growing interest in augmenting human cognition – attention, engagement, memory, learning, etc – through ubiquitous technologies. With the ongoing research and development of near-constant capture devices, unlimited storage, and algorithms for processing and retrieving captured recordings, the resulting personal “lifelogs” have opened the door to a vast range of applications. In the third rendition of this workshop series, we focus on technologies and applications of capturing and integrating personal experiences into everyday use cases. With the question What constitutes a modern lifelog?, we would like to invite researchers, designers, and practitioners to envision and exchange ideas on how ubiquitous technologies and applications can help enhance human cognition in everyday life. For example, search requests may no longer purely retrieve information from online archives, but take into account personal experiences. In this one-day workshop, we would like to formulate visions and concrete application scenarios for making use of ubiquitous technologies in order to push personal data to an application layer where it is used to support and augment human cognition and the human mind.
Keynote by Dr. Lewis Chuang
Organizers: Andreas Dengel , Tilman Dingler , Ioannis Giannopoulos, Cathal Gurrin, Koichi Kise, Kai Kunze, Evangelos Niforatos
David Rudi, Ioannis Giannopoulos, Peter Kiefer, Christian Peier, Martin Raubal (2016) Interacting with Maps on Optical Head-Mounted Displays. In Proceedings of the 4th ACM Symposium on Spatial User Interaction (SUI 2016). ACM, 2016
Vasileios-Athanasios Anagnostopoulos and Peter Kiefer (2016). Towards gaze-based interaction with urban outdoor spaces. In 6th International Workshop on Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2016), UbiComp’16 Adjunct, New York, NY, USA, ACM. accepted
Fabian Göbel, Ioannis Giannopoulos and Martin Raubal (2016). The Importance of Visual Attention for Adaptive Interfaces. In Proceedings of the Workshop Smarttention, Please! Intelligent Attention Management on Mobile Devices, in conjunction with MobileHCI 2016. ACM, New York, NY, USA (accepted)
A research highlight on “Spatial Awareness in the cockpit” has been published in the annual report of the Department of Civil, Environmental and Geomatic Engineering!
David Rudi, Peter Kiefer, and Martin Raubal. Spatial awareness in the cockpit. Department of Civil, Environmental and Geomatic Engineering, ETH Zurich, Annual Report 2015, April 2016.
See also: Research on “Aviation Safety”
Peter Kiefer, Ioannis Giannopoulos, Andrew Duchowski, Martin Raubal (2016) Measuring cognitive load for map tasks through pupil diameter. In Proceedings of the Ninth International Conference on Geographic Information Science (GIScience 2016). Springer
We are going to host the next meeting of the Eye Tracking Interest Group Zurich (ETIZ). Everyone using, or planning to use eye tracking in their research is cordially welcome!
Date, time: Wednesday, 23rd March 2016, 17:00-19:00
Place: ETH Zurich Hönggerberg, HIL G 22
Topic: Measuring Cognitive Load with Eye Tracking
Please sign up in the Doodle to allow us plan the coffee break: http://ethz.doodle.com/poll/6ti5qbqx23wvf53g (before 16 March)
17:00 – 17:05
17:05 – 17:20
Cognitive Load: Introduction
Christoph Hölscher, Chair of Cognitive Science, ETH Zürich
17:20 – 17:45
Cognitive Load and Eye Tracking: Overview on Methods
Andrew Duchowski, School of Computing, Clemson University, S.C., USA
17:45 – 18:15
with possibility to try out a mobile gaze-based interaction system
Vasilis Anagnostopoulos, LAMETTA project, Geoinformation Engineering ETH Zürich
18:15 – 18:45
Discussion: Cognitive Load
18:45 – 18:55
Discussion: Format of ETIZ meeting