2022 |
Alinaghi, Negar; Kattenbeck, Markus; Giannopoulos, Ioannis I Can Tell by Your Eyes! Continuous Gaze-Based Turn-Activity Prediction Reveals Spatial Familiarity Inproceedings In: pp. 2:1–2:13, Schloss Dagstuhl -- Leibniz-Zentrum f{"u}r Informatik, 2022, ISBN: 978-3-95977-257-0. Abstract | Links | BibTeX | Tags: eye tracking, human activity recognition, Machine Learning, Spatial Familiarity @inproceedings{alinaghi2022can, Spatial familiarity plays an essential role in the wayfinding decision-making process. Recent findings in wayfinding activity recognition domain suggest that wayfinders' turning behavior at junctions is strongly influenced by their spatial familiarity. By continuously monitoring wayfinders' turning behavior as reflected in their eye movements during the decision-making period (i.e., immediately after an instruction is received until reaching the corresponding junction for which the instruction was given), we provide evidence that familiar and unfamiliar wayfinders can be distinguished. By applying a pre-trained XGBoost turning activity classifier on gaze data collected in a real-world wayfinding task with 33 participants, our results suggest that familiar and unfamiliar wayfinders show different onset and intensity of turning behavior. These variations are not only present between the two classes -familiar vs. unfamiliar- but also within each class. The differences in turning-behavior within each class may stem from multiple sources, including different levels of familiarity with the environment. |
Alinaghi, Negar; Giannopoulos, Ioannis Consider the Head Movements! Saccade Computation in Mobile Eye-Tracking Inproceedings In: 2022 Symposium on Eye Tracking Research and Applications, Association for Computing Machinery, Seattle, WA, USA, 2022, ISBN: 9781450392525. Abstract | Links | BibTeX | Tags: AI, eye tracking, Saccades @inproceedings{10.1145/3517031.3529624, Saccadic eye movements are known to serve as a suitable proxy for tasks prediction. In mobile eye-tracking, saccadic events are strongly influenced by head movements. Common attempts to compensate for head-movement effects either neglect saccadic events altogether or fuse gaze and head-movement signals measured by IMUs in order to simulate the gaze signal at head-level. Using image processing techniques, we propose a solution for computing saccades based on frames of the scene-camera video. In this method, fixations are first detected based on gaze positions specified in the coordinate system of each frame, and then respective frames are merged. Lastly, pairs of consecutive fixations –forming a saccade- are projected into the coordinate system of the stitched image using the homography matrices computed by the stitching algorithm. The results show a significant difference in length between projected and original saccades, and approximately 37% of error introduced by employing saccades without head-movement consideration. |
2021 |
Alinaghi, Negar; Kattenbeck, Markus; Golab, Antonia; Giannopoulos, Ioannis Will You Take This Turn? Gaze-Based Turning Activity Recognition During Navigation Inproceedings In: Janowicz, Krzysztof; Verstegen, Judith A. (Ed.): 11th International Conference on Geographic Information Science (GIScience 2021) - Part II, pp. 5:1–5:16, Schloss Dagstuhl -- Leibniz-Zentrum für Informatik, Dagstuhl, Germany, 2021, ISSN: 1868-8969. Abstract | Links | BibTeX | Tags: eye tracking, human activity recognition, Machine Learning, Wayfinding @inproceedings{alinaghi_et_al:LIPIcs.GIScience.2021.II.5, Decision making is an integral part of wayfinding and people progressively use navigation systems to facilitate this task. The primary decision, which is also the main source of navigation error, is about the turning activity, i.e., to decide either to turn left or right or continue straight forward. The fundamental step to deal with this error, before applying any preventive approaches, e.g., providing more information, or any compensatory solutions, e.g., pre-calculating alternative routes, could be to predict and recognize the potential turning activity. This paper aims to address this step by predicting the turning decision of pedestrian wayfinders, before the actual action takes place, using primarily gaze-based features. Applying Machine Learning methods, the results of the presented experiment demonstrate an overall accuracy of 91% within three seconds before arriving at a decision point. Beyond the application perspective, our findings also shed light on the cognitive processes of decision making as reflected by the wayfinder’s gaze behaviour: incorporating environmental and user-related factors to the model, results in a noticeable change with respect to the importance of visual search features in turn activity recognition. |
2018 |
Kiefer, Peter; Giannopoulos, Ioannis; Göbel, Fabian; Raubal, Martin; Duchowski, Andrew T (Ed.) Eye Tracking for Spatial Research, Proceedings of the 3rd International Workshop (ET4S) Book ETH-Zürich, Zürich, 2018. Abstract | Links | BibTeX | Tags: eye tracking @book{kiefer18[TUW-277892], Proceedings of the 3rd International Workshop in conjunction with the 14th International Conference on Location Based Services (LBS 2018) |
Göbel, Fabian; Kiefer, Peter; Giannopoulos, Ioannis; Duchowski, Andrew T; Raubal, Martin Improving Map Reading with Gaze-adaptive Legends Inproceedings In: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, pp. 29:1–29:9, ACM Association for Computing Machinery (ACM), New York, 2018, ISBN: 978-1-4503-5706-7. Abstract | Links | BibTeX | Tags: adaptations, eye tracking, gaze-based interaction, legends, maps @inproceedings{göbel18:29:1[TUW-277866], Complex information visualizations, such as thematic maps, encode information using a particular symbology that often requires the use of a legend to explain its meaning. Traditional legends are placed at the edge of a visualization, which can be difficult to maintain visually while switching attention between content and legend.par Moreover, an extensive search may be required to extract relevant information from the legend. In this paper we propose to consider the user's visual attention to improve interaction with a map legend by adapting both the legend's placement and content to the user's gaze.par In a user study, we compared two novel adaptive legend behaviors to a traditional (non-adaptive) legend.We found that, with both of our approaches, participants spent significantly less task time looking at the legend than with the baseline approach. Furthermore, participants stated that they preferred the gaze-based approach of adapting the legend content (but not its placement). |
Duchowski, Andrew T; Krejtz, Krzysztof; Krejtz, Izabela; Biele, Cezary; Niedzielska, Anna; Kiefer, Peter; Raubal, Martin; Giannopoulos, Ioannis The Index of Pupillary Activity: Measuring Cognitive Load vis-`a-vis Task Difficulty with Pupil Oscillation Inproceedings In: CHI 2018, pp. 1–13, ACM, Paper No. 282, 2018, ISBN: 978-1-4503-5620-6, (Vortrag: CHI 2018 - Conference on Human Factors in Computing Systems, Montreal, Canada; 2018-04-21 -- 2018-04-26). Abstract | Links | BibTeX | Tags: eye tracking, pupillometry, task difficulty @inproceedings{duchowski18:1[TUW-270882], A novel eye-tracked measure of the frequency of pupil diameter oscillation is proposed for capturing what is thought to be an indicator of cognitive load. The proposed metric, termed the Index of Pupillary Activity, is shown to discriminate task difficulty vis-`a-vis cognitive load (if the implied causality can be assumed) in an experiment where participants performed easy and difficult mental arithmetic tasks while fixating a central target (a requirement for replication of prior work). The paper's contribution is twofold: full documentation is provided for the calculation of the proposed measurement which can be considered as an alternative to the existing proprietary Index of Cognitive Activity (ICA). Thus, it is possible for researchers to replicate the experiment and build their own software which implements this measurement. Second, several aspects of the ICA are approached in a more data-sensitive way with the goal of improving the measurement's performance. |
Göbel, Fabian; Kiefer, Peter; Giannopoulos, Ioannis; Raubal, Martin Gaze Sequences and Map Task Complexity Inproceedings In: Winter, Stephan; Griffin, Amy; Sester, Monika (Ed.): 10th International Conference on Geographic Information Science (GIScience 2018), pp. 30:1–30:6, LIPICS, 114, 2018, ISBN: 978-3-95977-083-5, (Vortrag: 10th International Conference on Geographic Information Science (GIScience 2018), Melbourne; 2018-08-28 -- 2018-08-31). Abstract | Links | BibTeX | Tags: eye tracking, map task complexity, sequence analysis @inproceedings{göbel18:30:1[TUW-277842], As maps are visual representations of spatial context to communicate geographic information, analysis of gaze behavior is promising to improve map design. In this research we investigate the impact of map task complexity and different legend types on the visual attention of a user. With an eye tracking experiment we could show that the complexity of two map tasks can be measured and compared based on AOI sequences analysis. This knowledge can help to improve map design for static maps or in the context of interactive systems, create better map interfaces, that adapt to the user's current task. |
2022 |
I Can Tell by Your Eyes! Continuous Gaze-Based Turn-Activity Prediction Reveals Spatial Familiarity Inproceedings In: pp. 2:1–2:13, Schloss Dagstuhl -- Leibniz-Zentrum f{"u}r Informatik, 2022, ISBN: 978-3-95977-257-0. |
Consider the Head Movements! Saccade Computation in Mobile Eye-Tracking Inproceedings In: 2022 Symposium on Eye Tracking Research and Applications, Association for Computing Machinery, Seattle, WA, USA, 2022, ISBN: 9781450392525. |
2021 |
Will You Take This Turn? Gaze-Based Turning Activity Recognition During Navigation Inproceedings In: Janowicz, Krzysztof; Verstegen, Judith A. (Ed.): 11th International Conference on Geographic Information Science (GIScience 2021) - Part II, pp. 5:1–5:16, Schloss Dagstuhl -- Leibniz-Zentrum für Informatik, Dagstuhl, Germany, 2021, ISSN: 1868-8969. |
2018 |
Eye Tracking for Spatial Research, Proceedings of the 3rd International Workshop (ET4S) Book ETH-Zürich, Zürich, 2018. |
Improving Map Reading with Gaze-adaptive Legends Inproceedings In: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, pp. 29:1–29:9, ACM Association for Computing Machinery (ACM), New York, 2018, ISBN: 978-1-4503-5706-7. |
The Index of Pupillary Activity: Measuring Cognitive Load vis-`a-vis Task Difficulty with Pupil Oscillation Inproceedings In: CHI 2018, pp. 1–13, ACM, Paper No. 282, 2018, ISBN: 978-1-4503-5620-6, (Vortrag: CHI 2018 - Conference on Human Factors in Computing Systems, Montreal, Canada; 2018-04-21 -- 2018-04-26). |
Gaze Sequences and Map Task Complexity Inproceedings In: Winter, Stephan; Griffin, Amy; Sester, Monika (Ed.): 10th International Conference on Geographic Information Science (GIScience 2018), pp. 30:1–30:6, LIPICS, 114, 2018, ISBN: 978-3-95977-083-5, (Vortrag: 10th International Conference on Geographic Information Science (GIScience 2018), Melbourne; 2018-08-28 -- 2018-08-31). |