2019 |
Giannopoulos, Ioannis Gaze-Based Assistance for Collective Spatial Cognition Inproceedings In: Curtin, Kevin M; Montello, Daniel R (Ed.): Innovative Research about Spatial Thinking by Human Groups, Laboratory for Location Science, University of Alabama, 2019, (Vortrag: Collective Spatial Cognition Specialist Meeting, Santa Barbara, California, USA; 2019-04-17 -- 2019-04-19). Abstract | BibTeX | Tags: Assistance Systems, Collective Spatial Cognition, Eye Movements, gaze-based interaction, Wayfinding @inproceedings{giannopoulos19[TUW-286401], When we walk and interact in an unfamiliar environment, wayfinding can be very challenging. We have to select a proper route than will lead us to the desired destination, we have to orient in our surroundings, we have to monitor our environment while walking to ensure that we are still on the right track and finally we have to recognize the destination. Furthermore, while we are wayfinding, we are acquiring spatial knowledge, developing and enhancing our mental representation of the environment we are interacting in. Assistance aids can be utilized for this purpose, helping us to offload some of the relevant tasks. Furthermore, assistance systems can help us to coordinate our activities with others, communicate, as well as increase our knowledge concerning the relevant environment. An assistance system that knows what we have seen, what we are interested in and what we want to achieve can be effectively utilized to support the process of wayfinding. Eye tracking data can be a great source, close to our cognitive processes, that can be utilized for the extraction of this relevant information that will help to coordinate and manage the spatial cognition of a person or even of a larger group of people. This position paper demonstrates how research in the area of gaze-based assistance can be utilized for acquiring, organizing and utilizing spatial knowledge of a group of people through the example of a group of tourists. |
2018 |
Göbel, Fabian; Kiefer, Peter; Giannopoulos, Ioannis; Duchowski, Andrew T; Raubal, Martin Improving Map Reading with Gaze-adaptive Legends Inproceedings In: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, pp. 29:1–29:9, ACM Association for Computing Machinery (ACM), New York, 2018, ISBN: 978-1-4503-5706-7. Abstract | Links | BibTeX | Tags: adaptations, eye tracking, gaze-based interaction, legends, maps @inproceedings{göbel18:29:1[TUW-277866], Complex information visualizations, such as thematic maps, encode information using a particular symbology that often requires the use of a legend to explain its meaning. Traditional legends are placed at the edge of a visualization, which can be difficult to maintain visually while switching attention between content and legend.par Moreover, an extensive search may be required to extract relevant information from the legend. In this paper we propose to consider the user's visual attention to improve interaction with a map legend by adapting both the legend's placement and content to the user's gaze.par In a user study, we compared two novel adaptive legend behaviors to a traditional (non-adaptive) legend.We found that, with both of our approaches, participants spent significantly less task time looking at the legend than with the baseline approach. Furthermore, participants stated that they preferred the gaze-based approach of adapting the legend content (but not its placement). |
2019 |
Gaze-Based Assistance for Collective Spatial Cognition Inproceedings In: Curtin, Kevin M; Montello, Daniel R (Ed.): Innovative Research about Spatial Thinking by Human Groups, Laboratory for Location Science, University of Alabama, 2019, (Vortrag: Collective Spatial Cognition Specialist Meeting, Santa Barbara, California, USA; 2019-04-17 -- 2019-04-19). |
2018 |
Improving Map Reading with Gaze-adaptive Legends Inproceedings In: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, pp. 29:1–29:9, ACM Association for Computing Machinery (ACM), New York, 2018, ISBN: 978-1-4503-5706-7. |