2nd Workshop on Eye Tracking and Visualization 2016 (ETVIS'16)
Questions and feedback at: michael.burch@visus.uni-stuttgart.de


Co-located with IEEE VIS in Baltimore, Maryland, USA, 23 - 28 October 2016

Welcome to the 2nd Workshop on Eye Tracking and Visualization

After last year's successful workshop co-located with IEEE VIS 2015 in Chicago, Illinois, USA, we decided to have another workshop on this interesting, challenging, and important topic. This year we will again invite you to submit papers with the general topic of combining eye tracking and visualization. This can be visualization or visual analytics techniques for eye tracking data or eye tracking-based user evaluations focusing on visual stimuli. The papers will be reviewed by an international program committee with researchers working in the fields of visualization, eye tracking, human-computer interaction, but also in psychology. The best papers will be invited as an extended version to the Journal of Eye Movement Research.


IMPORTANT! *** Paper submission deadline: July 20th, 2016 ***


CALL FOR PAPERS

There is a growing interest in eye tracking as a research method in many communities, including information visualization, scientific visualization, visual analytics, but also in human-computer interaction, applied perception, psychology, cognitive science, security, and mixed reality. Progress in hardware technology and the reduction of costs for eye tracking devices have made this analysis technique accessible to a large population of researchers. Recording the observer's gaze can reveal how dynamic graphical displays are visually accessed and which information are processed in real time. Nonetheless, standardized practices for technical implementations and data interpretation remain unresolved. With this Workshop on Eye Tracking and Visualization (ETVIS), we intend to build a community of eye tracking researchers within the visualization community, covering information visualization, scientific visualization, and visual analytics. We also aim to establish connections to related fields, in particular, in human-computer interaction, cognitive science, and psychology. This will promote a robust exchange of established practices and innovative use scenarios.

This workshop will cover topics that are related to visualization research (including information visualization, scientific visualization, and visual analytics) and eye tracking. Aspects discussed in this workshop include the following topics with an emphasis on the relationship between eye tracking and visualization:

Visualization and visual analytics techniques for eye-movement data (including spatio-temporal visualization, evolution of gaze patterns, visual analysis of participant behavior, visualization of static and dynamic stimuli, 2D vs. 3D representations of eye movement data)

Visual gaze and eye-movement data-analysis, including visual data mining, aggregation, clustering techniques, and metrics for eye movement data

Eye-movement data provenance, big eye-movement data

Standardized metrics for evaluating interactions with visualization

Novel methods for eye-tracking in challenging visualization scenarios

Uncertainty visualization of gaze data

Interactive annotation of gaze and stimulus data

Systems for the visual exploration of eye-movement data

Reports of eye-tracking studies that evaluate visualization or visual analytics

Eye-tracking in non-WIMP visualization environments, including mobile eye-tracking, mobile devices, virtual environments, mixed reality, and large displays

Eye-tracking-based interaction techniques for visualization

Interpreting eye-movement scanpaths from the perspective of human cognitive architecture and perceptuo-motor expertise

Cognitive models for inferring user states from gaze behavior with visualizations

Applications that rely on eye-tracking as an adaptive input parameter

Eye movement behavior in public transport systems

Eye tracking in specific applications like software engineering, biology, bioinformatics, medicine, sports and the like


PUBLICATION

TBA...