Aarhus University Seal

Gaze & Eye Tracking

Research Focus

This topic relates to the integration of our eye movements and gaze within the context of the  computer user interface. Eye-tracking technology has had many use cases since the 80s with regards to  improving the computer’s accessibility by enabling novel ways of computer control, understanding user behaviour through study of visual attention, and utilising gaze information as a rich source of context information for implicit and adaptive user interfaces. Our team works by designing novel eye-based interaction concepts, implementing interactive systems that integrate eye-tracking information, and  conducting eye-tracking user studies. 

Social impact

Eye movements have a long history of social impact, particularly for advances user interfaces for people with disabilities to offer control methods that do not rely primarily on hand-based inputs. There is also a high potential of the method for general  interface navigation, with a range of typical devices such as tablets, desktop PCs, and extended reality (XR)  headsets and glasses  adopting eye-tracking sensors. Through these devices, novel behavioural insights can be gained given that eye movements indicate cognitive and motor processes of  the user, which can be leveraged to improve the understanding of  behavioural patterns, intent, and attention. 

Key publications

Ken Pfeuffer, Benedikt Mayer, Diako Mardanbegi, and Hans Gellersen. 2017. Gaze + pinch interaction in virtual reality. In Proceedings of the 5th Symposium on Spatial User Interaction (SUI '17). Association for Computing Machinery, New York, NY, USA, 99–108. https://doi.org/10.1145/3131277.3132180 

Mathias N. Lystbæk, Thorbjørn Mikkelsen, Roland Krisztandl, Eric J Gonzalez, Mar Gonzalez-Franco, Hans Gellersen, and Ken Pfeuffer. 2024. Hands-on, Hands-off: Gaze-Assisted Bimanual 3D Interaction. In Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology (UIST '24). Association for Computing Machinery, New York, NY, USA, Article 80, 1–12. https://doi.org/10.1145/3654777.3676331 

Franziska Prummer, Mohamed Shereef Abdelwahab, Florian Weidner, Yasmeen Abdrabou, and Hans Gellersen. 2025. It’s Not Always the Same Eye That Dominates: Effects of Viewing Angle, Handedness and Eye Movement in 3D. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (CHI '25). Association for Computing Machinery, New York, NY, USA, Article 748, 1–10. https://doi.org/10.1145/3706598.3713992 

Franziska Prummer, Mohamed Shereef Abdelwahab, Florian Weidner, Yasmeen Abdrabou, and Hans Gellersen. 2025. It’s Not Always the Same Eye That Dominates: Effects of Viewing Angle, Handedness and Eye Movement in 3D. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (CHI '25). Association for Computing Machinery, New York, NY, USA, Article 748, 1–10. https://doi.org/10.1145/3706598.3713992 

Researchers

Ken Pfeuffer

Associate Professor
Views
Ken Pfeuffer: The Future of Text in XR Symposium 2024
Views
Ken Pfeuffer: Evolution of XR Input

Current projects and labs