Aarhus University Seal

Physical Computing

Research Focus

We focus on 3 key areas:

  • Multimodal Interaction
  • Fabrication
  • The Design of Tangible Interfaces

Our research contributes to HCI by investigating multisensory modalities, including visual, auditory, and haptic feedback. Our aim is to enhance usability, accessibility, and collaborative potential within diverse settings such as mobile, physical, and hybrid work environments.

Another focus is tangible interface design, which involves embedding interactive functionality into physical materials, surfaces, and shape-changing objects.

By bridging the disciplines of engineering and interaction design, we also investigate ways to democratize and simplify the creation of intelligent, personalized physical interfaces. This is accomplished through state-of-the-art fabrication technologies, sophisticated sensor integration, and end-user customization, ultimately empowering broader participation in interface design .

Social Impact

Our research addresses critical societal challenges related to accessibility, healthcare, and collaborative work.

Accessibility and Inclusion: through the development of non-visual, multimodal, and intra-oral interfaces, the group enhances digital accessibility for users with visual, motor, or situational impairments.

Health and Wellbeing: the group’s work on intra-oral and wearable interfaces contributes to non-invasive, continuous health monitoring.

Multimodal Collaboration: research on multimodal collaborative systems explores how speech, gesture, touch, and visual feedback can be integrated to support more inclusive and effective remote or co-located teamwork. These systems facilitate equitable participation and improve communication across diverse user groups and settings.

Digital Fabrication: by creating accessible fabrication methods we can empower novice users to engage in creative technology development.

Key publications

Kleinau, J., Grønbæk, J. E. S., & Hoggan, E. (2025, April). Co-Designing Multimodal Tools for Radically Mobile Hybrid Meetings. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (pp. 1-17).

Jiang, Y., Kleinau, J., Eckroth, T. M., Hoggan, E., Mueller, S., & Wessely, M. (2024, October). MouthIO: Fabricating Customizable Oral User Interfaces with Integrated Sensing and Actuation. In Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology (pp. 1-16).

Hoggan, E. (2024). Multimodal Interaction. In Interaction Techniques and Technologies in Human-Computer Interaction(pp. 45-63). CRC Press.

Van Oosterhout, A., Hoggan, E., Rasmussen, M. K., & Bruns, M. (2019, June). DynaKnob: combining haptic force feedback and shape change. In Proceedings of the 2019 on Designing Interactive Systems Conference (pp. 963-974).

Zhu, Y., Honnet, C., Kang, Y., Zhu, J., Zheng, A. J., Heinz, K., ... & Mueller, S. (2024, October). PortaChrome: A Portable Contact Light Source for Integrated Re-Programmable Multi-Color Textures. In Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology (pp. 1-13).