Guest Lectures by Katrin Angerbauer and Markus Wieland

The guest lectures will take place on 22.02.2024 at our VR Lab (Room 217, InformatiKOM 2, Adenauerring 10). The talks start at 11 am and we will finish with a Q&A session. If you are interested in participating online feel free to contact hci∂informatik.kit.edu. Do not miss this opportunity to gain insights from experts. See you there!
 
Visualizing Lived Experiences of People with Disabilities: Presenting Current Projects and Lessons Learned within My PhD Journey
Guest: Katrin Angerbauer
Abstract:
This short talk will present Katrin’s current projects at the intersection of Visualization and Mixed Reality Research and Accessibility. These include her CHI22 investigations on paper figure accessibility, her “Inclusive Avatar Project”, where she is researching how to make avatar representations more inclusive in social virtual reality platforms like VRChat and what lived experiences people with disabilities report while using the avatar, and her “Access Stories Project”, where she looks into how and in which context to visualize accessibility information and experiences. Further, it will include personal lived experiences of disability within academia and discuss lessons learned on trying to navigate her PhD journey with a disability.
Bio:
Katrin Angerbauer is a Ph.D. student at the University of Stuttgart, in the research group of Visualization and Virtual/Augmented Reality. She obtained her Bachelor’s and Master’s degrees in Software Engineering at the same university. Recently, she did a research stay at UCL London in the lab of Catherine Holloway, where she continues to be honorary research fellow until 2027. Katrin, personally experiencing life with a disability, draws personal motivation for her research. Her research focuses on human-computer interaction, virtual and augmented reality, and accessibility. It aims to amplify and visualize the lived experiences of people with disabilities and create more accessible and inclusive technology solutions.
 
Seize the Gaze: Empowering Eye Contact for People with Visual Impairment
 
Guest: Markus Wieland


Abstract:
Gaze serves a dual role in human social interactions: it allows us to perceive information about both our environment and conversation partner, while also enabling us to convey meaning through our gaze, such as by staring. For visually impaired individuals with residual vision, however, this form of communication remains largely inaccessible. Augmented Reality offers various possibilities for individuals with visual impairments to perceive eye contact. For instance, cues can be displayed through Augmented Reality glasses for individuals with visual impairments when the sighted person looks at them, thus providing access to eye contact. I will discuss the potential of Augmented Reality glasses for perceiving eye contact, as well as the requirements these cues must meet in different situations such as job interviews, collaborative scenarios, dating, or presenting.


Bio:
After completing a master's degree in Cognitive Science at the University of Tuebingen, Markus Wieland started his PhD journey at the Human-Centered Ubiquitous Media Lab at the LMU Munich. Here he developed an interest in human-computer interaction and, above all, accessibility. When he transitioned to the University of Stuttgart to continue his PhD studies and join Prof. Sedlmair's Virtual/Augmented Reality group, his research focused on making non-verbal communication perceivable through XR technologies for people with visual impairments.