






Real Minds in Virtual Worlds
- 00Days
- 00Hours
- 00Minutes
- 00Seconds
VR Innovation Lab
Bridging the Future of Immersive VR and Human Sciences
VRS Hackathon 2023
The VRS Hackathon is an annual interdisciplinary event organized by KlaesLab at Ruhr-Universität Bochum, bringing together developers, researchers, designers, and neuroscientists to tackle real-world challenges at the intersection of XR technology and human science.
The Challenge: Accessibility Through Eye & Face Tracking
Target Audience: Patients with conditions such as ALS or those who are quadriplegic. Such individuals have severely limited or non-existent movement capabilities. Their primary forms of interaction may be restricted to eye movements and facial expressions.
Core Challenge: Teams participating in the hackathon are tasked with devising innovative virtual scenarios and interactions, solely reliant on eye- and face-tracking. The intent isn't to pose a technological challenge, given the capabilities of current HMDs. Instead, the emphasis is on designing new environments and interaction methods within the defined constraints that can be both entertaining and functional for patients. The rating will focus not only on the interactions but also on the design of the environment or narrative.
The Problem: Observation vs. Interaction
Relying solely on eye tracking for interaction poses a challenge in distinguishing between merely looking at an object and intending to interact with it. By integrating both eye tracking and facial expressions, this issue can be addressed, effectively differentiating between observation and the intent to engage.
Some ideas on how this could look like:
- Rotation & Locomotion: Eye tracking could be used for head rotations (e.g., looking towards the right edge of the field of view to rotate right). Facial tracking could be used for locomotion—a smile might signal moving forward, while an open mouth could mean walking backward.
- Teleportation: A user might fixate on the location they want to teleport to and then make a specific facial expression to trigger the action.
- Environment Interaction: Grabbing an object could be done by fixating on an interactable object and making a specific facial expression.
(Note: A referenced study by Dey et al. 2022 explored facial expressions for such interactions in VR. Mark Billinghurst, one of the authors, gives a talk at the VR Summit).
Examples of What This Could Look Like
- Games where control and interaction are determined by gaze tracking, facial expressions, and voice commands.
- Simulated walking or movement experiences initiated and controlled by eye movements and facial gestures.
- Meditation applications where progression or change is dictated by the user's facial expressions or gaze (potentially exploring EEG integration).
- Communication tools emphasizing social interaction, such as a text-to-voice keyboard controlled via eye tracking for ALS patients.
- Developing a critical functional interface for patients to notify a care person (e.g., via a phone message) that they want to take the HMD off by pressing a specific button in the environment.
Keywords
Virtual Reality (VR) Eye Tracking Facial Expression Recognition Accessibility (ALS) Neurorehabilitation HackathonEvent Highlights
