Design process in building application that uses Gaze as main gestures through XR design, scientific theory of HCI, data analytic, UX Design and Research.
During the workshop on 3D XR back in 2019 in Altspace, every workshop we always touch the topic of multimodal experience and how people really like the platform for it. Altspace at the time was one of the most advanced event based XR platform ( i guess for me up to certain level they still are ) .. Multimodal interactions , that uses the strengths of each sensory modality of users - like speech, gaze, gestures, audio, and 6 degree of freedom would always advance user experience , and could even be used to measure satisfaction of user.
In this blog, i'm peeling one of the project i'm doing that focuses on a building experience using Gaze in particular tailored to the specific needs and preferences of users to get user goal's done as quick and effective as possible. It makes me learn more to design solution that uses gaze interaction principal to deliver a good user experience.
The skill that i have used to do this are XR design, scientific theory of HCI, data analytic, UX Design and Research.
In this case, the project i'm designing using Gaze as a way to measure concentration level and stress check.
Here is how i approach it through theoritical framework and implementation aspect :
Theoretical Frameworks:
Implementation Aspects:
Comfortable Gaze Interaction: Gaze interactions remain within a comfortable ±60-degree angle, users can engage with the virtual environment without experiencing strain during prolonged use. I use gaze for interaction case study , but if im using them as navigation which isnt in this case. I'd add a layer of Gaze Angle: 10° - 20° below horizon or if it's for selection , i'd add a layer of Gaze-Based Selection Precision: Setting a 10-degree activation angle threshold ensures precise gaze-based interactions for accurate feedback
Things to avoid based on the testing : Gaze angle > 10°above horizon ; Gaze angle > 60° below horizon ; Neck rotation > 45 °above horizon
Comfortable Gaze Interaction: Gaze interactions remain within a comfortable ±60-degree angle, users can engage with the virtual environment without experiencing strain during prolonged use. I use gaze for interaction case study , but if im using them as navigation which isnt in this case. I'd add a layer of Gaze Angle: 10° - 20° below horizon or if it's for selection , i'd add a layer of Gaze-Based Selection Precision: Setting a 10-degree activation angle threshold ensures precise gaze-based interactions for accurate feedback
Things to avoid based on the testing : Gaze angle > 10°above horizon ; Gaze angle > 60° below horizon ; Neck rotation > 45 °above horizon
Spatial UI Layout : In this case, i place the interactive elements within a 1.5-meter radius ensures comfortable interaction , while we make the content changes automatically as needed without requiring users to stretch left and right too much.
While doing this, i also had to think about visual cues feedback , thinking locomotion will be one of the only gestures within the experience. So we added a Clear Feedback Response Time <100 milliseconds. This rapid feedback mechanism enhances the responsiveness and usability of the locomotion system.
------------------------------------------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------------------------------------------
Work in Progress :
Implementation Aspects: