Designing Application based on HCI Scientific theory + XR Design Principal for Gaze Interaction Experience

Design process in building application that uses Gaze as main gestures through XR design, scientific theory of HCI, data analytic, UX Design and Research.

Designing Application based on HCI Scientific theory + XR Design Principal for Gaze Interaction Experience
About the project

During the workshop on 3D XR back in 2019 in Altspace, every workshop we always touch the topic of multimodal experience and how people really like the platform for it. Altspace at the time was one of the most advanced event based XR platform ( i guess for me up to certain level they still are ) .. Multimodal interactions , that uses the strengths of each sensory modality of users - like speech, gaze, gestures, audio, and 6 degree of freedom would always advance user experience , and could even be used to measure satisfaction of user.


In this blog, i'm peeling one of the project i'm doing that focuses on a building experience using Gaze in particular tailored to the specific needs and preferences of users to get user goal's done as quick and effective as possible. It makes me learn more to design solution that uses gaze interaction principal to deliver a good user experience.

The skill that i have used to do this are XR design, scientific theory of HCI, data analytic, UX Design and Research.

No items found.
The solution

User's Jobs to be Done to solved from Design Perspective

Using gaze to identify users’ focus areas:
In real world , we use gaze as a way to communicate with things around us , so now in XR - we use this basic principal let people do their thing and show the required minimum only when needed by utilising gaze. Let's see it one step ahead ; how might gaze be used to measure engagement towards a certain object / environment / interaction or interfaces.


In this case, the project i'm designing using Gaze as a way to measure concentration level and stress check.

Here is how i approach it through theoritical framework and implementation aspect :

Theoretical Frameworks:

  • Gaze-Driven Interaction: Emphasizing the use of gaze as the sole input modality for navigation, selection, and manipulation within XR environments.
  • Attention Allocation: Understanding how users allocate their visual attention to different elements within the interface to buld the correct environment
  • Perceptual-Cognitive Load: Evaluating the cognitive load imposed by gaze-based interactions and designing interfaces that minimize cognitive strain.
  • User Engagement Metrics: Developing metrics to quantify user engagement based on gaze behavior, such as fixation duration and saccade patterns.

Implementation Aspects:

  • Gaze Analytics: Implementing analytics tools to track and analyze user gaze behavior

No items found.
Results

Gaze-Driven Interaction and Attention Allocation



Comfortable Gaze Interaction
: Gaze interactions remain within a comfortable ±60-degree angle, users can engage with the virtual environment without experiencing strain during prolonged use. I use gaze for interaction case study , but if im using them as navigation which isnt in this case. I'd add a layer of Gaze Angle: 10° - 20° below horizon or if it's for selection , i'd add a layer of Gaze-Based Selection Precision: Setting a 10-degree activation angle threshold ensures precise gaze-based interactions for accurate feedback

Things to avoid based on the testing : Gaze angle > 10°above horizon ; Gaze angle > 60° below horizon ; Neck rotation > 45 °above horizon

Comfortable Gaze Interaction: Gaze interactions remain within a comfortable ±60-degree angle, users can engage with the virtual environment without experiencing strain during prolonged use. I use gaze for interaction case study , but if im using them as navigation which isnt in this case. I'd add a layer of Gaze Angle: 10° - 20° below horizon or if it's for selection , i'd add a layer of Gaze-Based Selection Precision: Setting a 10-degree activation angle threshold ensures precise gaze-based interactions for accurate feedback

Things to avoid based on the testing : Gaze angle > 10°above horizon ; Gaze angle > 60° below horizon ; Neck rotation > 45 °above horizon

Object placed within 0-60 degree to -0 to -60 degree( horizontally and vertically ) for gaze interaction within 90 degree POV

Spatial UI Layout : In this case, i place the interactive elements within a 1.5-meter radius ensures comfortable interaction , while we make the content changes automatically as needed without requiring users to stretch left and right too much.

Interactive elements within a 1.5-meter

Perceptual-Cognitive Load

While doing this, i also had to think about visual cues feedback , thinking locomotion will be one of the only gestures within the experience. So we added a Clear Feedback Response Time <100 milliseconds. This rapid feedback mechanism enhances the responsiveness and usability of the locomotion system.

------------------------------------------------------------------------------------------------------------------------------------

Design Result based of Scientific theory + XR Design Principal for Gaze Interaction Experience

------------------------------------------------------------------------------------------------------------------------------------

Work in Progress :

  • User Engagement Metrics: Developing metrics to quantify user engagement based on gaze behavior, such as fixation duration and saccade patterns.

Implementation Aspects:

  • Gaze Analytics: Implementing analytics tools to track and analyze user gaze behavior

No items found.