3D virtual and real environments can be vast, and the interaction with targets near and far is not easy. You could have a controller for raypointing, but this requires users to always carry the device. Freehand gestures have been long considered as an alternative, natural UI. However, direct manipulation becomes difficult over distance. But how can we improve this, with what potential alternative, gaze-based techniques? And how do they compare wrt. to time, error, usability, …
With this paper presented by Uta at CHI’23, we propose and evaluate techniques that are based on eye-tracking and hand tracking for selection. Our techniques in our user study involve:
- Gaze & Pinch: a prior interaction technique proposed in my earlier work.
- Gaze Hand Alignment: a concept based on aligning the hand and eye rays, to select even without a dedicated pinch, proposed in one of my more recent works. Here we have two variations:
- Gaze & Finger: Here gaze is used as some sort of preselection (hover), which indicates target of interest. By simply aligning the index finger with the gaze ray, selection is confirmed.
- Gaze & Handray: Similar to the previous, but instead of the index finger, it is the cursor of a Handray that has to be aligned.
In the evaluation, we compared to two baselines, Handray (where you point with the arm and pinch with the same hand), and Headcrusher (a 1997 technique where you crush the object in your line of sight – interesting from a perspective of the family of image plane techniques).
Here is a picture with an overview of all techniques:

How do these technique compare in principle? A detailed comparison can be seen in the following table.



What did we find out?
- Techniques ordered from high to low throughput are: Gaze&Handray (2.09), Gaze&Pinch (2.06), Gaze&Finger (1.86), Handray (1.39), Headcrusher (1.32).
- Target depth (and associated parallax) has a detrimental effect on performance of image plane techniques, but less pronounced with Gaze&Finger than with Headcrusher.
- Headcrusher was perceived least favourable technique and resulted in the lowest performance.
- Gaze&Handray resulted in highest performance and most preferences, indicating a viable approach next to Gaze&Pinch.
- We find that all techniques except for Handray are affected by target amplitude.
- Overall all three gaze-assisted techniques outperformed manual pointing techniques for our tested target size of 3 degrees of visual angle.
Interestingly, the Gaze&Handray technique was even a bit faster overall than the Gaze & Pinch technique, although it is not a statistically significant difference. Of course, there is a qualitative difference, as Gaze&Pinch extends nicely to various tasks. This could be further investigated for the Gaze&Handray and the other GHA technique.
Check out the Abstract & full presentation here:
Abstract: Gaze-Hand Alignment has recently been proposed for multimodal selection in 3D. The technique takes advantage of gaze for target pre-selection, as it naturally precedes manual input. Selection is then completed when manual input aligns with gaze on the target, without need for an additional click method. In this work we evaluate two alignment techniques, Gaze&Finger and Gaze&Handray, combining gaze with image plane pointing versus raycasting, in comparison with hands-only baselines and Gaze&Pinch as established multimodal technique. We used Fitts’ Law study design with targets presented at different depths in the visual scene, to assess effect of parallax on performance. The alignment techniques outperformed their respective hands-only baselines. Gaze&Finger is efficient when targets are close to the image plane but less performant with increasing target depth due to parallax.
Uta Wagner, Mathias N. Lystbæk, Pavel Manakhov, Jens Emil Sloth Grønbæk, Ken Pfeuffer, and Hans Gellersen. 2023. A Fitts’ Law Study of Gaze-Hand Alignment for Selection in 3D User Interfaces. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23). Association for Computing Machinery, New York, NY, USA, Article 252, 1–15. https://doi.org/10.1145/3544548.3581423, https://pure.au.dk/portal/files/312128746/Fitts_Law_Uta.pdf