Most input devices, be it mouse, pen, touch, are already extensively investigated and have become a big part of everyday life. Rather than adding a new UI mode, gesture, or some additional sensors, how can we make all of them substantially more expressive?
Here we explore the idea of Gaze-shifting, using gaze to add an indirect input mode to existing direct input devices. In essence, you can use any input device for both modes, as shown in the video examples.
Really happy that this work got nominated for best paper award!
Modalities such as pen and touch are associated with direct input but can also be used for indirect input. We propose to combine the two modes for direct-indirect input modulated by gaze. We introduce gaze-shifting as a novel mechanism for switching the input mode based on the alignment of manual input and the user’s visual attention.
Input in the user’s area of attention results in direct manipulation whereas input offset from the user’s gaze is redirected to the visual target. The technique is generic and can be used in the same manner with different input modalities. We show how gaze-shifting enables novel direct-indirect techniques with pen, touch, and combinations of pen and touch input.
Gaze-Shifting: direct-indirect input with pen and touch modulated by gaze (best paper nominee)
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang, and Hans Gellersen. 2015. In Proceedings of the 28th annual ACM symposium on User interface software and technology (UIST ’15). ACM, Charlotte, NC, USA. 373-383. doi, pdf, video, talk