-
![[CHI’18 / Google] Analysis and Modeling of Grid Performance on Touchscreen Mobile Devices [CHI’18 / Google] Analysis and Modeling of Grid Performance on Touchscreen Mobile Devices](https://kenpfeuffer.com/wp-content/uploads/2018/02/googlepic.jpg)
At Google in Mountain View I worked on modelling and prediction of user performance, using the tools of machine learning / Tensorflow. The work also includes an eye-tracking study that provides an in-depth analysis of visual attention on mobile phones. It was presented April at the CHI conference in Montreal, Canada.
-
![[SUI’17] Gaze + Pinch Interaction in Virtual Reality [SUI’17] Gaze + Pinch Interaction in Virtual Reality](https://kenpfeuffer.com/wp-content/uploads/2017/10/intro04-1.jpg)
An exploration of 3D eye gaze interactions in VR. Focus is on what kind of capabilities one would gain when using gaze with freehand gestures. It’s different to my prior work on 2D touchscreens – less than a traditional input device, more like a supernatural ability! Lots of examples, scenarios, build with low-fidelity prototypes.
-
![[PuC] Look Together: Using Gaze for Assisting Co-located Collaborative Search [PuC] Look Together: Using Gaze for Assisting Co-located Collaborative Search](https://kenpfeuffer.com/wp-content/uploads/2018/02/collab2.jpg)
This work continues on the multi-user work, and studies user performance for a collaborative search task. The work proposes four different ways to represent the gaze cursor to the users, between subtle (and less noticeable by others) and strong visuals (more noticeable by other, but more distracting too).
-
![[UIST’16] Gaze and Touch Interaction on Tablets [UIST’16] Gaze and Touch Interaction on Tablets](https://kenpfeuffer.com/wp-content/uploads/2016/10/gripclasses.jpg)
Another design space exploration for gaze input. Here it’s about how gaze can support touch interaction on tablets. When holding the device, the free thumb is normally limited in reach, but can provide an opportunity for indirect touch input. Here we propose gaze and touch input, where touches redirect to the gaze target. This provides
-
![[UIST’15] Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze [UIST’15] Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze](https://kenpfeuffer.com/wp-content/uploads/2015/10/newintro11.jpg)
Most input devices, be it mouse, pen, touch, are already extensively investigated and have become a big part of everyday life. Rather than adding a new UI mode, gesture, or some additional sensors, how can we make all of them substantially more expressive? Here we explore the idea of Gaze-shifting, using gaze to add an
![[VR’19] EyeSeeThrough: Seamless Tool Selection and Application [VR’19] EyeSeeThrough: Seamless Tool Selection and Application](https://kenpfeuffer.com/wp-content/uploads/2019/03/smalllight.png)
![[PhD Thesis] Extending Touch with Eye Gaze [PhD Thesis] Extending Touch with Eye Gaze](https://kenpfeuffer.com/wp-content/uploads/2018/02/overview7.jpg)
![[CHI’17 / MSR] Thumb + Pen Interaction on Tablets [CHI’17 / MSR] Thumb + Pen Interaction on Tablets](https://kenpfeuffer.com/wp-content/uploads/2017/05/ms-pic1.png?w=1036)
![[MUM’16] GazeArchers: Playing with Individual and Shared Attention in a Two-Player Look&Shoot Tabletop Game [MUM’16] GazeArchers: Playing with Individual and Shared Attention in a Two-Player Look&Shoot Tabletop Game](https://kenpfeuffer.com/wp-content/uploads/2016/12/setup2.png)
![[CHI’16] Partially-indirect Bimanual Input with Gaze, Pen, and Touch for Pan, Zoom, and Ink Interaction [CHI’16] Partially-indirect Bimanual Input with Gaze, Pen, and Touch for Pan, Zoom, and Ink Interaction](https://kenpfeuffer.com/wp-content/uploads/2016/05/techniques.jpg)