At Google in Mountain View I worked on modelling and prediction of user performance, using the tools of machine learning / Tensorflow. The work also includes an eye-tracking study that provides an in-depth analysis of visual attention on mobile phones. It was presented April at the CHI conference in Montreal, Canada.
[PhD Thesis] Extending Touch with Eye Gaze
The thesis is finished! It’s the proof of 4 years of living the life of a lab-rat, it’s a manual on how to build a gaze-interactive landscape of user interfaces, it’s a most (un-)likely vision of a gaze based future, and it’s an Inception-like design space of design spaces exploration.
Continue reading “[PhD Thesis] Extending Touch with Eye Gaze”
[SUI’17] Gaze + Pinch Interaction in Virtual Reality
An exploration of 3D eye gaze interactions in VR. Focus is on what kind of capabilities one would gain when using gaze with freehand gestures. It’s different to my prior work on 2D touchscreens – less than a traditional input device, more like a supernatural ability! Lots of examples, scenarios, build with low-fidelity prototypes.
Continue reading “[SUI’17] Gaze + Pinch Interaction in Virtual Reality”
[CHI’17 / MSR] Thumb + Pen Interaction on Tablets
This paper was the result of the amazing internship in MSR 2016. Developed lots of interaction concepts using pen and touch modalities, in a typical exploratory design approach from Ken Hinckley & co. The video reached >30000 views!
Continue reading “[CHI’17 / MSR] Thumb + Pen Interaction on Tablets”
[PuC] Look Together: Using Gaze for Assisting Co-located Collaborative Search
This work continues on the multi-user work, and studies user performance for a collaborative search task. The work proposes four different ways to represent the gaze cursor to the users, between subtle (and less noticeable by others) and strong visuals (more noticeable by other, but more distracting too).
Continue reading “[PuC] Look Together: Using Gaze for Assisting Co-located Collaborative Search”
[MUM’16] GazeArchers: Playing with Individual and Shared Attention in a Two-Player Look&Shoot Tabletop Game
A fun project to develop a game, that can be considered the most fun game for eye-tracking! It’s a tabletop UI where we attached two Tobii eyex trackers. The game has some interesting concepts: what if users look at the same target? What if they don’t?
[UIST’16] Gaze and Touch Interaction on Tablets
Another design space exploration for gaze input. Here it’s about how gaze can support touch interaction on tablets. When holding the device, the free thumb is normally limited in reach, but can provide an opportunity for indirect touch input. Here we propose gaze and touch input, where touches redirect to the gaze target. This provides whole-screen reachability while only using a single hand for both holding and input.
Continue reading “[UIST’16] Gaze and Touch Interaction on Tablets”
[CHI’16] Partially-indirect Bimanual Input with Gaze, Pen, and Touch for Pan, Zoom, and Ink Interaction
What is partially indirect? It’s about a new way of interacting with two hands: one does direct input via a pen, and the other indirect via touch & gaze. This paper provides an in-depth study to investigate this constellation.
[UIST’15] Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze
Most input devices, be it mouse, pen, touch, are already extensively investigated and have become a big part of everyday life. Rather than adding a new UI mode, gesture, or some additional sensors, how can we make all of them substantially more expressive?
Here we explore the idea of Gaze-shifting, using gaze to add an indirect input mode to existing direct input devices. In essence, you can use any input device for both modes, as shown in the video examples.
Really happy that this work got nominated for best paper award!