[CHI’18 / Google] Analysis and Modeling of Grid Performance on Touchscreen Mobile Devices

At Google in Mountain View I worked on modelling and prediction of user performance, using the tools of machine learning / Tensorflow. The work also includes an eye-tracking study that provides an in-depth analysis of visual attention on mobile phones. It will be presented April at the CHI conference in Montreal, Canada. More info coming soon.

Advertisements

[PhD Thesis] Extending Touch with Eye Gaze

The thesis is finished! It’s the proof of 4 years of living the life of a lab-rat, it’s a manual on how to build a gaze-interactive landscape of user interfaces, it’s a most (un-)likely vision of a gaze based future, and it’s an Inception-like design space of design spaces exploration.

Continue reading “[PhD Thesis] Extending Touch with Eye Gaze”

[PuC] Look Together: Using Gaze for Assisting Co-located Collaborative Search

This work continues on the multi-user work, and studies user performance for a collaborative search task. The work proposes four different ways to represent the gaze cursor to the users, between subtle (and less noticeable by others) and strong visuals (more noticeable by other, but more distracting too).

Continue reading “[PuC] Look Together: Using Gaze for Assisting Co-located Collaborative Search”

[MUM’16] GazeArchers: Playing with Individual and Shared Attention in a Two-Player Look&Shoot Tabletop Game

A fun project to develop a game, that can be considered the most fun game for eye-tracking! It’s a tabletop UI where we attached two Tobii eyex trackers. The game has some interesting concepts: what if users look at the same target? What if they don’t?

Continue reading “[MUM’16] GazeArchers: Playing with Individual and Shared Attention in a Two-Player Look&Shoot Tabletop Game”

[UIST’16] Gaze and Touch Interaction on Tablets

Another design space exploration for gaze input. Here it’s about how gaze can support touch interaction on tablets. When holding the device, the free thumb is normally limited in reach, but can provide an opportunity for indirect touch input. Here we propose gaze and touch input, where touches redirect to the gaze target. This provides whole-screen reachability while only using a single hand for both holding and input.

Continue reading “[UIST’16] Gaze and Touch Interaction on Tablets”

[UIST’15] Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze

Most input devices, be it mouse, pen, touch, are already extensively investigated and have become a big part of everyday life. Rather than adding a new UI mode, gesture, or some additional sensors, how can we make all of them substantially more expressive?

Here we explore the idea of Gaze-shifting, using gaze to add an indirect input mode to existing direct input devices. In essence, you can use any input device for both modes, as shown in the video examples.

Really happy that this work got nominated for best paper award!

Continue reading “[UIST’15] Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze”

Blog at WordPress.com.

Up ↑