-
![[UIST’14] Gaze-touch: combining gaze with multi-touch for interaction on the same surface [UIST’14] Gaze-touch: combining gaze with multi-touch for interaction on the same surface](https://kenpfeuffer.com/wp-content/uploads/2014/10/gazetouch.jpg)
How can we use the eyes better in existing user interfaces? Here we explore gaze input to complement multi-touch for interaction on the same surface. We present gaze-touch, a technique that combines the two modalities based on the principle of ‘gaze selects, touch manipulates’. Gaze is used to select a target, and coupled with multi-touch
-
![[UIST’13] Pursuit calibration: making gaze calibration less tedious and more flexible [UIST’13] Pursuit calibration: making gaze calibration less tedious and more flexible](https://kenpfeuffer.com/wp-content/uploads/2013/10/pursitcal.jpg)
Fighting one of the biggest problems of eye trackers: calibration! For each individual user, eye trackers need to be calibrated which makes them just not usable for anything out of the lab. Here we propose a moving target calibration which allows to calibrate users implicitly, reliably, and without them even knowing.
-
![[CHI’13 Demo] Pursuits: eye-based interaction with moving targets [CHI’13 Demo] Pursuits: eye-based interaction with moving targets](https://kenpfeuffer.com/wp-content/uploads/2013/05/pursiits.jpg)
Eye-based interaction has commonly been based on estimation of eye gaze direction, to locate objects for interaction. We introduce Pursuits, a novel and very different eye tracking method that instead is based on following the trajectory of eye movement and comparing this with trajectories of objects in the field of view.
-
![[ITS’12] Investigating mid-air pointing interaction for projector phones [ITS’12] Investigating mid-air pointing interaction for projector phones](https://kenpfeuffer.com/wp-content/uploads/2012/09/projector.jpg)
My bachelor thesis project, looking at mid air pointing on a projection. The unique thing is the mobile projector that is attached to a phone, following up on the vision of future phones that can project any screen, anywhere! Learned lots about studies and Fitts Law here, and got interested in interaction design.
![[Ubicomp’15 Demo] A collaborative gaze aware information display [Ubicomp’15 Demo] A collaborative gaze aware information display](https://kenpfeuffer.com/wp-content/uploads/2015/09/collab.jpg)
![[INTERACT’15] Gaze+touch vs. touch: what’s the trade-off when using gaze to extend touch to remote displays? [INTERACT’15] Gaze+touch vs. touch: what’s the trade-off when using gaze to extend touch to remote displays?](https://kenpfeuffer.com/wp-content/uploads/2015/08/gaze-mt.jpg)