[Ubicomp’15 Demo] A collaborative gaze aware information display

Building a system and app to showcase some really cool possibilities when using two eye trackers. Together, users can search hotels on a large map – both their gaze is indicated, in order to improve the user’s teamwork.

Continue reading “[Ubicomp’15 Demo] A collaborative gaze aware information display”

Advertisements

[INTERACT’15] Gaze+touch vs. touch: what’s the trade-off when using gaze to extend touch to remote displays?

An early study I’ve conducted to better understand how gaze and touch compare. The old system made it a bit difficult to compare, but overall got some good indications on user performance. For dragging, gaze+touch is bad, but its quite comparable for scaling and rotation tasks.

Continue reading “[INTERACT’15] Gaze+touch vs. touch: what’s the trade-off when using gaze to extend touch to remote displays?”

[UIST’14] Gaze-touch: combining gaze with multi-touch for interaction on the same surface

How can we use the eyes better in existing user interfaces? Here we explore gaze input to complement multi-touch for interaction on the same surface. We present gaze-touch, a technique that combines the two modalities based on the principle of ‘gaze selects, touch manipulates’. Gaze is used to select a target, and coupled with multi-touch gestures that the user can perform anywhere on the surface.

Continue reading “[UIST’14] Gaze-touch: combining gaze with multi-touch for interaction on the same surface”

[UIST’13] Pursuit calibration: making gaze calibration less tedious and more flexible

Fighting one of the biggest problems of eye trackers: calibration! For each individual user, eye trackers need to be calibrated which makes them just not usable for anything out of the lab. Here we propose a moving target calibration which allows to calibrate users implicitly, reliably, and without them even knowing.

Continue reading “[UIST’13] Pursuit calibration: making gaze calibration less tedious and more flexible”

[CHI’13 Demo] Pursuits: eye-based interaction with moving targets

Eye-based interaction has commonly been based on estimation of eye gaze direction, to locate objects for interaction. We introduce Pursuits, a novel and very different eye tracking method that instead is based on following the trajectory of eye movement and comparing this with trajectories of objects in the field of view.

Continue reading “[CHI’13 Demo] Pursuits: eye-based interaction with moving targets”

[ITS’12] Investigating mid-air pointing interaction for projector phones

My bachelor thesis project, looking at mid air pointing on a projection. The unique thing is the mobile projector that is attached to a phone, following up on the vision of future phones that can project any screen, anywhere! Learned lots about studies and Fitts Law here, and got interested in interaction design.

Continue reading “[ITS’12] Investigating mid-air pointing interaction for projector phones”

Create a free website or blog at WordPress.com.

Up ↑