-
Another design space exploration for gaze input. Here it’s about how gaze can support touch interaction on tablets. When holding the device, the free thumb is normally limited in reach, but can provide an opportunity for indirect touch input. Here we propose gaze and touch input, where touches redirect to the gaze target. This provides…
-
Most input devices, be it mouse, pen, touch, are already extensively investigated and have become a big part of everyday life. Rather than adding a new UI mode, gesture, or some additional sensors, how can we make all of them substantially more expressive? Here we explore the idea of Gaze-shifting, using gaze to add an…
-
How can we use the eyes better in existing user interfaces? Here we explore gaze input to complement multi-touch for interaction on the same surface. We present gaze-touch, a technique that combines the two modalities based on the principle of ‘gaze selects, touch manipulates’. Gaze is used to select a target, and coupled with multi-touch…
-
Fighting one of the biggest problems of eye trackers: calibration! For each individual user, eye trackers need to be calibrated which makes them just not usable for anything out of the lab. Here we propose a moving target calibration which allows to calibrate users implicitly, reliably, and without them even knowing.
-
Eye-based interaction has commonly been based on estimation of eye gaze direction, to locate objects for interaction. We introduceĀ Pursuits, a novel and very different eye tracking method that instead is based on following the trajectory of eye movement and comparing this with trajectories of objects in the field of view.
-
My bachelor thesis project, looking at mid air pointing on a projection. The unique thing is the mobile projector that is attached to a phone, following up on the vision of future phones that can project any screen, anywhere! Learned lots about studies and Fitts Law here, and got interested in interaction design.