Blog

  • Prototyping in Virtual Reality is quite difficult – and more so in Augmented and Mixed Reality (MR) where the physical and virtual realities unify. In this project, we propose a novel approach and system to create Mixed Reality experiences rapidly, for anyone who has a typical MR headset. Called SpatialProto, the system allows users to…

    Read more

  • Ever wondered that with AR glasses, you might simply look outside your window into the nature, and augmented information only appears when we really want to? Like, I am curiously gazing at that particular mountain far-away, and subtly the name and height of it appears. This question underlies this research, focusing on how our eyes…

    Read more

  • We look at the menu before we interact with it – which is what we aim to exploit in this project via eye-tracking. In VR 3D design tools, users often hold a menu in one hand, while the other hand is reserved for the drawings with a pen. To change something in a menu, we…

    Read more

  • We explored a mechanism to authenticate users in VR without effort in the user’s background. This can become quite important, considering the spread of head mounted devices in AR, VR, MR or XR, and the never-ending effort of having to authenticate to every session and use of a device. Of course, the headset itself could…

    Read more

  • In one of the last projects in Lancaster, I worked on a novel class of interaction technique for VR with Diako Mardanbegi et al. Many applications in VR need users to change a mode of a 3D object, e.g., to change the color of a virtual furniture. This project showed a seamless way to do…

    Read more

  • At Google in Mountain View I worked on modelling and prediction of user performance, using the tools of machine learning / Tensorflow. The work also includes an eye-tracking study that provides an in-depth analysis of visual attention on mobile phones. It was presented April at the CHI conference in Montreal, Canada.

    Read more

  • The thesis is finished! It’s the proof of 4 years of living the life of a lab-rat, it’s a manual on how to build a gaze-interactive landscape of user interfaces, it’s a most (un-)likely vision of a gaze based future, and it’s an Inception-like design space of design spaces exploration.

    Read more

  • An exploration of 3D eye gaze interactions in VR. Focus is on what kind of capabilities one would gain when using gaze with freehand gestures. It’s different to my prior work on 2D touchscreens – less than a traditional input device, more like a supernatural ability! Lots of examples, scenarios, build with low-fidelity prototypes.

    Read more

  • This paper was the result of the amazing internship in MSR 2016. Developed lots of interaction concepts using pen and touch modalities, in a typical exploratory design approach from Ken Hinckley & co. The video reached >30000 views!

    Read more

  • This work continues on the multi-user work, and studies user performance for a collaborative search task. The work proposes four different ways to represent the gaze cursor to the users, between subtle (and less noticeable by others) and strong visuals (more noticeable by other, but more distracting too).

    Read more