Introducing “Eye-Hand Symbiosis”, an interaction paradigm where the eyes and hands form a symbiotic relationship in the user interface to enable a novel way of human-computer interaction that extends beyond each of the individual modalities toward a plethora of novel interactive capabilities.
One of the most evolved relationships in our body exists between the eyes and the hands. We can intuitively coordinate what we see with what we touch, hold, and manipulate with our hands. This is not an easy feat, as each organ in itself is performing highly complex movements, often for seemingly distinct purpose. Our hands are highly expressive, flexibel, can learn skill and gesture, whereas our eyes indicate intent, attention and interest; effortlessly and instantly. Both combined, form the basis for powerful human abilities to experience and manipulate the world around us.
In the history of Human-Computer Interaction (HCI), major interfaces are based on the coordination of the two organs. The interactions follow to a clear division of labour between the two: the hands provide commands to the computer via input devices, and the eyes take the role to perceive the visual output on a screen. This has been a long-lasting paradigm that transcended the eras of computing from command-line interfaces to desktop and mobile devices, as arguably the most successful and widely available human-computer systems of our times.
Yet, with increasing advances in computing technology and many innovations across research and industry, a new type of human-computer interaction is seemingly emerging at the horizon that has the potential to transform the way we have been interacting all along. The particular technological advance we are witnessing is the increasingly usable, accurate, and secure way to sense what our eyes are focusing on via eye-tracking sensors. Before the millenium, eye-tracking was primarily a mean to understand how our eyes work in psychology and medical fields. Afterwards, various digital devices have been enhanced with eye-tracking sensors, and in future eye-tracked contact lenses, glasses, smartphones, and in principle any computing device.
At a pervasive scale integrated across our entire landscape of digital devices we are using every day, it will become possible to integrate both the eyes and hands actively for the interaction with computers, something researchers begin to increasingly explore.
A (not so serious) scenario for a smartphone
To illustrate how the interaction paradigm can be conceived in practice by people, and what potential advantages arise, here’s an example scenario of a dialog of Kate, a mother, and her son John, talking about a first experience with the future tech on their smartphone.
Kate: Oh my god, this Instagram feed is endless, my thumb feels like on fire! I almost can’t move it anymore… But I’ve just got to finish the stories of today…
John: Mum, that’s what I was telling you all year! Just use the new eye-hand features. It makes it so much easier.
Kate: No thanks, I told you I don’t want my phone to track even my eyes.
John: Come on, it already knows what you’re looking… At all the new pictures and clips that you are touching, there is literally no difference. Just try it once, here, take my phone.
Kate: Ok, just once… Oh, it seems to know I when I want to swipe and does it for me. That’s kind of smart. Haha, if I had that earlier, my hands probably would’nt be so stiff all the time… But, sometimes I still have to swipe by myself? I guess technology isn’t perfect.
John: So it’s using that AI tech, you can set it up to work much more and faster. My friend at the club use it all the time. but for me it was too much, doing sometimes things that I did not want. Just go to settings and change the options.
Kate: Doing it – wow, now that is cool! This is almost no work at all. But I see… When I look slightly different to the usual, the system reacts weirdly… I mean it’s so much better than before… But I do wonder, if it knows where I look, why not just place a button at the bottom that I can use to steer my eyes explicitly…
John: Oh that’s actually a new experimental feature that does that. It’s a beta, so you have to use the other app.
Kate: Does it okay, that’s what I’m talking about! For me, this is the perfect balance, as I’m in control, but have no effort at all for me fingers! Love it. Too bad it just works for Instagram… I mean, I could use this for work, texting with my friends…
John: well, there’s actually rumours of a new type of smartphone coming up, that is said to have a whole new operating system optimised for this kind of stuff, they call it Eye-Hand Symbiosis. Some of the nerds at school say that this is gonna completely change how we use computers at large…