
Can’t find your pen? Then just think about it—the shape, how it feels when you hold it and the colour of the ink you want—and your 3D printer will just print it for you. Last year a Chilean Enterprise, Thinker Thing, developed software called Emotional Evolutionary Design that interprets designs you think about to help 3D print those imagined objects.
The University of Minnesota created a quadcopter controlled by your mind. Put on the hat, just think left and it goes left, think up and it goes up. Pretty amazing!
Whilst these may be a little way off from changing our daily lives, development focus on natural user interfaces (NUIs) is well underway both in the commercial and medical world. NUI is simply the adaption of technology to human behaviour such as voice, gesture, eye or body movement, and yes, even thought control. The ambition is to create a seamless experience between the physical and digital worlds that we now inhabit.
Some everyday NUIs already exist, such as Siri, Kinect and Leap Motion, but they have not yet become all pervasive. There are a number of reasons for this. However, large companies such as Apple, Microsoft, Samsung and Google, to name but a few, are charging ahead with development.
Consumers' desire to be able to waggle your fingers, wave your hand and so on in order to interact with whatever they are doing in the digital universe is also growing.
According to the Asian Human Interface Index, 56 per cent of respondents would like to interact with devices through air motion gestures, and 67 per cent through voice commands.
There is also significant activity around NUI in the startup community. For example, in 2013 Google acquired Flutter, a company whose ambition in its words is “to power the eyes of our devices”. Its premise is you can control any interaction with the wave of a hand. What makes it clever and unique is it uses the camera in your device or your webcam to do this.
There are some who believe that whilst NUIs are without doubt the future, humans are still tactile and the sensation of touch is important. Redux Labs, a haptic startup, believes this and is building screens that create "bending waves" to deliver sensation at specific points in the user experience. This can create a virtual button or a scroll bar that you would feel under your finger.
So what is my point? Each new wave of technology blurs the physical and digital divide in our behaviour while also changing our expectations of how we can interact. Each advance creates new opportunities for brands to engage consumers.
In the Fiat Live store you can connect into the site and a real person with a head cam becomes your personal avatar, taking you on a guided tour of the car based on your questions and requirements. Once connected, everything is voice and gesture controlled. It is a revolutionary way to view a car without having to actually go to a showroom.
Now imagine that haptics technology was embedded in whatever device you were using, so potentially you could have the sensation of running your hand over the cloth or leather of the seat or dashboard whilst chatting away with your avatar. This adds a new dimension. Why go to a physical store when you can see, feel and buy a product 100 per cent digitally and get it delivered directly to your home?
NUIs are currently all single-stream use. Uptake will really accelerate when we make it relevant to everyone on a mass scale. Think about how many times you might be speaking to someone, typing and thinking of something else all at the same time. Being able to type, open a file with a gesture and say the name of song that then starts playing will be when NUIs align to our true behaviour.
Kristian Barnes is CEO, Vizeum Asia-Pacific