WHY THIS MATTERS IN BRIEF
Today’s computer interfaces are still limiting, even with the advent of voice and touch, so Hypersurfaces is setting them free.
In the future one thing is certain, as technology proliferates around us the keyboard and mouse will slowly die out and we will all be using and interfacing with technology in new ways – whether it’s telepathically via neural interfaces like the ones Facebook and friends are developing, or using more simple means such as gestures and voice, something known as behavioural computing. But as technology finds its way into everything from windows to clothing I can’t help feeling that the interfaces we have today in the labs are still, to some degree, missing the point. After all, wouldn’t it be nice to be able to interact with the devices and gadgets around us, from our self-driving cars to our smart home hi-fi’s using whatever interface is most convenient – whether it’s a book, a wall, or even a piece of furniture?
And this is precisely what a new innovation from Bruno Zamborlin and his company Hypersurfaces are delivering against. Hypersurfaces taps into the power of Artificial Intelligence (AI) and machine learning to “turn any object of any material, shape and size” into a user interface.
See it in action
Imagine a wooden kitchen table that can be used to control lighting or room temperature, a floor that’s able to determine if the intruder in your house is just the cat or a would-be thief, the surface of a door transformed into one big interface or the inner surface of a car door acting as a button-free control panel. These are some of the examples offered by the HyperSurfaces system.
The company’s technology works by combining vibration sensing with neural network algorithms running on dedicated microchips.
“Every time we interact with an object, we create a distinctive vibration pattern which dedicated sensors coupled with our patented algorithms, can transform into digital commands,” said Zamborlin – who heads an international development team split between London and Los Angeles.
All of the data processing is undertaken in real-time on the chip itself, meaning that once a use case model is loaded onto the system-on-chip, it can work without needing to access external systems such as data processing in the cloud.
“HyperSurfaces aims to revolutionise the way we live, blending the data world within any object around us,” the company stated in a press release. “Consumer electronics, IoT, retail, transportation, augmented reality, smart facilities, all these domains can potentially be changed forever.”
Development of the system continues, but you can see what their new interface has to offer, and its potential, in the video above.