Stylus no longer supports Internet Explorer 7, 8 or 9. Please upgrade to IE 11, Chrome, Safari, Firefox or Edge. This will ensure you have the best possible experience on the site.
Product Design
Published: 25 Aug 2016

Soli 2.0: Invisible Interface

Extra
Soli: Interaction sensor using radar to motion track hand gestures

Created by Google’s research lab ATAP, Project Soli utilises radar technology to enable touchless interaction with digital devices. Since its initial launch in 2015, an updated version – Soli 2.0 – was revealed this spring.

Soli is an interaction sensor that uses a miniature radar to translate a set of universal hand gestures into virtual actions, such as pressing a button by tapping fingers together. Physical feedback occurs naturally due to the sensation of fingers touching. The sensor can track sophisticated hand movements accurately in sub-millimetre motion at high speeds.

Primarily designed for smartwatches, Soli’s present aim is to free small screens from digital buttons. The other challenge for Soli 2.0 was to improve power consumption, as the initial version needed an external power source to run – the Soli 2.0’s new chip has reduced power usage and works autonomously.

Although voice control is hyped as the next generation of interface control, Google foresees a future where hand gestures will become the common mode for operating digital devices. Holding vast potential, Google intends to push Soli into the consumer market and implement the technology into other products by encouraging developers to evolve, test and build applications with it.

The use of skin as an interface between the physical and digital world is explored along with further examples of tactile interfaces in our S/S 2018 Design Direction, Charge.

See also UI/UX Design for Future Consumers, where we look at placing the user at the heart of the design process, and outline some future trajectories for UI and UX design.

RELATED REPORTS
VIEW ALL Reports

Updated
Related
© 
PANTONE®TPX
COATED
RAL
RGB
HEX
NCS