Google is experimenting to put gesture controls everywhere with its Project Soli. The project was announced at last month’s 2015 Google I/O, an annual developer conference where the company unveils its latest developer products.
Developed by Google ATAP, the skunkworks division focused on creating new tech consumers will use, Project Soli makes the user’s hands and fingers as an interface for devices using radar. Radar which detects objects in motion through high frequency radio waves.
"A typical model of the way you think about radar is like a police radar or baseball where you just have an object and you measure its speed," explains Project Soli's design lead Carste Schwesig. "But actually we are beaming out a continuous signal that gets reflected by an arm, for example ... so you measure the differences between the emitted and the received signal. It's a very complex wave signal and from that we can provide signal processing and machine learning techniques to detect gestures."
While companies like Leap Motion and Intel already have motion controllers, these systems rely on cameras for motion-tracking abilities. Here’s a look at Project Soli and what it hopes to bring to the table: