Prototype of Project Soli Prototype of Project Soli
Mountain View, United States

Search the world's information, including webpages, images, videos and more. Google has many special features to help you find exactly what you're looking for.

Employees: 64000

Project Soli

Google’s Advanced Technology and Projects lab, ATAP, is a hotbed of rapid technological innovation. But for Project Soli, one of the highlights of the ATAP presentation at the 2015 I/O developer conference, the engineering team reached for a 130-year-old technology: radar.

Though the term ‘radar’ wasn’t coined until the run-up to World War II, scientists realized as early as the 1880s that solid objects could reflect radio waves and that the reflections might be useful for detecting and locating things at a distance. Over the last year or so, ATAP engineers have been working on a 21st-century update to this epiphany after realizing that, if a radio chip was small enough, it could be used to sense fine-grained motions of the hand and fingers in mid-air — and might allow wearables and other devices to dispense once and for all with the stylus, the button, and even the touchscreen.

“What we propose instead is that we use your hand motion vocabulary, the familiar hand motions you already learn from tools you’re using everyday, for the interaction,” said project lead Ivan Poupyrev on the I/O stage. “There’s a broad vocabulary of motions which can be created by your hand.”

Instead of dragging your thumb against a touchscreen to scroll a map, you could drag it against your index finger. Instead of tapping a screen or a button, you could tap the tips of your finger and thumb together. Instead of flicking a screen to scroll through a menu, you could flick your thumb against a fingertip. And unlike infrared or camera-based systems for gesture input that require waving your hands through empty air, Poupyrev said, these gestures naturally provide their own haptic feedback.

To do this, the project team had to first shrink a broad-beam radar emitter and receiver down to fit on a chip that could be included in a wearable device — no small task in itself. Then the team had to standardize its gestures and come up with algorithms that could recognize them based on the complex reflections and scatterings of the radar signal bouncing off a user’s moving fingers.

Google’s Project Soli puts touchless gesture control at your fingertips with tiny radar chip
Google’s Project Soli puts touchless gesture control at your fingertips with tiny radar chip

The result, at least in the demos performed onstage, is impressive. Small, familiar gestures seemed to translate into fine control over virtual knobs, sliders and other on-screen inputs. The chip can even use the distance of the hand to interpret context, such as a watch app where the hour can be set at 5 inches away from the device, while raising the hand to 7 inches swaps over to minutes.

Soli still needs regulatory approval, but the plan is to set it loose among the developer community as soon as possible — along with an API that will let developers access everything from the raw data up to gesture-level interpretations. But there are several unanswered questions about the project at this stage. One is how developers will converge on consistent uses for each of the gestures from app to app and from device to device. Another is how susceptible Soli is to interference or accidental triggering. And most important, especially for wearables: How will it impact battery life?

To learn more about Project Soli, watch Poupyrev’s full I/O presentation — or check out the video below to get a quick overview and see some concepts for how radar-controlled smart gadgets could work.

Previous I/O 2015 coverage: Google reveals Brillo OS and Weave connectivity schema for IoT devices, Google’s Project Jacquard brings touch to textiles

By
Ted Burnham

Professional Combobulator

Video

Like what you see?

Visit