If you recall, the Google Pixel 4 series had a gesture-control feature that helped control the phone. It was made feasible by the Soli Radar chip, which detects hand motions using radar. The Soli sensor was created by Google’s Advanced Technology & Projects group, also known as ATAP. The group has now shown more possible applications for the Soli sensor in future products.
Google ATAP Soli Project:
The group stated their goal “to create ambient, socially intelligent devices that are controlled by the wave of a hand or turn of the head”. For this, Google employed the same Soli sensor in the new research. However, rather than using it to control a device—like a smartphone—the ATAP group used the sensor to recognize our everyday activities.
In a documentary shared on YouTube, the researchers revealed how they used a mesh of Soli sensors placed within a device to monitor people’s movement. These “ambient, socially aware devices” utilize radars, machine learning, and deep learning to learn what human behaviors signify.
Google ATAP Head of Design, Leonardo Giusti says “We believe as technology becomes more present in our life, it’s fair to start asking technology itself to take a few more cues from us”.
He added, “We’re inspired by how people interact with one another. As humans, we understand each other intuitively, without saying a single word. We pick up on social cues, subtle gestures that we innately understand and react to”.
- Also Read:
As per Giusti, this research is based on proxemics—the study of how an individual uses space around him/her and the degree of separation they maintain between each other in social situations. The ATAP system investigated these and other social cues in order to create gadgets that understand and react to human activities.
Google’s own device like Nest Hub Max detects if a person is approaching; and boots up to highlight any reminders, calendar events, or other notifications. However, there can be times when you pass the device while gazing in an entirely opposite direction.
For such scenarios, the Soli-powered gadgets can recognize movements and gestures like body orientation, the pathway you might be taking, the direction your head is facing, and others. The collected data is then refined by machine learning algorithms to determine whether or not the person really wants to interact with the device.
The above video demonstrates to us what this new technology is capable of. For instance, the company showed a smart display installed on the wall that shows temperature with the weather forecast in the tiny font when no one is around. And, when someone approaches near it, the forecast fills the entire screen; informing you whether or not there is a probability of rain.
Another example shows how you may answer a call just by walking up to the device. Similarly, the soli-sensor-equipped device can automatically play/pause a video based on your proximity to or distance from the device.
Google ATAP Soli Project: Availability
We’re completely unknown about when this technology will be available commercially. After all, it’s still in the research phase. Google’s current Soli-powered products are Nest Hub Max and Pixel 4 and 4XL smartphones. The tech giant removed the sensor from its successors to keep the costs down. However, we hope that Google will reintroduce it in the near future with improved mechanics to ease our living.
- Meanwhile, check out our review of the Google Pixel 6.