contador javascript Saltar al contenido
Contact : alejandrasalcedo0288@gmail.com

This is what the Google Pixel 4's Soli radar "sees"

Pixel 4, face unlock

Google announced the Pixel 4 in October of last year with special emphasis on one of the phone technologies that had taken them the longest to develop: Soli. Through radar, added to a series of different sensors, Google had been able to equip the device with a system able to identify and interpret gestures made around the mobile, so that it acted accordingly.

When analyzing the Google Pixel 4 XL we deduce that, despite its potential, the system Motion Sense based on Soli technology It still had a long way to go, especially in terms of functionality. However, in a short time it became one of the favorite functions of a large number of Pixel 4 owners, mainly since this same technology was in charge of enabling part of the fantastic facial unlock system integrated in the series phones Pixel 4.

Now, when several months have passed since the phone was introduced, and many are already waiting for the arrival of the new Pixel 4a, Google wanted to explain more fully how the Soli technology works in a post on his official AI blog.

Google explains what's behind the Pixel 4's Motion Sense system

Pixel 4 XL, front

In the post, Google explains how Pixel 4s are capable of detect when the user reaches out to pick up the phone and turn on the screen automatically, or when you move your hand from side to side over the screen to move from song.

Through an animated image -available below these lines-, you can see How is the image perceived by the Soli radar system? when a move is made. Logically, the image is of very low quality and resolution – which, on the other hand, helps this system not pose any type of privacy risk. However, Google explains that it is not necessary to obtain more detail in order to detect movement and interpret it through models generated from machine learning.

Google explains that, depending on the distance of the subject making the movement and the radar itself, the signal strength received by Soli is less. In the images, the intensity of the signal is represented with a greater or lesser brightness. As explained, in the three shared images, the one on the left shows what the radar ?would see? when one person walks near the mobile, the second one arm reaching out to pick up the device, and the third the gesture from one side to the other used to change songs.

Another curiosity of this system is the process used by Google to train this technology. To do this, an artificial intelligence model was created using the framework TensorFlow trained with million gestures made by thousands of participants. Apparently, thanks to this machine learning process, Soli is even capable of ?filtering? the interferences generated by the device itself, such as vibrations or movements produced by the speaker of the phone when playing music.

Regardless of the greater or lesser utility of this technology today, it is certainly fascinating how Google has managed to miniaturize and give life to a technology like Soli after years in development, to end up creating a curious gestural control system.

Follow Andro4all

About Christian Collado







Growth Editor at Andro4all, specialized in SEO. I study software development and write about technology, especially about the Android world and everything related to Google since 2016. You can follow me on Twitter, send me an email if you have something to tell me, or connect with me through my LinkedIn profile.

My work team: