With the advancement of wearable tech and other mobile devices,there is no simple way to handle these devices. At I/O 2015, Google's Advanced Technology and Projects group (ATAP) has unveiled Project Soli, which is based on the Gesture-sensing technology aiming to make a change in interacting with everything from smartwatches and tablets to large screen displays without touching their displays.
The notion behind Soli is similar to that of Leap Motion and other gesture based controllers.What puts Project Soli apart, is that it uses radar based sensor to detect natural hand and finger motions which is capable of tracking sub-millimeter motions at high speed having a refresh rate of 10,000 frames per second and with exceptional accuracy which is not found in other gesture based controllers since they use cameras for motion tracking abilities which hinders the effectiveness and accuracy of the devices Further , it uses the idea of haptic feedback;For instance the friction between two fingertips provides motion which can be sensed by radar .
When it comes to implementation, it can used to as a slider or a volume knob or to zoom in or out images. Further it can be used in meetings to switch from one slide to other just with a swipe.According to Project Soli founder Ivan Poupyrev , Within 10 months ATAP in association with Infineon technologies have been successful in miniaturizing the system into a small tiny chip less than a quarter, which can be embedded into various objects or even smallest wearable tech like smartwatches.Here is a glimpse of how it works
So later on this year, Google ATAP group plans to release the API to developers to build applications which will determine the future of Project Soli and its implementation which would change the way of interaction of users with the devices and other appliances.
No comments:
Post a Comment