Researchers at the Carnegie Mellon University recently demoed a concept device called LumiWatch which projects a smartwatch’s UI on the arms using lasers, providing an ample amount of space to read information and also interact with the projections. Now, a team of researchers at the Georgia Institute of Technology have developed a system which takes the idea of button-less navigation gestures to a whole new level.

Called FingerPing, the technology employs a smart band and a ring to execute different commands and control devices through basic hand gestures, without even touching the device in question.

The set-up relies on what is called ‘acoustic chirps’ created by the movement of finger bones in different manners. The ‘acoustic chirps’ are emitted by the smart ring and then received by the smart band, which are later processed to execute gesture-based tasks. Although the technology is still in the development phase, the set-up can already recognize 22 different gestures which can be programmed to execute commands like playing/pausing music, interacting with a T9 keyboard, etc.

FingerPing has generated encouraging results so far with high accuracy in the tests, correctly identifying the different poses and numerical signals between 1 and 10 created using the fingers in the American Sign Language (ASL). “A wearable is always on you, so you should have the ability to interact through that wearable at any time in an appropriate and discreet fashion. When we’re talking, I can still make some quick reply that doesn’t interrupt our interaction”, said Cheng Zhang, a research scholar who helmed the project titled ‘FingerPing: Recognizing Fine-grained Hand Poses Using Active Acoustic On-body Sensing’.

Zhang added that his goal is to integrate the technology into wearable devices. FingerPing is a quite promising technology, as the ability to remotely navigate through your smartphone’s UI or perform different tasks through finger gestures would be way too convenient, and of course, insanely cool at the same time.