The uSens virtual reality company is turning its sights towards the mobile market. The company announced the uSens AR Hand Tracking SDK, which enables developers to leverage its AI-based positional and skeletal tracking technology to create gesture-based interactions for mobile AR applications.
We first encountered uSens in late 2016, when we had a chance to test-drive the company’s Fingo hand tracking peripheral, which gave us the impression that uSens was a hardware company. However, last year at GDC, we learned that uSens, like Leap Motion, sees itself as a software company first and foremost. The company’s founders told us that “hardware is the easy part.” They were focused on perfecting the hard part: software.
uSens’ technology uses computer vision and deep learning to capture hand motion with full skeletal tracking. The company’s algorithm can keep track of up to 22 joints per hand, and it’s capable of tracking multiple hands at once.
Our past encounters with uSens involved tracking technology for VR hardware, but this year uSens shifted its sights towards AR-capable mobile devices. Anli He, co-founder, and CEO of uSens, explained that the VR headset market took off at a slower pace than he would have hoped, but the company sees a lot of potential in smartphone-based AR experiences.
“This opens a whole new world of possibilities for developers, enabling them to create a truly one-of- kind experience for a mainstream audience. Similar to how touchscreens enabled even the most technologically challenged to embrace smartphones, providing an easy and natural way for users to engage with AR/VR objects and environments will play a major role in boosting consumer adoption,” He said in a press release.
uSens’ newfound appreciation for smartphone AR is a sharp contrast from its opinion just a year ago. When we spoke to uSens CTO Yue Fei at GDC in 2017, he told us that smartphone cameras wouldn’t be suitable for its technology because smartphone cameras were optimized for pictures, meaning they sport low frame rates and narrow fields of view, neither of which is particularly conducive to tracking a bunch of joints in AR applications.
Despite its earlier concerns about the performance of mobile devices, the company adapted its computer vision and deep learning algorithms to work with smartphones and tablets. You don’t even need a powerful flagship device to use uSens' technology, which means developers would have a wide audience for applications they make with uSens hand tracking. The company said it hadn’t found a smartphone that can’t handle its hand-tracking software, including budget $100 devices.
uSens is already running a small private beta of its AR Hand Tracking SDK, and the company is now ready to bring on more testers to perfect the software before releasing the SDK publicly. Interested parties can contact uSens for access to the beta. uSens is also offering demonstrations of its hand-tracking technology at the Augmented World Expo this week.