Today saw the release of Leap Motion, the 3D gestural navigation controller for PCs and Macs. Like its cousin the Xbox Kinect, Leap Motion uses sensors to track physical gestures. Where Kinect tracks your entire body, Leap Motion tracks fine movements of the arms, hands, and fingers. In turn, this allows users to input information, enabling touch-free 3D gestural navigation control.
Leap Motion can be used to navigate operating systems (Windows, Mac), to cruise through Google Earth, to draw a digital picture, to generate experimental music, or to dissect a virtual frog, as seen in the AirSpace Leap Motion app store. In the future, surgeons could perform surgeries and airline pilots could control their plans with this solution, according to the vendor.
The success or failure of Leap Motion will derive from the strength of the app ecosystem that grows up around it:
As with touch screen, ground-up applications work best... “Touch-first” applications – those reimagined from the ground up with touch as the primary navigational method – generally appeal to users better than “touch-second” experiences where touch was added to an existing application. Similarly, gesture-controlled experiences need to be rethought from the ground up.The same is true for voice-controlled apps. Developers will need to change the way they work in coming years, collaborating with designers and experts in human anatomy, for all of this to work. Until that happens, the technology will remain marginal.