It’s not often that a new product release has the potential to reshape the way people work and play. The PC, the browser, the smartphone – all of these products fell into that category.
Microsoft’s new HoloLens has the potential to do the same. (Check out some photos from Gizmodo here -- they don't live up to the actual experience even a little bit -- and this video, which doesn't do it justice, either).
Yes, that’s a big claim. But I’m here to challenge your thinking with this assertion: Over the next few years, HoloLens will set the bar for a new type of computing experience that suffuses our jobs, our shopping experiences, our methods for learning, and how we experience media, among other life vectors. And other vendors will have to respond to this innovation in holographic, mixed reality computing.
Today saw the release of Leap Motion, the 3D gestural navigation controller for PCs and Macs. Like its cousin the Xbox Kinect, Leap Motion uses sensors to track physical gestures. Where Kinect tracks your entire body, Leap Motion tracks fine movements of the arms, hands, and fingers. In turn, this allows users to input information, enabling touch-free 3D gestural navigation control.
Leap Motion can be used to navigate operating systems (Windows, Mac), to cruise through Google Earth, to draw a digital picture, to generate experimental music, or to dissect a virtual frog, as seen in the AirSpace Leap Motion app store. In the future, surgeons could perform surgeries and airline pilots could control their plans with this solution, according to the vendor.
The success or failure of Leap Motion will derive from the strength of the app ecosystem that grows up around it:
As with touch screen, ground-up applications work best... “Touch-first” applications – those reimagined from the ground up with touch as the primary navigational method – generally appeal to users better than “touch-second” experiences where touch was added to an existing application. Similarly, gesture-controlled experiences need to be rethought from the ground up.The same is true for voice-controlled apps. Developers will need to change the way they work in coming years, collaborating with designers and experts in human anatomy, for all of this to work. Until that happens, the technology will remain marginal.