Wearable computing devices (like Google Glass, Jawbone Up, Nike+ FuelBand, iHealth, and Samsung Galaxy Gear, among others) have made a big splash in the consumer market. My colleague Sarah Rotman Epps’ analysis shows that Google Glass could be the next big App Platform. Fitness wearables might be a bit overhyped, but it’s nevertheless becoming common to see people sporting Nike+ FuelBand devices everywhere you go. No less a tech industry luminary than Mary Meeker recently declared wearables the next wave of computing (see slide 49).
When people think of futuristic user interfaces (Forrester analysts included), they often invoke the 2002 Tom Cruise movie Minority Report. The imagery in the movie offers a compelling vision of how next-generation technologies – gestural control, voice command, 3D visuals, multi-screen interactions – can empower computing experiences.
Where did Minority Report get this vision? From a man named John Underkoffler, Chief Scientist at a company called Oblong. He designed the computer interfaces in the film.
I had the pleasure of visiting Oblong’s Boston office recently, where I saw demonstrations of several technologies. Most interesting to me was the company’s Mezzanine offering, an “infopresence” conference room that the company sells to enterprises today.
The solution involves equipping a conference room (or multiples – it works as a long distance telepresence location) with a number of monitors (5 in the room I visited), teleconferencing equipment (industry standard products work well), and ceiling-mounted sensors (for interpreting gestural controls), and a whiteboard (a physical one, but visible to a camera). Workers control the room with a wand, which works via both gestural controls and a button.
Putting all of these things together, workers can collaborate both within the room itself and with remote teams (or remote individual team members). The resulting experience, in my view, offers two sets of benefits:
Today saw the release of Leap Motion, the 3D gestural navigation controller for PCs and Macs. Like its cousin the Xbox Kinect, Leap Motion uses sensors to track physical gestures. Where Kinect tracks your entire body, Leap Motion tracks fine movements of the arms, hands, and fingers. In turn, this allows users to input information, enabling touch-free 3D gestural navigation control.
Leap Motion can be used to navigate operating systems (Windows, Mac), to cruise through Google Earth, to draw a digital picture, to generate experimental music, or to dissect a virtual frog, as seen in the AirSpace Leap Motion app store. In the future, surgeons could perform surgeries and airline pilots could control their plans with this solution, according to the vendor.
The success or failure of Leap Motion will derive from the strength of the app ecosystem that grows up around it:
As with touch screen, ground-up applications work best... “Touch-first” applications – those reimagined from the ground up with touch as the primary navigational method – generally appeal to users better than “touch-second” experiences where touch was added to an existing application. Similarly, gesture-controlled experiences need to be rethought from the ground up.The same is true for voice-controlled apps. Developers will need to change the way they work in coming years, collaborating with designers and experts in human anatomy, for all of this to work. Until that happens, the technology will remain marginal.
Voice-controlled intelligent assistants offer a tantalizingly productive vision of end user computing. Using voice commands, users can extend the computing experience to not just mobile scenarios, but to hyper-mobile, on-the-go situations (such as while driving). With wearables like Google Glass, voice command promises even deeper integration into hyper-mobile experiences, as this video demonstrates. And voice controlled intelligent assistants can also enable next-generation collaboration tools like MindMeld.
In spite of this promise, there remains a lurking sense that voice control is more of a gimmick than a productivity enhancer. (As of the time I posted this blog, a Google search for Siri+gimmick yielded… “about 2,430,000 results”). To see where voice control really stands, we surveyed information workers in North American and Europe about their use of voice commands.
Information workers’ use of voice control today:
In reality, many information workers with smartphones are already using voice commands – at least occasionally. Our survey revealed that:
Today, Samsung places much greater strategic emphasis on its enterprise business, which is now a “top three priority” globally for the company. Symbolizing this new commitment to enterprise customers, on June 11th Samsung openeda new Executive Briefing Center (EBC) in its Ridgefield Park, NJ office. The EBC offers enterprise customers and Samsung’s many partners an opportunity to experience Samsung’s vertically-optimized enterprise offerings in context.
I attended the opening, which enjoyed executive-level support from the President and CEO of Samsung Electronics North America Yangkyu (Y.K) Kim, President of Samsung Electronics America Tim Baxter, and Senior Vice President, Samsung Enterprise Business Tod Pike. I also spent an hour learning more about the Samsung value proposition for enterprise customers from Tod, including the excerpted Q&A below.
Samsung’s Enterprise Business Division focuses on a vertical strategy that includes Education, Healthcare, Retail, Financial Services, and Hospitality... and which isn’t just about devices, though their product offerings in hospitality TVs, notebook and tablet PCs, virtualization, wireless printers, and digital signage play a prominent role. Samsung also brings together enterprise-savvy partners like Crestron and Nuance Communications – along with numerous systems integrators and other channel partners – to deliver software, content, and services along with those devices.
I recently spoke with Tim Tuttle, the CEO of Expect Labs, a company that operates at the vanguard of two computing categories: Voice recognition (a field populated by established vendors like Nuance Communications, Apple, and Google) and what we can call the Intelligent Assistant space (which is probably most popularly demonstrated by IBM’s “Jeopardy”-winning Watson). In their own words, Expect Labs leverages “language understanding, speech analysis, and statistical search” technologies to create digital assistant solutions.
Expect Labs built the application MindMeld to make the conversations people have with one another "easier and more productive” by integrating voice recognition with an intelligent assistant on an intuitive tablet application. They have coined the term “Anticipatory Computing Engine” to describe their solution, which offers users a new kind of collaboration environment. (Expect Labs aims to provide an entire platform for this type of computing).
“Hello, I’m J. P. Gownder, and I serve Infrastructure and Operations professionals!” That’s my new greeting to Forrester’s clients. (I borrowed – aka “stole” – this opening line from my excellent colleague, Laura Ramos, who recently rejoined the Forrester analyst ranks herself).
After eight years in a variety of roles at Forrester, I’ve joined the Infrastructure and Operations (I&O) team as a Vice President and Principal Analyst. I’ll be collaborating with analyst colleagues (please see below) on I&O’s forthcoming Workforce Enablement Playbook. I&O pros face the constant challenge of empowering their companies’ workers with devices and services to make them successful in their jobs… as well as navigating the growing challenge of employees who choose to bring their own technology to work instead.
More specifically, I’ll be researching at least five issues pertinent to I&O pros:
Regardless of what our minds conjure up when we think of airline travel, one thing we can readily observe is that while the weather, the experience of the flight crew, the mechanical condition of the aircraft, and the destination of the flight are all variables, the system of getting an aircraft from one place to another, in one piece, is extraordinarily reliable. Herb Kelleher of Southwest Airlines once joked that the airline business is the only place where the capital assets travel at 500 miles per hour.
Every commercial flight starts with a flight plan, a flight crew, an aircraft, and a destination. The dispatcher creates the plan based on the expected conditions for the flight, the limitations of the pilot and passengers, and the capabilities of the aircraft. Time is built into the plan to climb to cruise altitude and to descend again to reach the destination safely. How much fuel will be required is built into the plan and pumped into the tanks. Every activity is done to achieve a singular purpose: getting the aircraft and its passengers safely to the destination, and everyone involved knows where the destination is. Aviation is a study in viable systems design.
How strange it seems then, that thousands of IT projects begin every day, but more than one-third of them crash enroute. Why? I would argue that it's because there is seldom a clear destination in mind, a rational plan to get there, or a viable system approach in place to execute the plan. Most of the time, the destination and the means to get there are only vague estimates, and the elements of the strategy are rooted in hope.
Chances are that you have employees using Apple Macs at your firm today, and they’re doing this without the support and guidance of the infrastructure and operations (I&O) organization. IT consumerization has put an end to the days of one operating system (OS) to support. For I&O pros, this change carries new concerns about security, potential information loss, and unexpected support needs, to name a few. Forrester has found that IT organizations struggle in building a support and management strategy for Macs that works.
Fortunately, there are many firms who have blazed the trails and figured out how to support both employee-owned and company-owned Macs for their employees, and we've assembled our findings in the latest document on managing Macs. Hint: Leave the Windows PC management tools and techniques in the toolbox. It’s easy to understand why I&O professionals sometimes apply the same techniques and tools they are familiar with in the Windows world for managing Macs, but the reality is that they are different animals, and what is a best practice for one is irrelevant for the other — and can even cripple worker productivity.
End User Computing is at the Root of the VMware Family Tree
Examine the roots of the VMware family tree, and End User Computing is the longest root of 'em all. It's where it all began, back in 1999 with a cool little product that let me run Windows on top of Linux. It was like magic for software customer demos of complex enterprise apps. I could royally screw up a demo environment an hour before a demo for a $15M deal by adding just one field to the screen that the customer demanded to see, but instead of soiling my underwear in a panic, I could go back to my most recently saved state of less than an hour before. Brilliant! It was a tool for me to be more effective in my job. Hold that thought.
So with this heritage in mind and a general respect for VMware's products honed over the past 15 years of growth and change, and fantastic tools for I&O professionals to manage virtualized environments with, I was delighted to see End User Computing be the focus of general session demos and breakout sessions. I was looking forward to learning more about Wanova Mirage to see if it could help on the employee freedom and personal innovation front. Those of you following this space know what I think of what I like to call Soviet Bloc Virtual Desktop Infrastructures.
Virtuosity as the Root of Innovation and the Dangers of Hosted VDI