AR and VR technologies aren't new. Virtual reality first experienced a boom of interest in the early 1990s, spurred by the 1991 book Virtual Reality by Howard Rheingold. In 1995, Angelina Jolie starred in the movie Hackers, which introduced mass audiences to head-mounted VR display technology. But the early promise of the technology fell apart due to underperforming graphics, attention-jarring lag times, outlandish hardware requirements, and the lack of an application ecosystem. No VR market emerged (outside of niche categories like military usage) until Facebook acquired the Kickstarter startup Oculus for $2 billion in March, 2014.
You've probably heard about the Quantified Self (QS), a movement that aims to capture, analyze, and act upon data from the human body in the interest of better health, fitter athletes, and sharper minds. Today, QS is giving way to QW -- Quantified Workforce. A variety of technologies -- devices, software, services -- can quantify the health, fitness, mental acuity, timeliness, and collaboration of workers. Many of these services are ready for prime time, but present some challenges in implementing. These challenges aren't primarily technological; they're related to privacy, workers' rights, and human resources policies. Done right, though, quantifying the workforce can drive both top- and bottom- line growth in your company's business.
I've analyzed this trend in a new report, Smart Body, Smarter Workforce. Here are just a couple of examples of how quantifying the workforce can drive better business outcomes:
Lower the company's insurance rates. In January, 2014, Forrester predicted that insurance companies would offer lower rates to individuals who donned wearables -- and we are now seeing that response. In April, 2015, John Hancock announced an opportunity for buyers of its term and life insurance policies to earn up to 15% discount on their insurance rates by wearing a Fitbit, sharing the data with the company, and meeting certain activity levels.
It’s not often that a new product release has the potential to reshape the way people work and play. The PC, the browser, the smartphone – all of these products fell into that category.
Microsoft’s new HoloLens has the potential to do the same. (Check out some photos from Gizmodo here -- they don't live up to the actual experience even a little bit -- and this video, which doesn't do it justice, either).
Yes, that’s a big claim. But I’m here to challenge your thinking with this assertion: Over the next few years, HoloLens will set the bar for a new type of computing experience that suffuses our jobs, our shopping experiences, our methods for learning, and how we experience media, among other life vectors. And other vendors will have to respond to this innovation in holographic, mixed reality computing.
This weekend, I’ll be heading off to Las Vegas for the 2015 Consumer Electronics Show (CES). Infrastructure & Operations leaders should – and do – keep tabs on the news coming out of CES. In this era of consumerization, bring-your-own (BYO) technology, and Shadow IT, CES announcements affect the I&O role more than ever before. I have three tips for how to think about CES 2015:
Look at consumer technologies through a workforce lens. So many smart, connected products quickly migrate to the workforce. Sometimes these technologies enter via BYO and segue into company-owned, as tablets have done over the past few years. In other cases, vendors that target consumers immediately see the value their products can bring to workforce scenarios. For example, I recently spoke with Jonathan Palley, CEO of Spire, a wearable device that tracks not just activity but also state of mind (tension versus calm, focus versus distraction, and related states). While the product was launched to the consumer market just about a week ago, Jonathan made clear that “workforce is a huge part of our strategy as well.” Imagine helping workers remain in a more productive, less stressed state of mind via wearables.
We're living in a time when smart, connected devices -- tablets, smartphones, wearable devices, Internet of Things (IoT) devices, and the like -- are being woven into the Business Technology (BT) Agenda of most companies. Nowhere is this trend more intimately applied to the customer experience than in healthcare, where devices near our bodies, on our bodies, or even inside our bodies are changing the way doctors, insurers, and other healthcare players think about patient care.
In a a major new report, Four Ways Connected Devices Improve Patient Care, we've researched how mobile, cloud, and connected devices come together to reshape the patient care experience. Technology innovations on the device and services side are creating new treatment options. And systemic changes to the healthcare system are creating both challenges and opportunities, which these emerging technologies can help address. For instance:
Busy doctors spend too much time on electronic health record (EHR) data entry. And when they use a traditional PC in the room with a patient, it's not always a great experience; one doctor told us he felt his "back was to the patient" too often. The solution? Moving to a Surface Pro 3 tablet, armed with better software, which allows the clinician to face the patient directly while still saving time -- and gaining accuracy -- on EHR data entry.
In 2015, wearables will hit mass market: With Apple’s much-anticipated Apple Watch slated for release early next year, the already hype-heavy conversation will reach new heights. My colleague Anjali Lai wrote a report analyzing the true addressable market of Apple Watch from a quantitative and qualitative data perspective – covered right here on the Data Digest– to interject some strong data-driven analysis into the conversation.
My colleagues Sophia Vargas, Michael Yamnitsky, and I have just published a new Quick Take report, "HP Announces Innovative Tools That Will Bridge Physical And Digital Worlds." Sophia and Michael have written about 3D printing for CIOs previously, and all three of us are interested in how computing and printing technologies can inform the BT Agenda of technology managers.
Fresh off of the announcement that HP will split into two publicly owned companies, one of those new entities -- HP Inc, the personal computing and printing business -- announced its vision for the future with two new products that help users cross the divide between physical and digital. The Multi-Jet Fusion 3D printer represents HP's long-awaited entry into 3D printing, with disruptively improved speed and quality compared to existing market entries. The sprout desktop PC combines a 3D scanner with a touchscreen monitor, touchscreen display mat, and specialized software that allows users to scan real objects, then manipulate them easily in digital format.
In both cases, a video demonstration helps you to really grok what the product is about.
CNET posted a video tour of the Multi-Jet Fusion 3D printer on Youtube:
Today Salesforce.com offered a formal update on its Salesforce Wear offering (which I wrote about at its release here). Salesforce Wear is a set of developer tools and reference applications that allows enterprises to create applications for an array of wearable devices and link them to Salesforce1, a cloud based platform that connects customers with apps and devices.
Salesforce’s entry into the wearables space has been both bold and well-timed. Salesforce Wear constitutes a first mover in the wearables platform space; while Android Wear offers a platform, it only reaches Android Wear based devices – unlike Salesforce Wear, which operates across a wide array of wearable devices. While it’s early to market, it’s not too early: Enterprises in a wide array of verticals are leveraging wearables worn by employees or by customers to redesign their processes and customer experiences, as I have written.
Too many wearables today have screens that look like miniaturized smartphones.
Just as smartphones shouldn’t be PC screens shrunk down to a 4-5” screen, smartwatches shouldn’t look like smartphones shrunk to 1”. Nor is it a matter of responsive web design (RWD), which resizes web content to fit the screen.
Samsung's Gear 2 looks like a tiny smartphone screen.
Instead, it’s a different type of design philosophy – one with DNA in the mobile revolution, and then extending mobile thinking even further.
Let’s start with the concept of mobile moments. As my colleagues write in The Mobile Mind Shift, mobile moments are those points in time and space when someone pulls out a mobile device to get what he or she wants immediately, in context. In the case of wearables, the wearer often won’t need to pull out a device – it’s affixed to her wrist, clothing, or eyeglasses. But she might need to lift her wrist, as a visitor to Disney World must do with MagicBand.
Now we’re getting closer to what wearables should be. But there are additional dimensions to wearables that obviate the need for pixel-dense screens:
Wearables are opening up exciting new scenarios for consumers and enterprise users alike, but the wider conversation on wearables has taken a privacy-oriented turn. The New York Timesand WIRED, among others, have covered the emerging privacy concerns associated with wearable devices.
Particular ire has developed against Google Glass. An online activist group, Stop the Cyborgs, opposes Google Glass and related wearables, which the organization says will "normalize ubiquitous surveillance." Stop the Cyborgs offers downloads of anti-Glass graphics for posting in public places and online to spread the message that wearables are inherent privacy violators.
In a major new Forrester report, we present data and insights to help Infrastructure & Operations professionals who are piloting or planning to trial wearables navigate the privacy waters. As a teaser, here are some of our findings: