I am just back from the first ever Cognitive Computing Forum organized by DATAVERSITY in San Jose, California. I am not new to artificial intelligence (AI), and was a software developer in the early days of AI when I was just out of university. Back then, if you worked in AI, you would be called a SW Knowledge Engineer, and you would use symbolic programming (LISP) and first order logic programming (Prolog) or predicate calculus (MRS) to develop “intelligent” programs. Lot’s of research was done on knowledge representation and tools to support knowledge based engineers in developing applications that by nature required heuristic problem solving. Heuristics are necessary when problems are undefined, non-linear and complex. Deciding which financial product you should buy based on your risk tolerance, amount you are willing to invest, and personal objectives is a typical problem we used to solve with AI.
Fast forward 25 years, and AI is back, has a new name, it is now called cognitive computing. An old friend of mine, who’s never left the field, says, “AI has never really gone away, but has undergone some major fundamental changes.” Perhaps it never really went away from labs, research and very nich business areas. The change, however, is heavily about the context: hardware and software scale related constraints are gone, and there’s tons of data/knowledge digitally available (ironically AI missed big data 25 years ago!). But this is not what I want to focus on.
Unified information architecture, data governance, and standard enterprise BI platforms are all but a journey via a long and winding road. Even if one deploys the "latest and greatest" BI tools and best practices, the organization may not be getting any closer to the light at the end of the tunnel because:
Technology-driven enterprise BI is scalable but not agile. For the last decade, top down data governance, centralization of BI support on standardized infrastructure, scalability, robustness, support for mission critical applications, minimizing operational risk, and drive toward absolute single version of the truth — the good of enterprise BI — were the strategies that allowed organizations to reap multiple business benefits. However, today's business outlook is much different and one cannot pretend to put new wine into old wine skins. If these were the only best practices, why is it that Forrester research constantly finds that homegrown or shadow BI applications by far outstrip applications created on enterprise BI platforms? Our research often uncovers that — here's where the bad part comes in — enterprise BI environments are complex, inflexible, and slow to react and, therefore, are largely ineffective in the age of the customer. More specifically, our clients cite that the their enterprise BI applications do not have all of the data they need, do not have the right data models to support all of the latest use cases, take too long, and are too complex to use. These are just some of the reasons Forrester's latest survey indicated that approximately 63% of business decision-makers are using an equal amount or more of homegrown versus enterprise BI applications. And an astonishingly miniscule 2% of business decision-makers reported using solely enterprise BI applications.
In our recent report Closing The Experience Gaps, Ted Schadler and I talked about two key elements to meeting customers’ rising expectations: creating an architecture for cross-channel experience delivery and developing a philosophy and culture of business agility. Given it builds on many of the concepts that we outlined in the Software Must Enhance Your Brand, I wanted to highlight the key aspects of the second element: developing a philosophy and culture of business agility.
Closing the experience gaps — performance, convenience, personalization, and trust — requires a different mindset. The shift in customer expectations, fueled by an increasing rate of technology change, means that firms need to act more like a cloud-based ISV, not a traditional IT shop. This requires an agile process and continuous development from small teams spanning business, design, and technology competencies. Part of this makeover includes improving technical and design competencies. Companies like GE and Wal-Mart have dramatically upskilled their technology teams.
At the core of this new mindset are five cultural, process, and skill imperatives:
Align business and technology executives. Successful customer experience transformation efforts at Delta Air Lines and The Home Depot have at their core an accommodation between the CEO, business executives, and the CIO.
Embrace an agile, sense-and-respond continuous delivery process. Great customer experiences today are table stakes tomorrow. To continuously improve experiences, companies must work differently, in small agile teams that span business, design, and technology — what we call IDEA teams.
In 2012, the number of smartphone subscribers worldwide passed the 1 billion mark, primarily due to adoption in North America and Europe. But the focus of the smartphone market is now shifting toward Asia Pacific, the Middle East and Africa (MEA), and Latin America. These three regions, which are home to 84% of the world’s population, will contribute a significant proportion of the next 2.5 billion subscribers, which Forrester believes will happen by 2019. According to our recently published Forrester Research World Mobile and Smartphone Adoption Forecast, 2014 to 2019 (Global), Asia Pacific is the fastest-growing region in terms of subscribers with a CAGR of 14%, followed by MEA, and Latin America. Some of the findings from the forecast:
Low-cost smartphones are turning feature phone subscribers into smartphone subscribers. Chinese companies such as iocean, JiaYu, Meizu, Xiaomi, and Zopo and Indian players like Karbon and Micromax are flooding the market with sub-$200 Android-based smartphones. Declining smartphone prices and shrinking feature phone product lines have contributed to a steep rise in smartphone subscriptions: More than 46% of mobile subscribers owned a smartphone in 2013, compared with 9% in 2009. By 2019, we expect that 85% of all mobile subscribers will have smartphones.
The focus is shifting to India. India is the fastest-growing market for smartphones; as such, it’s attracting most of the focus from vendors. Gionee, Huawei, Konka, Lenovo, Xiaomi, and ZTE have recently entered the market, and Google launched its Android One program in partnership with Indian companies to provide sub-$100 Android phones.
For those of us who write and think about the future of healthcare, the story of rapid and systemic change rocking the healthcare system is a recurrent theme. We usually point to the regulatory environment as the source of change. Laws like the Affordable Care Act and the HITECH Act are such glaring disruptive forces, but what empowers these regulations to succeed? Perhaps the deepest cause of change affecting healthcare, and the most disruptive force, is the digitalization of our clinical records. As we continue to switch to electronic charts, this force of the vast data being collected becomes increasingly obvious. One-fifth of the world’s data is purported to be administrative and clinical medical records. Recording medical observations, lab results, diagnoses, and the orders that care professionals make in binary form is a game-changer.
Workflows are dramatically altered because caregivers spend so much of their time using the system to record clinical facts and must balance these record-keeping responsibilities with the more traditional bedside skills. They have access to more facts more easily than before, which allows them to make better judgments. The increasing ability of caregivers to see what their colleagues are doing, or have done, across institutional boundaries is allowing for better coordination of care. The use of clinical data for research into what works and what is efficient is becoming pervasive. This research is conducted by combining records from several institutions and having the quality committees of individual institutions look at the history of care within their institutions to enhance the ways in which they create the institutional standards of care. The data represents a vast resource of evidence that allows great innovation.
Bill Gates said "People everywhere love Windows.” Whether or not you agree, the fact that Microsoft Windows remains the de facto standard for business productivity after nearly 3 decades, suggests that many still do. But as the sales figures of Microsoft’s competitors suggest, people everywhere love lots of other things too. And one of the reasons they love them so much is that they like to get things done, and sometimes that means getting away from the office to a quiet place, or using a technology that isn’t constrained by corporate policies and controls, so they can be freer to experiment, grow their skills and develop their ideas uninhibited.
Technology managers I speak with are aware of this, but they’re justifiably paranoid about security, costs, and complexity. So the result of these conflicting forces coming together is inspiring rapid innovation in a mosaic of technologies that Forrester collectively calls digital workspace delivery systems. It involves many vendors, including Microsoft, Citrix, VMware, Dell, nComputing, Amazon Web Services, Fujitsu, AppSense, Moka5, and more. The goal of our work is to help companies develop their capabilities for delivering satisfying Microsoft Windows desktop and application experiences to a wide range of users, devices, and locations.
Storage has been confined to hardware appliance form factors for far too long. Over the past two decades, innovation in the storage space has transitioned from proprietry hardware controllers and processors to proprietary software running on commodity X86 hardware. The hardware driving backup appliances, NAS systems, iSCSI arrays, and object storage systems, are often quite similar in terms of processors and components, yet despite this fact I&O professionals are still used to purchasing single purpose systems which lock customers into a technology stack.
Over the past few years, companies such as HP (StoreVirtual VSA), Nexenta, Sanbolic and Maxta have released software-only storage offerings to complete head to head with proprietrary hardware appliances, and have found some success with cost conscious enterprises and service providers. The software-only storage revolution is now ready for primetime with startup offerings now reaching maturity and established players such as IBM, EMC and NetApp jumping into the market.
I&O professionals should consider software only storage since:
The storage technology acquisition process is broken. Any storage purchase you complete today will be bound to your datacenter for the next 3 to 5 years. When business stakeholders and clients need storage resources for emerging use cases such as object storage and flash storage, these parties often do not have the luxury of time to wait for storage teams to complete RFPs and product evaluations. With software-only storage access to new technology can be accelerated to meet the provisioning velocity needs of customers.
Day one of the first Cognitive Computing Forum in San Jose, hosted by Dataversity, gave a great perspective on the state of cognitive computing; promising, but early. I am here this week with my research director Leslie Owens and analyst colleague Diego LoGudice. Gathering research for a series of reports for our cognitive engagement coverage, we were able to debrief tonight on what we heard and the questions these insights raise. Here are some key take-aways:
1) Big data mind shift to explore and accept failure is a heightened principle. Chris Welty, formerly at IBM and a key developer of Watson and it's Jeoapardy winning solution, preached restraint. Analytic pursuit of perfect answers delivers no business value. Keep your eye on the prize and move the needle on what matters, even if your batting average is only .300 (30%). The objective is a holistic pursuit of optimization.
2) The algorithms aren't new, the platform capabilities and greater access to data allow us to realize cognitive for production uses. Every speaker from academic, vendor, and expert was in agreement that the algorithms created decades ago are the same. Hardware and the volume of available data have made neural networks and other machine learning algorithms both possible and more effective.
I come to Forrester after working in the Solution Marketing and Corporate Marketing groups at a large customer service software provider. That role put me in touch with contact center technology buyers and the overburdened folks responsible for actually making great customer service happen every day. I saw close up the impact of the age of the customer on the thinking, processes, behavior, and technology choices of contact center professionals around the world. They are facing a world in which consumers are much less willing to settle for mediocre and impersonal experiences when dealing with customer service organizations. As consumers we all want effortless service delivered via whatever channel is most convenient at the moment, and we want companies to know just the right amount of information about us, but not too much, at the moment of the interaction.
That is a very tough nut to crack for contact center managers, supervisors, and agents. My research coverage will primarily focus on two areas that can help contact center pros begin to address these issues:
Telcos across the world — and especially mobile operators — are now struggling with increasing network complexity and lower customer satisfaction due to exploding data traffic, decreasing ARPU, and OTTs marginalizing their opportunities to generate new revenues via content. The Japanese market, with one of the highest ARPUs, has been the battlefield for technology providers to offer local telcos to services their high-value customers in a country where people have very high expectations of telecommunications services. Two weeks ago, I participated in Nokia Networks’ analyst days in Tokyo and was interested to see how the company has increased its share in Japan in the past couple of years. To continue its success in the age of the customer, Nokia Networks must help Japanese telcos better win, serve, and retain customers.
Two days of briefings and discussions convinced me that Nokia Networks’ must address three critical items to maintain its leadership position in LTE radio in Japan:
Optimizing its networks to make its coverage and performance the best it can be in this very high-density market.
Introducing customized features from its Japan R&D lab to meet the most demanding operators in the world.
Helping telcos meet or overfulfill their customers’ expectations via a customer experience management (CEM) solution, although the revenue contribution is much smaller. Obviously, what customers experience and perceive are what really decides how effective all of the network improvements have been.