Day one of the first Cognitive Computing Forum in San Jose, hosted by Dataversity, gave a great perspective on the state of cognitive computing; promising, but early. I am here this week with my research director Leslie Owens and analyst colleague Diego LoGudice. Gathering research for a series of reports for our cognitive engagement coverage, we were able to debrief tonight on what we heard and the questions these insights raise. Here are some key take-aways:
1) Big data mind shift to explore and accept failure is a heightened principle. Chris Welty, formerly at IBM and a key developer of Watson and it's Jeoapardy winning solution, preached restraint. Analytic pursuit of perfect answers delivers no business value. Keep your eye on the prize and move the needle on what matters, even if your batting average is only .300 (30%). The objective is a holistic pursuit of optimization.
2) The algorithms aren't new, the platform capabilities and greater access to data allow us to realize cognitive for production uses. Every speaker from academic, vendor, and expert was in agreement that the algorithms created decades ago are the same. Hardware and the volume of available data have made neural networks and other machine learning algorithms both possible and more effective.
When it comes to data technology, are you lost in translation? What's the difference between data federation, virtualization, and data or information-as-a-service? Are columnar databases also relational? Does one use the same or different tools for BAM (Business Activity Monitoring) and for CEP (Complex Event Processing)? These questions are just the tip of the iceberg of a plethora of terms and definitions in the rich and complex world of enterprise data and information. Enterprise application developers, data, and information architects manage multiple challenges on a daily basis already, and the last thing they need to deal with are misunderstandings of the various data technology component definitions.
The tide is turning on privacy. Since the earliest days of the World Wide Web, there has been an increasing sense that the Internet would effectively kill privacy – and in the wake of the NSA PRISM program revelations, that sentiment was stronger than ever. However, by using our Forrester’s Technographics 360 methodology, which blends multiple qualitative and quantitative data sources, we found that attitudes on privacy are evolving: Consumers are beginning to shift from a state of apathy and resignation to caution and empowerment.
China faces a growing air pollution problem — one of the consequences of its significant economic growth over the past two decades. Surrounded by a large number of coal-burning factories in Hebei province, Beijing faces ever-worsening smog. To tackle this problem, city government has implemented new policies and laws, such as the Beijing Air Pollution Control Regulations, that provide guidance to technology vendors developing smog control solutions.
Optimized Energy Management Is The Key To Reducing Air Pollution
Beijing’s government is focusing on air quality monitoring and has invited tech vendors like Baidu, IZP Technologies, and Yonyou to develop solutions. The city wants to show the source of pollutants and how they will disperse across Beijing a couple of days in advance — but that doesn’t do anything to reduce the smog itself. Rather, the key to reducing air pollution is changing how China consumes energy. For example, the government could use big data analytics to:
Optimize factories’ energy consumption. Asset-intensive industries like steel, cement, and chemicals face challenges in analyzing the vast amounts of data generated by energy-monitoring sensors and devices. Tech vendors like Cisco and IBM could leverage their Internet of Things data analysis technology to help customers turn this data into actionable insights. For example, one steel factory in Hebei province is considering technology that identifies when an oxygen furnace is wasting energy because the temperature of the output smoke is too high.
The battle over customer versus internal business processes requirements and priorities has been fought — and the internal processes lost. Game over. Customers are now empowered with mobile devices and ubiquitous cloud-based all-but-unlimited access to information about products, services, and prices. Customer stickiness is extremely difficult to achieve as customers demand instant gratification of their ever changing needs, tastes, and requirements, while switching vendors is just a matter of clicking a few keys on a mobile phone. Forrester calls this phenomenon the age of the customer. The age of the customer elevates business and technology priorities to achieve:
Business agility. Forrester consistently finds one common thread running through the profile of successful organizations — the ability to manage change. In the age of the customer, business agility often equals the ability to adopt, react, and succeed in the midst of an unending fountain of customer driven requirements. Forrester sees agile organizations making decisions differently by embracing a new, more grass-roots-based management approach. Employees down in the trenches, in individual business units, are the ones who are in close touch with customer problems, market shifts, and process inefficiencies. These workers are often in the best position to understand challenges and opportunities and to make decisions to improve the business. It is only when responses to change come from within, from these highly aware and empowered employees, that enterprises become agile, competitive, and successful.
The Eyeo Festival took place in Minneapolis last week. I missed it. I missed it for a very good reason, which is that I just started a new job as a Principal Analyst at Forrester Research. But I still followed from afar, wishing I could hear firsthand about some of the fantastic projects and ideas that get presented there (and I’ll certainly check out the videos as they get posted).
What is the Eyeo Festival, you might be wondering? It’s a small annual conference that “brings together creative coders, data designers, and creators working at the intersection of data, art, and technology for inspiring talks, workshops, labs, and events.” I’ve been to two out of the four conferences and have come away both times incredibly inspired and impressed. This is not just big data. This is big, beautiful, informative data. The coders, designers, and creators both at Eyeo and elsewhere provide living proof that big (and small) data doesn’t have to be ugly, messy, or impossible to understand.
It can have an emotional impact and make a point like this project by Kim Rees and Periscopic, which uses mortality data from the World Health Organization to estimate the number of years lost to gun deaths in 2013 alone.
Recent news of a a computer program that passed the Turing Test is a great achievement for artificial intelligence (AI). Pulling down the barrier between human and machine has been a decades long holy grail pursuit. Right now, it is a novelty. In the near future, the implications are immense.
Which brings us to why should you care.
Earlier this week the House majority leader, Eric Cantor, suffered an enormous defeat in Virginia's Republican primary by Tea Party candidate David Brat. No one predicted this - the polls were wrong, by a long shot. Frank Luntz, a Republican pollster and communication advisor, offered up his opinion on what was missing in a New York Times Op-Ed piece - lack of face-to-face discussions and interviews with voters. He asserts that while data collection was limited to discrete survey questions, what it lacked was context. Information such as voter mood, perceptions, motives, and overall mind set were missing. Even if you collected quantitative data across a variety of sources, you don't get to these prescient indicators.
The new wave of AI (the next 2 - 5 years) makes capturing this insight possible and at scale. Marketing organizations are already using such capabilities to test advertising messages and positioning in focus group settings. But, if you took this a step further and allowed pollsters to ingest full discussions in person or through transcripts in research interviews, street polls, social media, news discussions and interviews, and other sources where citizen points of view manifest directly and indirectly to voting, that rich content translates into more accurate and insightful information.
Big data this, big data that. Hardly a day goes by when we're not bombarded with messages about the big data platforms and technologies that will solve all our marketing problems. Let's be honest though: these tools and technologies alone simply won’t solve the big data challenge. But the effect of all that media and market hype? A lot of confusion and mistrust on the part of marketing leaders about what big data really is, what it can do, and how it should be incorporated into business strategy. And that's holding a lot of firms back from maximizing the power of the data at their disposal.
By now you're asking yourself how anything I've said so far is different or unique. Here it is: "big data" isn't about exabytes or petabytes. It's not about velocity. It's not a project or Hadoop or any other single thing. Big data is a journey that every company must take to close the gap between the data that's available to them, and the business insights they're deriving from that data. This is a definition that business and technology leaders alike can understand and use to better win, serve, and retain customers.
My colleague, Brian Hopkins, and I have just published a pair of reports -- researched and written in parallel -- to help our marketing and technology management clients work together to tackle the opportunities and challenges of big data. Here are a few of the most interesting "a-ha" moments of the research:
Big data is undergoing big change, but most companies are missing it or just grasping at the edges. My colleague Fatemeh Khatibloo and I have just completed an exhaustive study of the big data phenomenon. We found a familiar pattern: business confusion in the face of stern warnings about the dangers of big data and vendor-sponsored papers extolling its benefits. Here’s what we found hidden beneath the buzz:
As data explodes, so do old ways of doing business.
Everywhere we look, we find businesses using more diverse, messier, and larger data sets to stay competitive in the age of the customer — like the consumer goods firm that allocated marketing dollars based on flu trend predictions and the oil and gas companies that used weather data to predict iceberg flows and extend their drilling season. Savvy businesses find ways to turn more data into a competitive advantage. If your firm doesn’t get this, it won’t be pretty — starting in the not too distant future.
Technology managers and architects can’t afford to sit back and think that their Hadoop project will deliver everything the business needs. Nor can you afford to think that big data isn’t for you because you don’t have that much data. Why? Because “big data” is really the practices and technologies that close the gap between the available data and the ability to turn that data into business insight — insight that your firm needs to survive and thrive in the age of the customer. Four things to understand: