Notes from the TechAmerica Europe seminar in Brussels, March 27, 2013
This may not be the most timely event write-up ever produced, but in light of all the discussions I’ve had on the same themes during the past few weeks, I thought I’d share my notes anyway.
The purpose of the event was to peel away some of the hype layers around the “big data” discussion, and — from a European perspective — take a look at the opportunities as well as challenges brought by the increasing amounts of data that is available, and the technologies that enable its exploitation. As was to be expected, an ever-present subtext was the potential of having laws and regulations put in place which — while well-intentioned — can ultimately stifle innovation and even act against consumer interests. And speaking of innovation: Another theme running through several of the discussions was the seeming lack of technology-driven innovation in Europe, in particular when considered in the context of an economic environment in dire need of every stimulus it can get.
The scene was set by John Boswell, senior VP, chief legal officer, and corporate secretary at SAS, who provided a neat summary of the technology developments (cheap storage, unprecedented access to compute power, pervasive connectivity) giving rise to countless opportunities related to the availability, sharing and exploitation of ever-increasing amounts of data. He also outlined the threats posed to companies, governments, and individuals by those who with more sinister intent when it comes to data exploitation, be it for ideological, financial, or political reasons. Clearly, those threats require mitigation, but John also made the point that “regulatory overlays” can also hinder progress, through limiting or even preventing altogether the free flow of data.
IBM has just announced that one of Australia’s “big four” banks, the ANZ, will adopt the IBM Watson technology in their wealth management division for customer service and engagement. Australia has always been an early adopter of new technologies but I’d also like to think that we’re a little smarter and savvier than your average geek back in high school in 1982.
IBM’s Watson announcement is significant, not necessarily because of the sophistication of the Watson technology, but because of IBM's ability to successfully market the Watson concept.
To take us all back a little, the term ‘cognitive computing’ emerged in response to the failings of what was once termed ‘artificial intelligence’. Though the underlying concepts have been around for 50 years or more, AI remains a niche and specialist market with limited applications and a significant trail of failed or aborted projects. That’s not to say that we haven’t seen some sophisticated algorithmic based systems evolve. There’s already a good portfolio of large scale, deep analytic systems developed in the areas of fraud, risk, forensics, medicine, physics and more.
BI professionals spend a significant portion of their time trying to instill the discipline of datadriven performance management into their business partners. However, isn’t there something wrong with teaching someone else to fly when you’re still learning to walk? Few BI pros have a way to measure their BI performance quantitatively (46% do not measure BI performance efficiencies and 55% do not measure effectiveness). Everyone collects statistics on the database and BI application server performance, and many conduct periodic surveys to gauge business users’ level of satisfaction. But how do you really know if you have a high-performing, widely used, popular BI environment? For example, you should know BI performance
Efficiency metrics such as number of times a report is used or a number of duplicate/similar reports, etc
Effectiveness metrics such as average number of clicks to find a report and clicks within a report to find an answer to a question and many others
Metric attributes/dimensions such as users, roles, departments, LOBs, regions and others
The deluge of customer data shows no signs of abating. The perpetually-connected customer leaves data footprints in every interaction with a brand. This presents tremendous opportunities for customer insights professionals and analytics practitioners tasked with analyzing this data, to not only get smarter about customers but ensure that the insights get appropriately used at the point of customer interaction.
When we asked customer analytics users about the challenges and drivers of customer analytics adoption, we found that data integration and data quality continue to inhibit better adoption of customer analytics while users still want to use analytics to improve the data-driven focus of the organization and drive satisfaction and customer retention.
Forrester’s Customer Analytics Playbook guides customer insights professionals, marketing scientists and customer analytics practitioners into this new reality of customer data and helps discover analytics opportunities, plan for greater sophistication, take steps towards building a customer analytics capability and continually monitor progress of analytics initiatives. It will include 12 chapters (and an executive overview) that cover different aspects of customer analytics.
Whether you are just starting on your BI journey or are continuing to improve on past successes, a shortage of skilled and experienced BI resources is going to be one of your top challenges. You are definitely not alone in this quest. Here are some scary statistics:
“By 2018, the United States alone could face a shortage of 140,000 to 190,000 people with deep analytical skills, as well as 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions.” (Source: May 2012 McKinsey Global Institute report on Big Data)
“… trigger a talent shortage, with up to 190,000 skilled professionals needed to cope with demand in the US alone over the next five years.” (Source: 2012 Deloitte report on technology trends)
“Fewer than 25% of the survey respondents worldwide said they have the skills and resources to analyze unstructured data, such as text, voice, and sensor data.” (Source: 2012 research report by IBM and the Saïd Business School at the University of Oxford)
Digital capability – social, mobile, cloud, data & analytics – disrupts business models, introduces new competitive threats, and places new demands on your business. Highlighting this fact: Forrester’s 2012 “Digital Readiness Assessment” survey found that 65% of global executives say they are “excited about the changes that digital tools and experiences will bring” to their company.
While most people know these digital trends are coming, however, far fewer know how to purchase these cutting-edge digital capabilities. What companies will you rely on? Where are the new risks? What are the pricing models? In the survey mentioned above, only 32% of the same sample agreed that their organization “has policies and business practices in place to adapt” to those digital changes.
This is important, since developing the breadth of digital capabilities your company needs cannot all be done in-house. To succeed, your company will need to access the strengths of its supplier ecosystem, maximize value from strategic partners, and leverage emerging supplier models.
This is a tremendous opportunity for sourcing and vendor management professionals to increase the strategic value they provide to their business. But to do this, you’ll need to balance your traditional cost-cutting goals with demands for business expectations for growth, innovation, and value.
Leading-edge executives at organizations drive growth, innovate, and disrupt industries through emerging technologies: social, mobile, cloud, analytics, sensors, GIS and others. 85% of executives in a recent survey shared that “the need to drive innovation and growth” would have a moderate or high impact on IT services spending. But, today’s technology buyers face a fragmented, fast-moving landscape of niche technology and services providers in newer spaces (social, mobile, cloud) as well as new offerings from their largest global partners.
Often the leading- and bleeding-edge disruption comes from business stakeholders, rather than IT or sourcing executives; sourcing executives struggle to keep up with the fast pace of change that business demands. Our research shows that this fragmented, divisional, silo approach to buying (often under the radar screen) can create risk and go against enterprise IT strategy decisions.
To help their organizations navigate through these emerging options, we have identified three key principles of IT sourcing strategy:
Change the rules for working with vendors and partners. To thrive in the world of digital disruption and to enable sourcing of emerging technologies and services that drive digital disruption, sourcing strategists must create new rules for working with technology partners. They must increase the emphasis on innovation and differentiation and treat partners who excel in these dimensions differently from other tiered suppliers.
The analytics community is experiencing a rebirth. A renewal. A renaissance. Why? Data is bursting from every corner, from every device, allowing brands to deliver relevant messages and offers to its customers. So, being an analytics connoisseur is important now more than ever. I mean, who else is going to play with all this data . . . and actually enjoy it?
Organizations must develop relevant marketing strategies across devices -- to different customers -- and have the advanced measurement and analytic frameworks to fuel decisions. And the perpetually connected customer is forcing organizations to act quickly, so near-real-time insights are paramount. My past research addresses this, specifically, how analytics professionals can use attribution as a way to understand the true value of each interaction point. This is even more complex because of the increase in cross-device usage. As a result,analytic pros are using savvy ways to connect information and to measure cross-device impact and incremental value.
Mobile BI and cloud BI are among the top trends that we track in the industry. Our upcoming Enterprise BI Platforms Wave™ will dedicate a significant portion of vendor evaluation on these two capabilities. These capabilities are far from yes/no checkmarks. Just asking vague questions like “Can you deliver your BI functionality on mobile devices?” and “Is your BI platform available in the cloud as software-as-a-service?” will lead to incomplete vendor answers, which in turn may lead you to make the wrong vendor selections. Instead, we plan to evaluate these two critical BI platform capabilities along the following parameters:
Animations. Does the product support animations? For example, if a particular dimension, such as time, has hundreds or thousands of values (as in daily values over multiple years), manually clicking through every day is not practical. Launching an automated, animated scroll up and down such a dimension is a more practical approach.
Reflections from the 10th Safer Internet Day Conference in Berlin, February 5th 2013
Earlier this month, I had the pleasure of speaking at the Safer Internet Day Conference in Berlin, organized by the Federal Ministry of Consumer Protection, Food and Agriculture and BITKOM, the German Association for Information Technology, Telecommunication and New Media. The conference title, ‘Big Data – Gold Mine or Dynamite?’ set the scene; after my little introductory speech on what big data really means and why this is a relevant topic for all of us (industry, consumers, and government), the follow-up presentations pretty much focused either on the ‘gold mine’ or the ‘dynamite’ aspect. To come straight to the point: I was very surprised, if not slightly shocked at how deep a gap became visible between the industry on the one side and the government (mainly the data protection authorities) on the other side.
While industry representatives, spearheaded by the BITKOM president Prof. Dieter Kempf and speakers from IBM, IMS Health, SAS, and others, highlighted interesting showcases and future opportunities for big data, Peter Schaar, the Federal Commissioner for Data Protection, seemed to be on a crusade to protect ‘innocent citizens’ from the ‘baddies’ in the industry.