I had a conversation recently with Brian Lent, founder, chairman, and CTO of Medio. If you don’t know Brian, he has worked with companies such as Google and Amazon to build and hone their algorithms and is currently taking predictive analytics to mobile engagement. The perspective he brings as a data scientist not only has ramifications for big data analytics, but drastically shifts the paradigm for how we architect our master data and ensure quality.
We discussed big data analytics in the context of behavior and engagement. Think shopping carts and search. At the core, analytics is about the “closed loop.” It is, as Brian says, a rinse and repeat cycle. You gain insight for relevant engagement with a customer, you engage, then you take the results of that engagement and put them back into the analysis.
Sounds simple, but think about what that means for data management. Brian provided two principles:
How is it possible for a local company to defeat global giants like Pepsi, Coca-Cola, and Watsons in your market segment and establish market leadership for more than a decade? The answer is given by Nongfu Spring, a Chinese company in manufacturing and retail industries. In my recent report “Case Study: Technology Innovation Enables Nongfu Spring To Strengthen Market Leadership”, I analyzed the key factors behind their success, and provide related best practice from enterprise architecture perspective. These factors include
Business strategy is enterprise architecture's top priority. EA pros often need to be involved in project-level IT activities to resolve issues and help IT teams put out fires. But it's much more important that architects have a vision, clearly understand the business strategy, and thoroughly consider the appropriate road map that will support it in order to be able to address the root causes of challenges.
Agile infrastructure sets up the foundation for scalable business growth. Infrastructure scalability is the basis of business scalability. Infrastructure experts should consider not only the agility that virtualization and IaaS solutions will provide next-generation infrastructure, but also network-level load balancing among multiple telecom carriers. They should also refine the network topology for enterprise security.
Yesterday Intel had a major press and analyst event in San Francisco to talk about their vision for the future of the data center, anchored on what has become in many eyes the virtuous cycle of future infrastructure demand – mobile devices and “the Internet of things” driving cloud resource consumption, which in turn spews out big data which spawns storage and the requirement for yet more computing to analyze it. As usual with these kinds of events from Intel, it was long on serious vision, and strong on strategic positioning but a bit parsimonious on actual future product information with a couple of interesting exceptions.
Content and Core Topics:
No major surprises on the underlying demand-side drivers. The the proliferation of mobile device, the impending Internet of Things and the mountains of big data that they generate will combine to continue to increase demand for cloud-resident infrastructure, particularly servers and storage, both of which present Intel with an opportunity to sell semiconductors. Needless to say, Intel laced their presentations with frequent reminders about who was the king of semiconductor manufacturingJ
Initial business intelligence (BI) ployment efforts are often difficult to predict and may dwarf the investment you made in BI platform software. The effort and costs associated with professional services, whether you use internal staff or hire contractors, depend not only on the complexity of business requirements like metrics, measures, reports, dashboards, and alerts, but also on the number of data sources you are integrating, the complexity of your data integration processes, and logical and physical data modeling. At the very least Forrester recommends considering the following components and their complexity to estimate development, system integration and deployment effort:
I’ve been presenting research on big data and data governance for the past several months where I show a slide of a businesswoman doing a backbend to access data in her laptop. The point I make is that data management has to be hyper-flexible to meet a wider range of analytic and consumption demands than ever before. Translated, you need to cross-train for data management to have cross-fit data.
The challenge is that traditional data management takes a one-size fits-all approach. Data systems are purpose built. If organizations want to reuse a finance warehouse for marketing and sales purposes, it often isn’t a match and a new warehouse is built. If you want to get out of this cycle and go from data couch potato to data athlete, a cross-fit data training program should focus on:
Context first. Understanding how data is used and will provide value drives platform design. Context indicates more than where data is sourced from and where it will be delivered. Context answers: operations or analytics, structured or unstructured, persistent or disposable? These guide decisions around performance, scale, sourcing, cost, and governance.
Data governance zones. Command and control data governance creates a culture of “no” that stifles innovation and can cause the business to go around IT for data needs. The solution is to create policies and processes that give permission as well as mitigate risk. Loosen quality and security standards in projects and scenarios that are in contained environments. Tighten rules and create gates when called for by regulation, where there are ethical conflicts, or when data quality or access exposes the business to significant financial risk.
We recently attended Amdocs' customer event in Singapore. Amdocs is gradually adjusting its strategy to reflect one of the most fundamental changes in the ICT industry today: Increasingly, business line managers, think the marketing or sales officer, are the ones influencing sourcing decisions. Traditional decision-makers, CTOs and CIOs, are no longer the sole ICT decision-makers. Amdocs is addressing this shift by:
Strengthening its customer experience portfolio.Successful telcos will try to regain lost relevance through improved customer experience. Marketing, portfolio product development, and sales are therefore growing in importance for telcos. Amdocs’ integrated customer experience offering, CES 9, provides telcos with a multichannel experience; proactive care; and self-service tools.
Betting big on big data/analytics.Amdocs is leveraging big data/analytics to provide real-time, predictive, and prescriptive insights to telcos about their customers’ behaviour. Communications-industry-specific converged charging and billing solutions as well as other catalogue solutions give Amdocs the opportunity to provide more value to telcos than some of the other players.
I attended Google’s annual atmosphere road show recently, an event aimed at presenting solutions for business customers. The main points I took away were:
Google’s “mosaic” approach to portfolio development offers tremendous potential. Google has comprehensive offerings covering communications and collaboration solutions (Gmail, Google Plus), contextualized services (Maps, Compute Engine), application development (App Engine), discovery and archiving (Search, Vault), and access tools to information and entertainment (Nexus range, Chromebook/Chromebox).
Google’s approach to innovation sets an industry benchmark. Google is going for 10x innovation, rather than the typical industry approach of pursuing 10% incremental improvements. Compared with its peers, this “moonshot” approach is unorthodox. However, moonshot innovation constitutes a cornerstone of Google’s competitive advantage. It requires Google’s team to think outside established norms. One part of its innovation drive encourages staff to spend 20% of their work time outside their day-to-day tasks. Google is a rare species of company in that it does not see failure if experiments don’t work out. Google cuts the losses, looks at the lessons learned — and employees move on to new projects.
The Obama 2012 campaign famously used big data predictive analytics to influence individual voters. They hired more than 50 analytics experts, including data scientists, to predict which voters will be positively persuaded by political campaign contact such as a call, door knock, flyer, or TV ad. Uplift modeling (aka persuasion modeling) is one of the hottest forms of predictive analytics, for obvious reasons — most organizations wish to persuade people to to do something such as buy! In this special episode of Forrester TechnoPolitics, Mike interviews Eric Siegel, Ph.D., author of Predictive Analytics, to find out: 1) What exactly is uplift modeling? and 2) How did the Obama 2012 campaign use it to persuade voters? (< 4 minutes)
Where customer experience and analytics meet, in real time
For a while now, I’ve been using Hailo as a European poster child for innovation in the context of big data analytics. Due to the level of interest generated by this example, and the number of questions I’ve received along the way about Hailo, its technology and business model, etc., I decided to put together this blog post rather than write loads of separate emails.
Ironically, I’ve not actually been able to use Hailo myself (much as I would like to), as I have neither an iOS or Android-based smartphone. I have, however, met lots of people who’re using Hailo as customers, and I’ve also spoken to taxi drivers about it. I have yet to meet anybody who isn’t a fan.
For those of you who don’t know Hailo, it’s an app that allows you to hail a registered cab from your smartphone; as it was started in London, it’s often also called “the black cab app.” With the company founders being three London cabbies (black cab drivers), the entire service has been uniquely focused around the needs of the two main participants in a taxi ride: the customer and the driver.
Notes from the TechAmerica Europe seminar in Brussels, March 27, 2013
This may not be the most timely event write-up ever produced, but in light of all the discussions I’ve had on the same themes during the past few weeks, I thought I’d share my notes anyway.
The purpose of the event was to peel away some of the hype layers around the “big data” discussion, and — from a European perspective — take a look at the opportunities as well as challenges brought by the increasing amounts of data that is available, and the technologies that enable its exploitation. As was to be expected, an ever-present subtext was the potential of having laws and regulations put in place which — while well-intentioned — can ultimately stifle innovation and even act against consumer interests. And speaking of innovation: Another theme running through several of the discussions was the seeming lack of technology-driven innovation in Europe, in particular when considered in the context of an economic environment in dire need of every stimulus it can get.
The scene was set by John Boswell, senior VP, chief legal officer, and corporate secretary at SAS, who provided a neat summary of the technology developments (cheap storage, unprecedented access to compute power, pervasive connectivity) giving rise to countless opportunities related to the availability, sharing and exploitation of ever-increasing amounts of data. He also outlined the threats posed to companies, governments, and individuals by those who with more sinister intent when it comes to data exploitation, be it for ideological, financial, or political reasons. Clearly, those threats require mitigation, but John also made the point that “regulatory overlays” can also hinder progress, through limiting or even preventing altogether the free flow of data.