CMOs historically focused narrowly on marketing and promotion. That’s not enough in the age of the customer. The CMO of 2015 must own the most important driver of business success -- the customer experience -- and represent the customer’s perspective in corporate strategy. Andy Childs at Paychex is a great example -- he owns not only traditional marketing but strategic planning and M&A.
We are in a golden age of data breaches - just this week, the United States Post Office was the latest casualty - and consumer attitudes about data security and privacy are evolving accordingly. If your data security and privacy programs exist just to ensure you meet compliance, you’re going to be in trouble. Data (and the resulting insights) is power. Data can also be the downfall for an organization when improperly handled or lost.
In 2015, Forrester predicts that privacy will be a competitive differentiator. There is a maze of conflicting global privacy laws to address and business partner requirements to meet in today’s data economy. There’s also a fine line between cool and creepy, and often it’s blurred. Companies, such as Apple, are sensitive to this and adjusting their strategies and messaging accordingly. Meanwhile, customers — both consumers and businesses — vote with their wallets.
An IT mindset has dominated the way organizations view and manage their data. Even as issues of quality and consistency raise their ugly head, the solution has often been to turn to the tool and approach data governance in a project oriented manner. Sustainability has been a challenge, relegated often to IT managing and updating data management tools (MDM, data quality, metadata management, information lifecycle management, and security). Forrester research has shown that less than 15% of organizations have business lead data governance that is linked to business initiatives, objectives and outcomes. But, this is changing. More and more organizations are looking toward data governance as a strategic enterprise competence as they adopt a data driven culture.
This shift from project to strategic program requires more than basic workflow, collaboration, and data profiling capabilities to institutionalize data governance policies and rules. The conversation can't start with data management technology (MDM, data quality, information lifecycle management, security, and metadata management) that will apply the policies and rules. It has to begin with what is the organization trying to achieve with their data; this is a strategy discussion and process. The implication - governing data requires a rethink of your operating model. New roles, responsibilities, and processes emerge.
On May 5, 2014, Target announced the resignation of its CEO, Gregg Steinhafel, in large part because of the massive and embarrassing customer data breach that occurred just before the 2013 U.S. holiday season kicked into high gear. After a security breach or incident, the CISO (or whoever is in charge of security) or the CIO, or both, are usually axed. Someone’s head has to roll. But the resignation of the CEO is unusual, and I believe this marks an important turning point in the visibility, prioritization, importance, and funding of information security. It’s an indication of just how much:
Security directly affects the top and bottom line. Early estimates of the cost of Target's 2013 holiday security breach indicate a potential customer churn of 1% to 5%, representing anywhere from $30 million to $150 million in lost net income. Target's stock fell 11% after it disclosed the breach in mid-December, but investors pushed shares up nearly 7% on the news of recovering sales. In February 2014, the company reported a 46% decline in profits due to the security breach.
Poor security will tank your reputation. The last thing Target needed was to be a permanent fixture of the 24-hour news cycle during the holiday season. Sure, like other breached companies, Target’s reputation will likely bounce back but it will take a lot of communication, investment, and other efforts to regain customer trust. The company announced last week that it will spend $100 million to adopt chip-and-PIN technology.
According to recent Business Technographics data, half of US enterprise technology management professionals report that there is 1.) no way to gain a single view of status and availability across their portfolio of cloud services, 2.) that they don’t have a clear way to assess the risk of using a third-party public as-a-service offering, and/or 3.) that they have no way to manage how providers handle their data.
An interesting debate is ensuing regarding how to best protect cloud data, given the market landscape. So far two modalities are emerging:
·A. Inserting in-line encryption between the enterprise and the SaaS provider that encrypts and/or tokenizes all data before it goes to the cloud to ensure safety interoperating within public cloud systems.
·B. The human-firewall model, in which IT closely monitors activity with context/content analytics and anomaly detection tools.
The truth lies somewhere between the two. By carefully applying Forrester’s data security and control framework, clients should incrementally encrypt data deemed sensitive to compliance or regulation, such as credit card and Social Security numbers, and closely monitor all activity across users and cloud applications.
Many of us in the information security space have a proud legacy of only purchasing best in breed point solutions. In my early days as an information security practitioner, I only wanted to deploy these types of standalone solutions. One of the problems with this approach is that it results in a bloated security portfolio with little integration between security controls. This bloat adds unneeded friction to the infosec team’s operational responsibilities. We talk about adding friction to make the attacker’s job more difficult, what about this self-imposed friction? S&R pros jobs are hard enough. I’m not suggesting that you eliminate best in breed solutions from consideration, I’m suggesting that any “point solution” that functions in isolation and adds unneeded operational friction shouldn’t be considered.
For decades, firms have deployed applications and BI on independent databases and warehouses, supporting custom data models, scalability, and performance while speeding delivery. It’s become a nightmare to try to integrate the proliferation of data across these sources in order to deliver the unified view of business data required to support new business applications, analytics, and real-time insights. The explosion of new sources, driven by the triple-threat trends of mobile, social, and the cloud, amplified by partner data, market feeds, and machine-generated data, further aggravates the problem. Poorly integrated business data often leads to poor business decisions, reduces customer satisfaction and competitive advantage, and slows product innovation — ultimately limiting revenue.
Forrester’s latest research reveals how leading firms are coping with this explosion using data virtualization, leading us to release a major new version of our reference architecture, Information Fabric 3.0. Since Forrester invented the category of data virtualization eight years ago with the first version of information fabric, these solutions have continued to evolve. In this update, we reflect new business requirements and new technology options including big data, cloud, mobile, distributed in-memory caching, and dynamic services. Use information fabric 3.0 to inform and guide your data virtualization and integration strategy, especially where you require real-time data sharing, complex business transactions, more self-service access to data, integration of all types of data, and increased support for analytics and predictive analytics.
Information fabric 3.0 reflects significant innovation in data virtualization solutions, including:
Earlier this month The Information Technology & Innovation Foundation (ITIF) published a prediction that the U.S. cloud computing industry stands to lose up to $35 billion by 2016 thanks to the National Security Agency (NSA) PRISM project, leaked to the media in June. We think this estimate is too low and could be as high as $180 billion or a 25% hit to overall IT service provider revenues in that same timeframe. That is, if you believe the assumption that government spying is more a concern than the business benefits of going cloud.
Having read through the thoughtful analysis by Daniel Castro at ITIF, we commend him and this think tank on their reasoning and cost estimates. However the analysis really limited the impact to the actions of non-US corporations. The high-end figure, assumes US-based cloud computing providers would lose 20% of the potential revenues available from the foreign market. However we believe there are two additional impacts that would further be felt from this revelation:
1. US customers would also bypass US cloud providers for their international and overseas business - costing these cloud providers up to 20% of this business as well.
2. Non-US cloud providers will lose as much as 20% of their available overseas and domestic opportunities due to other governments taking similar actions.
Let's examine these two cases in a bit more detail.
What happens in Vegas shouldn’t stay in Vegas. I was out at BlackHat with other members of the Forrester team over a week ago (seems like yesterday!). It was two jam packed days of popping into briefings, guzzling copious amounts of green tea, and meeting new people and learning new things. In general, I like to keep an eye and ear out for startups to see what’s bubbling up, and came across a few at BlackHat:
Co3 Systems. Co3 Systems* help to automate the four pillars of incident response (prepare, assess, manage, and report) and break down responsibilities and response to ensure best practices are followed along with compliance with regulatory requirements. They just updated their security module to include threat intelligence feeds from iSIGHT Partners, AlienVault, Abuse.ch and SANS, and recently rolled out an EU data privacy and breach notification update to the product. I’m a numbers nerd, so when they let me play with the solution, I immediately started running simulations that estimated the cost of a breach.
FileTrek. FileTrek provides visibility and transparency into where data resides, how it’s being accessed, moved, used, changed, and shared between people, devices, and files. No, it’s not DLP. It’s more like the mother of all audit trails that takes context and sequence of events into account. That way, if someone who is supposed to have access to data starts to do things with it beyond what they normally do, FileTrek will flag it as suspicious activity.
At a recent Enterprise Mobility event, I spoke with a few Asia-based IT directors about their journey in the age of consumerization of IT, and how they were dealing with Bring-Your-Own Technology (BYOT) at work. Their responses ranged from ‘fear of the unknown’ – as in ‘how do we deal with this trend?’ to ‘paralysis by analysis’ – as in ‘let’s arm ourselves with as much information as possible, and analyze it to death.’
The issue is – their employees are already accessing corporate email on their own mobile devices – which means that these IT managers are scrambling to catch up to managing BYOT in their organizations. In fact, an IT head at a large FMCG organization admitted that he did not know where to start managing BYOT.
Security and compliance were key concerns for these IT folks, and their concerns are valid. Trend Micro predicts, for example, that 91% of targeted attacks begin with spear-phishing, a highly targeted type of phishing aimed at specific individuals or groups within an organization. This was heightened in a recent spear-phishing attack on a South Korea bank. The security provider also predicts that there will be 1 million malicious Android apps in the wild by the end of 2013 – another red flag for organizations coping with the rise of Android devices at their work place.