Customers value tailored offerings. And consumers are increasingly aware of what Forrester calls the “privacy-personalization paradox” — that is, the paradox between their desire for personalization and their desire to keep their data private. A 2013 survey by Populus for Sky IQ of 3,097 UK adults found that 51% believe it is useful for brands to know some information about them, and 53% trust brands to act responsibly with their data. The same survey reveals that 79% respondents are careful about the type of information they pass to organizations, 63% worry about how much personal data they have revealed online, 48% say that data privacy is an issue they think about, and 46% do not trust social networks with their personal data.
Yet, the ruling by the European Courts of Justice in May 2014 that an individual could demand that "irrelevant or outdated" information be deleted from results reminds CIOs that privacy management needs to be a top business priority. Privacy regulation is now a topic that no CIO should underestimate as a major risk factor for businesses. CIOs who underestimate privacy regulation risk large fines for their organization, undermine their organization’s reputation and trust, as well as risk losing their own jobs. But those businesses that design their IT infrastructure with privacy regulation in mind build a competitive advantage for the age of the customer. Our upcoming report Customer Privacy Is A European CIO Priority highlights that the successful CIO:
Boring as it may appear, the World Conference on International Telecommunications (WCIT), which just took place in Dubai under the auspices of the International Telecommunications Union (ITU), matters to all Internet users globally. To us, the three most important questions that were discussed are:
Should national governments have greater influence over the global regulation of the Internet?
Should over-the-top providers (OTTs) like Google and business networks be governed by international telecom regulations?
Should the underlying business model of the Internet change from a free and neutral exchange of data to a “sender pays” model?
This week, the New York Times ran a series of articles about data center power use (and abuse) “Power, Pollution and the Internet” (http://nyti.ms/Ojd9BV) and “Data Barns in a Farm Town, Gobbling Power and Flexing Muscle” (http://nyti.ms/RQDb0a). Among the claims made in the articles were that data centers were “only using 6 to 12 % of the energy powering their servers to deliver useful computation. Like a lot of media broadsides, the reality is more complex than the dramatic claims made in these articles. Technically they are correct in claiming that of the electricity going to a server, only a very small fraction is used to perform useful work, but this dramatic claim is not a fair representation of the overall efficiency picture. The Times analysis fails to take into consideration that not all of the power in the data center goes to servers, so the claim of 6% efficiency of the servers is not representative of the real operational efficiency of the complete data center.
On the other hand, while I think the Times chooses drama over even-keeled reporting, the actual picture for even a well-run data center is not as good as its proponents would claim. Consider:
A new data center with a PUE of 1.2 (very efficient), with 83% of the power going to IT workloads.
Then assume that 60% of the remaining power goes to servers (storage and network get the rest), for a net of almost 50% of the power going into servers. If the servers are running at an average utilization of 10%, then only 10% of 50%, or 5% of the power is actually going to real IT processing. Of course, the real "IT number" is the server + plus storage + network, so depending on how you account for them, the IT usage could be as high as 38% (.83*.4 + .05).
A few months ago I shared a flight with a very pleasant lady from a European regulatory body. After shoulder surfing her papers and seeing we were both interested in information security (ironic paradox acknowledged!) we had a long chat about how enterprises could stand a chance against the hacktivist and criminal hordes so intent on stealing their data.
My flight-buddy felt that the future lay in open and honest sharing between organisations – i.e. when one is hacked they would immediately share details of both the breach and the method with their peers and wider industry; this would allow the group to look for similar exploits and prepare to deflect similar attacks. Being somewhat cynical, and having worked in industry, I felt that such a concept was idealised and that organisations would refuse to share such information for fear of reputational or brand damage – she acknowledged that it was proving tougher than she had expected to get her organisations to join in with this voluntary disclosure!
Across the US and Europe we are seeing a move toward ‘mandatory’breach disclosure; however they have seemingly disparate intentions. US requirements focus on breaches that may impact an organisations financial condition or integrity, whilst EU breach notification is very focussed on cases where there may have been an exposure of personal data. Neither of these seem to be pushing us toward this nirvana of ‘collaborative protection’.
In the UK, I’m aware that the certain organizations, within specific sectors, will share information within their small closed communities, unfortunately this is not widespread and certainly does not reflect the concept of ‘open and honest’ as my flight-buddy would have envisaged.