EMC's Project Bourne morphed into ViPR at the EMC World 2013 event at Las Vegas last week. It seems like everyone has a different take on what should be included in SDS, and my definition and implementation guidelines can be found in this report. Like other vendors, EMC is promising to revolutionize the way customers will provision, manage and create storage resources using ViPR, which will become a key component in the vendor's Software Defined Data Center strategy for virtualizing compute, networking, and storage resources. Unlike other years, where EMC bombarded its attendees with dozens of product launches, this year's show focused almost entirely on ViPR, which makes sense given the importance of this technology. ViPR is expected to become generally available in the latter half of 2013, and like all other SDS implementations, ViPR is designed to reduce the number of administrators it takes to manage rapidly growing data repositories by using automation and self-service provisioning. So what's under ViPR's covers?
The Wall Street Journal published a point-counterpoint article on cloud-hosted file sync/share solutions like Dropbox, Google Docs, and myriad others. They chose a title I wouldn't have used myself, but there you have it.
I took the pro side. You can read the whole article here.
My side of the argument is here:
Yes: Employees Are Doing What's Best for the Company
By Ted Schadler
Why do employees use cloud-based solutions like Dropbox, Box and SugarSync to sync and share files? As well over 100 million Dropbox customers have learned, it's because these services make it a cinch to move files from a computer to a tablet to a smartphone to another computer and back again. And it's a much better solution than email for sharing a bucket of files with others.
These services began life with a focus on home scenarios. But it didn't take savvy employees long to realize that these services also solve three big productivity problems at work: 1) getting all your work files on every device you use for work; 2) sharing files with colleagues; and 3) sharing files with trusted partners and customers.
So, should IT organizations allow employees to use these cloud-based services? That question is patently absurd. Why should an IT organization dictate what employees do to get their work done? Who made IT responsible for policing employee behavior and tools?
When you hear the words “end user computing”, what do you think of? If you’re in infrastructure & operations (I&O), you might think about the corporate standard laptop or desktop you’ve just selected that over the next couple of years you’ll provision to most of your employees. Or your corporate standard OS image that you stamp on those systems; locked-down, loaded with the management & security agents and corporate apps you think those employees need. Or perhaps even the corporate standard smartphone that you’ve handed out to the employees who needed mobile email access. You might think of these things because they all help I&O organizations deliver and support technology for employees more efficiently. These techniques help you address the historical “ask” from your colleagues outside of IT: “Give us technology while absolutely minimizing the impact you have on our bottom line”
Andras Cser probed a sore spot in IAM last week with his post, “XACML Is Dead.” It’s a necessary conversation (though I did see a glint in his eye at the Forrester BT Forum after he pressed Publish!). Our Q3 2012 Identity Standards TechRadar showed that XACML has already crested the peak of its moderate success trajectory, heading for decline. We haven’t seen its business value-add or ecosystem grow since then, despite the publication of XACML 3.0 and a few other bright spots, such as Axiomatics’ recent funding round.
It’s not that we don’t need an interoperable solution for finer-grained access control. But the world’s demands for loosely coupled identity and access systems have gotten...well, more demanding. The solution needs to be friendly to open web API security and management. It needs to be friendly to mobile developers. And it most certainly needs to be prepared to tackle the hard parts of integrating authorization with truly heterogeneous cloud services and applications, where business partners aren’t just enterprise clones, but may be tiny and resource-strapped. This admittedly gets into business rather than technical challenges, but every ounce of technical friction makes success in the business realm less likely.
Having had several related discussions this past week while in Washington DC, it is obvious that the question of how to use and manage the growing wealth of data, and incorporate it into an existing information governance organization and infrastructure (however mature or not), is top of mind in the public sector as well. These questions are particularly timely for the federal government with the publication of the new Executive Order on Open Data and accompanying Memorandum on Open Data Policy – Managing Information as an Asset. Do government agencies need a CDO in order to do this?
If they did, what functions does the new role take on? Does the new role take on new uses of data for business strategy? Who has responsibility for existing functions of information management and data governance? Then from the organizational perspective, where does this new role sit? Who reports to the CDO? Gene discusses these questions in his blog. With the increasing importance of data and the information they generate, organizations need to get their heads around the new assets they have – both for internal use and both partners external to the organization. But the proliferation of “chiefs” doesn’t seem to be the answer. Information is an asset to the company, yes. And it needs to be managed. But not all assets have their own chief, nor should they.
I’m currently finishing up my presentation for the Internet Retailer Conference & Expo in June: I’ll be presenting on the non-US marketplace options for brands as part of the global eCommerce track. In preparation for the session, I’ve had a chance to catch up with a number of established global online marketplaces for brands as well as agencies helping to develop some of the marketplace storefronts.
While many North American and European brands are familiar with local marketplace players, it’s worthwhile highlighting just a few key marketplace options originating outside of these two regions:
Channel partners are bullish about their growth prospects. In fact, in a recently conducted Forrester survey in North America (NA) and Europe, 59% of channel partners expect to grow by more than 10% in each of the next two years. However, partners will need help and handholding as they aim for greater sophistication and higher growth targets, especially around cloud based services. Forrester research indicates that three-quarters of channel partners in NA and Europe now sell cloud-based solutions (up dramatically from two years ago). These solutions now make up 26% of their overall revenue, a percentage they expect to increase in coming years.
In my recent report, Seeding the Cloud Channel, I highlight three key areas where the partners will need support from both their tech vendors and their distributors:
Diagnostic tools and services to assess current maturity and set a transformation road map. Partners will first have to collaborate with their principal vendors to gauge the fitness of their organizations for an annuity-based business model — and whether they can sustain that model in long run. Vendors need to create assessment tools to evaluate their partners' business model transformation potential. For example, Cisco Systems built its OnPlus ROI Tool expressly for partners to model the myriad business model options and scenario decisions they face. This will not only help partners identify their pertinent strengths and weaknesses, but will also help them plan their future growth strategy.
Ever wonder how Las Vegas casinos catch card-counting teams at Blackjack tables, like the MIT team immortalized in the film “21” with Kevin Spacey? They use many techniques, some of which are confidential, but one we know about is their use of Entity Analytics on many intersecting streams of information about their patrons or potential employees. I recently had the chance to learn more about Entity Analytics and Big Data from one of the top industry thought leaders, Jeff Jonas of IBM.
Jeff Jonas Kevin Spacey in 21
This opportunity came when Marcel Jemio, Chair of the Fiscal Service Data Stewards at the US Treasury Dept. (and a Forrester client), invited me to a presentation Jeff gave at a special internal event at the Fiscal Service in Washington, D.C. So of course I leapt at the opportunity! Marcel opened the session with an overview of why Treasury is interested in data and analytics: Treasury is charged with helping the nation guard against the kind of national or global financial collapse that triggered the 2007-2009 recession. Therefore it’s crucial that the stewards of the nation’s financial data, like Marcel and his colleagues, continuously improve the insights we gain from this data.
As an analyst on Forrester's Customer Insight's team, I spend a lot of time counseling clients on best-practice customer data usage strategies. And if there's one thing I've learned, it's that there is no such thing as a 360-degree view of the customer.
Here's the cold, hard truth: you can't possibly expect to know your customer, no matter how much data you have, if all of that data 1) is about her transactions with YOU and you 2) is hoarded away from your partners. And this isn't just about customer data either -- it's about product data, operational data, and even cultural-environmental data. As our customers become more sophisticated and collaborative with each other ("perpetually connected"), so organizations must do the same. That means sharing data, creating collaborative insight, and becoming willing participants in open data marketplaces.
Now, why should you care? Isn't it kind of risky to share your hard-won data? And isn't the data you have enough to delight your customers today? Sure, it might be. But I'd put money on the fact that it won't be for long, because digital disruptors are out there shaking up the foundations of insight and analytics, customer experience, and process improvement in big ways. Let me give you a couple of examples:
YouTube finally announced this week that it would allow channels to charge monthly fees to access content on YouTube. Some have predicted that YouTube’s subscription model would undercut its ad model in an echo of the infamous pay-wall problem that has bedeviled online newspapers as they shifted from ad-supported to paid. Others have suggested that this shows that YouTube is up against an advertising wall of its own making — advertisers will only pay so much to advertise against this amateur and semi-pro content (and to be fair, I am in this camp even though I don’t think this fact is dire). And still others gleefully wait to watch as YouTube learns how hard it is to get people to pay for things online.
In fact, all three of these things are minor asides in YouTube’s decision-making, as I see it. Instead of reacting to these and other constraints, YouTube is proacting on imminent opportunity. YouTube is basically making a grab for more of everything that matters:
More business model options. TV is both ad-supported and subscription-supported, and that works just fine. It gives companies like HBO the creative flexibility to generate content that advertisers may not be ready for, and it gives companies like Scripps the freedom to promise more home-focused entertainment that home-focused advertisers care about. That flexibility is crucial to the ongoing success of those companies, and it will be crucial to YouTube as well. Although in YouTube’s case, I would be surprised if the revenue balance in the one- to two-year time frame exceeded 10% or 15% subscription to advertising.