This case study is from TJ Keitt's and my social business playbook report, “The Road To Social Business Starts With A Burning Platform.” A social business uses technology to work efficiently using a common collaboration platform -- without being constrained by server availability or storage capacity. Here’s the story.
If you've already consolidated dozens of email systems from every vendor and era onto a single managed instance of Exchange 2007, made the shift to support 70 or more state agencies by operating as an ISP, and crunched 20 SharePoint instances down to a single scalable data center, what else is there to do? After all, you've already achieved a high state of IT operational efficiency and process optimization.
If you are Ed Valencia, CTO and Deputy Commissioner, and Tarek Tomes, Customer and Service Management, Assistant Commissioner, the State of Minnesota’s IT department (MN.IT), you step back and ask, “Has what we’ve done really helped the business communicate and collaborate efficiently and effectively?” They knew they could do more by moving their collaboration workloads into the cloud.
So they took a gamble that Microsoft's Office 365 Dedicated offering was ready for the State of Minnesota. Office 365 Dedicated has opened new doors for people throughout the State of Minnesota government. Agencies can collaborate with one another because the common collaboration platform integrates the disparate directories of the different government entities. For example, the Governor can send a message to every agency in the executive branch through this common platform.
For social media evangelists, the question on everyone's mind is this: "How do we effectively measure the business value of social initiatives?"
Even when we get close, there's always that pesky issue of causation vs. correlation — can we really prove causation even for examples with high correlation between social initiatives and business outcomes? (Read Freakonomics, or watch the documentary, for insights into the challenges of causation vs. correlation.)
Recently I attended one of the day-long events in Munich that Google offers as part of its atmosphere on tour road show that visits 24 cities globally in 2012. The event series is aimed at enterprise customers and aims to get them interested in Google’s enterprise solutions, including Google Apps, search, analytics and mapping services, as well as the Chrome Book and Chrome Box devices.
Google Enterprise as a division has been around for some time, but it is only fairly recently that Google started to push the enterprise solutions more actively into the market through marketing initiatives. The cloud-delivery model clearly plays a central role for Google’s enterprise pitch (my colleague Stefan Ried also held a presentation on the potential of cloud computing at the event).
Still, the event itself was a touch light on details and remained pretty high level throughout. Whilst nobody expects Google to communicate a detailed five-year plan, it would have been useful to get more insights into Google’s vision for the enterprise and how it intends to cater to these needs. Thankfully, prior to the official event, Google shared some valuable details of this vision with us. The four main themes that stuck out for us are:
In addition to the eHealth initiatives mentioned in my previous blog, I wanted to call out another T-city program that struck close to home for me — the “tumor conference program.” The idea is simple, but the impact is enormous. The program’s official objective is to “make possible the interdisciplinary exchange of experiences between doctors, therapists, and cancer specialists, and to support the process flow of a tumor conference by using a modern communications solution.” But for many patients, the objective is more than “process flow,” it is about universal access to healthcare and access to specialists in the fields they need — in this case, access to the cancer specialists that are affiliated with research centers and university hospitals. These conferences are vital to extending access beyond just the big cities to the smaller towns and rural areas. And we’re not talking about Africa or India — we’re talking about Europe, and developed countries on other continents.
On Monday, February 13, HP announced its next turn of the great wheel for servers with the announcement of its Gen8 family of servers. Interestingly, since the announcement was ahead of Intel’s official announcement of the supporting E5 server CPUs, HP had absolutely nothing to say about the CPUs or performance of these systems. But even if the CPU information had been available, it would have been a sideshow to the main thrust of the Gen8 launch — improving the overall TCO (particularly Opex) of servers by making them more automated, more manageable, and easier to remediate when there is a problem, along with enhancements to storage, data center infrastructure management (DCIM) capabilities, and a fundamental change in the way that services and support are delivered.
With a little more granularity, the major components of the Gen8 server technology announcement included:
Onboard Automation – A suite of capabilities and tools that provide improved agentless local intelligence to allow quicker and lower labor cost provisioning, including faster boot cycles, “one click” firmware updates of single or multiple systems, intelligent and greatly improved boot-time diagnostics, and run-time diagnostics. This is apparently implemented by more powerful onboard management controllers and pre-provisioning a lot of software on built-in flash memory, which is used by the onboard controller. HP claims that the combination of these tools can increase operator productivity by up to 65%. One of the eye-catching features is an iPhone app that will scan a code printed on the server and go back through the Insight Management Environment stack and trigger the appropriate script to provision the server.[i]Possibly a bit of a gimmick, but a cool-looking one.
Based on the very high interest in this blog and its cloud predictions we are planning to host a Forrester Teleconference entiteled "2012 — The Year The Cloud Matures: A Deeper Dive Into 10 Cloud Predictions For The Upcoming Year" on February 28th, 1-2pm EST/6-7pm UK time, where we will highlight and go through the 10 below predictions one by one. For more details and registration please follow the link to the: teleconference web page.
1. Multicloud becomes the norm
As companies quickly adopt a variety of cloud resources, they’ll increasingly have to address working with several different cloud solutions, often from different providers. By the end of 2012, cloud customers will already be using more than 10 different cloud apps on average. Cloud orchestration will become a big topic and an opportunity for service providers.
2. The Wild West of cloud procurement is over
While 2011 still witnessed different stakeholders within a company brokering (sometimes unsanctioned by IT) a lot of cloud deals, most companies will have established their formal cloud strategy by the end 2012, including the business models between IT and lines of business for their own, private cloud resources.
My colleague James Staten recently wrote about AutoDesk Cloud as an exemplar of the move toward App Internet, the concept of implementing applications that are distributed between local and cloud resources in a fashion that is transparent to the user except for the improved experience. His analysis is 100% correct, and AutoDesk Cloud represents a major leap in CAD functionality, intelligently offloading the inherently parallel and intensive rendering tasks and facilitating some aspects of collaboration.
But (and there’s always a “but”), having been involved in graphics technology on and off since the '80s, I would say that “cloud” implementation of rendering and analysis is something that has been incrementally evolving for decades, with hundreds of well-documented distributed environments with desktops fluidly shipping their renderings to local rendering and analysis farms that would today be called private clouds, with the results shipped back to the creating workstations. This work was largely developed and paid for either by universities and by media companies as part of major movie production projects. Some of them were of significant scale, such as “Massive,” the rendering and animation farm for "Lord of the Rings" that had approximately 1,500 compute nodes, and a subsequent installation at Weta that may have up to 7,000 nodes. In my, admittedly arguable, opinion, the move to AutoDesk Cloud, while representing a major jump in capabilities by making the cloud accessible to a huge number of users, does not represent a major architectural innovation, but rather an incremental step.
One of the many interesting topics of discussion we get into in our Social Business Strategy workshops is around the social ecosystem. This is the name I have given the collection of business capabilities potentially enhanced by one or more social technologies.
First let me define social technologies. Note I’m using the word “technology” quite deliberately in place of the more common term “social media” because social media is too often associated with consumer-facing technology as deployed in support of marketing. In defining the entire social ecosystem I prefer the more generic “technology”. I define social technology as “any technology that enables one-to-many communications in a public forum (or semi-public if behind a security firewall)”.
Marketers, how are you getting along with IT these days? It matters more than it used to. The job your company expects you to do is more and more entwined with technology. And so are the people in your target market.
Our research at Forrester shows almost half of US adults say technology is important to them. And the ecosystem of suppliers of marketing-centric technologies and services is ballooning. So whatever your aim as a marketer — whether it’s listening to the market, engaging with potential customers, or measuring the results of those efforts — you can’t do your job without these many technologies of new channels, new services, and new products.
This technology entwinement is especially tight when your company tackles the challenge of mastering the flow of customer data throughout the organization, from inputs across customer touchpoints, to the many ways you subsequently engage those customers. The struggle is not only in how to do this but also in how to do it sustainably: How to remember what data’s been collected, how it’s been used, what the outcomes have been, and on and on.
Where it gets messy is that marketers and IT often sing from different hymnals when it comes to making the most of all the relevant technologies. You’re eager to get to market with exciting new tools for engaging with potential customers, and you’re willing to experiment. But your IT colleagues often seem to be focused above all on cutting costs and avoiding risk — goals that rarely mesh well with what you’re trying to get done as a marketer. Not surprisingly, one marketing exec that Forrester interviewed recently called IT the “Department of No.”
Whereas in the past it may have been possible (even expected!) for marketing and IT to work at arm’s length, it’s not an option anymore.