And Then There Were Three Cloud Email Giants

With Cisco's shuttering of Cisco Mail, multitenant cloud email is now (as my colleague Chris Voce called it) a battle royale between Microsoft, Google, and IBM, where the winner will have products, scale, sales channels, and big ecosystems of support.

I am not surprised that Cisco bailed on cloud email. All the signs were there:

  • The company overpaid for PostPath in the midst of a buying spree. PostPath (which made some folks a lot of money when it sold for $215M) was just one of 17 acquisitions Cisco made in 2007 and 2008. Clearly Cisco was feeling confident that it could buy its way into new markets. (And it did with WebEx.)
  • Cisco Mail was always to be released "any day now." It's fine to preannounce a product so that buyers know it's coming. But Cisco Mail never quite got shipped. The one reference customer never returned my phone calls.
  • Cisco's collaboration platform doesn't require email. Messaging is one of the four big boxes of collaboration stuff. (The others are conferencing, workspaces, and social technology.) Messaging in particular can be carved out and offered separately. Cisco doesn't need email. It has WebEx and video conferencing. (The jury's still out on presence, chat, video hosting, and social technology.)
Read more

Categories:

CardSpace Is Dead. Long Live Back-Channel Access.

Microsoft announced during last week's RSA conference that it would not be shipping Windows CardSpace 2.0. A lot of design imperatives weighed on that one deliverable: security, privacy, usability, bridging the enterprise and consumer identity worlds – and being the standard-bearer of the "identity metasystem" and the "laws of identity" to boot.  Something had to give. What are the implications for security and risk professionals?

The CardSpace model had nice phishing resistance properties that cloud-based identity selectors will find hard to replicate, alas. But without wide adoption on the open Web, that wasn't going to make a dent anyway. We'll have to look for other native-app solutions over time for that.

Read more

Cisco Sends A Recall On Its Cloud Email Strategy

Infrastructure & operations executives have shown a tremendous interest in looking for opportunities to take advantage of the cloud to provision email and collaboration services to their employees – in fact in a recent Forrester survey, nearly half of IT execs report that they either are interested in or plan on making a move to the cloud for email. Why? It can be more cost effective, increase your flexibility, and help control the historical business and technical challenges of deploying these tools yourself.  

To date, we’ve talked about four core players in the market : Cisco, Google, IBM, and Microsoft. According to a recent blog post, Cisco has chosen to no longer invest in Cisco Mail. Cisco Mail was formerly known as WebEx Mail – and before that, the email platform was the property of PostPath, which Cisco acquired in 2008 with the intention of providing a more complete collaboration stack alongside its successful WebEx services and voice.  I've gathered feedback and worked with my colleagues Ted Schadler, TJ Keitt, and Art Schoeller to synthesize and discuss what this means to Infrastructure & Operations pros and coordinating with their Content & Collaboration colleagues.

 So what happened and what does it mean for I&O professionals? Here’s our take:

Read more

Citrix Acquires EMS-Cortex

Another year and Citrix’s acquisition strategy of interesting companies continues as they have announced the purchase of EMS-Cortex. This acquisition has caught my eye because EMS-Cortex provides a web-based “cloud control panel” that can be used by service providers and end users to manage the provisioning and delegation administration of hosted business applications in a cloud environment such as XenApp, Microsoft Exchange, BlackBerry Enterprise Server, and a number of other critical business applications. In theory this means that customers and vendors will be able to “spin up” core business services quickly in a multi tenant environment.  

It is an interesting acquisition, as vendors are starting to address the fact that for “cloudonomics” to be achieved by their customers it is important that they ease the route to cloud adoption. While this acquisition is potentially a good move for Citrix I think it will be interesting for I&O professionals to see how they plan to integrate this ease of deployment with existing business service management processes, especially if the EMS-Cortex solution is going to be used in a live production environment.

Read more

Juniper’s QFabric: The Dark Horse In The Datacenter Fabric Race?

It’s been a few years since I was a disciple and evangelized for HP ProCurve’s Adaptive EDGE Architecture(AEA). Plain and simple, before the 3Com acquisition, it was HP ProCurve’s networking vision: the architecture philosophy created by John McHugh(once HP ProCurve’s VP/GM, currently the CMO of Brocade), Brice Clark (HP ProCurve Director of Strategy), and Paul Congdon (CTO of HP Networking) during a late-night brainstorming session. The trio conceived that network intelligence was going to move from the traditional enterprise core to the edge and be controlled by centralized policies. Policies based on company strategy and values would come from a policy manager and would be connected by high speed and resilient interconnect much like a carrier backbone (see Figure 1). As soon as users connected to the network, the edge would control them and deliver a customized set of advanced applications and services based on user identity, device, operating system, business needs, location, time, and business policies. This architecture would allow Infrastructure and Operation professionals to create an automated and dynamic platform to address the agility needed by businesses to remain relevant and competitive.

As the HP white paper introducing the EDGE said, “Ultimately, the ProCurve EDGE Architecture will enable highly available meshed networks, a grid of functionally uniform switching devices, to scale out to virtually unlimited dimensions and performance thanks to the distributed decision making of control to the edge.” Sadly, after John McHugh’s departure, HP buried the strategy in lieu of their converged infrastracture slogan: Change.

Read more

Don’t Think You Need to Know Your Organization’s Biz Requirements? Think Again.

This past month or so, I’ve been working with a number of Forrester clients who are either coming up on end of life storage hardware or are adding more capacity to their existing environment. In either case, the question always starts with “Who should we be using?” This situation comes up frequently, and I felt the need to point out some changes happening in organizations’ IT environments, and why this should be one of the last questions to ask.

  • Virtualization continues to move forward in most organizations. Although most environments are only 30% to 40% virtualized, there is an aggressive initiative to virtualize as much as possible. In Forrester surveys, virtualization was one of the top three initiatives for 2010, and I have no doubt it will be for 2011 as well. This means there is a great deal of responsibility (and budget) on the virtualization administrator to make this happen.
  • Teams are being assembled to think and design for a private cloud. This is no longer an abstract initiative but is actually happening, and rollouts may vary from one organization to another, but the reality is that business growth initiatives are forcing IT to evolve their overall environments to support these initiatives. And if they’re not, there’s a problem.
  • Businesses are moving at lightning speed. Today, the competitive landscape for any industry is aggressive. Organizations are looking to up their game, creating new growth initiatives, and leveraging technology platforms to do this. There are so many resources at their fingertips (public cloud services from AWS, etc.), that they can essentially bypass an IT department, and if savvy enough, use external resources for their needs. The bottom line is, if IT can’t do it fast enough, then IT becomes less relevant to the business. 
Read more

Retrospective On Agile After 10 Years

Last week Agile reached the milestone of 10 years. Ten years ago a group of 17 thought leaders met to draw up the original guiding principles and manifesto for Agility. These principles not only define the approach of many Agile methods but are the line in the sand when discussing if something is Agile or not. There is little doubt in my mind that the creation of these principles set in motion one of the most important transformations that the software development industry has seen ― focusing development teams on the things that really matter and challenging ideas of work, processes, and practices that make no sense in the development of software. But the question is, after 10 years, should the guiding principles be updated?

Reviewing the principles

When working with organizations that are introducing Agile methods, conflict sometimes occurs when reviewing the principles in the following areas:

  • Individuals and interactions over processes and tools. The words individuals and tools usually cause the most concern when adopting Agile. Software development is a team sport, but this principle focuses on individuals. And as Agile grows in use, increasingly tools play a key role.
  • Working software over comprehensive documentation. The challenge of documentation is often discussed when talking about this principle. As the adoption of Agile grows, so does its use in industries where documentation is crucial and legislated.
  • Customer collaboration over contract negotiation. This is very true with internal development teams, but when external vendors enter the mix, contracts are important and will form the basis of the engagement.
  • Responding to change over following a plan. Implementation of this principle often results in lack of upfront planning, which is contrary to most organizations' funding processes.
Read more

Intel Discloses Details on “Poulson,” Next-Generation Itanium

This week at ISSCC, Intel made its first detailed public disclosures about its upcoming “Poulson” next-generation Itanium CPU. While not in any sense complete, the details they did disclose paint a picture of a competent product that will continue to keep the heat on in the high-end UNIX systems market. Highlights include:

  • Process — Poulson will be produced in a 32 nm process, skipping the intermediate 45 nm step that many observers expected to see as a step down from the current 65 nm Itanium process. This is a plus for Itanium consumers, since it allows for denser circuits and cheaper chips. With an industry record 3.1 billion transistors, Poulson needs all the help it can get keeping size and power down. The new process also promises major improvements in power efficiency.
  • Cores and cache — Poulson will have 8 cores and 54 MB of on-chip cache, a huge amount, even for a cache-sensitive architecture like Itanium. Poulson will have a 12-issue pipeline instead of the current 6-issue pipeline, promising to extract more performance from existing code without any recompilation.
  • Compatibility — Poulson is socket- and pin-compatible with the current Itanium 9300 CPU, which will mean that HP can move more quickly into production shipments when it's available.
Read more

Social Breathes New Life Into Knowledge Management For Customer Service

You have to admit that knowledge management (KM) is hard — it’s hard to explain, hard to implement, hard to do right. It’s not just technology. It is a combination of organizational realignment, process change, and technology combined in the right recipe that is needed to make KM successful. And when it is successful, it delivers real results — reduced handle times, increased agent productivity and first closure rates, better agent consistency, increased customer satisfaction. Check out the case studies on any of the KM vendors' sites to see real statistics. Yet despite these success stories, and despite there being commercially viable KM solutions on the market for over 10 years, I am unsure whether KM really ever crossed the chasm.  

Why is it then that we are seeing renewed interest in KM in 2011? I believe it’s attributed to listening (and acting on) the voice of agents and customers, coupled with loosening the strings of tightly controlled content that has breathed new life into KM. Most common trends include:

  • Using more flexible authoring workflows. In the past, knowledge was authored by editors who were not on the frontlines of customer service, who foreshadowed questions that they thought customers would ask, and who used language that was not consistent with customer-speak. Authored content would go through a review cycle, finally being published days after it was initially authored. Today, many companies are implementing “just-in-time” authoring where agents fielding questions from customers, not backroom editors, create content that is immediately available in draft form to other agents. Content is then evolved based on usage, and most frequently, used content is published to a customer site, making knowledge leaner and more relevant to real-life situations.
Read more

In SaaS, Product Management Is Essential, Not Optional

A while back, I had a spirited exchange with someone who took a rather extreme position about the role of product management in software as a service companies. The less polite version: He staked out a silly position – just poll your customers about what to build, and eliminate the PM role altogether – so I felt obliged to refute it. With two additional years of research into SaaS and PM, it's worth returning to that contretemps for a moment.

Then, I argued that PM was necessary in SaaS ventures.

Now, it's clear that PM is not merely necessary but essential.

Product Managers Are Really Innovation Managers
If the sole responsibility of PM were requirements, as my foil assumed, a poll might replace a person, but only if you had no interest in the long-term success of your company. Polls are extremely useful tools, and it's far better to use them than not to. However, polls have their limitations: most obviously, as a tool for taking the temperature of the customers you have, they tell you absolutely nothing about the customers you don't have yet.

This week, I've already written a little about the need to go beyond polls to use other tools, such as serious games. In an upcoming post, I'll have more to say on the topic. For now, I'll just say that, in both business and politics, voting alone is not the ticket to ultimate success.

Read more