Microsoft was kind enough to invite me to Microsoft's Dynamics Fall Analyst Event — a two-day event packed with product, strategy, customer, and partner information. The focus was clearly on Microsoft Dynamics CRM 2011. This product and the go-to-market strategy are clear and focused. Here are my thoughts:
The Dynamics CRM 2011 product is good. Today, Microsoft Dynamics CRM is used by 23,000 customers, 1.4 million end users in 80 countries and 40+ languages. That in itself is impressive. However, Microsoft wants to do better. It has focused on the user experience and UI in the 2011 product in hopes of driving increased adoption. Dynamics CRM 2011 is deeply integrated with Outlook, Office Communicator, SharePoint, Office 365, and Bing. It can be easily personalized. A business user, without the help of IT, can set up a dashboard. It has rich reporting analytics. It works on mobile devices, including the iPhone. Microsoft realizes that this product still has limitations, especially around Web self-service customer service capabilities. Its near-term plans are to address this, as well as adding capabilities around support for the phone channel and for social customer service. However, right now, these holes offer a chance for specialty customer service vendors to make inroads.
I have been working on a research document, to be published this quarter, on the impact of 8-socket x86 servers based on Intel’s new Xeon 7500 CPU. In a nutshell, these systems have the performance of the best-of-breed RISC/UNIX systems of three years ago, at a substantially better price, and their overall performance improvement trajectory has been steeper than competing technologies for the past decade.
This is probably not shocking news and is not the subject of this current post, although I would encourage you to read it when it is finally published. During the course of researching this document I spent time trying to prove or disprove my thesis that x86 system performance solidly overlapped that of RISC/UNIX with available benchmark results. The process highlighted for me the limitations of using standardized benchmarks for performance comparisons. There are now so many benchmarks available that system vendors are only performing each benchmark on selected subsets of their product lines, if at all. Additionally, most benchmarks suffer from several common flaws:
They are results from high-end configurations, in many cases far beyond the norm for any normal use cases, but results cannot be interpolated to smaller, more realistic configurations.
They are often the result of teams of very smart experts tuning the system configurations, application and system software parameters for optimal results. For a large benchmark such as SAP or TPC, it is probably reasonable to assume that there are over 1,000 variables involved in the tuning effort. This makes the results very much like EPA mileage figures — the consumer is guaranteed not to exceed these numbers.
Forrester’s Smart City Tweet Jam was a great success. On Tuesday morning/afternoon/evening, smart city followers around the globe participated in an hour of intense tweeting on smart cities. We touched on a range of issues from the definitions of a “city” and a “smart city” and the evolution toward the goal of becoming smart to the challenges city leaders face and the business models that enable adoption of technology-based solutions. We ended with a contrarian view that “smart cities” might just be a fade. But that was quickly refuted with reminders of the growing challenges faced by cities and the imperative of facing these challenges in a sustainable manner.
One hour, 62 Twitterers, and 389 tweets later we were exhausted – at least I was. But we were pleased to have aired and shared our opinions about the challenges, the potential solutions to those challenges, and the paths and business models that will make those solutions possible in the short-run, and hopefully sustainable in the longer term. Below are some excerpts from the conversation. But there were many interesting points of view and contributions to the discussion. I've included here a visual representation of the key words and topics discussed during the Tweet Jam, created using ManyEyes. For the more stats and the full transcript, check out #smartcityjam.
With about 41,000 attendees, 1,800 sessions, and a whooping 63,000-plus slides, Oracle OpenWorld 2010 (September 19-23) in San Francisco was certainly a mega event with more information than one could possibly digest or even collect in a week. While the main takeaway for every attendee depends, of course, on the individual’s area of interest, there was a strong focus this year on hardware due to the Sun Microsystems acquisition. I’m a strong believer in the integration story of “Hardware and Software. Engineered to Work Together.” and really liked the Iron Man 2 show-off all around the event; but, because I’m an application guy, the biggest part of the story, including the launch of Oracle Exalogic Elastic Cloud, was a bit lost on me. And the fact that Larry Ellison basically repeated the same story in his two keynotes didn’t really resonate with me — until he came to what I was most interested in: Oracle Fusion Applications!
Ready, set, go. Earlier this week IBM announced their Smart City Challenge – a competition for cities to help investigate and launch smart city initiatives. IBM will award $50 million worth of technology and services to help 100 municipalities across the globe. The city has to articulate a plan with several strategic issues it would like to address and demonstrate a track record of successful problem solving, a commitment to the use of technology and willingness to provide access to city leaders. Hmmm...this sounds exactly like IBM’s existing target market.
The challenge for IBM is to demonstrate that this program is incremental to IBM’s existing activities with cities and local governments. This program really is an opportunity to extend smart city activities – both from a philanthropy perspective and from a business development perspective. (I’m acknowledging that there can be business development in philanthropy.) Will cities that have not yet embarked on a smart city initiative or program now consider applying for funding and assistance in starting down that path?
One way to ensure a broader, and incremental, audience is to get the word out – and, actually evangelize to cities that have not already understood the benefits of technology as a means of addressing their critical pain points. Many of these are perhaps smaller cities, which leads me to another recommendation.
Microsoft began opening its own retail stores in 2009 and recently began a push into more US cities. A recent post by George Anderson on Forbes.com about Microsoft's new store format prompted me into some late-night analysis. It appears Microsoft's store format strategy is to ride in the draft of Apple by building larger-format stores very near, if not adjacent to, Apple's own stores. As a retail analyst and both an Apple and Microsoft customer for over 25 years, I feel compelled to weigh Microsoft's retail strategy against Apple's (and since I cover retail strategy from a CIO perspective, it feels appropriate to publish here).
Comparing eight success factors
Location: I'll start here, as it was the subject of the original post. Across from Apple may be the only sensible choice for MS, but the challenge MS has is that Apple is a destination store, i.e. people plan to go there for the experience. This makes it less likely they will decide to browse the MS store because it is close. On the other hand, assuming MS does some promotions to attract traffic to its stores, they are likely to also drive additional traffic to Apple. Predicted winner = Apple.
Store architecture: Size isn't everything! Sure Microsoft can copy Apple and go for outstanding store designs and even build them bigger, but Apple architecture is designed to reinforce a consistent brand image: minimalist, clean lines, designer. Microsoft's designs can reinforce many things about its brand, but it's hard to see the consistency in a way that's possible with Apple. Predicted winner = Apple.
It will come as little surprise to most of you that the overall GRC market is still saturated with relatively small vendors, many of which continue to struggle to maintain their market niches. At the same time, a handful of market leaders (notably BWise, IBM/OpenPages, MetricStream, RSA/Archer, and Thomson Reuters/Paisley) continue to distance themselves from the rest of the pack, while several large competitors (including Oracle, SAP, SAS, Software AG, and Wolters Kluwer) put more and more pressure on the market all the time.
It's been interesting to watch these vendors that competed head-to-head regularly for SOX compliance deals now drifting further apart . . . some focusing more on risk management and analytics, some strengthening their compliance and content offerings, some building deeper integration with IT systems, and others building bridges into audit departments. The current environment of increased government oversight and regulation — and in some cases, reform of whole industries — worldwide promises to bring a strong resurgence to the GRC platform market overall, which means increased competition both from veteran vendors and newcomers alike.
Fujitsu? Who? I recently attended Fujitsu’s global analyst conference in Boston, which gave me an opportunity to check in with the best kept secret in the North American market. Even Fujitsu execs admit that many people in this largest of IT markets think that Fujitsu has something to do with film, and few of us have ever seen a Fujitsu system installed in the US unless it was a POS system.
So what is the management of this global $50 Billion information and communications technology company, with a competitive portfolio of client, server and storage products and a global service and integration capability, going to do about its lack of presence in the world’s largest IT market? In a word, invest. Fujitsu’s management, judging from their history and what they have disclosed of their plans, intends to invest in the US over the next three to four years to consolidate their estimated $3 Billion in N. American business into a more manageable (simpler) set of operating companies, and to double down on hiring and selling into the N. American market. The fact that they have given themselves multiple years to do so is very indicative of what I have always thought of as Fujitsu’s greatest strength and one of their major weaknesses – they operate on Japanese time, so to speak. For an American company to undertake to build a presence over multiple years with seeming disregard for quarterly earnings would be almost unheard of, so Fujitsu’s management gets major kudos for that. On the other hand, years of observing them from a distance also leads me to believe that their approach to solving problems inherently lacks the sense of urgency of some of their competitors.
We inhabit an age in which empowering technology is readily available first to individuals, not institutions. Consumers and employees will always get the new good stuff first. And it will always be so. The economics of technology investment seal that deal. The consumer market is bigger and easier to get started in.
In this empowered era, smart mobile devices, social technology, pervasive video, and cloud computing are the anchor tenants of the new technology platform. These technologies are available to every consumer and employee, even yours. The question is what to do about it? Two things:
Because customers can hijack your brand (consumers in the US make 500 billion impressions on each other online every year), you have to use empower your customers with better information than they can get from their networks. You have to honor your customers as a marketing channel.
Because employees have ready access to technology to improve their working lives, you have to give employees permission -- and protection -- to adopt these technologies. You have to honor employees' use of consumer technology as a source of incremental and sometimes breakthrough innovation.