Recently I participated in a roundtable discussion by members of Forrester’s EA Council on “Getting Strategic In A Tactical World.” Members talked through the challenge of maintaining a strategic focus when the IT (or business) organization was very tactical and of getting the enterprise architecture function to have the right balance of tactical and strategic activities. “Strategic/Tactical Focus” is one of the dimensions of the Archetypes of EA that Forrester has written about, including in this blog, and the balance between tactical and strategic is a key factor in how the larger organization views EA’s relevance as well as the support it provides to EA.
One of the participants, who headed a team of more than 50 architects, asked the others, “How is your department funded – as overhead operations or as part of the project investment budget?” The person who asked this question said that his organization is more than 70% funded out of the project budget. Others responded with a range of 100% operations to 100% project-based. The comments around these different funding mixes were very interesting (all comments paraphrased):
“It’s easier to justify the size of my team if the funding is tied to the amount of project investments we are making.”
“Investment funding levels are too variable – two years ago we cut way back, now we’ve ramped way up. If my team size was a factor of investment funding, we wouldn’t be prepared for the amount of investment we are making now.”
“EA funding as part of ongoing operations budget makes us look like overhead. I don’t want architecture to look like some sort of overhead.”
EMC today moved to fill the hole in their portfolio in scale out file storage (and limited success in NAS in general) with the acquisition of Isilon, for a cool $2.25 billion. Looks like the HP helped set the going price for the few remaining independent storage vendors on the scene when they spent around $2.5B on 3PAR earlier this year. Isilon, similar to 3PAR, has about 1,500 customers and around $150MM revenue, so this is an acquisition based on technology and future growth potential, more than market share.
File data is often estimated as the fastest-growing data type and is also unpredictable in its growth patterns. Isilon’s technology uses a scale-out architecture that fits the behavior pattern of big users of file data. You can start small, with a cluster of a few nodes, and then grow over time — and you aren’t locked in to a given level of performance — you can deploy nodes with different performance and density profiles. This flexibility, combined with extremely easy management and refreshes without complex data migration or downtime, makes for a powerful combination. EMC will take a big step up from their current offerings with the addition of the Isilon technology, probably the best option from an independent vendor today.
From its birth as one of the highest-rated track sessions at IT Forum earlier this year to its recent publication as the Forrester report "Best Practices: Building High-Performance Application Development Teams," Jeffrey Hammond's research on the techniques that leading development shops use to drive their success has been wowing application development professionals.
Are you facing this challenge? Are your business stakeholders demanding faster delivery of more innovation? Is the software you deliver increasingly vital to the success of your business? Then you can't afford to miss this upcoming event:
Webinar: "Building High-Performance Developer Teams"
Hosted by: Jeffrey Hammond, Principal Analyst, and Mike Gilpin, VP and Research Director
When: Thursday, November 18, 2010, at 11 a.m. EST (-05:00 GMT)
Duration: 1 hour
Microsoft was kind enough to invite me to Microsoft's Dynamics Fall Analyst Event — a two-day event packed with product, strategy, customer, and partner information. The focus was clearly on Microsoft Dynamics CRM 2011. This product and the go-to-market strategy are clear and focused. Here are my thoughts:
The Dynamics CRM 2011 product is good. Today, Microsoft Dynamics CRM is used by 23,000 customers, 1.4 million end users in 80 countries and 40+ languages. That in itself is impressive. However, Microsoft wants to do better. It has focused on the user experience and UI in the 2011 product in hopes of driving increased adoption. Dynamics CRM 2011 is deeply integrated with Outlook, Office Communicator, SharePoint, Office 365, and Bing. It can be easily personalized. A business user, without the help of IT, can set up a dashboard. It has rich reporting analytics. It works on mobile devices, including the iPhone. Microsoft realizes that this product still has limitations, especially around Web self-service customer service capabilities. Its near-term plans are to address this, as well as adding capabilities around support for the phone channel and for social customer service. However, right now, these holes offer a chance for specialty customer service vendors to make inroads.
I have been working on a research document, to be published this quarter, on the impact of 8-socket x86 servers based on Intel’s new Xeon 7500 CPU. In a nutshell, these systems have the performance of the best-of-breed RISC/UNIX systems of three years ago, at a substantially better price, and their overall performance improvement trajectory has been steeper than competing technologies for the past decade.
This is probably not shocking news and is not the subject of this current post, although I would encourage you to read it when it is finally published. During the course of researching this document I spent time trying to prove or disprove my thesis that x86 system performance solidly overlapped that of RISC/UNIX with available benchmark results. The process highlighted for me the limitations of using standardized benchmarks for performance comparisons. There are now so many benchmarks available that system vendors are only performing each benchmark on selected subsets of their product lines, if at all. Additionally, most benchmarks suffer from several common flaws:
They are results from high-end configurations, in many cases far beyond the norm for any normal use cases, but results cannot be interpolated to smaller, more realistic configurations.
They are often the result of teams of very smart experts tuning the system configurations, application and system software parameters for optimal results. For a large benchmark such as SAP or TPC, it is probably reasonable to assume that there are over 1,000 variables involved in the tuning effort. This makes the results very much like EPA mileage figures — the consumer is guaranteed not to exceed these numbers.
Forrester’s Smart City Tweet Jam was a great success. On Tuesday morning/afternoon/evening, smart city followers around the globe participated in an hour of intense tweeting on smart cities. We touched on a range of issues from the definitions of a “city” and a “smart city” and the evolution toward the goal of becoming smart to the challenges city leaders face and the business models that enable adoption of technology-based solutions. We ended with a contrarian view that “smart cities” might just be a fade. But that was quickly refuted with reminders of the growing challenges faced by cities and the imperative of facing these challenges in a sustainable manner.
One hour, 62 Twitterers, and 389 tweets later we were exhausted – at least I was. But we were pleased to have aired and shared our opinions about the challenges, the potential solutions to those challenges, and the paths and business models that will make those solutions possible in the short-run, and hopefully sustainable in the longer term. Below are some excerpts from the conversation. But there were many interesting points of view and contributions to the discussion. I've included here a visual representation of the key words and topics discussed during the Tweet Jam, created using ManyEyes. For the more stats and the full transcript, check out #smartcityjam.
With about 41,000 attendees, 1,800 sessions, and a whooping 63,000-plus slides, Oracle OpenWorld 2010 (September 19-23) in San Francisco was certainly a mega event with more information than one could possibly digest or even collect in a week. While the main takeaway for every attendee depends, of course, on the individual’s area of interest, there was a strong focus this year on hardware due to the Sun Microsystems acquisition. I’m a strong believer in the integration story of “Hardware and Software. Engineered to Work Together.” and really liked the Iron Man 2 show-off all around the event; but, because I’m an application guy, the biggest part of the story, including the launch of Oracle Exalogic Elastic Cloud, was a bit lost on me. And the fact that Larry Ellison basically repeated the same story in his two keynotes didn’t really resonate with me — until he came to what I was most interested in: Oracle Fusion Applications!
Ready, set, go. Earlier this week IBM announced their Smart City Challenge – a competition for cities to help investigate and launch smart city initiatives. IBM will award $50 million worth of technology and services to help 100 municipalities across the globe. The city has to articulate a plan with several strategic issues it would like to address and demonstrate a track record of successful problem solving, a commitment to the use of technology and willingness to provide access to city leaders. Hmmm...this sounds exactly like IBM’s existing target market.
The challenge for IBM is to demonstrate that this program is incremental to IBM’s existing activities with cities and local governments. This program really is an opportunity to extend smart city activities – both from a philanthropy perspective and from a business development perspective. (I’m acknowledging that there can be business development in philanthropy.) Will cities that have not yet embarked on a smart city initiative or program now consider applying for funding and assistance in starting down that path?
One way to ensure a broader, and incremental, audience is to get the word out – and, actually evangelize to cities that have not already understood the benefits of technology as a means of addressing their critical pain points. Many of these are perhaps smaller cities, which leads me to another recommendation.
Microsoft began opening its own retail stores in 2009 and recently began a push into more US cities. A recent post by George Anderson on Forbes.com about Microsoft's new store format prompted me into some late-night analysis. It appears Microsoft's store format strategy is to ride in the draft of Apple by building larger-format stores very near, if not adjacent to, Apple's own stores. As a retail analyst and both an Apple and Microsoft customer for over 25 years, I feel compelled to weigh Microsoft's retail strategy against Apple's (and since I cover retail strategy from a CIO perspective, it feels appropriate to publish here).
Comparing eight success factors
Location: I'll start here, as it was the subject of the original post. Across from Apple may be the only sensible choice for MS, but the challenge MS has is that Apple is a destination store, i.e. people plan to go there for the experience. This makes it less likely they will decide to browse the MS store because it is close. On the other hand, assuming MS does some promotions to attract traffic to its stores, they are likely to also drive additional traffic to Apple. Predicted winner = Apple.
Store architecture: Size isn't everything! Sure Microsoft can copy Apple and go for outstanding store designs and even build them bigger, but Apple architecture is designed to reinforce a consistent brand image: minimalist, clean lines, designer. Microsoft's designs can reinforce many things about its brand, but it's hard to see the consistency in a way that's possible with Apple. Predicted winner = Apple.