[For some reason this has been unpublished since April — so here it is well after AMD announced its next spin of the SeaMicro product.]
At its recent financial analyst day, AMD indicated that it intended to differentiate itself by creating products that were advantaged in niche markets, with specific mention, among other segments, of servers, and to generally shake up the trench warfare that has had it on the losing side of its lifelong battle with Intel (my interpretation, not AMD management’s words). Today, at least for the server side of the business, it made a move that can potentially offer it visibility and differentiation by acquiring innovative server startup SeaMicro.
SeaMicro has attracted our attention since its appearance (blog post 1, blog post 2) with its innovative architecture that dramatically reduces power and improves density by sharing components like I/O adapters, disks, and even BIOS over a proprietary fabric. The irony here is that SeaMicro came to market with a tight alignment with Intel, who at one point even introduced a special dual-core packaging of its Atom CPU to allow SeaMicro to improve its density and power efficiency. Most recently SeaMicro and Intel announced a new model that featured Xeon CPUs to address the more mainstream segments that were not a part of SeaMicro’s original Atom-based offering.
At its recent financial analyst day, AMD indicated that it intended to differentiate itself by creating products that were advantaged in niche markets, with specific mention, among other segments, of servers, and to generally shake up the trench warfare that has had it on the losing side of its lifelong battle with Intel (my interpretation, not AMD management’s words). Today, at least for the server side of the business AMD made a move that can potentially offer it visibility and differentiation by acquiring innovative server startup SeaMicro.
SeaMicro has attracted our attention since its appearance (blog post 1, blog post 2), with its innovative architecture that dramatically reduces power and improves density by sharing components like I/O adapters, disks, and even BIOS over a proprietary fabric. The irony here is that SeaMicro came to market with a tight alignment with Intel, who at one point even introduced a special dual-core packaging of its Atom CPU to allow SeaMicro to improve its density and power efficiency. Most recently SeaMicro and Intel announced a new model that featured Xeon CPUs to address the more mainstream segments that were not for SeaMicro’s original Atom-based offering.
I’m excited to announce the recent publication of Welcome To The Era Of Digital Intelligence. This idea has been brewing for a long time, and it shouldn’t surprise anybody who follows interactive marketing or web analytics. The macro marketing environment has changed – and continues to rapidly evolve – to accommodate new touchpoints, sophisticated consumers, and highly coordinated multichannel customer experiences. And as the remit of marketing expands, so too must that of marketing analytics.
It’s clear that traditional analytics approaches were not designed or intended to handle the breadth of channels, devices, volume, and speed that fuel today’s digital interactions. The endemic symptoms of these gaps are plain for anyone to see: the proliferation of analysis tools, the explosion of data warehousing projects, and the struggle to translate analytics into actionable insights. It is abundantly clear that we need to take a step back and re-imagine an analytics framework that adequately supports modern digital marketing.
Forrester calls this updated approach to marketing analytics “digital intelligence,” defined as:
The capture, management, and analysis of data to provide a holistic view of the digital customer experience that drives the measurement, optimization, and execution of marketing tactics and business strategies.
Digital intelligence comprises six “layers”:
Digital data inputs – incorporating data from all digital marketing touchpoints
Business data inputs – putting digital marketing data into context with data from the business
Data processing – collecting, integrating, and managing data with a high degree of speed and granularity
Emerging ARM server Calxeda has been hinting for some time that they had a significant partnership announcement in the works, and while we didn’t necessarily not believe them, we hear a lot of claims from startups telling us to “stay tuned” for something big. Sometimes they pan out, sometimes they simply go away. But this morning Calxeda surpassed our expectations by unveiling just one major systems partner – but it just happens to be Hewlett Packard, which dominates the WW market for x86 servers.
At its core (unintended but not bad pun), the HP Hyperscale business unit Project Moonshot and Calxeda’s server technology are about improving the efficiency of web and cloud workloads, and promises improvements in excess of 90% in power efficiency and similar improvements in physical density compared with current x86 solutions. As I noted in my first post on ARM servers and other documents, even if these estimates turn out to be exaggerated, there is still a generous window within which to do much, much, better than current technologies. And workloads (such as memcache, Hadoop, static web servers) will be selected for their fit to this new platform, so the workloads that run on these new platforms will potentially come close to the cases quoted by HP and Calxeda.
Calxeda, one of the most visible stealth mode startups in the industry, has finally given us an initial peek at the first iteration of its server plans, and they both meet our inflated expectations from this ARM server startup and validate some of the initial claims of ARM proponents.
While still holding their actual delivery dates and details of specifications close to their vest, Calxeda did reveal the following cards from their hand:
The first reference design, which will be provided to OEM partners as well as delivered directly to selected end users and developers, will be based on an ARM Cortex A9 quad-core SOC design.
The SOC, as Calxeda will demonstrate with one of its reference designs, will enable OEMs to design servers as dense as 120 ARM quad-core nodes (480 cores) in a 2U enclosure, with an average consumption of about 5 watts per node (1.25 watts per core) including DRAM.
While not forthcoming with details about the performance, topology or protocols, the SOC will contain an embedded fabric for the individual quad-core SOC servers to communicate with each other.
Most significantly for prospective users, Calxeda is claiming, and has some convincing models to back up these claims, that they will provide a performance advantage of 5X to 10X the performance/watt and (even higher when price is factored in for a metric of performance/watt/$) of any products they expect to see when they bring the product to market.
Here we are again. As we approach Labor Day, less than three weeks after IBM announced its agreement to acquire Unica (see my blog post with Suresh Vittal here), comScore announced yesterday that it has acquired the venerable European Web analytics vendor Nedstat.
Total cash and stock consideration for the purchase is valued at approximately $36.7 million USD. Additionally, nearly the entire Nedstat staff, numbering about 120, will stay on at comScore.
Official information is available through comScore, the comScore corporate blog, and the regulatory filing for those of you who are financially minded. I also had the opportunity to speak with comScore CEO and co-founder Magid Abraham, who generously took time out of a very hectic day for a call.
The acquisition is predicated on the following benefits:
Geographic expansion. Nedstat provides an established European presence from which to serve current and prospective comScore clients in the region.
Product enhancement. comScore will enhance its Unified Digital Measurement (UDM) platform with Nedstat technology.
Deeper client relationships. The opportunity to upsell comScore’s existing client base with new and expanded product offerings.
It's a funny thing, I was sitting in my office on Monday trying to decide what to blog about next, and as if on cue we had very exciting news yesterday in the Web Analytics business. IBM announced its agreement to acquire Coremetrics. So ended the deliberations on blog topics.
You can read the official press release from IBM here and the Coremetrics blog post here. The deal is subject to standard regulatory approvals in the US and Europe prior to closing. IBM has substantial experience in M&A, acquiring 90 companies since 1999 (source: Wikipedia), therefore I would expect that this deal will proceed with a high degree of precision to a successful conclusion. It is also interesting to note that this is not IBM's first go at Web Analytics, which ended in 2006 when they divested Surfaid to Coremetrics...so in as sense we're going full circle with this transaction.
I benefit from fortuitous scheduling, as I had already planned to have dinner last night with John Squire, Chief Strategy Officer at Coremetrics. Timing, as they say, is everything. (By the way, if you are ever in the mood for an excellent pastrami sandwich and/or Belgian beer - on tap no less - I highly recommend Refuge in San Carlos, California)
Coremetrics will operate as a unit of IBM's application and integration middleware division, which is a common approach by IBM as shown by previous acquisitions such as ILOG. New business sales will shift over to IBM's core sales groups, but account management will remain within the Coremetrics team in an effort to make the transition seamless for current clients.
I was traveling for the past couple weeks in the United Kingdom to meet with clients. Following a set of very successful meetings I ran into a bit of trouble. Just as I was planning to return home a volcano in Iceland erupted and brought air travel in Europe to a standing halt. I had to spend an additional 6 days in London. I never thought I would utter that combination of words, it just goes to show that sometimes truth is stranger than fiction.
(picture credited to AP Photo/Icelandic Coast Guard)
All things considered I can't complain too much. Obviously it is never fun to have travel plans disrupted or to be away from family longer than anticipated. But there are far worse places to be stranded than London! It's a wonderful city. And I have many clients, colleagues and friends there, so I kept quite busy, and was able to work from Forrester's London office while awaiting the green light to come home. About a dozen Forrester employees were in a similar situation, and the company did a great job of making sure we were ok and provided much needed support; I'm sure many travelers were not so fortunate.
It is interesting how the web became my constant companion as I made my best efforts to stay productive during the crisis and find my way home. I frequented the travel websites (American Airlines, Marriott), the UK and EU air transport authorities (NATS), news sites (BBC and Sky), and most of all Twitter (#ashtag) to stay up to date on the volcano news and ensure that I had a place to sleep every night, and a seat reserved on the earliest flight home. Turning to Twitter for real-time, crowdsourced news was a real revelation: they often scooped the big news websites; and it provided a sense of community, a lot of us were stuck in this mess together!
Last week I hit a major personal milestone. My first report as a Forrester analyst went live!
As thrilling as this is for me, I hope it will be even more exciting for Customer Intelligence professionals.
The report is titled How Web Analytics will Emerge As A Cornerstone of Customer Intelligence, and is based on the premise that the web is the common demoninator for customer experiences and that this information can be harnessed and subsequently applied throughout the enteprise. This report outlines the future trajectory of Web analytics technology and gives CI professionals pragmatic advice about how to use that technology as a foundational component for customer intelligence that fuels multichannel marketing effectiveness.
Marketers today have a dizzying array of online and offline touchpoints at their disposal, but without a doubt all roads lead through the Web. For most organizations, Web sites, microsites, landing pages, communities, and other interactive properties are mission-critical for acquiring, retaining, and nurturing customers and other target audiences. By definition this reality makes the Web one of the most crucial sources of insight for Customer Intelligence (CI) professionals. To put that insight into action, firms must leverage Web analytics beyond isolated Web site marketing and operations to feed analysis, decision support, and execution for the entire marketing function.
I believe that Web analytics will extend beyond the Web site in two phases.
First - Web analytics platforms will cement their position as the nucleus of online measurement by continuing their current diversification efforts to extend beyond core Web analytics capabilities.
I was on client calls most of the day, and when I came up for air in the afternoon to check my RSS reader and Tweetdeck to see what what going on in the world I made a fascinating discovery. Like many of you I came across the following post from the Google Analytics Blog:
This was most unexpected, and my Thursday suddenly got alot more interesting.
Before we go any further let me state that I have not been briefed by Google on this news item. This post is purely based on my own initial thoughts on the matter.
The blog post announces Google's plans to release a browser plug-in that would allow consumers to opt-out of Google Analytics tracking. This offering is still in development, and the post offers no specifics on the release date, although it implies that this is only weeks away.
(Side note: It is also interesting to note the language used in the post. The post leads with "As an enterprise-class web analytics solution..." This isn't a surprising or entirely inappropriate assertion, but it strongly implies Google's aspirations for GA.)
There are many reasons why Google's course of action is counterintuitive. Naturally, the marketer in me recoils at the idea of voluntarily allowing measurable data to slip through our hands. Rationalizing web analytics data is already hard enough, and now this? And we can certainly debate the true privacy impact of web analytics on consumers.