Oracle Says No To Itanium – Embarrassment For Intel, Big Problem For HP

Richard Fichera

Oracle announced today that it is going to cease development for Itanium across its product line, stating that itbelieved, after consultation with Intel management, that x86 was Intel’s strategic platform. Intel of course responded with a press release that specifically stated that there were at least two additional Itanium products in active development – Poulsen (which has seen its initial specifications, if not availability, announced), and Kittson, of which little is known.

This is a huge move, and one that seems like a kick carefully aimed at the you know what’s of HP’s Itanium-based server business, which competes directly with Oracle’s SPARC-based Unix servers. If Oracle stays the course in the face of what will certainly be immense pressure from HP, mild censure from Intel, and consternation on the part of many large customers, the consequences are pretty obvious:

  • Intel loses prestige, credibility for Itanium, and a potential drop-off of business from its only large Itanium customer. Nonetheless, the majority of Intel’s server business is x86, and it will, in the end, suffer only a token loss of revenue. Intel’s response to this move by Oracle will be muted – public defense of Itanium, but no fireworks.
Read more

ARM Servers - Calxeda Opens The Kimono For A Tantalizing Tease

Richard Fichera

Calxeda, one of the most visible stealth mode startups in the industry, has finally given us an initial peek at the first iteration of its server plans, and they both meet our inflated expectations from this ARM server startup and validate some of the initial claims of ARM proponents.

While still holding their actual delivery dates and details of specifications close to their vest, Calxeda did reveal the following cards from their hand:

  • The first reference design, which will be provided to OEM partners as well as delivered directly to selected end users and developers, will be based on an ARM Cortex A9 quad-core SOC design.
  • The SOC, as Calxeda will demonstrate with one of its reference designs, will enable OEMs to design servers as dense as 120 ARM quad-core nodes (480 cores) in a 2U enclosure, with an average consumption of about 5 watts per node (1.25 watts per core) including DRAM.
  • While not forthcoming with details about the performance, topology or protocols, the SOC will contain an embedded fabric for the individual quad-core SOC servers to communicate with each other.
  • Most significantly for prospective users, Calxeda is claiming, and has some convincing models to back up these claims, that they will provide a performance advantage of 5X to 10X the performance/watt and (even higher when price is factored in for a metric of performance/watt/$) of any products they expect to see when they bring the product to market.
Read more

IBM And ARM Continue Their Collaboration – Major Win For ARM

Richard Fichera

Last week IBM and ARM Holdings Plc quietly announced a continuation of their collaboration on advanced process technology, this time with a stated goal of developing ARM IP optimized for IBM physical processes down to a future 14 nm size. The two companies have been collaborating on semiconductors and SOC design since 2007, and this extension has several important ramifications for both companies and their competitors.

It is a clear indication that IBM retains a major interest in low-power and mobile computing, despite its previous divestment of its desktop and laptop computers to Lenovo, and that it will be in a position to harvest this technology, particularly ARM's modular approach to composing SOC systems, for future productization.

For ARM, the implications are clear. Its latest announced product, the Cortex A15, which will probably appear in system-level products in approximately 2013, will be initially produced in 32 nm with a roadmap to 20nm. The existence of a roadmap to a potential 14 nm product serves notice that the new ARM architecture will have a process roadmap that will keep it on Intel’s heels for another decade. ARM has parallel alliances with TSMC and Samsung as well, and there is no reason to think that these will not be extended, but the IBM alliance is an additional insurance policy. As well as a source of semiconductor technology, IBM has a deep well of systems and CPU IP that certainly cannot hurt ARM.

Read more

ARM-Based Servers – Looming Tsunami Or Just A Ripple In The Industry Pond?

Richard Fichera

From nothing more than an outlandish speculation, the prospects for a new entrant into the volume Linux and Windows server space have suddenly become much more concrete, culminating in an immense buzz at CES as numerous players, including NVIDIA and Microsoft, stoked the fires with innuendo, announcements, and demos.

Consumers of x86 servers are always on the lookout for faster, cheaper, and more power-efficient servers. In the event that they can’t get all three, the combination of cheaper and more energy-efficient seems to be attractive to a large enough chunk of the market to have motivated Intel, AMD, and all their system partners to develop low-power chips and servers designed for high density compute and web/cloud environments. Up until now the debate was Intel versus AMD, and low power meant a CPU with four cores and a power dissipation of 35 – 65 Watts.

The Promised Land

The performance trajectory of processors that were formerly purely mobile device processors, notably the ARM Cortex, has suddenly introduced a new potential option into the collective industry mindset. But is this even a reasonable proposition, and if so, what does it take for it to become a reality?

Our first item of business is to figure out whether or not it even makes sense to think about these CPUs as server processors. My quick take is yes, with some caveats. The latest ARM offering is the Cortex A9, with vendors offering dual core products at up to 1.2 GHz currently (the architecture claims scalability to four cores and 2 GHz). It draws approximately 2W, much less than any single core x86 CPU, and a multi-core version should be able to execute any reasonable web workload. Coupled with the promise of embedded GPUs, the notion of a server that consumes much less power than even the lowest power x86 begins to look attractive. But…

Read more

11 Meanings of Why-My-BI-Application-Is-Not-Useful

Boris Evelson

When a user of a BI application complains about the application not being useful - something that I hear way too often - what does that really mean? I can count at least 11 possible meanings, and potential reasons:

1. The data is not there, because

  • It's not in any operational sources, in which case the organization needs to implement a new app, a new process or get that data from an outside source
  • It is in an operational source, but not accessible via the BI application.

The data is there, but

2. It's not usable as is, because

  • There are no common definitions, common metadata
  • The data is of poor quality
  • The data model is wrong, or out of date

3. I can't find it, because I

  • Can't find the right report
  • Can't find the right metadata
  • Can't find the data
  • I don't have access rights to the data I am looking for

4. I don't know how to use my application, because I

  • Was not trained
  • Was trained, but the application is not intuitive, user friendly enough

5. I can't/don't have time do it myself - because I just need to run my business, not do BI !!! - and

  • I don't have support staff
  • I am low on IT priority list

6. It takes too long to

  • Create a report/query
  • Run/execute a report/query
Read more

Pros and cons of using a vendor provided analytical data model in your BI implementation

Boris Evelson

The following question comes from many of our clients: what are some of the advantages and risks of implementing a vendor provided analytical logical data model at the start of any Business Intelligence, Data Warehousing or other Information Management initiatives? Some quick thoughts on pros and cons:

Pros:

  • Leverage vendor knowledge from prior experience and other customers
  • May fill in the gaps in enterprise domain knowledge
  • Best if your IT dept does not have experienced data modelers 
  • May sometimes serve as a project, initiative, solution accelerator
  • May sometimes break through a stalemate between stakeholders failing to agree on metrics, definitions

Cons

 

  • May sometimes require more customization effort, than building a model from scratch
  • May create difference of opinion arguments and potential road blocks from your own experienced data modelers
  • May reduce competitive advantage of business intelligence and analytics (since competitors may be using the same model)
  • Goes against “agile” BI principles that call for small, quick, tangible deliverables
  • Goes against top down performance management design and modeling best practices, where one does not start with a logical data model but rather
    • Defines departmental, line of business strategies  
    • Links goals and objectives needed to fulfill these strategies  
    • Defines metrics needed to measure the progress against goals and objectives  
    • Defines strategic, tactical and operational decisions that need to be made based on metrics
Read more

BI In The Cloud? Yes, And On The Ground, Too

Boris Evelson

Slowly but surely, with lots of criticism and skepticism, the business intelligence (BI) software-as-a-service (SaaS) market is gaining ground. It's a road full of peril — at least two BI SaaS startups have failed this year — but what software market segment has not seen its share of failures? Although I do not see a stampede to replace traditional BI applications with SaaS alternatives in the near future, BI SaaS does have a few legitimate use cases even today, such as complementary BI, in coexistence with traditional BI, BI workspaces, and BI for small and some midsize businesses. 

In our latest BI SaaS research report we recommend the following structured approach to see if BI SaaS is right for you and if you are ready for BI SaaS:

  1. Map your BI requirements and IT culture to one of five BI SaaS use cases
  2. Evaluate and consider scenarios where BI SaaS may be a right or wrong fit for you
  3. Select the BI SaaS vendor that fits your business, technical, and operational requirements, including your tolerance for risk

First we identified 5 following BI SaaS use cases.

  1. Coexistence case: on-premises BI complemented with SaaS BI in enterprises
  2. SaaS-centric case in enterprises: main BI application in enterprises committed to SaaS
  3. SaaS-centric case in midmarket: main BI application in midsized businesses
  4. Elasticity case: BI for companies with strong variations in activity from season to season
  5. Power user flexibility case: BI workspaces are often considered necessary by power analysts
Read more

Bottom Up And Top Down Approaches To Estimating Costs For A Single BI Report

Boris Evelson

How much does it cost to produce a single BI report? Just like typical answers to most other typical questions, the only real answer is “it depends”. But let’s build a few scenarios:

Scenario 1: Services only. Bottom up, ABC approach.

Assumptions.

 

  • Medium complexity report. Two data sources. 4 way join. 3 facts by 5 dimensions. Prompting, filtering, sorting ranking on most of the columns. Some conditional formatting. No data model changes.
  • Specifications and design – 2 person days. Development and testing - 1 person day. UAT – 1 person day.
  • Loaded salary for an FTE $120,000/yr or about ~$460/day.
  • Outside contractor $800/day.

Cost of 1 BI report: $1,840 if done by 2 FTEs or $2,520if done by 1 FTE (end user) and 1 outside contractor (developer). Sounds inexpensive? Wait.

 

Scenario 2. Top down. BI software and services:

Assumptions:

  • Average BI software deal per department (as per the latest BI Wave numbers) - $150,000
  • 50% of the software cost is attributable to canned reports, the rest is allocated to ad-hoc queries, and other forms of ad-hoc analysis and exploration.
  • Average cost of effort and services - $5 per every $1 spent on software (anecdotal evidence)
  • Average number of reports per small department - 100 (anecdotal evidence)
Read more

Forrester BI Maturity Survey Results Are In

Boris Evelson

Boris Evelson By Boris Evelson

Our latest BI maturity survey results are in. We used exactly the same questions from our online BI maturity self assessment tool to survey over 200 Forrester clients. Now you can compare your own BI maturity level against your peers by using data from the survey.

In the self assessment tool and in the survey we ask over 30 questions in the following 6 categories

  • Governance
  • Organization
  • Processes
  • Data and technology
  • Measurement
  • Innovation

Our clients rated themselves on the scale of 1 to 5 (5, if they strongly agree with our statement or 1, if they strongly disagree). Here are the overall results. Keep in mind that these results do not evaluate BI maturity accross ALL business, but rather in businesses that are already pretty far ahead in their BI implementations (they are Forrester clients, they read our research reports, they talk to our research analysts):

  • Governance 3.00
  • Organization 2.74
  • Processes 2.47
  • Data and technology 2.73
  • Measurement 2.11
  • Innovation 2.00
Read more

How To Differentiate Advanced Data Visualisation Solutions

Boris Evelson

Boris Evelson By Boris Evelson

I get many inquiries from clients on how to select a data visualization vendor / solution. The criteria that my clients often site are

  • Thick and thin client
  • Dynamic visualizations, not just static charts 
  • Ability to pull data from multiple sources
  • OLAP-like functionality
Read more