Calxeda, one of the most visible stealth mode startups in the industry, has finally given us an initial peek at the first iteration of its server plans, and they both meet our inflated expectations from this ARM server startup and validate some of the initial claims of ARM proponents.
While still holding their actual delivery dates and details of specifications close to their vest, Calxeda did reveal the following cards from their hand:
The first reference design, which will be provided to OEM partners as well as delivered directly to selected end users and developers, will be based on an ARM Cortex A9 quad-core SOC design.
The SOC, as Calxeda will demonstrate with one of its reference designs, will enable OEMs to design servers as dense as 120 ARM quad-core nodes (480 cores) in a 2U enclosure, with an average consumption of about 5 watts per node (1.25 watts per core) including DRAM.
While not forthcoming with details about the performance, topology or protocols, the SOC will contain an embedded fabric for the individual quad-core SOC servers to communicate with each other.
Most significantly for prospective users, Calxeda is claiming, and has some convincing models to back up these claims, that they will provide a performance advantage of 5X to 10X the performance/watt and (even higher when price is factored in for a metric of performance/watt/$) of any products they expect to see when they bring the product to market.
If you're a security and risk professional in charge of protecting consumer-facing applications, you may have heard that OpenID is a “toy,” or it's an insecure protocol, or other critiques. And then here comes the recent news by former early adopter 37signals to drop its OpenID login support, which has occasioned some soul-searching in the Web 2.0 identity community. Check out commentary from Scott Gilbertson of Wired's WebMonkey, Dare Obasanjo, and reaction from “social login” vendor JanRain
When OpenID appeared on the scene, more robust solutions based on SAML were well under way for many years and seeing adoption, but only in scenarios involving limited circles of trust — typically point-to-point enterprise outsourcing scenarios and specialized higher-education communities — rather than in broad-based consumer populations.
Last week IBM and ARM Holdings Plc quietly announced a continuation of their collaboration on advanced process technology, this time with a stated goal of developing ARM IP optimized for IBM physical processes down to a future 14 nm size. The two companies have been collaborating on semiconductors and SOC design since 2007, and this extension has several important ramifications for both companies and their competitors.
It is a clear indication that IBM retains a major interest in low-power and mobile computing, despite its previous divestment of its desktop and laptop computers to Lenovo, and that it will be in a position to harvest this technology, particularly ARM's modular approach to composing SOC systems, for future productization.
For ARM, the implications are clear. Its latest announced product, the Cortex A15, which will probably appear in system-level products in approximately 2013, will be initially produced in 32 nm with a roadmap to 20nm. The existence of a roadmap to a potential 14 nm product serves notice that the new ARM architecture will have a process roadmap that will keep it on Intel’s heels for another decade. ARM has parallel alliances with TSMC and Samsung as well, and there is no reason to think that these will not be extended, but the IBM alliance is an additional insurance policy. As well as a source of semiconductor technology, IBM has a deep well of systems and CPU IP that certainly cannot hurt ARM.
From nothing more than an outlandish speculation, the prospects for a new entrant into the volume Linux and Windows server space have suddenly become much more concrete, culminating in an immense buzz at CES as numerous players, including NVIDIA and Microsoft, stoked the fires with innuendo, announcements, and demos.
Consumers of x86 servers are always on the lookout for faster, cheaper, and more power-efficient servers. In the event that they can’t get all three, the combination of cheaper and more energy-efficient seems to be attractive to a large enough chunk of the market to have motivated Intel, AMD, and all their system partners to develop low-power chips and servers designed for high density compute and web/cloud environments. Up until now the debate was Intel versus AMD, and low power meant a CPU with four cores and a power dissipation of 35 – 65 Watts.
The Promised Land
The performance trajectory of processors that were formerly purely mobile device processors, notably the ARM Cortex, has suddenly introduced a new potential option into the collective industry mindset. But is this even a reasonable proposition, and if so, what does it take for it to become a reality?
Our first item of business is to figure out whether or not it even makes sense to think about these CPUs as server processors. My quick take is yes, with some caveats. The latest ARM offering is the Cortex A9, with vendors offering dual core products at up to 1.2 GHz currently (the architecture claims scalability to four cores and 2 GHz). It draws approximately 2W, much less than any single core x86 CPU, and a multi-core version should be able to execute any reasonable web workload. Coupled with the promise of embedded GPUs, the notion of a server that consumes much less power than even the lowest power x86 begins to look attractive. But…
When a user of a BI application complains about the application not being useful - something that I hear way too often - what does that really mean? I can count at least 11 possible meanings, and potential reasons:
1. The data is not there, because
It's not in any operational sources, in which case the organization needs to implement a new app, a new process or get that data from an outside source
It is in an operational source, but not accessible via the BI application.
The data is there, but
2. It's not usable as is, because
There are no common definitions, common metadata
The data is of poor quality
The data model is wrong, or out of date
3. I can't find it, because I
Can't find the right report
Can't find the right metadata
Can't find the data
I don't have access rights to the data I am looking for
4. I don't know how to use my application, because I
Was not trained
Was trained, but the application is not intuitive, user friendly enough
5. I can't/don'thave time do it myself - because I just need to run my business, not do BI !!! - and
The following question comes from many of our clients: what are some of the advantages and risks of implementing a vendor provided analytical logical data model at the start of any Business Intelligence, Data Warehousing or other Information Management initiatives? Some quick thoughts on pros and cons:
Leverage vendor knowledge from prior experience and other customers
May fill in the gaps in enterprise domain knowledge
Best if your IT dept does not have experienced data modelers
May sometimes serve as a project, initiative, solution accelerator
May sometimes break through a stalemate between stakeholders failing to agree on metrics, definitions
May sometimes require more customization effort, than building a model from scratch
May create difference of opinion arguments and potential road blocks from your own experienced data modelers
May reduce competitive advantage of business intelligence and analytics (since competitors may be using the same model)
Goes against “agile” BI principles that call for small, quick, tangible deliverables
Goes against top down performance management design and modeling best practices, where one does not start with a logical data model but rather
Defines departmental, line of business strategies
Links goals and objectives needed to fulfill these strategies
Defines metrics needed to measure the progress against goals and objectives
Defines strategic, tactical and operational decisions that need to be made based on metrics
Slowly but surely, with lots of criticism and skepticism, the business intelligence (BI) software-as-a-service (SaaS) market is gaining ground. It's a road full of peril — at least two BI SaaS startups have failed this year — but what software market segment has not seen its share of failures? Although I do not see a stampede to replace traditional BI applications with SaaS alternatives in the near future, BI SaaS does have a few legitimate use cases even today, such as complementary BI, in coexistence with traditional BI, BI workspaces, and BI for small and some midsize businesses.
In our latest BI SaaS research report we recommend the following structured approach to see if BI SaaS is right for you and if you are ready for BI SaaS:
Map your BI requirements and IT culture to one of five BI SaaS use cases
Evaluate and consider scenarios where BI SaaS may be a right or wrong fit for you
Select the BI SaaS vendor that fits your business, technical, and operational requirements, including your tolerance for risk
First we identified 5 following BI SaaS use cases.
Coexistence case: on-premises BI complemented with SaaS BI in enterprises
SaaS-centric case in enterprises: main BI application in enterprises committed to SaaS
SaaS-centric case in midmarket: main BI application in midsized businesses
Elasticity case: BI for companies with strong variations in activity from season to season
Power user flexibility case: BI workspaces are often considered necessary by power analysts
Our latest BI maturity survey results are in. We used exactly the same questions from our online BI maturity self assessment tool to survey over 200 Forrester clients. Now you can compare your own BI maturity level against your peers by using data from the survey.
In the self assessment tool and in the survey we ask over 30 questions in the following 6 categories
Data and technology
Our clients rated themselves on the scale of 1 to 5 (5, if they strongly agree with our statement or 1, if they strongly disagree). Here are the overall results. Keep in mind that these results do not evaluate BI maturity accross ALL business, but rather in businesses that are already pretty far ahead in their BI implementations (they are Forrester clients, they read our research reports, they talk to our research analysts):