Earlier this week Dell joined arch-competitor HP in endorsing ARM as a potential platform for scale-out workloads by announcing “Copper,” an ARM-based version of its PowerEdge-C dense server product line. Dell’s announcement and positioning, while a little less high-profile than HP’s February announcement, is intended to serve the same purpose — to enable an ARM ecosystem by providing a platform for exploring ARM workloads and to gain a visible presence in the event that it begins to take off.
Dell’s platform is based on a four-core Marvell ARM V7 SOC implementation, which it claims is somewhat higher performance than the Calxeda part, although drawing more power, at 15W per node (including RAM and local disk). The server uses the PowerEdge-C form factor of 12 vertically mounted server modules in a 3U enclosure, each with four server nodes on them for a total of 48 servers/192 cores in a 3U enclosure. In a departure from other PowerEdge-C products, the Copper server has integrated L2 network connectivity spanning all servers, so that the unit will be able to serve as a low-cost test bed for clustered applications without external switches.
Dell is offering this server to selected customers, not as a GA product, along with open source versions of the LAMP stack, Crowbar, and Hadoop. Currently Cannonical is supplying Ubuntu for ARM servers, and Dell is actively working with other partners. Dell expects to see OpenStack available for demos in May, and there is an active Fedora project underway as well.
In another token that the movement toward converged infrastructures and vertically integrated solutions is becoming ever more mainstream, HP and Microsoft recently announced a line of specialized appliances that combine integrated hardware, software and pre-packaged software targeting Exchange email, business analytics with Microsoft SharePoint and PowerPivot, and data warehousing with SQL Server. The offerings include:
HP E5000 Messaging System – Microsoft Exchange mailboxes in standard sizes of 500 – 3000 mailboxes. This product incorporates a pair of servers derived from HP's blade family in a new 3U rack enclosure plus storage and Microsoft Exchange software. The product is installed as a turnkey system from HP.
HP Business Decision Appliance – Integrated servers and SQL Server PowerPivot software targeting analytics in midmarket and enterprise groups, tuned for 80 concurrent users. This offering is based on standard HP rack servers and integrated Microsoft software.
HP Enterprise Data Warehouse Appliance – Intended to compete with Oracle Exadata, at least for data warehouse applications, this is targeted at enterprise data warehouses in the 100s of Terabyte range. Like Exadata, it is a massive stack of integrated servers and software, including 13 HP rack servers, 10 of their MSA storage units and integrated Ethernet, Infiniband and FC networking, along with Microsoft SQL Server 2008 R2 Parallel Data Warehouse software.
The following question comes from many of our clients: what are some of the advantages and risks of implementing a vendor provided analytical logical data model at the start of any Business Intelligence, Data Warehousing or other Information Management initiatives? Some quick thoughts on pros and cons:
Leverage vendor knowledge from prior experience and other customers
May fill in the gaps in enterprise domain knowledge
Best if your IT dept does not have experienced data modelers
May sometimes serve as a project, initiative, solution accelerator
May sometimes break through a stalemate between stakeholders failing to agree on metrics, definitions
May sometimes require more customization effort, than building a model from scratch
May create difference of opinion arguments and potential road blocks from your own experienced data modelers
May reduce competitive advantage of business intelligence and analytics (since competitors may be using the same model)
Goes against “agile” BI principles that call for small, quick, tangible deliverables
Goes against top down performance management design and modeling best practices, where one does not start with a logical data model but rather
Defines departmental, line of business strategies
Links goals and objectives needed to fulfill these strategies
Defines metrics needed to measure the progress against goals and objectives
Defines strategic, tactical and operational decisions that need to be made based on metrics
I just came back from an exciting week in Orlando, FL, shuttling between SAP SAPPHIRE and IBM Cognos Forum conferences. Thank you, my friends at SAP and IBM for putting the two conferences right next to each other (time- and location-wise), and for saving me an extra trip!
Both conferences showed new and exciting products and both vendors are making great progress towards my vision of “next generation BI”: automated, pervasive, unified and limitless. I track about 20 different trends under these four categories, but there’s a particular one that is especially catching my attention these days. It went largely under covers at both conferences, and I was struggling with how to verbalize it, until my good friend and peer, Mark Albala, of http://www.info-sight-partners.com, put it in excellent terms for me in an email earlier today: it’s all about “pre-discovery” vs. “post-discovery” of data.
I always predicted that Open Source BI has to reach critical mass before it becomes a viable alternative for large enterprise BI platforms. All the individual components (a mixture of Open Source BI projects and commercial vendor wrappers around them) are slowly but surely catching up to their bigger closed source BI brothers. Talend and Kettle (a Pentaho led project) offer data integration components like ETL, Mondrian and Palo (SourceForge projects) have OLAP servers, BIRT (an Eclipse project), Actuate, Jaspersoft and Pentaho have impressive reporting components, Infobright innovates with columnar dbms well suited for BI, and productized offerings from consulting companies like European based Engineering Ingegneria Informatica – SpagoBI – offer some Open Source BI component integration.
However, even large closed source BI vendors that acquired multiple BI components over the years still struggle with full, seamless component integration. So what chance do Open Source BI projects and vendors with independent leadership structure and often varying priorities have for integrating highly critical BI components such as metadata, data access layers, GUI, common prompting/sorting/ranking/filtering approaches, drill-throughs from one product to another, etc? Today, close to none. However, a potential consolidation of such products and technologies under one roof can indeed create a highly needed critical mass and give these individual components a chance to grow into large enterprise quality BI solutions.
Consistently rated as one of the most popular features of Forrester Events, one-on-one meetings give you the opportunity to discuss the unique technology issues facing your organization with Forrester analysts. Business & Technology Leadership Forum attendees may schedule up to two 20-minute one-on-one meetings with the Forrester analysts of their choice, depending on availability. Registered attendees will be able to schedule one-on-one meetings starting on Monday September 15, 2008. Book early!