Does The Good Old 80/20 Rule Work For Estimating BI Costs?

Boris Evelson

I get tons of questions about "how much it costs to develop an analytical application." Alas, as most of us unfortunately know, the only real answer to that question is “it depends.” It depends on the scope, requirements, technology used, corporate culture and at least a few dozen of more dimensions. However, at the risk of a huge oversimplification, in many cases we can often apply the good old 80/20 rule as follows:

Components

  • ~20% for software, hardware, and other data center and communications infrastructure
  • ~80% for full time employees, outside services (analysis, design, coding, testing, integration, implementation, etc), new processes, new initiatives (governance, change management, training)

Initial softare costs (~80%) vs. Ongoing software license maintenance costs (~20% / year)

Direct (~20%) vs. Indirect costs (~80%). Here are some examples:

Direct ~20%

  • Data integration for reporting and analysis
  • Data cleansing processes for reporting and analysis
  • Reporting and analytical data bases such as Data Warehouses, Data Marts
  • Reporting / querying / dashboards
  • OLAP (Online Analytical Processing)
  • Analytical MDM (Master Data Management)
  • Analytical metadata management
  • Data mining, predictive analytics
  • BI specific  SOA (Services Oriented Architecture) or other types of EAI (Enterprise Application Integration)
Read more

The Role Of IT Operations In Archiving

Stephanie Balaouras

Stephanie Balaouras

Yesterday IBM announced the availability of their new IBM Information Archive Appliance. The appliance replaces IBM’s DR550. The new appliance has significantly increased scale and performance because it’s built on IBM’s Global Parallel File System (GPFS), more interfaces (NAS and an API to Tivoli Storage Manager) and accepts information from multiple sources – IBM content management and archiving software and eventually 3rd party software. Tivoli Storage Manager (TSM) is embedded in the appliance to provide automated tiered disk and tape storage as well as block-level deduplication. TSM’s block-level deduplication will reduce storage capacity requirements and its disk and tape management capabilities will let IT continue to leverage tape for long-term data retention. All these appliance subcomponents are transparent to the IT end user who manages the appliance – he or she just sees one console where they define collections and retention policies for those collections.

Read more

Categories:

Cloud DR Services Are Real

Stephanie Balaouras

Stephanie Balaouras

There is a lot of hype surrounding cloud and I'm usually not one to join the hype but in the case of cloud-based backup and disaster recovery services (I'm trying to use the IT service continuity but it hasn't caught on yet), these service are available today and they address major pain points in IT operations and organizations of all sizes can leverage these services, not just small and medium businesses.

Storage-as-a-Service is relatively new. Today the main value proposition is as a cloud target for on-premise deployments of backup and archiving software. If you have a need to retain data for extended periods of time (1 year plus in most cases) tape is still the more cost effective option given it's low capital acquisition cost and removability. If you have long term data retention needs and you want to eliminate tape, that's where a cloud storage target comes in. Electronically vault that data to a storage-as-service provider who can store that data at cents per GB. You just can't beat the economies of scale these providers are able to achieve.

If you're a small business and you don't have the staff to implement and manage a backup solution or if you're an enterprise and you're looking for a PC backup or a remote office backup solution, I think it's worthwhile to compare the three year total cost of ownership of an on-premise solution versus backup-as-a-service.

Read more

Deduplication Market Undergoes Rapid Changes

Stephanie Balaouras

Stephanie Balaouras In May, I blogged about NetApp's announced acquisition of deduplication pionneer, Data Domain. The announcement triggered an unsolicted counter-offer from EMC, followed by another counter from NetApp. But after a month of offers, counter-offers and regulatory reviews, EMC ultimately outbid NetApp with an all cash offer of $2.1 billion. I believe that Data Domain would have been a better fit in the current NetApp portfolio; it would have been easier for NetApp to reposition its current VTL as a better fit for large enterprises that still planned to leverage tape. It's also said that more than half of Data Domain's current employees are former NetApp employees so there would have been a clear cultural fit as well.

 

For $2.1 billion, EMC gets Data Domain's more than 3000 customers and 8000 installs but it also gets a product that in my opinion, overlaps with its current Quantum-based disk libraries, the DL1500 and DL3000. In Forrester inquiries and current consulting engagements, Data Domain is regularly up against the EMC DL1500 and DL3000. EMC will need to quickly explain to customers how it plans to position its new Data Domain offerings with its current DL family, both the Quantum- and Falconstor-based DLs as well as its broader data protection portoflio that includes Networker and Avamar - which also offer deduplication.

Read more

Categories:

Information Post-Discovery - Latest BI Trend

Boris Evelson

Boris-Evelson By Boris Evelson

I just came back from an exciting week in Orlando, FL, shuttling between SAP SAPPHIRE and IBM Cognos Forum conferences. Thank you, my friends at SAP and IBM for putting the two conferences right next to each other (time- and location-wise), and for saving me an extra trip!

Both conferences showed new and exciting products and both vendors are making great progress towards my vision of “next generation BI”: automated, pervasive, unified and limitless.  I track about 20 different trends under these four categories, but there’s a particular one that is especially catching my attention these days. It went largely under covers at both conferences, and I was struggling with how to verbalize it, until my good friend and peer, Mark Albala, of http://www.info-sight-partners.com, put it in excellent terms for me in an email earlier today: it’s all about “pre-discovery” vs. “post-discovery” of data.

Read more

Free BI Is Still No Free Lunch

Boris Evelson

Boris-Evelson By Boris Evelson

In my recent BI Belt Tightening For Tough Economic Times document I explored a few low-cost alternatives to traditional, mainstream, and typically relatively expensive Business Intelligence (BI) tools. While some of these alternatives indeed were a fraction of a cost of a characteristic large enterprise BI software license, there were even fewer truly zero cost options. But there were some. For example, you can:

  • Leverage and use no-cost bundled BI software already in-house.Small departments and workgroups may be able to leverage BI software that comes bundled at no additional cost with BI appliances, database management systems (DBMSes), and application licenses. You can consider using these few free licenses from Actuate, IBM Cognos, Information Builders, Jaspersoft, Microsoft, MicroStrategy, Panorama, Pentaho, and SAP Business Objects for additional functions such as testing, QA, and prototyping. While these few free licenses are just a drop in the bucket in a typical large enterprise BI license requirements, do look around and don’t waste money on BI products you may already have.
Read more

Are There BI Implications In The Rumored IBM/Sun Merger? You Betcha!

Boris Evelson

Boris-Evelson By Boris Evelson

I always predicted that Open Source BI has to reach critical mass before it becomes a viable alternative for large enterprise BI platforms. All the individual components (a mixture of Open Source BI projects and commercial vendor wrappers around them) are slowly but surely catching up to their bigger closed source BI brothers. Talend and Kettle (a Pentaho led project) offer data integration components like ETL, Mondrian and Palo (SourceForge projects) have OLAP servers, BIRT (an Eclipse project), Actuate, Jaspersoft and Pentaho have impressive reporting components, Infobright innovates with columnar dbms well suited for BI, and productized offerings from consulting companies like European based Engineering Ingegneria Informatica – SpagoBI – offer some Open Source BI component integration.

However, even large closed source BI vendors that acquired multiple BI components over the years still struggle with full, seamless component integration. So what chance do Open Source BI projects and vendors with independent leadership structure and often varying priorities have for integrating highly critical BI components such as metadata, data access layers, GUI, common prompting/sorting/ranking/filtering approaches, drill-throughs from one product to another, etc? Today, close to none. However, a potential consolidation of such products and technologies under one roof can indeed create a highly needed critical mass and give these individual components a chance to grow into large enterprise quality BI solutions.

Read more

BI Nirvana

Boris Evelson

Borisevelson By Boris Evelson

I had an amazing client experience the other day. I searched long and hard for a client with flawless, perfect, 100% efficient and effective BI environment and applications. My criteria were tough and that's why it took me so long (I've been searching for as long as I've been in the BI business, almost 30 years). These applications had to be plug & play, involve little or no manual setup, be 100% automated, incorporate all relevant data and content, and allow all end users to self service every single BI requirement. Imagine my utter and absolute amazement when I finally stumbled on one.

The most remarkable part was that this was a very typical large enterprise. It grew over many years by multiple acquisitions, and as a result had many separate and disconnected front and back office applications, running on various different platforms and architectures. Its senior management suffered from a typical myopic attitude, mostly based on immediate gratification, caused by compensation structure that rewarded only immediate tangible results, and did not put significant weight and emphasis on long term goals and plans. Sounds familiar? If you haven't worked for one of these enterprises, the color of the sky in your world is probably purple.

Read more

Facing Microsoft Licensing Decisions? Bridge The Gap Between Operations And Sourcing

Christopher Voce

Voce
Whether or not to sign or renew an Enterprise Agreement with Microsoft is a sticky question that many organizations face. For many companies out there, their spend on Microsoft licensing can be a significant portion of a company's IT budget, whether it be Enterprise Agreements or Select License agreements. Some of you may be directly responsible for the negotiation of the agreement, but many more of you work with your sourcing professionals who negotiate the agreements with Microsoft or resellers. The increasing complexity around Microsoft licensing decisions require more heads at the table. For Infrastructure and Operations pros, your voice is critical in the decision process. Certainly, your current state of Microsoft products and your future rollouts over the life of the agreement (and beyond) play a role, but there are other factors to consider. Some of the other key questions you’ll face include:

Read more

Categories:

The Cloud, Not Deduplication Alone, Will Lead To The Demise Of Tape

Stephanie Balaouras

Stephaniebalaouras Friday, Iron Mountain and Microsoft announced a new partnership. Customers of Microsoft's backup offering, Data Protection Manager (DPM) 2007 service pack 1, can electronically vault redundant copies of their data to Iron Mountain's CloudRecovery service. This is welcomed news for DPM customers. Customers will continue to backup locally to disk for instant restore but rather than vault data to tape and physically transport tape to an offsite storage service provider, customers will vault data over the Internet to Iron Mountain. For disaster recovery purposes and long-term retention services, you need this redundant copy of your data offsite. By eliminating the physical tape transport you eliminate the risk of lost or stolen tapes or the need to deploy some kind of tape encryption solution. Microsoft DPM hasn't taken the backup world by storm since its introduction in 2005, but each subsequent release has added critical features and application support. Additionally, because it is often bundled in with Microsoft System Center, I expect adoption will increase among small and medium businesses (SMBs) and small and medium enterprises (SMEs).

Read more