Does The Good Old 80/20 Rule Work For Estimating BI Costs?

Boris Evelson

I get tons of questions about "how much it costs to develop an analytical application." Alas, as most of us unfortunately know, the only real answer to that question is “it depends.” It depends on the scope, requirements, technology used, corporate culture and at least a few dozen of more dimensions. However, at the risk of a huge oversimplification, in many cases we can often apply the good old 80/20 rule as follows:

Components

  • ~20% for software, hardware, and other data center and communications infrastructure
  • ~80% for full time employees, outside services (analysis, design, coding, testing, integration, implementation, etc), new processes, new initiatives (governance, change management, training)

Initial softare costs (~80%) vs. Ongoing software license maintenance costs (~20% / year)

Direct (~20%) vs. Indirect costs (~80%). Here are some examples:

Direct ~20%

  • Data integration for reporting and analysis
  • Data cleansing processes for reporting and analysis
  • Reporting and analytical data bases such as Data Warehouses, Data Marts
  • Reporting / querying / dashboards
  • OLAP (Online Analytical Processing)
  • Analytical MDM (Master Data Management)
  • Analytical metadata management
  • Data mining, predictive analytics
  • BI specific  SOA (Services Oriented Architecture) or other types of EAI (Enterprise Application Integration)
Read more

IBM Is Innovating In Servers Again

James Staten

It’s good to see IBM has returned to the world of x86 server innovation with its latest eX5 line of servers announced this week.

Read more

Intel vs. AMD still isn’t a fair fight

James Staten

There are more hindrances to AMD’s ability to penetrate the market with its Opteron CPUs; and Intel’s not a fault this time. In an earlier blog post on the AMD-Intel settlement I brought up an example of a type of incompatibility that exists between the two CPU makers that isn’t covered by the settlement – live migration of virtual machines. There’s more to this story.

Read more

Cloud Is Defined, Now Stop the Cloudwashing

James Staten

James Staten

This blog post is a response to an article by Alex Williams on ReadWriteWeb. Thanks for the shout out, Alex, and for bringing more attention to the contentious issue of cloud computing definitions. While Forrester research reports are created exclusively for our clients, our definition is freely available:

A standardized IT capability (services, software, or infrastructure) delivered via Internet technologies in a pay-per-use, self-service way.

Read more

Cloud Computing Belongs On Your 3-Year Roadmap

James Staten

James Staten

Welcome to the fourth quarter of 2009; what we at Forrester call planning season for most IT departments. In a typical year, this is the time that infrastructure and operations professionals spend lots of cycles burning through what remains of the 2009 budget and building plans for investment in 2010 with the hope of gathering a bit more budget than last year. Of course this is no ordinary year. Economists and financial prognosticators, like our own Andrew Bartels are predicting a long recovery from the recession and further delays in IT spending. That means another year of your infrastructure getting older. There’s two ways of looking at this problem and thus your budget proposals for 2010:

Read more

The Role Of IT Operations In Archiving

Stephanie Balaouras

Stephanie Balaouras

Yesterday IBM announced the availability of their new IBM Information Archive Appliance. The appliance replaces IBM’s DR550. The new appliance has significantly increased scale and performance because it’s built on IBM’s Global Parallel File System (GPFS), more interfaces (NAS and an API to Tivoli Storage Manager) and accepts information from multiple sources – IBM content management and archiving software and eventually 3rd party software. Tivoli Storage Manager (TSM) is embedded in the appliance to provide automated tiered disk and tape storage as well as block-level deduplication. TSM’s block-level deduplication will reduce storage capacity requirements and its disk and tape management capabilities will let IT continue to leverage tape for long-term data retention. All these appliance subcomponents are transparent to the IT end user who manages the appliance – he or she just sees one console where they define collections and retention policies for those collections.

Read more

Categories:

When is a VMware cloud a vCloud? Let the API wars begin

James Staten

James StatenAttention enterprises Pop Quiz: If your favorite hosting provider launches a cloud service that supports VMware vSphere and is part of the VMware vCloud initative, are they providing you with the rich vCloud functionality VMware is touting at VMworld this week?

Read more

Xen.org: Stop the Forking. Forrester: About Time

James Staten

Xen Xen.org, the open source community behind the leading IaaS cloud computing hypervisor finally made a bold move today by stepping up to the plate of delivering a complete open source virtual infrastructure for cloud platforms. Prior to this release, Xen.org had been content to manage and maintain the core Xen hypervisor and let its partners all build solutions around it. The problem with this approach was that while the hypervisor itself was compatible between these solutions the infrastructure and how you managed it were not.

Read more

Cloud DR Services Are Real

Stephanie Balaouras

Stephanie Balaouras

There is a lot of hype surrounding cloud and I'm usually not one to join the hype but in the case of cloud-based backup and disaster recovery services (I'm trying to use the IT service continuity but it hasn't caught on yet), these service are available today and they address major pain points in IT operations and organizations of all sizes can leverage these services, not just small and medium businesses.

Storage-as-a-Service is relatively new. Today the main value proposition is as a cloud target for on-premise deployments of backup and archiving software. If you have a need to retain data for extended periods of time (1 year plus in most cases) tape is still the more cost effective option given it's low capital acquisition cost and removability. If you have long term data retention needs and you want to eliminate tape, that's where a cloud storage target comes in. Electronically vault that data to a storage-as-service provider who can store that data at cents per GB. You just can't beat the economies of scale these providers are able to achieve.

If you're a small business and you don't have the staff to implement and manage a backup solution or if you're an enterprise and you're looking for a PC backup or a remote office backup solution, I think it's worthwhile to compare the three year total cost of ownership of an on-premise solution versus backup-as-a-service.

Read more

Deduplication Market Undergoes Rapid Changes

Stephanie Balaouras

Stephanie Balaouras In May, I blogged about NetApp's announced acquisition of deduplication pionneer, Data Domain. The announcement triggered an unsolicted counter-offer from EMC, followed by another counter from NetApp. But after a month of offers, counter-offers and regulatory reviews, EMC ultimately outbid NetApp with an all cash offer of $2.1 billion. I believe that Data Domain would have been a better fit in the current NetApp portfolio; it would have been easier for NetApp to reposition its current VTL as a better fit for large enterprises that still planned to leverage tape. It's also said that more than half of Data Domain's current employees are former NetApp employees so there would have been a clear cultural fit as well.

 

For $2.1 billion, EMC gets Data Domain's more than 3000 customers and 8000 installs but it also gets a product that in my opinion, overlaps with its current Quantum-based disk libraries, the DL1500 and DL3000. In Forrester inquiries and current consulting engagements, Data Domain is regularly up against the EMC DL1500 and DL3000. EMC will need to quickly explain to customers how it plans to position its new Data Domain offerings with its current DL family, both the Quantum- and Falconstor-based DLs as well as its broader data protection portoflio that includes Networker and Avamar - which also offer deduplication.

Read more

Categories: