Over the past several months, I've been receiving a lot of questions about replication for continuity and recovery. One thing I've noticed, however, is that there is a lot of confusion around replication and its uses. To combat this, my colleague Stephanie Balaouras and I recently put out a research report called "The Past, Present, And Future Of Replication" where we outlined the different types of replication and their use cases. In addition to that, I thought it would be good to get some of the misconceptions about replication cleared up:
Myth: Replication is the same as high availability Reality: Replication can help to enable high availability and disaster recovery, but it is not a solution in and of itself. In the case of an outage, simply having another copy of the data at an alternate site isn't going to help if you don't have a failover strategy or solution. Some host-based replication products come with integrated failover and failback capabilities.
Myth: Replication is too expensive Reality: It's true that traditionally array-based replication has been expensive due to the fact that it requires like-to-like storage and additional licensing fees. However, two factors have mitigated this expense: 1) several storage vendors are no longer charging an extra licensing fee for replication; and 2) there are several alternatives to array-based replication that allow you to use heterogeneous storage and come at a significantly lower acquisition cost. Replication products fall into one of four categories (roughly from most to least expensive):
There has been an interesting PR battle in Washington over the last few weeks about the number of massive regulations still on the administration's agenda. House Minority Leader John Boehner wrote a memo to President Obama citing a list of 191 proposed rules expected to have a more than $100 million impact on the economy (each!) and asking for clarification on the number of these pending rules that would surpass the $1 billion mark. The acting head of the Office of Management and Budget responded, saying that the number of "economically significant bills" passed last year actually represented a downward trend, and the current number on the agenda is more like 13.
For those of you wanting a little more clarification, you can search through the OMB's Unified Agenda and Regulatory Plan by economic significance, key terms, entities affected, and other criteria. Making sense of all of these proposed rules will take time, but it will help you get an idea of issues that your organization may have to face in the near future.
Coincidentally, my latest report, The Regulatory Intelligence Battlefield Heats Up, went live yesterday. In this paper, I offer an overview of different available resources to keep up with new and changing regulations as well as relevant legal guidance.
I get a lot of inquiries asking me to name the best CRM professional services providers (PSPs). Business and IT managers worry about the cost and risk of failure when engaging consultants and systems integrators to improve the performance of their mission-critical customer-facing business processes.
Organizations entrust PSPs with important tasks – not just "screwing in software.” In a survey of 119 companies I did a few years ago: nearly 28% used PSPs to help develop their strategic vision for CRM, 42% used PSPs for defining business objectives for CRM, 44% for aligning business processes with the CRM strategy, and 56% to define the conceptual design for CRM technology solutions. PSPs were used by 60% of enterprises to establish detailed design requirements and by 64% to implement CRM solutions.
However, there are huge risks to working productively with CRM consulting or systems integration providers. In the same study, I found that four out of 10 would not recommend their CRM PSP to others after the work was completed.
I recommend that you use 12 evaluation criteria to increase your odds of success. How well does your CRM PSP stack up against these standards?
Demonstrable knowledge of the technical characteristics of CRM applications. This is the most important of the 12 criteria. Business and IT executives expect their PSP to bring an expert understanding of the specific CRM applications and related technologies to the projects they are engaged to support.
Demonstrable knowledge of the requirements of the industry. Organizations expect their CRM PSP to have a deep knowledge of the business challenges in their industry and insight into unique sector characteristics and to be familiar with industry jargon and culture.
This morning Intel announced plans to buy security vendor McAfee for $7.7 billion, valuing the company at a 60% premium over their market cap as of closing-time yesterday. The valuation is about 5 times the last trailing four quarters’ revenues, which is about typical for M&A deals in the security industry, and it suggests that both parties negotiated well. The price is not so high that it makes Intel look like Daddy Warbucks, but not so low that it looks like McAfee was desperate to sell.
But of course “a not so high price” is all relative. Nearly $8 billion is a lot of money. What on earth does Intel expect to get for all of the money it is spending on McAfee? I’ve been scratching my head over this, and despite McAfee CTO George Kurtz’ helpful blog post, I am still struggling to figure this one out. Let’s look at some of the stated rationales for the deal:
Forrester's latest forecast for the technology economy is bullish, which by extension means good news for providers of software and services focused on improving corporate sustainability.
In our new outlook for IT spending by businesses and governments, we estimate that the market will hit $1.58 trillion in 2010, up almost 8 percent from the depressed 2009 level, and grow by a further 8.4 percent to $1.71 trillion in 2011 (global purchases expressed in U.S. dollars). U.S. government data about the overall economy, and tech vendors' Q1-Q2 financial reports, buttress our expectation that IT spending will growth at more than double the rate of the overall economy in 2010-11 and even beyond. See the details in Andrew Bartels's latest report here.
We expect that some of the prime beneficiaries of this positive outlook for IT spending will be those services and software suppliers that are focused on helping clients improve their sustainability posture. In particular, we are very positive on the outlook for sustainability consulting, and for enterprise carbon and energy management (ECEM) software.
Our research team is working now on reports that will update our outlook and spending forecasts for these two exciting markets. As we work with clients in enterprise IT organizations, it's clear that the "green IT" of yesterday is becoming the "IT for green" of tomorrow; that is, IT organizations and infrastructure are increasingly being deployed to meet the corporatewide sustainability challenge, not just improving IT's own energy efficiency and CO2 footprint.
Business users often view IT SVM as a bottleneck to getting new technologies that will make them more personally and professionally productive and that also will help ensure their firm’s competitiveness. Affordable mobile technologies like smartphones (Research In Motion’s BlackBerry, Apple’s iPhone, and Google’ Android are the most popular operating systems), cellular data air cards, network-connected PCs and tablets, etc. make it a lot easier (and hence more tempting than ever) to bypass IT’s seemingly archaic and slow-moving sourcing practices.
As a result, employees are increasingly bringing their own technology into the workplace. Forrester calls this trend “tech populism,” and it’s quickly gaining momentum. It might be a personal laptop sitting on the desk next to the corporate PC, or a personal smartphone that’s loaded up with self-purchased applications and freeware, either or both connecting to the employer’s guest WiFi Internet connection.
Additionally, more and more, traveling employees are leaving their business laptop at the office or at home -- relying instead either on their smartphone for simple email access during one- or two-day trips, complemented by personal thumbdrive memory sticks onto which reference and presentation documents have been downloaded and will be viewed (and possibly manipulated or transferred between multiple noncorporate devices) using a hotel, public kiosk, customer, colleague, or partner’s PC. So much for IT’s security policy enforcement! If a file or document can be downloaded onto a personal external drive, what happens to it afterward is at the discretion of the employee –- who after all usually is just trying to be more productive while taking better advantage of easier-to-use hardware and software tools than those provided by the company.
Without a doubt, the tech industry’s new economics are creating major tumult in the marketplace. “Services,” not products, and “in the cloud,” not on the computer, are just two of the major trends forcing IT services providers to continually predict future market demand and adjust strategy accordingly. More than ever, it’s imperative to understand where firms will rely on third-party providers in the coming year . . . and also where they’ll increase spend.
As you may know, Forrester fields a 20-minute Web survey each year to commercial buyers of enterprise IT services as part of Forrester’s Forrsights for Business Technology (formerly named “Business Data Services”). This year, we’ll continue to collect responses from IT decision-makers at companies with 1,000 or more employees across the US, Canada, France, the UK, and Germany. As we’re designing the survey now, our commitment to strategists is that we’ll write the questions with your underlying need in mind: to predict and quantify tech industry growth and disruption.
Here are a few new questions you’ll be able to answer with our 2010 data insights:
Which areas of innovation are turned into business- or IT-funded projects? . . . How mature is vendor governance/oversight compared with three years ago? . . . How are firms dealing with the rising influence of Digital Natives? . . . What are the plans, strategies, and barriers for moving from a staff augmentation to a fully managed services model? . . . How will an uptick in selective sourcing strategies translate to you as the service provider tailoring your go-to-market plans according to current customer challenges?
And, of course, we’ll continue to ask traditional questions around services plans, budgets, and preferred vendors.
In a dramatic move this morning, Dell announced the intention to purchase the innovative vendor of enterprise storage 3PAR for $1.15 billion in cash. If there were any doubts remaining about Dell’s commitment to be a force in the storage market alongside of EMC, IBM, HDS et al., this deal should put them to rest. Dell acquired the iSCSI storage array vendor EqualLogic in November of 2007, clustered NAS vendor Exanet in February of this year, and most recently they bought data deduplication vendor Ocarina this past July, as well as putting together a partnership with object storage vendor Caringo. Clearly these are a significant list of deals, but the strategy was incomplete without an enterprise class primary storage system of their own. 3PAR, whose products generally compete with high-end systems in terms of performance and availability, will give Dell the ammunition they need to go head-to-head with the big guys.
Dell has cultivated a relatively successful partnership with EMC for mid-range and enterprise storage for some years, but in spite of Dell’s claim to be invested in that relationship going forward, this deal clearly puts pressure on it. Initially, there is a gap between the SMB focused EqualLogic products and the high-end offerings from 3PAR, which will be filled by the Clariion products from EMC, but in the long run, Dell is likely to be motivated to move out of EMC’s shadow and build its storage brand on proprietary products based on these acquisitions.
This deal ends a good deal of speculation about who might buy 3PAR, with HP the main alternative suitor. HP now faces a build or buy decision as it continues to try to redefine itself in storage amidst a patchwork of the aging EVA platform, partner technology from Hitachi on the high-end, and acquisitions in iSCSI and clustered file storage, but no clearly defined long term vision or anchor technology.
It’s probably fair to say that the computer community is obsessed with speed. After all, our people buy computers to solve problems, and generally the faster the computer, the faster the problem gets solved. The earliest benchmark that I have seen is published in “High Speed Computing Devices, Engineering Resource Devices, McGraw Hill, 1950.” They cite the Marchant desktop calculator as achieving a best-in-class result of 1,350 digits per minute for addition, and the threshold problems then were figuring out how to break down Newton Raphsen equation solvers for maximum computational efficiency. And so the race begins…
Not much has changed since 1950. While our appetites are now expressed in GFLOPs per CPU and TFLOPS per system, users continue to push for escalation of performance in numerically intensive problems. Just as we settled down to a relatively predictable performance model with standard CPUs and cores glued into servers and aggregated into distributed computing architectures of various flavors, along came the notion of attached processors. First appearing in the 1960s and 1970s as attached mainframe vector processors and attached floating point array processors for minicomputers, attached processors have always had a devoted and vocal minority support within the industry. My own brush with them was as a developer using a Floating Point Systems array processor attached to a 32-bit minicomputer to speed up a nuclear reactor core power monitoring application. When all was said and done, the 50X performance advantage of the FPS box had decreased to about 3.5X for the total application. Not bad, but a defeat of expectations. Subsequent brushes with attempts to integrate DSPs with workstations left me a bit jaundiced about the future of attached processors as general purpose accelerators.
Forrester received more than 1,000 inquiries on SaaS and cloud services to date in 2010. With SaaS gaining maturity and even becoming the more common way to deploy software in some categories, firms are increasingly opting for SaaS solutions in place of packaged apps.
With the growing uptake of SaaS, Forrester has seen a change in the nature of questions about SaaS. Firms are not only asking basics around the whens and whys of SaaS but they are also asking more strategic questions around SaaS sourcing and SaaS vendor management, as well as how to set up organizational structure and hire the right skills to succeed with SaaS deployments.
Stay tuned for the full analysis of Forrester's SaaS inquiry data for the first half of 2010, to be published shortly.
Also, for anyone interested in a more in-depth analysis of SaaS and cloud services trends and best practices, we are hosting our first full-day workshop on the topic in Forrester’s Cambridge, Mass., headquarters on September 16. For more details about this event, please click here.
Please share your thoughts and connect with me on Twitter @lizherbert.