The Forrester Wave™: BPM Suites, Q3 2010 — BPM Suites Deliver Broad Support For Business-Led Process Transformation

Clay Richardson

Over the past decade, BPM suites promised to put the business in the driver’s seat for delivering process improvement to the enterprise. However, most of these promises fell flat, relegating the business to participate as backseat drivers directing IT on how best to steer process improvement. 

In the latest update to our BPM suites Forrester Wave report, Forrester evaluated 11 leading vendors against 148 product feature, platform, and market presence criteria. The Forrester Wave provided a head-to-head comparison of which BPM suites best support the needs of comprehensive process improvement programs that demand tight collaboration and coordination across business and IT stakeholders. Here's a sneak peek at the findings from our new report, "The Forrester Wave: Business Process Management Suites, Q3 2010".

  • Time-to-value and fit-to-purpose are top priorities. Process professionals are searching for ways to trim the fat from bloated BPM initiatives and constantly ask about tools and best practices for making BPM leaner and meaner. Leading vendors — like Pega and Appian — are responding to the need for leaner and more fit-to-purpose BPM suites by providing targeted solution frameworks, embedding agile project management features, and delivering highly customizable end user work environments. 
     
Read more

Will Plug-In Hybrids Change The Data Center?

Richard Fichera

In a recent discussion with a group of infrastructure architects, power architecture, especially UPS engineering, was on the table as a topic. There was general agreement that UPS systems are a necessary evil, cumbersome and expensive beasts to put into a DC, and a lot of speculation on alternatives. There was general consensus that the goal was to develop a solution that would be more granular install and deploy and thus allow easier and ad-hoc decisions about which resources to protect, and agreement that battery technologies and current UPS architectures were not optimal for this kind of solution.

So what if someone were to suddenly expand battery technology R&D investment by a factor of maybe 100x of R&D and into battery technology,  expand high-capacity battery production by a giant factor, and drive prices down precipitously? That’s a tall order for today’s UPS industry, but it’s happening now courtesy of the auto industry and the anticipated wave of plug-in hybrid cars. While batteries for cars and batteries for computers certainly have their differences in terms of depth and frequency of charge/discharge cycles, packaging, lifespan, etc, there is little doubt that investments in dense and powerful automotive batteries and power management technology will bleed through into the data center. Throw in recent developments in high-charge capacitors (referred to in the media as “super capacitors”), which add the impedance match between the requirements for spike demands and a chemical battery’s dislike of sudden state changes, and you have all the foundational ingredients for major transformation in the way we think about supplying backup power to our data center components.

Read more

Think You Know About All The Big US Government Regulations Coming Up? All 191 Of Them?

Chris McClean

There has been an interesting PR battle in Washington over the last few weeks about the number of massive regulations still on the administration's agenda. House Minority Leader John Boehner wrote a memo to President Obama citing a list of 191 proposed rules expected to have a more than $100 million impact on the economy (each!) and asking for clarification on the number of these pending rules that would surpass the $1 billion mark. The acting head of the Office of Management and Budget responded, saying that the number of "economically significant bills" passed last year actually represented a downward trend, and the current number on the agenda is more like 13.

For those of you wanting a little more clarification, you can search through the OMB's Unified Agenda and Regulatory Plan by economic significance, key terms, entities affected, and other criteria. Making sense of all of these proposed rules will take time, but it will help you get an idea of issues that your organization may have to face in the near future.

Coincidentally, my latest report, The Regulatory Intelligence Battlefield Heats Up, went live yesterday. In this paper, I offer an overview of different available resources to keep up with new and changing regulations as well as relevant legal guidance.

Read more

IT Spending Rebound Will Mean More Spending On Sustainability Software And Services

Chris Mines

Forrester's latest forecast for the technology economy is bullish, which by extension means good news for providers of software and services focused on improving corporate sustainability.

In our new outlook for IT spending by businesses and governments, we estimate that the market will hit $1.58 trillion in 2010, up almost 8 percent from the depressed 2009 level, and grow by a further 8.4 percent to $1.71 trillion in 2011 (global purchases expressed in U.S. dollars). U.S. government data about the overall economy, and tech vendors' Q1-Q2 financial reports, buttress our expectation that IT spending will growth at more than double the rate of the overall economy in 2010-11 and even beyond. See the details in Andrew Bartels's latest report here.

We expect that some of the prime beneficiaries of this positive outlook for IT spending will be those services and software suppliers that are focused on helping clients improve their sustainability posture. In particular, we are very positive on the outlook for sustainability consulting, and for enterprise carbon and energy management (ECEM) software.

Our research team is working now on reports that will update our outlook and spending forecasts for these two exciting markets. As we work with clients in enterprise IT organizations, it's clear that the "green IT" of yesterday is becoming the "IT for green" of tomorrow; that is, IT organizations and infrastructure are increasingly being deployed to meet the corporatewide sustainability challenge, not just improving IT's own energy efficiency and CO2 footprint.

Read more

Forrester’s Forrsights For Business Technology: We’ve Launched Our 2010 Services Survey

John McCarthy

Without a doubt, the tech industry’s new economics are creating major tumult in the marketplace. “Services,” not products, and “in the cloud,” not on the computer, are just two of the major trends forcing IT services providers to continually predict future market demand and adjust strategy accordingly. More than ever, it’s imperative to understand where firms will rely on third-party providers in the coming year . . . and also where they’ll increase spend.

As you may know, Forrester fields a 20-minute Web survey each year to commercial buyers of enterprise IT services as part of Forrester’s Forrsights for Business Technology (formerly named “Business Data Services”). This year, we’ll continue to collect responses from IT decision-makers at companies with 1,000 or more employees across the US, Canada, France, the UK, and Germany. As we’re designing the survey now, our commitment to strategists is that we’ll write the questions with your underlying need in mind: to predict and quantify tech industry growth and disruption. 

Here are a few new questions you’ll be able to answer with our 2010 data insights:

Which areas of innovation are turned into business- or IT-funded projects? . . . How mature is vendor governance/oversight compared with three years ago? . . . How are firms dealing with the rising influence of Digital Natives? . . . What are the plans, strategies, and barriers for moving from a staff augmentation to a fully managed services model? . . . How will an uptick in selective sourcing strategies translate to you as the service provider tailoring your go-to-market plans according to current customer challenges?

And, of course, we’ll continue to ask traditional questions around services plans, budgets, and preferred vendors. 

Read more

The Need For Speed – GPUs Emerge As Mainstream Accelerators

Richard Fichera

It’s probably fair to say that the computer community is obsessed with speed. After all, our people buy computers to solve problems, and generally the faster the computer, the faster the problem gets solved. The earliest benchmark that I have seen is published in “High Speed Computing Devices, Engineering Resource Devices, McGraw Hill, 1950.” They cite the Marchant desktop calculator as achieving a best-in-class result of 1,350 digits per minute for addition, and the threshold problems then were figuring out how to break down Newton Raphsen equation solvers for maximum computational efficiency. And so the race begins…

Not much has changed since 1950. While our appetites are now expressed in GFLOPs per CPU and TFLOPS per system, users continue to push for escalation of performance in numerically intensive problems. Just as we settled down to a relatively predictable performance model with standard CPUs and cores glued into servers and aggregated into distributed computing architectures of various flavors, along came the notion of attached processors. First appearing in the 1960s and 1970s as attached mainframe vector processors and attached floating point array processors for minicomputers, attached processors have always had a devoted and vocal minority support within the industry. My own brush with them was as a developer using a Floating Point Systems array processor attached to a 32-bit minicomputer to speed up a nuclear reactor core power monitoring application. When all was said and done, the 50X performance advantage of the FPS box had decreased to about 3.5X for the total application. Not bad, but a defeat of expectations. Subsequent brushes with attempts to integrate DSPs with workstations left me a bit jaundiced about the future of attached processors as general purpose accelerators.

Read more

Forrester Inquiries On SaaS Becoming More Sophisticated, More Around Organizational Design And Skill Sets

Liz Herbert

Forrester received more than 1,000 inquiries on SaaS and cloud services to date in 2010. With SaaS gaining maturity and even becoming the more common way to deploy software in some categories, firms are increasingly opting for SaaS solutions in place of packaged apps.

With the growing uptake of SaaS, Forrester has seen a change in the nature of questions about SaaS. Firms are not only asking basics around the whens and whys of SaaS but they are also asking more strategic questions around SaaS sourcing and SaaS vendor management, as well as how to set up organizational structure and hire the right skills to succeed with SaaS deployments.

Stay tuned for the full analysis of Forrester's SaaS inquiry data for the first half of 2010, to be published shortly.

Also, for anyone interested in a more in-depth analysis of SaaS and cloud services trends and best practices, we are hosting our first full-day workshop on the topic in Forrester’s Cambridge, Mass., headquarters on September 16. For more details about this event, please click here.

Please share your thoughts and connect with me on Twitter @lizherbert.

Liz

App Development Managers Should Care About Oracle’s Suit Against Google

John R. Rymer

As much fun as the juicy details of the Oracle-Google lawsuit are, the meaning of the suit for enterprise application development managers is, well, philosophical. Aside from sweating over the legal status of your Android phone (if you own one), the lawsuit won’t create drama for your shop. But the long-term implications are serious. Henceforth, Java will be a marching band rather than a jazz collective. Oracle’s action will reduce the independent innovation that has made Java what it is, causing developers to seek new ideas from sources outside of Java. Your Java strategy, as a result, will get more complicated.

A little background: Since the late ’90s, the primary source of Java innovation has been open source projects that either fix Java limitations or provide low-cost alternatives to vendor products. But Java’s position as a wellspring of innovation has been declining in recent years as many Web developers shifted their attention to dynamic languages, pure Web protocols, XML programming, and other new ideas. This trend has been particularly pronounced in the client tier for Web applications, where alternative rich Internet application technologies including Ajax frameworks like Dojo and container-based platforms like Adobe Flash/Flex have replaced client-side Java. Java virtual machines are a foundation of these efforts, but the enterprise and mobile Java platforms are not.

In choosing Java’s future course, Oracle had two philosophies to choose from.

Read more

Forrester Groundswell Awards: Less Than Two Weeks Left To Submit Your Entry

Peter Burris

The deadline to submit your entry into the Forrester Groundswell Awards is on August 27, just two weeks away. The submissions we received last year, which we wrote up in this Forrester report, provided invaluable assistance to Forrester clients seeking ways to optimize Groundswell-related investments.

We hope you’ll participate this year as well. Josh Bernoff, one of the authors of Groundswell, just posted his advice on how to create a great entry. I have reposted it below for our technology industry clients:

______________________________________________

If you haven't entered yet but plan to, this advice is for you. (If you just want to see other people's entries, click on the items at the left of the Awards site.)

Read more

Preview Of PCI DSS 1.3 – Oops 2.0 – Released

John Kindervag

The PCI Security Standards Council released the summary of changes for the new version of PCI — 2.0.  Merchants, you can quit holding your breath as this document is a yawner — as we’ve long suspected it would be.  In fact, to call it 2.0 is a real stretch as it seems to be filled — as promised by earlier briefings with the PCI SSC — merely with additional guidance and clarifications. Jeff, over at the PCI Guru, has a great review of the summary doc so I won’t try to duplicate his detailed analysis. The most helpful part of the doc is an acknowledgement that more guidance on virtualization — the one function per server stuff — will finally be addressed.

Suffice it to say, it doesn’t look good for all those DLP vendors looking for Santa Compliance to leave them a little gift under the tree this year. I’ve been hearing hopeful rumors (that I assume start within the bowels of DLP vendor marketing departments) that PCI would require DLP in the next version.  Looks like it’s going to be a three year wait to see if Santa will finally stop by their house.

Remember that this is a summary of changes so there’s not that much meat yet. The actual standard will be pre-released early next month with the final standard coming out after the European Community Meeting in October.