Analysts suffer get the benefit of dozens of briefings per year from hopeful vendors trying to convince us that they are the next big thing. Here’s a typical example of marketing-speak messaging that is an amalgam of all the mistakes that will ensure a vendor goes on our "not with a barge pole" list.
“Exvezium is a leading provider of Purchasing and Supply Optimization (PSO) solutions, focused on the automotive, retail, financial services, and government sectors. Customers such as Mutt Publishing, Shania Entertainment, and the Steiner Wig Corporation have chosen Exvezium for its very unique requisition automation, online tendering and award optimization capabilities. Leading analyst firm Milometer classed Exvezium as a Strong Challenger in its Sourcery Square 2009 evaluation.
"The four best practices for implementing PSO are getting executive buy-in, choosing a configurable solution, supporting constraint-based awarding, and maximizing event activity," said CEO, President and Founder Mark Ettingbabble. "Exvezium supports these through our combination of cutting-edge technology and best-in-class services."
What’s wrong with this? Here are my dirty dozen analyst pet hates:
OK, a bit of a stretch here, but I did spend 15 minutes one-on-one with the great hurler last week at the Xerox analyst conference at Citi Field in New York. And thankfully, the Mets were not playing. Tom signed my baseball as I toyed with asking him about Roger Clemens, steroids, and Hall-of-Fame-type questions, and the best I could come up with was simply asking how hard he threw the ball in his prime. He scowled and looked at me as if talking to a 5-year-old and said, "There are three important things about pitching — and yes velocity is one, but location, and the ball's movement are the others, and speed is the least important." So I thought about this, and it occurred to me that we focus on speed — in this case — only because we have radar guns that can measure it well. Movement and location are more difficult, so we just ignore them. And perhaps this is a problem with performance management in business today. We focus not on the more important metrics, but the ones we can conveniently grasp. Contact center call duration, as an example, is much less important than the time or the number of successful customer encounters. So thanks, Tom, for this insight, and perhaps we should spend a bit more time taking an outside in approach to metrics.
I frequently get asked the question of how many databases a DBA typically manages. Over the past five years, I have interviewed hundreds of organizations on this topic, asking them about their ratios and how they improved them. Typically I find that the current industry average is 40 databases to a DBA for large enterprises ($1 billion+ in revenue), with the lowest ratio seen around eight and the highest at 275. So, why this huge variation? There are many factors that I see in customer deployments that contribute to this variation, such as the size of a database, database tools, version of databases, DBA expertise, formalization of database administration, and production versus nonproduction.
This ratio is usually limited by the total size of all databases that a DBA manages. A terabyte-sized database remains difficult to manage compared to a database that's 100 GB in size. Larger databases often require extra tuning, backup, recovery, and upgrade effort. The average database-to-DBA ratio is often constrained by the total size of the databases being managed, which tends to be around five terabytes per DBA. In other words, one DBA can effectively manage 25 databases of 200 GB each or five 1 terabyte databases. And these include production and nonproduction databases.
What are the factors that can help improve the ratio? Cloud, tools, latest DBMS version (automation), and DBMS product used – SQL Server, Oracle, DB2, MySQL, or Sybase. Although most DBMS vendors have improved on manageability over the years, based on customer feedback, Microsoft SQL Server tends to have the best ratios.
I believe that although you should try to achieve the 40:1 ratio and the 5 terabyte cap, consider establishing your own baseline based on the database inventory and DBAs and using that as the basis for improving the ratio over time.
Carrying on from my thoughts in Part 1: It’s time to start deploying purely standards-based infrastructure outside the data center; data center protocols are just starting to be created for a converged and virtualized world. With the amount of tested and deployed standards protocols, there’s no excuse for networks to be locked in to a certain vendor with proprietary protocols when standards-based network solutions provide access to compelling volume economics, flexibility to adapt a much wider array of solutions, and relief from hiring specialized talent to run a science project. Although many organizations understand that standards-based networking provides them with the flexibility to choose from the best available solutions at a lower cost of ownership, they still feel trapped. Listed below are three top shackles and the keys to open them up:
Look for the new "Community" tab on the Forrester site. This is your access to a community of like-minded peers. You can use the community to start and participate in discussions, share ideas and experiences, and help guide Forrester Research for your role. The success or failure of this community effort depends largely on you. The analysts will participate, but in this forum they have less weight than do you, the Forrester I&O user. So help us, help your peers, and help yourself make this an active and thriving online community. Some thoughts to maybe get you going: Have any particularly good or bad experiences with products, solutions or technology? What key enablers are you looking at as you transform your data centers and operations? What does "cloud" mean to you? Any thoughts on vendor management and negotiations? This is just a random stream of consciousness selection. Make the community yours by adding your own topics.
For those of you who have followed my research of the collaboration software space, you'll find that I have argued that the real whitespace for vendors is in facilitating interactions between different companies (see examples here and here). This advice, though, has always been given in the spirit of helping vendors enter the market and tell a differentiated story; my goal is always to get product marketers away from spinning tales of travel savings (which everyone does). Recently, I finished a report that explored why intercompany collaboration is important to the actual running of a tech industry business. Like any good story, it's a three-part narrative:
The definition of a B2B tech customer is changing. There was a time when a tech vendor selling to businesses only had to deal with the IT department. As such, the product design and messaging revolved around fulfilling the requirements of a techie audience: speeds and feeds, interoperability and security. Now? Business leaders are involved in technology decisions, shifting the design points of technology and its marketing to ease of use and ability to solve business problems. Further muddling this view, individual information workers are increasingly able to provision their own hardware and software, thanks to Web-based technologies and consumer technologies -- like Apple laptops and iPhones -- that IT departments are grudgingly accepting. The pull of these many groups on tech vendors has complicated the job of tech product managers and marketers: They now have to develop their product for and market it to a wider range of people with different interests.
There’s been lots going on with what Forrester calls the “interaction-centric customer service vendors”. These are the vendors that manage the high-volume, transaction-oriented relationships — those often encountered in B2C environments, over the multiple communication channels (email, chat, social, phone etc) that exist today.
RightNow announced its RightNow’s CX for Facebook app, to be released in November. This app creates a “Support” tab on a company’s wall and allows users to interact via social and traditional channels right from Facebook. Users can find answers from community content or from the corporate knowledgebase, ask the community questions, follow, participate in, and track discussions, propose an idea, ask an agent (either in a public or a private conversation), and more. It’s a nicely designed app, and something that RightNow needed to release, given the availability of similar ones from eGain, Genesys, Parature, etc.
eGain also solidifies its social footprint by announcing its Social Experience Suite — a customer interaction hub that manages both traditional and social interactions. The new version includes a social-blended agent desktop, a single-sourced knowledgebase across all channels (traditional + social again), and a unified customer record. The version also includes forums and adapters to monitor social networks through integrations with Facebook, Twitter, Google, and Yahoo search.
Forrester has long advocated adoption of a “business technology” approach to replace traditional IT. “BT” recognizes the fundamental role information technology plays in all aspects of business – and the need for business decision-makers to be deeply involved in setting technology strategy, priorities, and even delivering solutions. But how does this tight coupling of business and technology decision-making actually work?
My colleague Alexander Peters and I have just witnessed a situation that illustrates that having the right organizational structure and technology-savvy businesspeople is crucial to a BT transition.
The organization developed an IT strategy 10 years ago based on three best practices:
Major business processes would be implemented on a single, modern, flexible platform.
The platform would employ SOA to ensure that it could adapt to unforeseen needs.
The platform would run in the consolidated, scalable, and efficient data center of a service provider.
Today, the organization has not yet achieved its top goal of a single platform for all of its major processes. It has a new SOA/Java environment, but it processes a little more than half of the required workload. Older systems do the rest. Most disturbing:
The development investment has been many times greater than expected at the outset.
The annual cost of IT operations doubled versus the baseline.
System reliability went down with the new environment.
Technology innovation and business disruption are changing the software market today. Cloud computing is blurring the line between applications and services, and smart solutions are combining hardware with software into new, purpose-engineered solutions. We are happy to announce that we have launched our Forrester Forrsights Software Survey, Q4 2010, to predict and quantify the future of the software market and help IT vendors to tap into the insights from approximately 2,500 IT decision-makers across North America and Western Europe.
The survey will provide insights on the strategic direction and spending plans of enterprises from very small businesses to global enterprises, segmented by industry and country. In comparison with last year’s survey, we significantly boosted the sample size this year for the energy (oil and gas, utilities, and mining) and healthcare industries; we’ll be able to provide an in-depth analysis for these industries along with retail, financial services, high tech, and other industries.
Key themes for this year’s software survey include the following topics:
Cloud computing. Besides a 360-degree overview on current and future adoption rates of software-as-a-service (SaaS) for different software applications, we are going much deeper this year and have asked IT decision-makers about their cloud strategy for application replacement as well as for different data and transaction types.
Integrated information technology. Purpose-engineered solutions combining hardware with software are promising higher performance and faster implementation times. But do IT users really buy into single-vendor strategies?
As Forrester’s EA tools analyst specialist, I am regularly receiving inquiries from EA teams that are encountering trouble choosing the "single repository of truth" for the entire enterprise. Generally, they are oscillating between two products after a long decision process, hesitating in many cases because no one product is able to satisfy all the architects: the EAs, the solution architects, and sometimes the business architects. One product satisfies some architects and not the others, and vice versa; in the end, choosing one single product would not satisfy anyone because for each option that will satisfy a few, some will not use it (generally, for good reason), and it will not give others the information they require to do their job. Therefore, for these EA teams, the dream of getting a "single repository of truth" is becoming a nightmare. I encounter this sort of dilemma in half of the inquiries I receive about EA tools and particularly within the largest companies.
My answers are sometimes difficult for these EA teams to hear:
First: Do all team members agree on EA objectives for the next two to three years? Do all architects know and share the same IT objectives and priorities? If EA and IT objectives/priorities are not clear, it is not surprising that they want different tools, because a universal EA tool does not really exist at this time. The recent document I published about the EA management suite as a third generation of EA tools explains how the most recent two generations complement each other.