I frequently get asked the question of how many databases a DBA typically manages. Over the past five years, I have interviewed hundreds of organizations on this topic, asking them about their ratios and how they improved them. Typically I find that the current industry average is 40 databases to a DBA for large enterprises ($1 billion+ in revenue), with the lowest ratio seen around eight and the highest at 275. So, why this huge variation? There are many factors that I see in customer deployments that contribute to this variation, such as the size of a database, database tools, version of databases, DBA expertise, formalization of database administration, and production versus nonproduction.
This ratio is usually limited by the total size of all databases that a DBA manages. A terabyte-sized database remains difficult to manage compared to a database that's 100 GB in size. Larger databases often require extra tuning, backup, recovery, and upgrade effort. The average database-to-DBA ratio is often constrained by the total size of the databases being managed, which tends to be around five terabytes per DBA. In other words, one DBA can effectively manage 25 databases of 200 GB each or five 1 terabyte databases. And these include production and nonproduction databases.
What are the factors that can help improve the ratio? Cloud, tools, latest DBMS version (automation), and DBMS product used – SQL Server, Oracle, DB2, MySQL, or Sybase. Although most DBMS vendors have improved on manageability over the years, based on customer feedback, Microsoft SQL Server tends to have the best ratios.
I believe that although you should try to achieve the 40:1 ratio and the 5 terabyte cap, consider establishing your own baseline based on the database inventory and DBAs and using that as the basis for improving the ratio over time.
Carrying on from my thoughts in Part 1: It’s time to start deploying purely standards-based infrastructure outside the data center; data center protocols are just starting to be created for a converged and virtualized world. With the amount of tested and deployed standards protocols, there’s no excuse for networks to be locked in to a certain vendor with proprietary protocols when standards-based network solutions provide access to compelling volume economics, flexibility to adapt a much wider array of solutions, and relief from hiring specialized talent to run a science project. Although many organizations understand that standards-based networking provides them with the flexibility to choose from the best available solutions at a lower cost of ownership, they still feel trapped. Listed below are three top shackles and the keys to open them up:
Look for the new "Community" tab on the Forrester site. This is your access to a community of like-minded peers. You can use the community to start and participate in discussions, share ideas and experiences, and help guide Forrester Research for your role. The success or failure of this community effort depends largely on you. The analysts will participate, but in this forum they have less weight than do you, the Forrester I&O user. So help us, help your peers, and help yourself make this an active and thriving online community. Some thoughts to maybe get you going: Have any particularly good or bad experiences with products, solutions or technology? What key enablers are you looking at as you transform your data centers and operations? What does "cloud" mean to you? Any thoughts on vendor management and negotiations? This is just a random stream of consciousness selection. Make the community yours by adding your own topics.
For those of you who have followed my research of the collaboration software space, you'll find that I have argued that the real whitespace for vendors is in facilitating interactions between different companies (see examples here and here). This advice, though, has always been given in the spirit of helping vendors enter the market and tell a differentiated story; my goal is always to get product marketers away from spinning tales of travel savings (which everyone does). Recently, I finished a report that explored why intercompany collaboration is important to the actual running of a tech industry business. Like any good story, it's a three-part narrative:
The definition of a B2B tech customer is changing. There was a time when a tech vendor selling to businesses only had to deal with the IT department. As such, the product design and messaging revolved around fulfilling the requirements of a techie audience: speeds and feeds, interoperability and security. Now? Business leaders are involved in technology decisions, shifting the design points of technology and its marketing to ease of use and ability to solve business problems. Further muddling this view, individual information workers are increasingly able to provision their own hardware and software, thanks to Web-based technologies and consumer technologies -- like Apple laptops and iPhones -- that IT departments are grudgingly accepting. The pull of these many groups on tech vendors has complicated the job of tech product managers and marketers: They now have to develop their product for and market it to a wider range of people with different interests.
Earlier this year I told you the story of a business executive who told us how critical is that business — not IT — drive process improvement initiatives. Here is another interesting case my colleague VP and Principal Analyst John Rymer and I have just witnessed.
It is the story of a business organization that developed an IT strategy based on three best practices:
The core business processes would be implemented on a single modern, flexible platform.
The platform would be service-oriented to ensure clear accountabilities and flexibility for future needs.
The platform development and operations would be outsourced to a shared services provider.
We reviewed the strategy 10 years after it was conceived to find out that it has not yet achieved its top strategic goal. More disturbing:
The development investment has been far greater than expected at the outset.
The annual cost of IT operations doubled versus the base line.
The reliability of the processes converted to the new environment went down.
There’s been lots going on with what Forrester calls the “interaction-centric customer service vendors”. These are the vendors that manage the high-volume, transaction-oriented relationships — those often encountered in B2C environments, over the multiple communication channels (email, chat, social, phone etc) that exist today.
RightNow announced its RightNow’s CX for Facebook app, to be released in November. This app creates a “Support” tab on a company’s wall and allows users to interact via social and traditional channels right from Facebook. Users can find answers from community content or from the corporate knowledgebase, ask the community questions, follow, participate in, and track discussions, propose an idea, ask an agent (either in a public or a private conversation), and more. It’s a nicely designed app, and something that RightNow needed to release, given the availability of similar ones from eGain, Genesys, Parature, etc.
eGain also solidifies its social footprint by announcing its Social Experience Suite — a customer interaction hub that manages both traditional and social interactions. The new version includes a social-blended agent desktop, a single-sourced knowledgebase across all channels (traditional + social again), and a unified customer record. The version also includes forums and adapters to monitor social networks through integrations with Facebook, Twitter, Google, and Yahoo search.
Forrester has long advocated adoption of a “business technology” approach to replace traditional IT. “BT” recognizes the fundamental role information technology plays in all aspects of business – and the need for business decision-makers to be deeply involved in setting technology strategy, priorities, and even delivering solutions. But how does this tight coupling of business and technology decision-making actually work?
My colleague Alexander Peters and I have just witnessed a situation that illustrates that having the right organizational structure and technology-savvy businesspeople is crucial to a BT transition.
The organization developed an IT strategy 10 years ago based on three best practices:
Major business processes would be implemented on a single, modern, flexible platform.
The platform would employ SOA to ensure that it could adapt to unforeseen needs.
The platform would run in the consolidated, scalable, and efficient data center of a service provider.
Today, the organization has not yet achieved its top goal of a single platform for all of its major processes. It has a new SOA/Java environment, but it processes a little more than half of the required workload. Older systems do the rest. Most disturbing:
The development investment has been many times greater than expected at the outset.
The annual cost of IT operations doubled versus the baseline.
System reliability went down with the new environment.
Technology innovation and business disruption are changing the software market today. Cloud computing is blurring the line between applications and services, and smart solutions are combining hardware with software into new, purpose-engineered solutions. We are happy to announce that we have launched our Forrester Forrsights Software Survey, Q4 2010, to predict and quantify the future of the software market and help IT vendors to tap into the insights from approximately 2,500 IT decision-makers across North America and Western Europe.
The survey will provide insights on the strategic direction and spending plans of enterprises from very small businesses to global enterprises, segmented by industry and country. In comparison with last year’s survey, we significantly boosted the sample size this year for the energy (oil and gas, utilities, and mining) and healthcare industries; we’ll be able to provide an in-depth analysis for these industries along with retail, financial services, high tech, and other industries.
Key themes for this year’s software survey include the following topics:
Cloud computing. Besides a 360-degree overview on current and future adoption rates of software-as-a-service (SaaS) for different software applications, we are going much deeper this year and have asked IT decision-makers about their cloud strategy for application replacement as well as for different data and transaction types.
Integrated information technology. Purpose-engineered solutions combining hardware with software are promising higher performance and faster implementation times. But do IT users really buy into single-vendor strategies?
As Forrester’s EA tools analyst specialist, I am regularly receiving inquiries from EA teams that are encountering trouble choosing the "single repository of truth" for the entire enterprise. Generally, they are oscillating between two products after a long decision process, hesitating in many cases because no one product is able to satisfy all the architects: the EAs, the solution architects, and sometimes the business architects. One product satisfies some architects and not the others, and vice versa; in the end, choosing one single product would not satisfy anyone because for each option that will satisfy a few, some will not use it (generally, for good reason), and it will not give others the information they require to do their job. Therefore, for these EA teams, the dream of getting a "single repository of truth" is becoming a nightmare. I encounter this sort of dilemma in half of the inquiries I receive about EA tools and particularly within the largest companies.
My answers are sometimes difficult for these EA teams to hear:
First: Do all team members agree on EA objectives for the next two to three years? Do all architects know and share the same IT objectives and priorities? If EA and IT objectives/priorities are not clear, it is not surprising that they want different tools, because a universal EA tool does not really exist at this time. The recent document I published about the EA management suite as a third generation of EA tools explains how the most recent two generations complement each other.
Like thousands of Oracle clients and a dozen or so Forrester analysts, I was at Oracle OpenWorld last week. One of the big news items was the announcement of the availability of Fusion Applications. The creation of these new applications has been a massive effort, involving many of Oracle’s top software designers and developers working for over five years. My preliminary opinion, along with my colleagues, is that Fusion apps do have some useful new features and a better user interface than prior Oracle products, as well as providing a more credible SaaS option than Oracle's prior On Demand offerings.
However, there seems to me to be a lack of clarity as to how Fusion apps fit in the evolution of the Oracle family of apps. To its credit, Oracle has stated that it is going to be responsive to clients, not forcing them to convert to Fusion nor make staying on existing apps unattractive by not supporting and enhancing those apps. Instead, it wants to make Fusion apps so attractive that clients will want to adopt them, either (rarely) as a whole suite or (more likely) as step-by-step replacement or additions to existing app products. Still, that leaves unclear what Oracle sees as the endgame for Fusion vs. its other app products.
As I see it, there are four scenarios for how Fusion apps will relate over time to the existing portfolio of apps that Oracle has acquired and continues to support through its Applications Unlimited position:
Fusion apps take over and replace the other applications over time.
Fusion apps become yet another app product line, which co-exists with the other apps.
Fusion app features and functions percolate into and are absorbed into the other apps, which persist indefinitely.
Fusion apps provide new categories of applications, which get brought into the other app families as add-ons.