The concept of traditional TV watching needs to be redefined. The TV has evolved from a passive device with a single content source to a simple, large-display screen on which numerous activities come together. In the past, consumers had a TV device (typically the set-top box), a movie device (typically the DVD player), and possibly a gaming device connected to the TV. But now, the game console streams movies, the set-top box records TV, and the PC does everything. The next step is connected TVs — HDTVs that incorporate a direct connection to the Internet, whether wired or wireless.
Connected TVs are a big deal to manufacturers, but Forrester Technographics® data found that consumers are struggling to understand the benefits. One of the challenges that manufacturers of these TVs face is that when viewers are in "TV mode," they seem unable to imagine doing anything with these TVs other than watching more TV. When questioned, people appear to want one thing: more video. The top responses focus on getting access to well-known sources of TV shows and movies like Netflix, Blockbuster, or regular broadcast and cable networks, both for men and women.
I frequently get asked the question of how many databases a DBA typically manages. Over the past five years, I have interviewed hundreds of organizations on this topic, asking them about their ratios and how they improved them. Typically I find that the current industry average is 40 databases to a DBA for large enterprises ($1 billion+ in revenue), with the lowest ratio seen around eight and the highest at 275. So, why this huge variation? There are many factors that I see in customer deployments that contribute to this variation, such as the size of a database, database tools, version of databases, DBA expertise, formalization of database administration, and production versus nonproduction.
This ratio is usually limited by the total size of all databases that a DBA manages. A terabyte-sized database remains difficult to manage compared to a database that's 100 GB in size. Larger databases often require extra tuning, backup, recovery, and upgrade effort. The average database-to-DBA ratio is often constrained by the total size of the databases being managed, which tends to be around five terabytes per DBA. In other words, one DBA can effectively manage 25 databases of 200 GB each or five 1 terabyte databases. And these include production and nonproduction databases.
What are the factors that can help improve the ratio? Cloud, tools, latest DBMS version (automation), and DBMS product used – SQL Server, Oracle, DB2, MySQL, or Sybase. Although most DBMS vendors have improved on manageability over the years, based on customer feedback, Microsoft SQL Server tends to have the best ratios.
I believe that although you should try to achieve the 40:1 ratio and the 5 terabyte cap, consider establishing your own baseline based on the database inventory and DBAs and using that as the basis for improving the ratio over time.
Carrying on from my thoughts in Part 1: It’s time to start deploying purely standards-based infrastructure outside the data center; data center protocols are just starting to be created for a converged and virtualized world. With the amount of tested and deployed standards protocols, there’s no excuse for networks to be locked in to a certain vendor with proprietary protocols when standards-based network solutions provide access to compelling volume economics, flexibility to adapt a much wider array of solutions, and relief from hiring specialized talent to run a science project. Although many organizations understand that standards-based networking provides them with the flexibility to choose from the best available solutions at a lower cost of ownership, they still feel trapped. Listed below are three top shackles and the keys to open them up:
Our internal deadlines are looming for Forrester’s Marketing And Strategy Forum EMEA 2010, to be held in London on November 18 and 19. Pretty soon, all of our presentations have to be reviewed, content-edited and fact-checked, and then submitted. In case you hadn’t noticed, we have put together a special track at this event for marketing professionals in the tech industry; this runs on the Friday from 11:40 till 15:30. I will kick off and moderate this “conference within a conference,” where we will explore the idea that tech industry marketing should no longer be communicating product differentiation; it should be the difference. As technology becomes commoditized, customers take control of the vendor-user interaction, and social media becomes a standard interaction channel, marketing must move its contribution from just educating customers and persuading them to accept the product to a more strategic role of enabling interactions with customers to solve their problems -- an engagement model that Forrester calls "customer enablement."
We will also be talking about community marketing, marketing in a global economy, and aligning sales and marketing. Some of the presentations are based on our previous Marketing Forum held in Los Angeles back in March. But I have cajoled my colleagues into making sure that they illustrate their presentations with EMEA-based case studies and examples. I have been particularly energized to do this in the past few weeks as I have been looking forward to attending the bi-annual Ryder Cup golf contest between the USA and Europe, held this weekend in my home town in Wales.
Look for the new "Community" tab on the Forrester site. This is your access to a community of like-minded peers. You can use the community to start and participate in discussions, share ideas and experiences, and help guide Forrester Research for your role. The success or failure of this community effort depends largely on you. The analysts will participate, but in this forum they have less weight than do you, the Forrester I&O user. So help us, help your peers, and help yourself make this an active and thriving online community. Some thoughts to maybe get you going: Have any particularly good or bad experiences with products, solutions or technology? What key enablers are you looking at as you transform your data centers and operations? What does "cloud" mean to you? Any thoughts on vendor management and negotiations? This is just a random stream of consciousness selection. Make the community yours by adding your own topics.
For those of you who have followed my research of the collaboration software space, you'll find that I have argued that the real whitespace for vendors is in facilitating interactions between different companies (see examples here and here). This advice, though, has always been given in the spirit of helping vendors enter the market and tell a differentiated story; my goal is always to get product marketers away from spinning tales of travel savings (which everyone does). Recently, I finished a report that explored why intercompany collaboration is important to the actual running of a tech industry business. Like any good story, it's a three-part narrative:
The definition of a B2B tech customer is changing. There was a time when a tech vendor selling to businesses only had to deal with the IT department. As such, the product design and messaging revolved around fulfilling the requirements of a techie audience: speeds and feeds, interoperability and security. Now? Business leaders are involved in technology decisions, shifting the design points of technology and its marketing to ease of use and ability to solve business problems. Further muddling this view, individual information workers are increasingly able to provision their own hardware and software, thanks to Web-based technologies and consumer technologies -- like Apple laptops and iPhones -- that IT departments are grudgingly accepting. The pull of these many groups on tech vendors has complicated the job of tech product managers and marketers: They now have to develop their product for and market it to a wider range of people with different interests.
There’s been lots going on with what Forrester calls the “interaction-centric customer service vendors”. These are the vendors that manage the high-volume, transaction-oriented relationships — those often encountered in B2C environments, over the multiple communication channels (email, chat, social, phone etc) that exist today.
RightNow announced its RightNow’s CX for Facebook app, to be released in November. This app creates a “Support” tab on a company’s wall and allows users to interact via social and traditional channels right from Facebook. Users can find answers from community content or from the corporate knowledgebase, ask the community questions, follow, participate in, and track discussions, propose an idea, ask an agent (either in a public or a private conversation), and more. It’s a nicely designed app, and something that RightNow needed to release, given the availability of similar ones from eGain, Genesys, Parature, etc.
eGain also solidifies its social footprint by announcing its Social Experience Suite — a customer interaction hub that manages both traditional and social interactions. The new version includes a social-blended agent desktop, a single-sourced knowledgebase across all channels (traditional + social again), and a unified customer record. The version also includes forums and adapters to monitor social networks through integrations with Facebook, Twitter, Google, and Yahoo search.
Forrester has long advocated adoption of a “business technology” approach to replace traditional IT. “BT” recognizes the fundamental role information technology plays in all aspects of business – and the need for business decision-makers to be deeply involved in setting technology strategy, priorities, and even delivering solutions. But how does this tight coupling of business and technology decision-making actually work?
My colleague Alexander Peters and I have just witnessed a situation that illustrates that having the right organizational structure and technology-savvy businesspeople is crucial to a BT transition.
The organization developed an IT strategy 10 years ago based on three best practices:
Major business processes would be implemented on a single, modern, flexible platform.
The platform would employ SOA to ensure that it could adapt to unforeseen needs.
The platform would run in the consolidated, scalable, and efficient data center of a service provider.
Today, the organization has not yet achieved its top goal of a single platform for all of its major processes. It has a new SOA/Java environment, but it processes a little more than half of the required workload. Older systems do the rest. Most disturbing:
The development investment has been many times greater than expected at the outset.
The annual cost of IT operations doubled versus the baseline.
System reliability went down with the new environment.
Technology innovation and business disruption are changing the software market today. Cloud computing is blurring the line between applications and services, and smart solutions are combining hardware with software into new, purpose-engineered solutions. We are happy to announce that we have launched our Forrester Forrsights Software Survey, Q4 2010, to predict and quantify the future of the software market and help IT vendors to tap into the insights from approximately 2,500 IT decision-makers across North America and Western Europe.
The survey will provide insights on the strategic direction and spending plans of enterprises from very small businesses to global enterprises, segmented by industry and country. In comparison with last year’s survey, we significantly boosted the sample size this year for the energy (oil and gas, utilities, and mining) and healthcare industries; we’ll be able to provide an in-depth analysis for these industries along with retail, financial services, high tech, and other industries.
Key themes for this year’s software survey include the following topics:
Cloud computing. Besides a 360-degree overview on current and future adoption rates of software-as-a-service (SaaS) for different software applications, we are going much deeper this year and have asked IT decision-makers about their cloud strategy for application replacement as well as for different data and transaction types.
Integrated information technology. Purpose-engineered solutions combining hardware with software are promising higher performance and faster implementation times. But do IT users really buy into single-vendor strategies?
As Forrester’s EA tools analyst specialist, I am regularly receiving inquiries from EA teams that are encountering trouble choosing the "single repository of truth" for the entire enterprise. Generally, they are oscillating between two products after a long decision process, hesitating in many cases because no one product is able to satisfy all the architects: the EAs, the solution architects, and sometimes the business architects. One product satisfies some architects and not the others, and vice versa; in the end, choosing one single product would not satisfy anyone because for each option that will satisfy a few, some will not use it (generally, for good reason), and it will not give others the information they require to do their job. Therefore, for these EA teams, the dream of getting a "single repository of truth" is becoming a nightmare. I encounter this sort of dilemma in half of the inquiries I receive about EA tools and particularly within the largest companies.
My answers are sometimes difficult for these EA teams to hear:
First: Do all team members agree on EA objectives for the next two to three years? Do all architects know and share the same IT objectives and priorities? If EA and IT objectives/priorities are not clear, it is not surprising that they want different tools, because a universal EA tool does not really exist at this time. The recent document I published about the EA management suite as a third generation of EA tools explains how the most recent two generations complement each other.