Forrester Blogs For Business Technology Professionals
This is a roll-up of all Forrester blogs written for Business Technology Professionals. Role-specific blogs are listed below. Visit Forrester.com to learn how we make Business Technology Professionals successful every day.
Many large organizations have finally “seen the light” and are trying to figure out the best way to treat their critical data as the trusted asset it should be. As a result, master data management (MDM) strategies and the enabling architectures, organizational and governance models, methodologies, and technologies that support the delivery of MDM capabilities are…in a word…HOT! But the concept of MDM -- and the homegrown or vendor-enabled technologies that attempt to deliver that elusive “single version of truth,” “golden record,” or “360-degree view” -- has been around for decades in one form or another (e.g., data warehousing, BI, data quality, EII, CRM, ERP, etc. have all at one time or another promised to deliver that single version of truth in one form or another).
The current market view of MDM has matured significantly over the past 5 years, and today many organizations are on their way to successfully delivering multi-domain/multi-form master data solutions across various physical and federated architectural approaches. But the long-term evolution of the MDM concept is far from over. There remains a tremendous gap in what limited business value most MDM efforts deliver today compared to what all MDM and data management evangelists feel MDM is capable of delivering in terms of business optimization, risk mitigation, and competitive differentiation.
What will the next evolution of the MDM concept look like in the next 3, 5, and 10 years? Will the next breakthrough be one that’s focused on technology enablement? How about information architecture? Data governance and stewardship? Alignment with other enterprise IT and business strategies?
My colleague Margo Visitacion and I are finishing up a new report, Seven Pragmatic Practices To Improve Software Quality, that will publish in a few weeks. We realized that not everyone has the same definition of quality. More often than not application development professionals define software quality as just meaning fewer bugs. But software quality means a whole lot more than just fewer bugs.
Forrester defines software quality as:
Software that meets business requirements, provides a satisfying user experience, and has fewer defects.
What It Means: Quality Is A Team Sport
Quality must move beyond the purview of just QA professionals and must become an integrated part of the entire software development life cycle (SDLC) to reduce schedule-killing rework when business requirements are misunderstood, improve user satisfaction, and reduce the risks of untested nonfunctional requirements such as security and performance.
We are getting many requests for help on iPad strategies for the enterprise. It's clear why. iPads are a tremendously empowering technology that any employee can buy. My colleague Andy Jaquith has a report coming real soon now on the security aspects of iPhones and iPads, and I'm launching research on case studies of iPad in the enterprise.
I am currently hearing about three business scenarios for iPad and tablets, but I'd love hear of your experiences, plans, concerns, or frustrations. Ping me at tschadler(at)forrester(dot)com. Here are the three scenarios:
Sales people out in the field. This is the "Hollywood pitch deck" scenario. The iPad, particularly with a cover that can prop it up a bit, is a great way to scroll through slides to show a customer or demonstrate a Web site. In one situation, I heard that there's a competition brewing for who can manipulate the Web site upside down (so the client across the table sees it right side up) without making any mistakes. Now there's a new skill for sales: upside down Web browsing.
Executives on an overnight trip. No, iPad doesn't replace a laptop (at least not yet; more on this below). But it's great for email, calendar, reviewing documents, and presenting PDF or Keynote decks.
When digging into the data from September 2009 Global State Of Enterprise Architecture Online Survey, I found an interesting correlation in the data: Survey respondents who reported a high degree of business and IT process standardization also reported that EA was more effective and more influential within the organization. As the level of standardization decreased, so did EA effectiveness and influence. Just take a look at this sample data from a report that recently went live on our website:
Why does this correlation exist? We’ve been saying (and most clients have been agreeing) that process standardization is a keystone to effectiveness across all areas of IT: apps, infrastructure, PMOs, you name it. When I look at IT organizations in my research, those that focus on standardizing processes or that live in an environment of highly standardized business processes tend to be doing a better job.
But simply being more standardized can’t be the “secret sauce” for EA success. There must be something that standardization does to an organization — a window or door that it creates — that enables IT functions such as EA to get better at what they do. Based on deeper analysis of our data, this is my hypothesis:
During CScape at Cisco Live, one of the more interesting conversations I had started with a simple question: Is social software (and collaboration software in general) a set of standalone applications or features of other business applications? This sprang from a discussion on the future of the collaboration technology business and really speaks to a couple of important developments in the market:
There has been a lot of negative press and commentary regarding the recent Queensland Health Implementation of Continuity Project (SAP HR and Payroll), which recently experienced a very public failure as many employees were not paid due to multiple points of failure in the project. The recent Auditor-General's Report on the process is damning, spreading the blame across multiple agencies and the systems integration partner, IBM. I make no claims to be familiar with the intricate details of the process, but I have read the report and feel I have a clear understanding of the (many!) points of failure.
While this project did seem to be a monumental failure, I would suggest that we consider two important facts:
VMware today released an incremental upgrade to its core vSphere platform and took the opportunity to do some product repackaging and pricing actions - the latter being a big win for enterprise customers. The vSphere 4.1 enhancements focused on scalability to accommodate larger and larger virtual pools. The number of VMs per pool and number of hosts and VMs per instance of vCenter have been ratcheted up significantly, which will simplify large environments. The new network and storage I/O features and new memory compression and VMotion improvements will help customers pushing the upper limits of resource utilization. Storage vendors will laud the changes to vStorage too, which finally ends the conflict between what storage functions VMware performs versus what arrays do natively.
The company also telegraphed the end of life for ESX in favor of the more modern ESXi hypervisor architecture.
But for the majority of VMware shops the pricing changes are perhaps the most significant. It's been a longstanding pain that in order to use some of the key value add management features such as Site Recovery Manager and AppSpeed you had to license them across the full host even if you only wanted to apply that feature to a few VMs. This led to some unnatural behavior such as grouping business critical applications on the same host - cost optimization that trumps availability best practices. Thankfully that has now been corrected.
As I cruised the pavilion at Cisco Live in Las Vegas last week, the display that held my attention the longest was the Collaboration ROI booth. There, the network infrastructure provider making waves in the collaboration software market was demonstrating calculations it had done on how its various solutions were improving efficiency and productivity for specific jobs in verticals like retail banking. In the example I reviewed, banks using virtual loan officers were able to obtain more small business customers because the bank was able to have someone "there" to answer the prospective customer's questions. Now, with all the activity going on around me, why was this so fascinating? Put simply, it relates to a fundamental issue for all vendors hoping to compete in the collaboration software space: How do you differentiate in this crowded market?