5:30am, the family sleeps and it’s time to prepare – today is Analyst Day in Frankfurt. I’m on the road 2h45min before the event starts (1h20min should be sufficient) but sometimes the traffic is terrible. Last week I missed a flight because the highway was completely closed after an accident and I had to give up after 3h driving for nothing. When the concern of missing an appointment slowly turns into certainty, these are the moments that cost me some of my (remaining) hair.
(Of course) I arrive much too early, but other analysts are already there (probably they don’t sleep at all). Plenty of time to look through my presentation again for some final adjustments and for some small talk with customers that arrived early.
1min before the kick-off, I make the last slide changes and load it to the presentation laptop. Another analyst colleague goes first. I have seen some of the slides a hundred times and look around at the faces of the attendees. For most, it’s the first time they see e.g. our market sizing and forecasting data, and they make hectic notes into their notebooks. They don’t know yet that we will distribute all slides after the event. I’m getting a bit nervous, but I’m used to it. When I'm not nervous any more before a presentation, it’ll get boring for me and the audience, and I should probably do something else.
What is BI? There are two prevailing definitions out there – broad and narrow. The broad definition (using our own) is that BI is a set of methodologies, processes, architectures, and technologies that transform raw data into meaningful and useful information used to enable more effective strategic, tactical, and operational insight and decision-making. But if we stick to this definition then shouldn’t we include data integration, data quality, master data management, data warehousing and portals in BI? I know lots of folks would disagree and fit these into data management or information management segments, but not BI.
Then, the narrow definition is used when referring to just the top layers of the BI architectural stack such as reporting, analytics and dashboards. But even there, as Jim Kobielus and I discovered as we were preparing to launch our BI TechRadar 2010 research, we could count over 20 (!) product categories such as Advanced Analytics, Analytical Performance Management, Scorecards, BI appliances and BI SaaS, BI specific DBMS, BI Workspaces, Dashboards, Geospatial analytics, Low Latency BI, Metadata Generated BI Apps, Non modeled exploration and In-memory analytics, OLAP, Open Source BI and SaaS BI, Packaged BI Apps, Process / Content Analytics, Production reports and ad-hoc query builders, Search UI for BI, Social Network / Media Analytics, Text analytics, Web Analytics.
To make matters worse, some folks out there are now trying to clearly separate BI and analytics, by trying to push a “core, traditional BI is commoditized, analytics is where differentiation is today” message. Hmmm, I thought I was building analytical apps using OLAP starting back in the early 80’s.
So you need to formulate an application modernization decision -- what to do with a given application -- how do you begin that decision making process? In the past, modernization decisions were often simply declared -- "We are moving to this technology" -- for a number of reasons, such as, it:
Keeps us current on technology.
Provides a more acceptable user-interface or integration capability.
Increases our exposure to access by external customers.
Increases the volume of business transaction we can process.
Trades custom/bespoke applications for standardized application packages such as ERP, payroll, human resources, etc.
Fast-forward to today -- you could simply go with your gut -- declare a solution based on what you currently know (or think you know) about the application in question. But it's a new day baby -- a proposal like that, without proper justification, is likely to be met with one of two responses from management:
There’s a lot of hype out there by many vendors who claim that they have tools and technologies to enable BI end user self service. Do they? When you analyze whether your BI vendor can support end user self service, consider the following types of “self service” and related BI tool requirements:
#1. Self service for average, casual users.
What do these users need to do?
Run and lightly customize canned reports and dashboards
Run ad hoc queries
Add calculated measures
Fulfill their BI requirements with little or no training (typically one needs search-like, not point-and-click UI for this)
What capabilities do they need for this?
Report and dashboard templates
Customizable prompts, sorts, filters, and ranks
Report, query, dashboard building wizards
Semantic layer (not all BI vendor have a rich semantic layer)
Prompting for columns (not all BI vendors let you do that)
Drill anywhere (only BI vendors with ROLAP and multisourcing / data federation provide this capability)
#2. Self service for advanced, power users
What do these users need to do?
Perform what-if scenarios (this often requires write back, which very few BI vendors allow)
Add metrics, measures, and hierarchies not supported by the underlying data model (typically one needs some kind of in-memory analytics capability for this)
Explore based on new (not previously defined) entity relationships (typically one needs some kind of in-memory analytics capability for this)
Not knowing exactly what one is looking for (typically one needs search-like UI for this)
Every year, I take 250 to 300 calls from Forrester clients. The vast majority of these calls are from executives embroiled in the process of trying to select the right CRM technology solution to support their business strategy. From these conversations, I have distilled a set of decision criteria to help you quickly cut through the CRM tech vendor underbrush.
Ability to meet your specific business requirements. You have to know what business outcomes you are trying to achieve, and define the business capabilities that you need to support, before you seriously consider investing in a CRM software solution. Although the core capabilities of leading CRM software vendors are quite similar, the companies I hear from still place a very high importance on the solution meeting the functional and technology criteria that are specific to their needs. Can the vendor meet your use-case requirements?
Ease of use for front-line workers. My clients expect CRM software to demonstrate the capability to make people more fruitful in their work, and this is predicated on how easy the solution is to use. Good usability encourages user adoption. Is the solution UI modern and adaptable to diverse role-based requirements?
Capability to provide advanced analytic abilities. My clients place a high value on CRM vendors' ability to provide analytic tools to better understand customer behavior and make insightful customer-facing decisions using the myriad customer data collected. Analytics are the key to unlocking the value in CRM applications. Does the vendor have powerful and easy-to-use business intelligence capabilities?
For many of my clients, 2009 was a difficult year as they struggled in response to the sudden and dramatic downturn in the economy. Although many CRM technology projects were deferred or cancelled last year, I see this trend being strongly reversed in 2010. Every day I get calls from companies large small who tell me that they now releasing funds to invest in improving their customer-facing business processes neglected during the past 18 months.
The underlying trends driving the need for effective and efficient customer management processes have not disappeared. In fact, the need for companies to effectively engage with their customers has never been more important. Locking in customer loyalty through deeper engagement and differentiated experiences will continue as critical priorities, but navigating the complex CRM solution vendor landscape and organizing projects for success will continue to be challenging. In 2010, you must focus on choosing the best opportunities for quick wins carefully, spend wisely on the right CRM solutions, and manage project risk. Take advantage of Forrester data, methods, and tools to capitalize on the improving economic climate to drive top-line growth.
I have designed this webinar offer you a roadmap to the specific Forrester data, techniques, and tools that you can immediately put to use to implement our six-step methodology for CRM success.
1. Understand the customer of the future
2. Define the right CRM strategy and priorities
3. Build a rock-solid business case
4. Risk-proof your project
5. Resolve customer data management dilemmas
6. Negotiate the right software pricing and licensing agreements
Here is the link to the registration page. You do not have to be a Forrester client to join in.
Recently, I discussed complexity with a banker working on measuring and managing complexity in a North American bank. His approach is very interesting: He found a way to operationalize complexity measurement and thus to provide concrete data to manage it. While I’m not in a position to disclose any more details, we also talked about the nature of complexity. In absence of any other definition of complexity, I offered a draft definition which I have assembled over time based on a number of “official” definitions. Complexity is the condition of:
Late last year, Forrester published a “big idea” research report identifying Product-centric Development as a distinctive, value-based approach to software development characterized by highly-successful commercial software companies. As part of this research, we made the call that product (or service) centricity can occur regardless of whether internal or outsourced resources do the bulk of the work – rather, it is a team’s orientation to the company’s value chain, their partnership with customers and business stakeholders, and the ownership for the business results that their software ultimately delivers that are all really the more important, defining characteristics of a ‘product-centric’ development shop.
Recently, I came across a great example of this kind of high-value development work spanning internal and external resources in Dr Pepper Snapple Group’s (DPS) partnership with HCL. Why does this mixed-development model work? Two key success factors include:
- HCL’s service delivery team is continually in step with DPS’s business environment. By staffing dedicated service delivery managers (SDMs) on site, DPS can call on HCL to help tackle both strategic management issues, such as reducing shrinkage and achieving on-time delivery, and day-today problems, such as app latency and downtime, with a "one-stop-shop" liaison who can own the problem and seek resolution across technology silos.
Recently, I’ve been getting more inquiries around risk based testing. In addition to agile test methods and test estimation, test teams turning their eyes to risk based testing is just another positive step in integrating quality through out the SDLC. Yes, I still see QA engineers as having to put their evangelist hats on to educate their developer brothers and sisters that quality is more than just testing (don’t get me wrong, consistent unit and integration testing is a beautiful thing), however, any time that business and technology partners can think about impact and dependencies in their approach to a solid, workable application elevates quality to the next level.
Keep asking those questions about risk based testing – and make sure that you’re covering all of the angles. Make sure that you’re covering:
A couple of days ago, Texas Stadium was reduced to a pile of rubble. Wow. The former home of my Dallas Cowboys, the site of victories and record-setting performances — gone in a matter of minutes. Was I sad? Yeah. But also a bit relieved. The Cowboys have moved to their new, super-duper (and quite ostentatious) stadium, Texas Stadium memorabilia has been auctioned off, and the poor, neglected building had become quite an eyesore.
Sometimes destroying something unusable is the best way to move forward.
Run that statement past your approach to documenting software requirements. When was the last time you took a step back to evaluate how your organization represents requirements? If it’s been awhile and your business analysts are still delivering massive, text-heavy, all-encompassing business requirements documents (BRDs), it’s time for some demolition of your own.
Why? Compelling forces have converged, drastically changing the way application development teams author software requirements. Organizations are recognizing the connection between software failure and poor requirements, and authoring better requirements has become a major initiative in many firms. At the same time, Lean and Agile are front and center, so right-sizing requirements documentation is on everyone’s radar. Bottom line, you need to do more with less: author stronger requirements while minimizing waste and being as lean as possible.