For some time there have been rumors about Deutsche Banking having selected TCS BaNCS for some or all of its international subsidiaries. Today, both Deutsche Bankand Tata Consultancy Services (TCS)published a press release announcing that Deutsche Bank will implement TCS BaNCS Core Banking as its new core banking platform for Global Transaction Banking (GTB). The first international subsidiary, which is located in Abu Dhabi, went live three days ago. I discussed the deal with N. Ganapathy Subramaniam (NGS), the president of TCS Financial Solutions.
EA teams like to know how mature their EA practice is. There are a lot of EA maturity models out there. You will find some of these assessments and maturity models discussed in a 2009 Forrester report. Many EA teams share the idea that there is a single “ultimate EA model” and that EA leaders should strive to move up the ladder to this ultimate model. It’s like a video game – you try to get to the next level.
For the past three months, the EA team’s Researcher Tim DeGennaro has been looking at these models and Forrester’s research on EA best practices to create a framework for assessing EA programs. This looked deceptively simple: Develop criteria based on the best practices we see in leading EA organizations, create an objective scale to rate an organization’s progress, offer reporting to illuminate next steps, and wrap it in an easy-to-use assessment package. What we’ve found so far is not only that avoiding the effects of subjectivity and lack of context is impossible but also that many assessment styles disagree on the most crucial aspect: What exactly is EA supposed to be aiming for?
Are you interested in business intelligence, wonder about the future of the analytics market or have a question on advanced analytics technologies?
Then join the Forrester analysts Rob Karel, Boris Evelson, Clay Richardson, Gene Leganza, Noel Yuhanna, Leslie Owens, Suresh Vittal, William Frascarelli, David Frankland, Joe Stanhope, Zach Hofer-Shall, Henry Peyret and myself for an interactive TweetJam on Twitter about the state of advanced analytics on Wednesday, December 15th, 2010 from 12:00 p.m. – 1:00 p.m. EDT (18:00 – 19:00 CET) using the Twitter hashtag #dmjam. We’ll share the results of our recent research on the analytics market space and discuss how it will change with new technologies entering the scene and maturing over time.
Business intelligence is the fastest growing software market today as companies are driving business results based on deeper insights and better planning, and advanced analytics is the spearhead of BI technologies that can untap new dimensions of business performance. But what exactly is ‘advanced’ analytics, what technologies are available and how to efficiently use them?
Much more detailed information can be found in the blog of Forrester analyst James Kobielus who will lead us through the discussion during the TweetJam. Above you see an overview graphic listing the different elements of advanced analytics today, taken from his blog.
Here are some of the questions we want to debate during our TweetJam discussion:
What exactly is and isn’t advanced analytics?
What are the chief business applications of advanced analytics?
Our Q3 2010 Global Financial Services Architecture Online Survey shows that 79% of the surveyed financial services firms are either already working on transforming their application landscape or plan to start this effort by 2012 at the latest. The need for greater business agility and flexibility, new business capabilities, and improved ability to cope with changing markets, offer more differentiation, and increase market share are key drivers for a large share of these financial services firms.
Coping with these drivers requires a large amount of architectural flexibility; therefore, architectural flexibility needs to be an integral element of any decision in favor of or against a given architecture or off-the-shelf banking platform within a transformation initiative. Consequently, it does not come as a surprise that 43% of the surveyed firms expect that more than one-third of their business applications will leverage service-oriented architecture and use business services in the next 18 to 24 months and an additional 19% think that more than half of their applications will utilize business services within that time frame.
Step back and think: How would you answer the question, “What does your IT group deliver to your business?” Your answer will indicate how you think about the relationship between business and technology, and, in turn, it will affect your business agility and your business-IT alignment.
If you answer anything close to “IT delivers and integrates solutions to meet business requirements,” your mental model boils down to thinking that your solutions support the business: Business is one thing; solutions are a separate thing. If the business has a problem, let it come ask IT for some application to address the problem — maybe IT will even help the business figure out what it needs. Each application supports (typically overlapping) parts of the business, and IT performs whatever behind-the-scenes integration is necessary to gain some degree of coherency across the whole. The result is the sort of siloed spaghetti application mess that most organizations are dealing with — even if SOA and BPM and the rest make it easier to deal with.
Early next year I'm going to ask Sourcing & Vendor Management professionals to vote on which software companies' licensing policies they most resent as Unfair. Fairness is a subjective quality, but it seems to me that some policies penalize customers for circumstances beyond their control that are unrelated to the value they are getting from the software. Others have serious consequences that may not have been apparent to the buyer when he agreed to the contract. Fair software pricing charges some companies more than others, but in a logical, transparent way that is related to value. Jim Hagemann Snabe (SAP's co-CEO) explained software pricing best practice extremely well in this recent interview with Computerweekly.com's Warwick Ashford:
"Q: What is SAP doing to meet user demand for greater clarity on licensing and pricing?"
MyCustomer.com recently asked me what my thoughts were about CRM: Why initial CRM projects failed, what has now changed to make deployments successful, and what the future holds for CRM. Here is the first part of my point of view, as well as a link to a series of three published articles from MyCustomer.com.
Question: Nearly a decade ago, estimates suggested that a very large proportion of CRM projects were failing. What were the main problems undermining CRM projects in those days?
Answer: The main problems undermining CRM projects a decade ago were mismatched expectations with reality in three categories: technology, process and people.
The first CRM systems were not fully baked and had large feature holes that were not always communicated to the purchaser. The technology was not intuitive or easy to use. It was hard to implement with long time-to-value and hard to become proficient in its use. It was even harder to change the business processes that had been implemented — changes that were necessary to stay in line with evolving business needs.
CRM systems were also difficult to integrate with a company’s IT ecosystem, which meant that many actions needed to be repeated in multiple systems. (For example, consider a CRM system that was not integrated into a company’s email system. This means that a sales person would have to cut and paste a customer communication from their email correspondence into the CRM system, which was labor intensive and often not done. )
On the heels of Forrester's GRC Market Overview last month, this week we published my Governance, Risk, And Compliance Predictions: 2011 And Beyond report. Based on our research with GRC vendors, buyers, and users, this paper highlights the aggressive regulatory environment and greater attention to risk management as drivers for change. Specifically, here is a brief summary of the top five trends we will see next year:
Increasing vendor competition will continue to bring more choices and more confusion. Strong market growth will encourage more technology and service vendors to get into the market, which means the fragmentation (which I've discussed previously) and confusion will continue.
Can you remember a year when your business both (1) grew in a healthy way and (2) changed more slowly than the year before? Besides a company’s early startup years, such would be the exception, not the rule. So, in 2011, your business is likely to continue accelerating its pace of change. A recent Forrester report, The Top 15 Technology Trends EA Should Watch: 2011 To 2013, named both business rules and SOA policy as items for your watch list — because both of them help accelerate business change.
Back in the mainframe days — and even into minicomputer, client/server, and Web applications — nearly all of the business logic for every application was tightly wrapped up in the application code. A few forward-thinking programmers might have built separate parameter files with a small bit of business-oriented application configuration, but that was about it. But, business changes too quickly to have all of the rules locked up in the code.
Some have tried the route that businesspeople ought to do their own programming — and many vendor tools through the years have tried creatively (though unsuccessfully) to make development simple enough for that. But, business is too complex for businesspeople to do all of their own programming.
Enter business rules, SOA policy, and other ways to pull certain bits of business logic out of being buried in the code. What makes these types of approaches valuable is that they are targeted, contained, and can have appropriate life cycles built around them to allow businesspeople to change what they are qualified to change, authorized to change, and have been approved to change.
Most, if not all, technology improvements need what is commonly referred to as “complementary inputs” to yield their full potential. For example, Gutenberg's invention of movable type wouldn't have been viable without progress in ink, paper, and printing press technology. IT innovations depend on complements to take hold. The use of internal cloud differences will affect applications, configuration, monitoring, and capacity management. External clouds will need attention to security and performance issues related to network latency. Financial data availability is also one important cloud adoption criteria and must be addressed. Without progress in these complementary technologies, the benefits of using cloud computing cannot be fully developed.
Internal cloud technology is going to offer embedded physical/virtual configuration management, VM provisioning, orchestration of resources, and most probably, basic monitoring or data collection in an automated environment, with a highly abstracted administration interface. This has the following impact:
More than ever, we need to know where things are. Discovery and tracking of assets and applications in real time is more important than ever: As configurations can be easily changed and applications easily moved, control of the data center requires complete visibility. Configuration management systems must adapt to this new environment.
Applications must be easily movable. To take advantage of the flexibility offered by orchestration, provisioning, and configuration automation, applications must be easily loaded and configured. This assumes that there is, upstream of the application release, an automated process that will “containerize” the applications, its dependencies, and its configuration elements. This will affect the application life cycle (see figure).