Internet of Things is a hype - no question. But let's talk about the INTEGRATION Of Things.
It’s been a while since Bosch completed the acquisition of the Germany BPM and Integration vendor Inubit AG in October 2011. Two years later Inubit has not only well arrived in the Bosch Group, it became even the nucleus of Bosch’s allover software business and helps the traditional manufacturer of automotive parts and consumer electronics to embrace an additional business model of a software vendor.
Nevertheless calling the conference ConnectedWorld articulates the repositioning of the former general purpose BPM and Integration software into the internet of things. This is where Bosch with its dominant automotive footprint and their good market share of home appliance in Europe is strong. It is a natural move to focus Bosch Software Innovation’s in the areas of Bosch core business. In this context, it is no surprise that every second visitor of the show is a Bosch employee who likes to understand if and how their Bosch units can use the new software assets. Ideally this results not only in internal use, but in joint external products. Today the clear majority of Bosch's software revenues are external and not yet related to other Bosch products.
I recently had a meeting with executives from Tech Mahindra, an Indian-based IT services company, which was refreshing for the both the candor with which they discussed the overall mechanics of a support and integration model with significant components located half a world away, as well as their insights on the realities and limitations of automation, one of the hottest topics in IT operations today.
On the subject of the mechanics and process behind their global integration process, the eye opener for me was the depth of internal process behind the engagements. The common (possibly only common in my mind since I have had less exposure to these companies than some of my peers) mindset of “develop the specs, send them off and receive code back” is no longer even remotely possible. To perform a successful complex integration project takes a reliable set of processes that can link the efforts of the approximately 20 – 40% of the staff on-site with the client with the supporting teams back in India. Plus a massive investment in project management, development frameworks, and collaboration tools, a hallmark of all of the successful Indian service providers.
From a the client I&O group perspective, the relationship between the outsourcer and internal groups becomes much more than an arms-length process, but rather a tightly integrated team in which the main visible differentiator is who pays their salary rather than any strict team, task or function boundary. For the integrator, this is a strong positive, since it makes it difficult for the client to disengage, and gives the teams early knowledge of changes and new project opportunities. From the client side there are drawbacks and benefits – disengagement is difficult, but knowledge transfer is tightly integrated and efficient.
For decades, firms have deployed applications and BI on independent databases and warehouses, supporting custom data models, scalability, and performance while speeding delivery. It’s become a nightmare to try to integrate the proliferation of data across these sources in order to deliver the unified view of business data required to support new business applications, analytics, and real-time insights. The explosion of new sources, driven by the triple-threat trends of mobile, social, and the cloud, amplified by partner data, market feeds, and machine-generated data, further aggravates the problem. Poorly integrated business data often leads to poor business decisions, reduces customer satisfaction and competitive advantage, and slows product innovation — ultimately limiting revenue.
Forrester’s latest research reveals how leading firms are coping with this explosion using data virtualization, leading us to release a major new version of our reference architecture, Information Fabric 3.0. Since Forrester invented the category of data virtualization eight years ago with the first version of information fabric, these solutions have continued to evolve. In this update, we reflect new business requirements and new technology options including big data, cloud, mobile, distributed in-memory caching, and dynamic services. Use information fabric 3.0 to inform and guide your data virtualization and integration strategy, especially where you require real-time data sharing, complex business transactions, more self-service access to data, integration of all types of data, and increased support for analytics and predictive analytics.
Information fabric 3.0 reflects significant innovation in data virtualization solutions, including:
“Figuring out how to think about the problem.” That’s what Albert Einstein said when asked what single event was most helpful in developing the Theory of Relativity. Application integration is a problem. A big problem. Not to mention data, B2B, and other domains of integration. As an industry analyst and solution architect, what I’m most interested in first is how to think about the problem.
Pop Quiz: The Goal of Integration
Which of the following statements best articulates the goal of integration strategy?
The goal of integration is to keep data in sync across two or more siloed applications.
The goal of integration is to improve business outcomes by achieving consistent, coherent, effective business operations.
The correct answer is B. Was that too easy? Apparently not, because most of the integration strategies I see are framed as if the answer were A. Most, but not all — and it’s the ones framed around B that I’m most interested in. Here’s the difference:
A-style integration centers on technology. It begins with data and business logic fractured across application silos, and then asks, “How can integration technologies make it easier to live with this siloed mess?”
B-style integration centers on business design. It begins with a businessperson’s view of well-oiled business operations: streamlined processes, consistent transactions, unified tools for each user role, purpose-built views of data, and the like. It designs these first — that is, it centers on business design — and then asks, “How can integration technologies give us coherent business operations despite our application silos?”
I've noticed a bit of a disturbing pattern of late in my cloud discussions with clients. They have been talking about hybrid cloud in the future tense. If you are planning for hybrid down the road, I have a wake up call for you. Too late, you are already hybrid.
If your company has even a single SaaS application in use today I can almost gurantee you it's connected to something inside your data center giving you hybrid cloud. So hybrid isn't a future state after you have a private cloud in place and IT Ops chooses to connect that private cloud to a public cloud. Look at it through the lens of a business process or application service which is composed of different components, some cloud-based, some on-premise. From an Infrastructure & Operations perspective, hybrid cloud means a cloud service connected to any other corporate resource (a back office app, your web site, your intranet, another SaaS app you have under contract and yes, even your private cloud). Any of these types of connections presents the same integration impact - whether you established the connection or not. If you are like the typical enterprise, that answered our Forrsights Q4 2012 Software Survey, then you have more than six SaaS applications in place today (that you know about) so cloud integration is likely well in place today. And about one third of the developers who responded to our Forrsights Q1 2013 Developer Survey said they have already deployed applications to the public cloud. Twenty-five percent also admitted to putting application integrations in place.
If your organization is like nearly every other one I've talked to in the past 20+ years, you have a spaghetti chart of integration connections between all the siloed applications that run your business. Your customer is fractured across five applications. Your fulfillment process is broken across eight applications. Just try to pull together the data necessary to tell how profitable one of your products is. Or, as you implement mobile, external APIs, custom B2B connections, and more, how will you provide consistent, coherent access to your transactions and data?
Making sense of all the mess has been an important priority for years. The question is "how?" Forrester's latest research finds that it's time for a new kind of integration strategy. We call it "Digital Business Design":
A business-centered approach to solution architecture, implementation, and integration that brings business and technology design together by placing design priority on user roles, business transactions, processes, canonical information, events, and other business aspects that embody a complete definition of a business.
Two months ago, we announced our upcoming Forrester Forrsights Software Survey, Q4 2010. Now the data is back from more than 2,400 respondents in North America and Europe and provides us with deep and sometimes surprising insights into the software market dynamics of today and the next 24 months.
We’d like to give you a sneak preview of interesting results around some of the most important trends in the software market: cloud computing integrated information technology, business intelligence, mobile strategy, and overall software budgets and buying preferences.
Companies Start To Invest More Into Innovation In 2011
After the recent recession, companies are starting to invest more in 2011, with 12% and 22% of companies planning to increase their software budgets by more than 10% or between 5% and 10%, respectively. At the same time, companies will invest a significant part of the additional budget into new solutions. While 50% of the total software budgets are still going into software operations and maintenance (Figure 1), this number has significantly dropped from 55% in 2010; spending on new software licenses will accordingly increase from 23% to 26% and custom-development budgets from 23% to 24% in 2011.
Cloud Computing Is Getting Serious
In this year’s survey, we have taken a much deeper look into companies’ strategies and plans around cloud computing besides simple adoption numbers. We have tested to what extent cloud computing makes its way from complementary services into business critical processes, replacing core applications and moving sensitive data into public clouds.
Fujitsu? Who? I recently attended Fujitsu’s global analyst conference in Boston, which gave me an opportunity to check in with the best kept secret in the North American market. Even Fujitsu execs admit that many people in this largest of IT markets think that Fujitsu has something to do with film, and few of us have ever seen a Fujitsu system installed in the US unless it was a POS system.
So what is the management of this global $50 Billion information and communications technology company, with a competitive portfolio of client, server and storage products and a global service and integration capability, going to do about its lack of presence in the world’s largest IT market? In a word, invest. Fujitsu’s management, judging from their history and what they have disclosed of their plans, intends to invest in the US over the next three to four years to consolidate their estimated $3 Billion in N. American business into a more manageable (simpler) set of operating companies, and to double down on hiring and selling into the N. American market. The fact that they have given themselves multiple years to do so is very indicative of what I have always thought of as Fujitsu’s greatest strength and one of their major weaknesses – they operate on Japanese time, so to speak. For an American company to undertake to build a presence over multiple years with seeming disregard for quarterly earnings would be almost unheard of, so Fujitsu’s management gets major kudos for that. On the other hand, years of observing them from a distance also leads me to believe that their approach to solving problems inherently lacks the sense of urgency of some of their competitors.
Software AG announced today a significant change in their executive structure. After the acquisition of webMethods back in 2007, the second largest software vendor in Germany acquired IDS Scheer last year, at topic we explored already in this report.
If you follow Software AG over this time, you might realize that the way CEO Karl-Heinz Streibich runs a post merger process may involve dramatic disruptions in the executive structure of the company. Dave Mitchell, the former webMethods CEO left some months after that acquisition. Today, the Chief Product Officer, Dr. Peter Kürpick surprisingly left the company. Peter was a member of the executive board since 2005, and, although his contract officially runs until 2013, he is leaving at his own request immediately. He stood for the successful turnaround of Software AG’s product strategy and repositioned Software AG from an outmoded mainframe shop into a leading global integration player. The successful merging of Software AG’s mainframe and integration know-how with the newer webMethods product stack into one interoperable integration stack was one of Peter’s major achievements. Peter also took over the responsibility for Software AG’s ETS (mainframe) product strategy after the integration business reached a solid stability. He would have had the skills and experience to create a consistent technology stack spanning from the mainframe over the WebMethods integration up to the business architecture tools of IDS Scheer (ARIS).