Information Fabric 3.0 Delivers The Next Generation Of Data Virtualization

Noel Yuhanna

For decades, firms have deployed applications and BI on independent databases and warehouses, supporting custom data models, scalability, and performance while speeding delivery. It’s become a nightmare to try to integrate the proliferation of data across these sources in order to deliver the unified view of business data required to support new business applications, analytics, and real-time insights. The explosion of new sources, driven by the triple-threat trends of mobile, social, and the cloud, amplified by partner data, market feeds, and machine-generated data, further aggravates the problem. Poorly integrated business data often leads to poor business decisions, reduces customer satisfaction and competitive advantage, and slows product innovation — ultimately limiting revenue.

Forrester’s latest research reveals how leading firms are coping with this explosion using data virtualization, leading us to release a major new version of our reference architecture, Information Fabric 3.0. Since Forrester invented the category of data virtualization eight years ago with the first version of information fabric, these solutions have continued to evolve. In this update, we reflect new business requirements and new technology options including big data, cloud, mobile, distributed in-memory caching, and dynamic services. Use information fabric 3.0 to inform and guide your data virtualization and integration strategy, especially where you require real-time data sharing, complex business transactions, more self-service access to data, integration of all types of data, and increased support for analytics and predictive analytics.

Information fabric 3.0 reflects significant innovation in data virtualization solutions, including:

Read more

Cisco's Acquisition Of Composite Software Brings Data Intelligence To The Networks

Noel Yuhanna

Cisco’s acquisition of Composite Software is unique compared with the ones it’s done in the past. This acquisition makes networks more knowledgeable about data — a piece that’s been missing from Cisco’s framework.

Today, digital information that flows through networks is not data-aware. To networks, data is just represented as bits and bytes. There’s no built-in intelligence that tells the routers that some data needs higher priority when routing or needs to travel to another location before reaching its destination. The data intelligence piece is missing. This is where Composite Software comes in. Composite Software is a data virtualization company that knows what data is being used, how the data needs to be transformed and routed, and what data has higher priority.

Data virtualization deals with an abstraction layer of information from many disparate data sources — so it can integrate with applications, databases, files, virtualization, clouds etc.  Composite Software is one of the leading data virtualization companies that is often shortlisted by customers largely because of its strong product offering. It supports some of the most complex data virtualization deployments in existence — in part because it’s been active in this market as long as, or longer than, any other player. A key component in any large data virtualization implementation is the network that ensures consistent performance while accessing all of the disparate data, especially if the data is located across many servers, clouds and virtualized platforms.

Read more

Enterprise Data Management Is Not The Holy Grail

Brian  Hopkins

From my first days as a baby architect, I was spoon-fed the idea that enterprise data management (EDM) was the solution to our data woes. Some call it enterprise information management or other names that mean a holistic approach to managing data that is business led and centered on stewardship and governance. The DMBOK provides a picture that describes this concept very well — check it out.

Here’s the problem: Most firms are not able to internalize this notion and act accordingly. There are myriad reasons why this is so, and we can all list off a bunch of them if we put our minds to it. Top of my list is that the lure of optimizing for next quarter often outweighs next year’s potential benefits.

Here’s another problem: Most EAs cannot do much about this. We are long-term, strategic people who can clearly see the benefits of EDM, which may lead us to spend a lot of time promoting the virtues of this approach. As a result, we get bloody bruises on our heads and waste time that could be spent doing more-productive things.

I do think that taking a long-term, holistic approach is the best thing to do; in my recently published report "Big Opportunities In Big Data," I encourage readers to maintain this attitude when considering data at extreme scale. We need to pursue short-term fixes as well. Let me go a step further and say that making short-term progress on nagging data management issues with solutions that take months not years is more important to our firms than being the EDM town crier. Hopefully my rationale is clear: We can be more effective this way as long as our recommendations keep the strategic in mind.

Read more