Semantic Technology Is Not Only For Data Geeks

You can't bring up semantics without someone inserting an apology for the geekiness of the discussion. If you're a data person like me, geek away! But for everyone else, it's a topic best left alone. Well, like every geek, the semantic geeks now have their day — and may just rule the data world.

It begins with a seemingly innocent set of questions:

"Is there a better way to master my data?"

"Is there a better way to understand the data I have?"

"Is there a better way to bring data and content together?"

"Is there a better way to personalize data and insight to be relevant?"

Semantics discussions today are born out of the data chaos that our traditional data management and governance capabilities are struggling under. They're born out of the fact that even with the best big data technology and analytics being adopted, business stakeholder satisfaction with analytics has decreased by 21% from 2014 to 2015, according to Forrester's Global Business Technographics® Data And Analytics Survey, 2015. Innovative data architects and vendors realize that semantics is the key to bringing context and meaning to our information so we can extract those much-needed business insights, at scale, and more importantly, personalized. 

Read more

Agile Development And Data Management Do Coexist

A frequent question I get from data management and governance teams is how to stay ahead of or on top of the Agile development process that app dev pros swear by. New capabilities are spinning out faster and faster, with little adherence to ensuring compliance with data standards and policies. 

Well, if you can't beat them, join them . . . and that's what your data management pros are doing, jumping into Agile development for data. 

Forrester's survey of 118 organizations shows that just a little over half of organizations have implemented Agile development in some manner, shape, or form to deliver on data capabilities. While they lag about one to two years behind app dev's adoption, the results are already beginning to show in terms of getting a better handle on their design and architectural decisions, improved data management collaboration, and better alignment of developer skills to tasks at hand. 

But we have a long way to go. The first reason to adopt Agile development is to speed up the release of data capabilities. And the problem is, Agile development is adopted to speed up the release of data capabilities. In the interest of speed, the key value of Agile development is quality. So, while data management is getting it done, they may be sacrificing the value new capabilities are bringing to the business.

Let's take an example. Where Agile makes sense to start is where teams can quickly spin up data models and integration points in support of analytics. Unfortunately, this capability delivery may be restricted to a small group of analysts that need access to data. Score "1" for moving a request off the list, score "0" for scaling insights widely to where action will be taking quickly.

Read more

Are Data Preparation Tools Changing Data Governance?

First there was Hadoop. Then there were data scientists. Then came Agile BI on big data. Drum roll, please . . . bum, bum, bum, bum . . .

Now we have data preparation!

If you are as passionate about data quality and governance and I am, then the 5+-year wait for a scalable capability to take on data trust is amazingly validating. The era for "good enough" when it comes to big data is giving way to an understanding that the way analysts have gotten away with "good enough" was through a significant amount of manual data wrangling. As an analyst, it must have felt like your parents saying you can't see your friends and play outside until you cleaned your room (and if it's anything like my kids' rooms, that's a tall order).

There is no denying that analysts are the first to benefit from data preparation tools such as Altyrex, Paxata, and Trifacta. It's a matter of time to value for insight. What is still unrecognized in the broader data management and governance strategy is that these early forays are laying the foundation for data citizenry and the cultural shift toward a truly data-driven organization.

Today's data reality is that consumers of data are like any other consumers; they want to shop for what they need. This data consumer journey begins by looking in their own spreadsheets, databases, and warehouses. When they can't find what they want there, data consumers turn to external sources such as partners, third parties, and the Web. Their tool to define the value of data, and ultimately if they will procure it and possibly pay for it, is what data preparation tools help with. The other outcome of this data-shopping experience is that they are taking on the risk and accountability for the value of the data as it is introduced into analysis, decision-making, and automation.

Read more

Data Governance and Data Management Are Not Interchangeable

Since when did data management and data governance become interchangeable?

This is a question that has both confounded and frustrated me.  The pursuit of data management vendors to connect with business stakeholders, because of the increasing role business units have had in decison making and holding the purse strings to technology purchases, means data governance as a term was hijacked to snuff out the bad taste of IT data projects gone sour. 

The funny thing is, vendors actually began drinking their own marketing Kool-aid and think of their MDM, quality, security, and lifecycle management products as data governance tools/solutions.  Storage and virtualizations vendors are even starting to grock on to this claiming they govern data. Big data vendors jumped over data management altogether and just call their catalogs, security, and lineage capabilities data governance.  

Yes, this is a pet peeve of mine - just as data integration is now called blending, and data cleansing and transformation is now called wrangling or data preparation. But more on that is another blog...

First, you (vendor or data professional) cannot simply sweep the history of legacy data investments that were limited in results and painful to implement under the MadMen carpet. Own it and address the challenges through technology innovation rather than words.

Read more

Let's Break All The Data Rules!

When I think about data, I can't help but think about hockey. As a passionate hockey mom, it's hard to separate my conversations about data all week with clients from the practices and games I sit through, screaming encouragement to my son and his team (sometimes to the embarrassment of my husband!). So when I recently saw a documentary on the building of the Russian hockey team that our miracle US hockey team beat at the 1980 Olympics, the story of Anatoli Tarsov stuck with me. 

Before the 1960s, Russia didn't have a hockey team. Then the Communist party determined that it was critical that Russia build one — and compete on the world stage. They selected Anatoli Tarsov to build the team and coach. He couldn't see films on hockey. He couldn't watch teams play. There was no reference on how to play the game. And yet, he built a world-class hockey club that not only beat the great Nordic teams but went on to crush the Canadian teams that were the standard for hockey excellence.

This is a lesson for us all when it comes to data. Do we stick with our standards and recipes from Inmon and Kimball? Do we follow check-box assessments from CMMI, DM-BOK, or TOGAF's information architecture framework? Do we rely on governance compliance to police our data?

Or do we break the rules and create our own that are based on outcomes and results? This might be the scarier path. This might be the riskier path. But do you want data to be where your business needs it, or do you want to predefine, constrain, and bias the insight?

Read more

Is Zombie Data Taking Over?

It is easy to get ahead of ourselves with all the innovation happening with data and analytics. I wouldn't call it hype, as that would imply no value or competency has been achieved. But I would say that what is bright, shiny, and new is always more interesting than the ordinary. 

And, to be frank, there is still a lot of ordinary in our data management world.

In fact, over the past couple of weeks, discussions with companies have uncommonly focused on the ordinary. This in some ways appeared to be unusual because questions focused on the basic foundational aspects of data management and governance — and for companies that I have seen talk publicly about their data management successes.

"Where do I clean the data?"

"How do I get the business to invest in data?"

"How do I get a single customer view of my customer for marketing?"

What this tells me is that companies are under siege by zombie data. 

Data is living in our business under outdated data policies and rules. Data processes and systems are persisting single-purpose data. As data pros turn over application rocks and navigate through the database bogs to centralize data for analytics and virtualize views for new data capabilities, zombie data is lurching out to consume more of the environment, blocking other potential insight to keep the status quo.

The questions you and your data professional cohorts are asking, as illustrated above, are anything but basic. The fact that these foundational building blocks have to be assessed once again demonstrates that organizations are on a path to crush the zombie data siege, democratize data and insight, and advance the business. 

Keep asking basic questions — if you aren't, zombie data will eventually take over, and you and your organization will become part of the walking dead.

Read more

Yellow Elephants and Pink Unicorns Don't Tell The Real Big Data Story

Big data and Hadoop (Yellow Elephants) are so synonymous that you can easily overlook the vast landscape of architecture that goes into delivering on big data value. Data scientists (Pink Unicorns) are also raised to god status as the only real role that can harness the power of big data -- making insights obtainable from big data as far away as a manned journey to Mars. However, this week, as I participated at the DGIQ conference in San Diego and colleagues and friends attended the Hadoop Summit in Belgium, it has become apparent that organizations are waking up to the fact that there is more to big data than a "cool" playground for the privileged few.

The perspective that the insight supply chain is the driver and catalyst of actions from big data is starting to take hold. Capital One, for example, illustrated that if insights from analytics and data from Hadoop were going to influence operational decisions and actions, you need the same degree of governance as you established in traditional systems. A conversation with Amit Satoor of SAP Global Marketing talked about a performance apparel company linking big data to operational and transactional systems at the edge of customer engagement and that it had to be easy for application developers to implement.

Hadoop distribution, NoSQL, and analytic vendors need to step up the value proposition to be more than where the data sits and how sophisticated you can get with the analytics. In the end, if you can't govern quality, security, and privacy for the scale of edge end user and customer engagement scenarios, those efforts to migrate data to Hadoop and the investment in analytic tools cost more than dollars; they cost you your business.

Read more

3 Ways Data Preparation Tools Help You Get Ahead Of Big Data

The business has an insatiable appetite for data and insights.  Even in the age of big data, the number one issue of business stakeholders and analysts is getting access to the data.  If access is achieved, the next step is "wrangling" the data into a usable data set for analysis.  The term "wrangling" itself creates a nervous twitch, unless you enjoy the rodeo.  But, the goal of the business isn't to be an adrenalin junky.  The goal is to get insight that helps them smartly navigate through increasingly complex business landscapes and customer interactions.  Those that get this have introduced a softer term, "blending."  Another term dreamed up by data vendor marketers to avoid the dreaded conversation of data integration and data governance.  

The reality is that you can't market message your way out of the fundamental problem that big data is creating data swamps even in the best intentioned efforts. (This is the reality of big data's first principle of a schema-less data.)  Data governance for big data is primarily relegated to cataloging data and its lineage which serve the data management team but creates a new kind of nightmare for analysts and data scientist - working with a card catalog that will rival the Library of Congress. Dropping a self-service business intelligence tool or advanced analytic solution doesn't solve the problem of familiarizing the analyst with the data.  Analysts will still spend up to 80% of their time just trying to create the data set to draw insights.  

Read more

Beyond Big Data's Vs: Fast Data Is More Than Data Velocity

When you hear the term fast data the first thought is probably the velocity of the data.  Not unusual in the realm of big data where velocity is one of the V's everyone talked about.  However, fast data encompasses more than a data characteristic, it is about how quickly you can get and use insight.  

Working with Noel Yuhanna on an upcoming report on how to develop your data management roadmap, we found speed was a continuous theme to achieve. Clients consistently call out speed as what holds them back.  How they interpret what speed means is the crux of the issue.

Technology management thinks about how quickly data is provisioned.  The solution is a faster engine - in-memory grids like SAP HANA become the tool of choice.  This is the wrong way to think about it.  Simply serving up data with faster integration and a high performance platform is what we have always done - better box, better integration software, better data warehouse.  Why use the same solution that in a year or two runs against the same wall? 

The other side of the equation is that sending data out faster ignores what business stakeholders and analytics teams want.  Speed to the business encompasses self-service data acquisition, faster deployment of data services, and faster changes.  The reason, they need to act on the data and insights.    

The right strategy is to create a vision that orients toward business outcomes.  Today's reality is that we live in a world where it is no longer about first to market, we have to be about first to value.  First to value with our customers, and first to value with our business capabilities.  The speed at which insights are gained and ultimately how they are put to use is your data management strategy.  

Read more

The Theory of Data Trust Relativity

Since the dawn of big data data quality and data governance professionals are yelling on rooftops about the impact of dirty data.  Data scientists are equally yelling back that good enough data is the new reality.  Data trust at has turned relative.

Consider these data points from recent Forrester Business Technographics Survey on Data and Analytics and our Online Global Survey on Data Quality and Trust:

  • Nearly 9 out of 10 data professionals rate data quality as a very important or important aspect of information governance
  • 43% of business and technology management professionals are somewhat confident in their data, and 25% are concerned
Read more