Context – the missed side of “Big” data

Data challenges continue to occupy our mindshare, as huge time, effort and costs are being invested in finding solutions. In most cases, technology advancements seem to guide the pursuit of the right solution. This is in line with the traditional approach where information technology (IT) teams build expansive data landscapes, with business teams dictating requirements from outside looking in. The key premise behind such an approach is the relentless focus on data.

Most such solutions have attempted to deliver efficient storage, speedy retrieval and user friendly reporting with huge data warehouses, marts and decision support tools. The heavy focus on data management resolved a few of the data access and retrieval issues. However, it did not give us the right enterprise intelligence capability due to excessive focus on managing the “Data”.

Fast forward to today, and new paradigms of “Big” data and Internet of things (IoT) technologies are being offered in the hopes of solving volume, variety, velocity and possibly, veracity issues. The technology certainly helps as “Big” data offers many new exciting avenues and possibilities.  As we pursue the new age data management, we must ask, should we continue with our “data” management obsession? Will it lead to the right business intelligence outcome? What lessons have we learned from the past undertakings?

Today, the Exabyte(s) are already real as we start to scale the Zettabyte(s) and Yottabyte(s) peaks. But, to what end? The hard truth is that unwieldy data volumes rarely lead to meaningful information, knowledge or intelligence.  Traditional belief that if the data exists then it can be used has led us to humongous data repositories that are of little use.

The lesson being – obsession of data sans proper “Context” can neither tell a story nor paint a picture.

The “context of data”

A dictionary meaning of the word “context” is –

“The parts of a written or spoken statement that precede or follow specific word or passage, usually influencing its meaning or effect” (courtesy

Most meanings and interpretations of “context” emphasize importance of understanding the complete background to gain a holistic understanding as to implement proper, exact use of it. A simple example is the price of a product; the question is “Does this price as a number represent the value of the product to the consumer?” The short answer is “no”, not unless the consumer can relate to its additional facets such as quality, features, and usage. The value of the product is explained clearly when the context of the price is described or understood.

The poor use and return on investment of huge data repositories is due to what is missed – the “context” of the data in question. Without context, consumers cannot understand meaning and background to purpose the data correctly. Presently, many enterprises are seeking to make sense of their expansive data landscapes to give their user purpose for such data. Now technology offers uninhibited capabilities to capture and store new data. But, the question is – how do we capture the “context”?

Cueris’ WeaveXTTM approach prescribes a context focused approach where the context definition starts upstream in sync with the user demand for a specific outcome or decision support or analytical intelligence.   The approach comprises of the following major steps –

  1. Start with the context – describe all facets of the problem. To comprehend the problem, understand historical events, activities, outcomes and stakeholder roles. This collaborative and interactive step is the foundation to evolving high quality data analysis.
  2. Identify, acquire, and prepare the right data (quality and quantity) to enrich the context. Quality implies data that is true of meaning, available on time, and from the right source, etc. The goal is to present holistic view(s) from its inception through consumption, over its entire life-cycle. The right data leads to weaving of context strands that support variety of uses based on the respective context.
  3. Next is the analysis algorithm design using context strands populated with data in the preceding step. This iterative step focuses on evolving algorithms to support the analysis targeting a specific outcome(s).
  4. In this last step, the analysis results are prepared for presentation to the end users. The focus is on communication of outcomes such that the consumers can easily relate to or understand the prediction in reference to the context. The goal is to provide actionable intelligence to support diligent decision making and proactive, mitigation actions.

The context framing starts with subject matter experts, data scientists and other stakeholder teams. Each individual step(s) concludes with a list of specific, either technical or non-technical actions, that may include deployment of specific tools, processes or technologies. This conversation emphasizes importance of defining the context upstream during early stages of analytics.

Category: Blog
Leave a Comment