Lost your password?
Don't have an account? Sign Up


The advent of Big Data has certainly created bigger opportunities. But, success in implementing BIG DATA initiatives lies in being able to extract RELEVANT KNOWLEDGE from heaps of data generated. This onerous task is akin to getting a needle out of a haystack. More importantly, the right questions need to be asked within this sphere. Are conversions right or simply the number of orders, for what sort of products is price elasticity relevant?

The key often is how many are RELEVANT, and how does our current data help us in generating these solutions. It is imperative to understand that the great hope of micro-segmentation or micro-customization could often be a mirage. With a desire to micro customize we often tend to micro-segment. Our mathematical models, therefore, run a chance of overfitting.

Thus, if we were to intelligently extract long-term patterns and qualify with the excess digital information we could be doing the right thing. However, if we choose to look for too much of depth within information in a small bucket of time, we could be entering a phase of overfitting or looking for more truth than what exists in the current data.

Long-term patterns, qualified by additional information, on the other hand, help us create immense value. These values come in terms of preventing value chain snags and making the appropriate responses to potentially disappointed customers. The ability to proactively attend to such customers and prospects makes Data – BIG & PREDICTIVE. On the other hand micro-segmentation with short-term data can make your data provide you insights that may not stand the test of time and would be a waste with an excessive investment of time and resources.