In the world of buzz, big data has been all the rage and there are lots of companies chasing this space coming from lots of different angles. In my previous life, I worked with business intelligence leaders and I know their work does not come cheap. You are looking at usually a minimum of $250K to fire up their engines and then you are doing something very unique or one offs with each engagement. As John and I were thinking about our startup we kept coming back to what were the “common” data challenges of emerging companies where the world of big data was outside their resources. When we started to engage startups, we found most would love to have a big data problem with tons of customers, but the reality is that most firms don’t have that critical mass of customers. Given their limited information about their customers, it is time to apply a small data approach.

In researching what is being discussed about the value of small data, and how to amplify it, what would you need? Naturally, the first place I went to investigate small data was google where I found two interesting sources. Martin Lindstrom, a branding expert has written a book, Small Data. Lindstrom has approached the topic from a consumer market perceptive, and careful observations of customers in their own environment. I also found Allen Bonde of Small Data Group, a former McKinsey and Yankee Group consultant with the earliest thinking in this area. Bonde has proposed a definition of small data where it is focused on people whereas big data is about machines. Although Lindstrom and Bonde are coming at small data from a B2C and product development perceptive I keep thinking there is a B2B side, where close scrutiny of individual customer information can give you bigger company and industry targeting insights.

The challenge with small data is it is limited, she says stating the obvious. John and I kept discussing what was needed to make small data, big data. We identified two key elements:
1. A rich database to augment or enrich the small data
2. Deeper intelligence to further develop insights

With that we got to work on Escape Velocity Inc. where we would combine the small customer data of B2B clients with an enterprise class B2B database. Then we would use AI, machine learning as the amplifier of the combined data sources. We have been busy building and enhancing our 19M B2B customer database while building algorithms to solve common small data challenges like the high percentage of generic emails. What this has yielded us is the ability to take nominal information frequently only a generic email addresses and expand and enrich the information to drive targeting of demand and revenue creation.

I keep coming back to the idiom; start small but think big. I think companies will have far more success if they attack the challenges of the voluminous world of data we live in with small steps and steady progress. Are we approaching the challenges of big data correctly?