The Consumer Guide: Choosing the Right Interaction Analytics Vendor for Your Big Data Needs

HB for blog

Lately I’ve been contemplating the purchase of a new laptop for my family.  Of course, I have one for work, but my family needs one that the kids can take over to play games, use for homework, and generally just serve as that all-purpose connection to the world and works for everyone.  It doesn’t need to have every whiz bang feature out there, but I want it to be reliable, easy to use, and meet some general requirements that we laid out as a family.  Since I’m not one who enjoys getting a big sales pitch in the store, I tend to do my research at home, and often turn to that popular magazine that rates items using that handy scale with the red and black circles – you know the one I’m talking about, right?  It’s unbiased and hones in on the key features you need to know about.

A few days after I used this publication to help me make a decision, I was working with a client who was interested in using customer interaction analysis output as an input to their big data efforts.  As you know, we’ve discussed how these interactions are the missing piece of the puzzle, so I was pleased the client was going this route to fully understand their customer base. I got to thinking, what criteria would be needed to get the highest rating, or the full red circle, for interaction analytics in a big data world?  Here are my thoughts.  You know this: your company requires a solution provider that will meet certain technology requirements to capture, synthesize and analyze customer interaction data.  But, in a world where it’s easy to get overwhelmed by the amount of information available, especially when it comes to big data, the benefit of analyzing interaction data is the ability to predict things (like churn, sales, CSAT)  in the form of probabilities.  Below I’ve identified three key things to look for to ensure success.


When talking to vendors this word gets thrown around a lot.  Most vendors use data samples because it takes a lot of horse power to transcribe.  The trouble with this approach on its own is that companies develop a hypothesis in advance, then test against the hypothesis using samples.  People have an unconscious bias towards the selection of, and then the interpretation of the sample data.  Using all of the interaction data is unbiased.  “N=all” allows things that would not normally appear to be found at a granular level, with no bias.  It’s important to find a technology solution that leverages a framework and architecture that can process 100% of interactions, quickly, with a minimal footprint. Simply put, samples of interactions aren’t good enough when you’re trying to build predictive models or study intricate probabilities.  For example, when studying churn, a customer’s propensity to leave your company is often based on the journey of hitting several trigger points that occur over a series of interactions—i.e. first they reach out to state a technical issue they’re having, maybe they call back to state the issue is still happening and the resolution paths they’ve tried, then they call a third time to say it’s still not resolved, and they’ve received their bill after being rolled off of a promotion. If given the proper information, an unbiased model can determine that customers who align to several events are statistically very likely to churn versus others.  Unfortunately, if you were only collecting a sample, you may not have correctly captured the number of times all of the events happened, or the correlations between them, thus degrading the overall accuracy and action-ability of the model.

Also, you need to be able to process results quickly, sometimes as quickly as on an hourly basis. The ultimate goal of any predictive model that you feed with all your interactions is to generate information that you can act on, and often times, the timeliness of those actions is paramount to the overall success. Returning to the example of churn, if the model identifies customers who are likely to not only churn, but will churn in the same day, then you need to know who they are so you can mitigate the risk proactively, or quickly reach out to them right after their last interaction in order to address their problem before they leave.  The only way to achieve this is to have a solution that incorporates a framework and architecture that processes 100% of interactions, to ensure all relevant events are captured quickly and with a minimal footprint.


It is important to find a solution that gives your company the ability to fully explore everything contained in your audio interactions without the concern that you will need to go through the time consuming process of re-indexing and re-processing the audio you’ve already ingested.  Once interactions are processed, everything should be available and in machine readable form, for as long as you want, and for sole purpose of analytics.

Another key factor to consider is how quickly you are able to get started. Solutions should have the ability to be installed on-site or run through the cloud within weeks, not months.  They should be quickly integrated with other systems and metadata, guiding users through business case construction and cause and effect connections.


There are three groups at the heart of big data: the data creators, the data analysts, and the data capitalists.  An open forum and standard for information coming and going to and from your interaction analytic platform is crucial to support the creation of all three roles at your company.  No vendor should hold your company’s information and data hostage.   There is too much at stake.  Many analytic companies will use security as a reason to charge additional fees or keep you from your data, making it costly to use with other software applications. It’s essential that you work with a vendor who will treat your data as well as you do.  You should look for someone who uses an open encryption methodology, has the ability to bulk decrypt and extract the data so that it stays secure throughout the extraction and transfer process and will keep it according to PCI standards but not use that as an excuse as to why it can’t be used for other applications.  Not only is integrating with other systems essential, so is the ability to be able to export the data into usable formats for unlimited purposes. It’s key to find a system designed to make it easy to get information in and out, to ensure you’ll be able to use it in big data environments.

When searching for a big data analytics solution provider you may not be able to find a consumer friendly report to help you make your decision, but there’s significant advantages to choosing one vendor over another.  Remember, when considering a vendor/partner for customer interaction analytics, a technology that is scalable, flexible and open should be ranked the highest so you set your company up for big data success and fully leverage your customer interaction data asset.

{Photo credit: Wikimedia}

Categories: Choosing The Best