Spend Analysis II: Why Data Analysis Is Avoided

Today’s post is from Eric Strovink of BIQ.

If I have learned one thing during my career as a software developer and software company executive, it’s this: contrary to what I believed when I was a know-it-all 20-something, there are a lot of clever people in the world. And clever people make smart decisions (for example, reading this blog, which thousands do every day).

One of those decisions is the decision NOT to perform low-probability ad hoc data analysis. It’s a sensible decision. Sometimes it’s based on empirical study and hard-won experience, and sometimes it’s a gut feel; but either way, the decision has a strong rational basis. It’s just not worthwhile.

A picture is helpful:


Click image to enlarge

The above shows the expected value of an ad hoc analysis of a $100K savings opportunity. On the X axis is the number of days required to prepare and analyze the data; on the Y axis, the probability that the analysis will be fruitful. I chose a $700 opportunity cost per analyst-day; choose your own number, it doesn’t really matter.

Note that the graph is mostly “underwater”; that is, the expected value of the analysis is largely negative. Unless the probability of success is quite high, or the time taken to perform the analysis is quite low, it’s simply not a good plan to undertake it.

We are all faced with the decision, from time to time, whether to explore a hunch or not. However, an analyst can only work for 220 days per year. Sending a key analyst off on a wild goose chase could be a serious setback, so it’s a risky decision, and we don’t do it, and so our hunches remain hunches forever.

But what if it wasn’t risky at all?

Nothing can be done about the “probability that the analysis will be fruitful”; that’s fixed. But plenty can be done about the “number of days required to prepare and analyze data.” Suppose a dataset could be built in 5 minutes, and analyzed in under an hour? This turns the expected value of speculative analysis sharply positive. Suddenly it is a very good idea indeed to perform ad hoc analysis of all kinds.

And that’s good news. Because there is a ton of useful data floating around the average company that nobody ever looks at. Top of the list? Invoice-level detail, from which all kinds of interesting conclusions can be drawn. Try this experiment: acquire invoice-level (PxQ) data from a supplier with whom you have a contract. Dump it into an analysis dataset, and chart price point by SKU over time. Chances are, like most companies, you’ll find something very wrong, such as prices all over the map for the same SKU (ironically, sometimes this happens even if you have an e-procurement system that’s supposed to prevent it). If you have a contract, only one of those prices is correct; the rest are not, and represent money on the table that you can recover trivially.

Of course, please don’t spend weeks or months on the exercise, because then it won’t pay off. Instead, find yourself a data analysis tool with which you can move quickly and efficiently — or a services provider who can use the tool efficiently for you (and thus make a contingency-based analysis worthwhile for both of you). Bottom line: if you can’t build a dataset by yourself, in minutes, you’ll end up underwater, just like the graph.

Next installment: Crosstabs Aren’t “Analysis”

Previous Installment: It’s the Analysis Stupid

Share This on Linked In