Procurement Trend #06. Data-Based Predictive Analytics

Three annoying anti-trends remain. We’re so close to the end that we can almost taste the bitter-sweet victory, but the sour taste in our mouths still remains as we must continue to provide those fashionably-challenged futurists with counter-examples to the trends of their fore-fathers that no one who didn’t lock themselves in a windowless padded room would try to pass off as a trend of tomorrow. We want to shame them for their stupidity, but we will leave their hard-earned humiliation for LOLCat, who is obviously quite fed up at having to spend yet another life listening to their ludicrousness, but still finding the time to point out how LOLCats have been sustainable at least since the first corrugated cardboard box was created.

So why do these pit-dwelling prophets from Hawalius keep pushing us trends from the rubbish pile? Besides the fact that some of them obviously spent the best part of last decade in a rancid cave, probably because they look around, see the laggard organizations still struggling with last decade’s technology, and assume they can still sell last decade’s leftover snake oil in today’s marketplace. Thus, if most organizations are struggling with proper historical spend analysis, data-based predictive analytics is obviously a future trend, and

  • good decisions require good data

    and so few organizations have good data

  • inventory forecasting is getting harder and harder

    as sudden changes in unemployment rate, interest rates, and brand sentiment as well as unexpected supply chain delays or competitive product introductions can all have a large impact on demand

  • market prices are getting even harder to predict in volatile markets

    and profitability often depends on slim margins

Which would be great reasoning if leading organizations hadn’t figured this out over a decade ago and moved on to doing something about it a while ago!

So what does this mean to you?

Clean and Enrich Your (Master) Data

Dirty data dictates dastardly decisions. And those never end well. But don’t go crazy trying to do it. 100% clean data is a pipe dream, and, as with most situations, the 80/20, or, to be more precise, the 90/10 rule applies. Clean and enrich as required to confidently map 90%+ of spend, including 90%+ of the spend for the top 90%+ of suppliers and the top 90%+ of products. Stop when the effort exceeds the return. With a good mapping tool, the mapping can be done for even the largest Fortune 500 by hand in a week. Depending on how good the data is, the analyst might even get to 95% or even 98%. Then, identify any glaring weaknesses (such as supplier financial or risk data, market data, or cost breakdowns relative to a Bill of Material) that are important from a spend analysis or should cost modelling viewpoint, and get that data.

Put Protocols and Safeguards in Place to Keep your (Master) Data that Way

It’s going to take time, money, and manpower to map, clean, and enrich the data. This will be time, money, and manpower wasted if protocols aren’t put in place to make sure not just anyone can update master data, or at least not without review and verification. Put workflows and approvals in place to minimize the chances of bad data getting into the system or data getting out of whack too quickly.

Automatically Augment Your (Master) Data with Market Data

Good historical data is good. But current market data is better. With past and current data you can not only know current conditions, but with current market data, updated regularly, you can compute trends.

Use All the Data to Predict Trends and Make Sourcing Decisions

Use the computed trends to predict likely future conditions based upon the trends and current market movements. Based on this data, you can judge whether or not it is a good time to source a category and lock in long-term pricing.