Daily Archives: April 27, 2026

Next Generation Analytics NEEDS to Surface Root Cause Analysis …

… but relationship modelling alone is NOT going to get us there!

In another great article by Xavier Olivera of Hackett Spend Matters, he dives into the topic of how procurement analytics needs to work – from visibility to orientation because current procurement analytics offerings, while reasonably good and actionable at the process level compared to where they were a few years ago, are poor at helping users orient themselves when a specific goal or problem comes into focus.

He notes that when a procurement leader decides they want to improve X, the challenge is no longer visibility. It is knowing which analytics matter for that objective and which do not. But all the analytics platforms give them today is metrics, they don’t give them direction. Even if the user knows what metric to drill into first (because it is the highest, lowest, or outlier), all they can see is the data that contributed to that metric. For spend, the transactions. For a supplier rating, the Net Promoter Scores. For a process, the time in each step.

The users see the immediate “what”, but not the “why”. Why were the transactions high? Is this market price, has the quantity gone up, or is the supplier charging above the agreed upon rate. For a rating, is it because the performance wasn’t up to spec, the delivery is consistently late, or the service/interactions are very poor. For a process, which time was too long (compared to average), unless you can dig into another level (and even then, why it was too long).

According to Xavier, in situations like these, analytics has to work different. When a procurement leader wants to improve contract compliance, the starting point should not be a full review of all compliance metrics, benchmarks and dashboards. It should be a guided path that surfaces the specific reports, KPIs and comparisons most likely to explain the gap, given the organization’s operating context.

Which is a great start, but just surfacing those reports, KPIs, and comparisons that are statistically relevant or deviations from a norm doesn’t explain the gap, it just captures the gap. Not only is it the case that a KPI only becomes meaningful once it is examined in the right context, but it only becomes useful if there is enough data to allow the system to determine, with high statistical likelihood, the root cause and actions to take that could address the root cause (and not just the symptom these systems surface today).

Xavier than tells us that the ability to orient analytics effectively depends on the data’s structure, which is partially right, but doesn’t quite capture the entire requirement. He goes onto state that Procurement outcomes do not arise from isolated transactions … they emerge over time from relationships and analytics is most effective when the underlying data model can express these relationships explicitly. Which is closer. But the reality is that this still isn’t enough for proper root cause analysis.

It’s critical, because without relationships you can’t trace the end metric back to the source data, but just being able to identify the source data only tells you what is fundamentally wrong, not why, or what you need to do about it.

That’s where analytics needs to get to.

If your steel category transactions are high, you can trace back to the contracts and whether or not the rates are per contract, the shipping is per carrier quote, the tonnage as expected, and the breakdown across steel categories appropriate for your current product lines or construction products. If any rates or tonnage don’t add up, you know the issue is the invoices — but you don’t know why they are being paid. Were the new rates not properly encoded? Were the tolerances within acceptable limits and the automatic OK-to-Pay issued despite the mismatch? Are category managers blindly overriding the system because the supplier was threatening late shipments if payments didn’t appear on time?

In Xavier’s example, if contract compliance is low, why? Is it just a few suppliers, or even a single supplier, across a category. If just a few suppliers, are they unaware of the contract because of personnel changeover? Did a new industry regulation adversely affect them? Was it actually the fault of a carrier or sub-tier supplier they had no control over? This is what you need to determine to ensure that compliance actually improves and stays improved.

In other words, you need more than the data, you need models that capture what the data element used in a KPI is, who or what creates the data in the first place (and how they create that data), what the data range and typical mean/median/mode values are, what positively or negatively impacts the data, and what can be done if a shift is desired in the data.

Without this baked in intelligence into the model, even if the root data in the system can be uncovered, a user won’t understand what it means or where to start doing something about it. That’s where analytics needs to get to for analysts to be proactive instead of reactive.

And this is another area where the Busch-Lamoureux approach to Exact Purchasing will help. When you define your categories at a granular level appropriate to to the quadrant of the pocket cube they occupy, you not only know what influences their cost, but what also influences their supply, what defines their quality, and what role third parties (that you may have to monitor) play. You have the foundations for doing real proactive analysis and identifying not only what “good” is but what is most likely contributing to a “not good” metric or data point and what standard options exist to address, and try to improve, the data point (as you need to mitigate high risk and manage high complex categories at a detailed level).

In other words, the future is knowledge-based models that capture more than data points and calculations, but what the data points actually mean and what factors (represented by other data points) directly influence the data points you are analyzing.