Category Archives: Spend Analysis

Why Are CFOs and CPOs Still Delusional When it Comes to Analytics?

the doctor was recently asked if an organization needs a dedicated Sourcing Spend Analytics solution if the organization already has a generic BI tool that sits on top of its ERP or data warehouse. Well, while the answer is No in theory, it’s rarely No in practice. This is because even if the generic platform you have can support (sourcing) spend analysis, chances are it hasn’t been set up for that. And it will need to be (heavily) customized.

So you either need to get a consultancy and do a lot of specialization, or buy a dedicated solution that is ready out of the box — and, preferably, if possible, buy one that is built on top of your BI platform if you bought one (like Tableau or Qlik) that is best in class.

As we noted in our piece last year that asked why do we still have first generation ERP/Data Warehouse BI, most arguments for generic BI have more holes than swiss cheese. As the Spend Master noted himself ten years ago in his classic, but still under-read, piece on screwing up the screw-ups in BI:

  • central databases, like the kind favoured by most BI tools, don’t solve the analysis problem
  • business analysts should be able to construct BI datasets on their own
  • a governance and stewardship program, which is likely the reason for the generic BI platform acquisition, doesn’t actually put any meat on the table
  • cleansing is often the problem, not basic analysis & reporting
  • BI systems are difficult to use and set up, it is difficult to create ad hoc reports, and it is impossible to change the dataset organization … all the stuff that makes spend analysis, you know, valuable

Plus,

  • BI reports are pretty generic, and not fine tuned to Sourcing, Procurement, or Finance
  • BI engines work on one schema — the ERP schema … which is rarely suited to spend analysis
  • BI engines expect all of the data to come from the ERP. SA systems don’t.
  • The ability of first (and even second generation) BI engines to create arbitrary reports is considerably overstated.

Hopefully someday soon CPOs and CFOs alike will get the point that if you want to do proper Sourcing and Procurement Spend Analysis, you need a proper Sourcing and Procurement Spend Analysis Solution.

Don’t Throw Away That Old Spend Cube, Spendata Will Recover It For You!

And if you act fast, to prove they can do it, they’ll recover it for free. All you have to do is provide them 12 months of data from your old cube. More on this at the end of the post, but first …

As per our article yesterday, many organizations, often through no fault of their own, end up with a spend cube (filled with their IP) that they spent a lot of money to acquire, but which they can’t maintain — either because it was built by experts using a third party system, built by experts who did manual re-mappings with no explanations (or repeatable rules), built by a vendor that used AI “pattern matching”, or built by a vendor that ceased supporting the cube (and simply provided it to the company without any of the rules that were used to accomplish the categorization).

Such a cube is unusable, and unless maintainable rules can be recovered, it’s money down the drain. But, as per yesterday’s post, it doesn’t have to be.

  1. It’s possible to build the vast majority of spend cubes on the largest data sets in a matter of days using the classic secret sauce described in our last post.
  2. All mappings leave evidence, and that evidence can be used to reconstruct a new and maintainable rules set.

Spendata has figured out that it’s possible to reverse engineer old spend cubes by deriving new rules by inference, based on the existing mappings. This is possible because the majority of such (lost) cubes are indirect spending cubes (where most organizations find the most bang for their buck). These can often be mapped to 95% or better accuracy using just Vendor and General Ledger code, with outliers mapped (if necessary) by Item Description.

And it doesn’t matter how your original cube was mapped — keyword matching algorithms, the deep neural net de jour, or by Elves from Rivendell — because supplier, GL-code, and supplier and GL-code patterns can be deduced from the original mappings, and then poked at with intelligent (AI) algorithms to find and address the exceptions.

In fact, Spendata is so confident of its reverse-engineering that — for at least the first 10 volunteers who contact them (at the number here) — they’ll take your old spend cube and use Spendata (at no charge) to reverse-engineer its rules, returning a cube to you so you can see the results (as well as the reverse-engineering algorithms that were applied) and the sequenced plain-English rules that can be used (and modified) to maintain it going forward.

Note that there’s a big advantage to rules-based mapping that is not found in black-box AI solutions — you can easily see any new items at refresh time that are unmapped, and define rules to handle them. This has two advantages.

  1. You can see if you are spending where you are supposed to be spending against your contracts and policies.
  2. You can see how fast new suppliers, products, and human errors are entering your system. [And you can speak with the offending personnel in the latter case to prevent these errors in the future.]

And mapping this new data is not a significant effort. If you think about it, how many new suppliers with meaningful spending does your company add in one month? Is it five? Ten? Twenty? It’s not many, and you should know who they are. The same goes for products. Chances are you’ll be able to keep up with the necessary rule additions and changes in an hour a month. That’s not much effort for having a spend cube you can fully understand and manage and that helps you identify what’s new or changed month over month.

If you’re interested in doing this, the doctor is interested in the results, so let SI know what happens and we’ll publish a follow-up article.

And if you take Spendata up on the offer:

  1. take a view of the old cube with 13 consecutive months of data
  2. give Spendata the first 12 consecutive months, and get the new cube back
  3. then add the 13th month of data to the new cube to see what the reverse-engineered rules miss.

You will likely find that the new rules catch almost all of the month 13 spending, showing that the maintenance effort is minimal, and that you can update the spend cube yourself without dependence on a third party.

Is That Old Spend Cube Money Down the Drain?

How many times has this happened? You hire some experts to help with a sourcing effort, they produce a one-off spend analysis, you run some initiatives and realize some savings, and … a year later, you’ve got an obsolete spend cube with IP you’ve paid a lot of money for, but can neither use nor extend, because either the experts didn’t share the process they used to create the cube or, even worse, they used “AI” with “intelligent transaction pattern matching” and there simply aren’t any rules to share.

Or, as often happens (due to the competitive landscape), maybe your original vendor has lost interest in spend analysis, or has left the business, or was acquired and sidelined — and your spend analysis system is either end-of-life, largely unsupported, or obsolete. What then?

Well, you have two options:

  1. Write it off, throw it away, and start all over again
  2. Recover the cube

And yes, you read that right, recover the cube!

You’re probably saying, how can that be done, especially if the original cube was mapped with AI or one-time overlay rules that were created by an expert and lost in the sands of time?

With intelligence, observation, and an application of proper, inverse, AI that sifts through the evidence left behind and generates real rules to start you off — rules that can then be extended in a system that supports layering in a logical fashion to not only allow for a re-creation of the original cube, but an improvement that fixes original errors and takes into account changes in the business since the cube was created.

And yes, this is possible, because mappings leave evidence, the same way a suspect at a scene leaves evidence, and that evidence can be unearthed by applying the digital equivalent of classic archaeological techniques that have been used for over a century to interpret the past. (the doctor has given presentations on this and if you are intrigued, contact him)

And it’s even easier in the case of spend analysis when you remember that you can completely map even a Fortune 100’s spend by hand in less than a week to high accuracy by using the classic secret sauce of:

  1. map the GL codes
  2. map the suppliers
  3. map the suppliers and GL codes
  4. map the exceptions
  5. map the (significant) exceptions to the exceptions

… and then run the rules in the same order.

This works because the vast majority of spend cubes are on indirect spend, and indirect spend cubes can almost always be mapped effectively this way. Even if there is no specific GL code in the data set, there should be similar patterns around the key fields that determine GL code (product description, SKU, etc.) And what doesn’t match defines the exceptions.

In other words, it’s theoretically possible to do a reverse engineering when you understand the foundations of most spend cubes and learn how to interpret the mapping evidence left behind.

But, is anyone doing this?

… And Stop Paying for More Analysis Software Than You Need!

Yesterday SI featured a guest post from Brian Seipel who advised you to Stop Paying for More Analysis than You Need because, simply put, a lot of analytics effort and reports yield little to no return. As Brian expertly noted,

  • Sometimes 80% classification at the transactional level is enough
    Especially if you can get 95%+ by supplier or dollar volume. Once it’s easy to see there’s no opportunity in a category (either because it’s all under contract, the spend is low, the spend versus market price on what is classified leaves little savings opportunity etc.), why classify more?
  • If you are producing a heap of reports on a regular basis, many won’t get looked at
    Especially if the reports aren’t telling you anything new. Plus, as previously explained on SI, a great Spend Analysis Report is useful 3 times. The first time it is used to detect an opportunity, midway through a project to capture an identified savings opportunity to make sure the plan is coming together, at the end of the project to gauge the realized savings. That’s it.
  • A 20% savings isn’t always meaningful
    You’re probably overspending on office supplies by 20%, but it may not matter. If office supplies (because you’ve moved to a mostly paperless office thanks to investments in 2nd monitors and tablets and secure electronic distribution and janitorial supplies is under MRO) is only 10K, and capturing that 2K would take a week of effort running a simple event and negotiating a master contract when your fully burdened cost is 2K a day, is it worth it? Heck no. You don’t spend 10K to save 2K. It’s all about the ROI.
  • Speculative analysis on categories you have no control over may not pay out
    Just because you can show Marketing they are overspending by 50% doesn’t mean they are going to do anything about it. If they solemnly believe you can’t measure talent or impact on a spend basis, and you have no say over the final award, you will be fighting an uphill battle and while the argument should be made to the C-Suite, it has to come from the CPO, so until she is ready to take the battle on, spending on an analysis you can predict from intuition and market analysis is not going to give the ROI you need today.

When you put all this together, this gives you some rules about what you should be looking for, and spending on, when you select an analytics system (especially if you are not a do-it-yourselfer, even though there are systems today that are ridiculously easy to use compared to the reporting systems that first rolled out two decades ago).

  • Don’t overpay for auto-class
    While no one wants to manually classify transactions (even though a crack analyst can classify a Fortune 500 spend by hand in 2 to 3 days to 95%+ with a powerful multi-level rules-based system with regular expression pattern match, augmented intelligence, and drag and drop reclassification capability), considering how easy it is to manually classify straggler transactions once you’ve achieved 90%+ auto-classification to a best-in-class industry categorization (with 95%+ reliability), don’t overpay for auto-class. In fact, don’t pay extra at all — there are a dozen systems with this feature that can get you there. Only pay extra for a system that makes it easy to accomplish mappings and re-mappings and maintain them in a consistent and non-conflicting manner.
  • It doesn’t matter how many reports there are out of the box
    Because, once you get through the first set of projects that fix the spend issues identified, they will all be useless anyway. What matters is how many templates there are for customizing your own. It’s all about being able to define the top X from a subset of categories, geographies, suppliers, departments, users, etc. that are likely to contain your best opportunities, not just the top X spend or transaction volume. It’s about the Schneidermann diagrams and bubble charts on the dimensions that matter on the relevant subset of data. It should be easy to define any type of report you may need to run regularly on whatever filtered subset of data that is relevant to you at the time.
  • Totals, CheckSums, and Data Validations Should be Easy
    … and auto-run on every data import. You want to be able to focus in on your mapping and verification efforts where the spend, and potential opportunity, is large enough to be worth your time, know that the totals add up (to what is expected), and that the data wasn’t corrupted on export or import. The system should verify the data is within the appropriate time window, that at least key dimensions (supplier [id], GL code, etc.) are within expected sets and ranges, and source system identifiers are present.
  • Built In Category Intelligence is only valuable if you need it
    … don’t pay for community spend intelligence, integrated market feeds, or best-practice templates for categories you don’t source (regularly) or that don’t constitute a significant savings opportunity, especially if those fees are ongoing as part of a subscription. Unless it’s intelligence you will use every month, pay for it as a one-off from a market intelligence vendor that offers that service.

The reality is that second generation spend analysis systems are now a commodity, and you can get a great enterprise platform subscription that starts in the low to mid five figures annually that does more than than most organizations need. (And personal consultant licenses to great products for much, much, less.) Don’t overpay for the software, save it for the analyst who can use it to find you savings.

Stop Paying for More Analysis than you Need


Today we welcome another guest post from Brian Seipel a Procurement Consultant at Source One Management Services focused on helping corporations understand their spend profile and develop actionable strategies for cost reduction and supplier relationship management. Brian has a lot of real-world project experience in supply chain distribution, and brings some unique insight on the topic.

I wrapped up a large spend analysis initiative recently. The project spanned a dozen international operating companies with over two dozen stakeholders pitching in. By the end, we analyzed roughly one million transactions from dozens of disparate systems. It was a lot of work to be sure, but it also provided an unparalleled view into over $1 billion in spend.

Despite the heavy lift, this analysis was critical. It served as the foundation for identifying strategic sourcing projects slated to save this organization millions. The benefit far outweighed the cost.

This is not always the case.

We live in an age where analytics reign supreme. Some organizations staff whole teams to churn out an uncountable (and maybe uncontrollable) number of spreadsheets, reports, and dashboards filled to the brim with data. Other organizations hire third parties like yours truly or implement state-of-the-art analytics packages to crunch these numbers. Either way, end users are left with more data points than they’d ever care to actually use in their decision-making processes.

I feel like I’ve slammed a lot of hyperbole into a few short paragraphs. Let’s dial it back with a simple statement and follow-up question: Even in this data-forward world, organizations need to ensure that we’re not wasting valuable resources on analyses that don’t warrant it. So how do we tell which efforts are worth the time?

Let’s break that down into a few more specific questions.

What direct impact are we trying to make?

This sounds like a throw-away question, but it isn’t. Think of the last ten reports you personally handed off to your boss or your boss’ boss. If I were a betting man, I’d say you could take at least one of them out of your weekly stack without the end user even noticing. Why? Because the people consuming these reports are inundated by data. They don’t have time to sift through reports generated for the sake of bureaucracy.

If you can look at a report and not know what specific challenge it helps solve, odds are good the answer is “none.” Sync up with the end user and confirm it provides the value you think it does.

How much of an impact can we expect?

A spend analysis has a clear enough direct impact on a defined challenge – we need to understand where money is going, to which suppliers, at what point in time, in order to identify projects to reduce cost. That said, some spend may not warrant the attention.

This may sound a bit like a “chicken vs. egg” issue, since we often can’t estimate value before we dig into the numbers. That said, we should have general figure in our mind before investing the time. Saving 20% on office supplies is great when your Staples bill is six figures. Drop that to a few spare thousand every year and the value just isn’t there.

How much buy-in can we expect?

Are relevant stakeholders likely to pursue the projects your analysis shines light on? If not, do you have the leverage, authority, or sheer charm and charisma needed to turn them? I’ve seen plenty of projects die on the vine because of hesitation or outright hostility on the part of key stakeholders. Investing in analytics for projects destined to fail before they start is a sucker’s game.

?There’s a decades-gone-by phrase that old timers in the IT industry will recognize: “Nobody ever got fired for buying IBM.” The elements of fear, uncertainty, and doubt that made it effective back then are still relevant today. Think of the last time your office’s internet connection dropped off, even for a few minutes. Were you thinking about the cost savings your new provider offers? Cost savings may be good, but IT knows reliable uptime is better and is what makes or breaks them.

How deep does our dive need to be?

It pays to get down into the weeds when creating a spec list or generating an in-depth market basket. Once you’ve established the value of a project, it makes sense to invest in it by pulling the devil out of the details. Ending on a detailed note doesn’t mean we need to start the same way, though.

I pick on office supplies a lot when giving an example here. Let’s go back to that six figure Staples spend from earlier. How many pens, pencils, dry erase markers, reams of paper, and other supplies make up that figure? We’re looking at potentially thousands of line items. Remember the goal of our spend analysis – identify projects that can lead to cost savings. Do we really care about each individual line item right now? Will knowing how many black ballpoints versus blue felt tips make project identification easier? No – in fact, spending too much time on this granular detail now will only waste time and lead to potential lost opportunity costs.


I understand the knee-jerk reaction to traverse that DIKW (Data-Information-Knowledge-Wisdom) pyramid, I really do. It often is the right call. At the same time, there’s something to be said for taking a step back and looking at the bigger picture.

Every action we take needs to have purpose. Don’t waste time on a report today just because you ran it yesterday. Understand how your analysis fits into your organization’s goals and, if you find it doesn’t, cut ties so you can focus on more impactful endeavours.

Thanks, Brian!