Category Archives: Spend Analysis

The Key Reason Spend Analyses Fail (that Often Goes Overlooked)


Today we welcome another guest post from Brian Seipel a Procurement Consultant at Source One Management Services focused on helping corporations understand their spend profile and develop actionable strategies for cost reduction and supplier relationship management. Brian has a lot of real-world project experience in sourcing, and brings some unique insight on the topic.

Organizations that develop an understanding of their spend have an edge when it comes to strategic sourcing: They better understand where money is being spent, with who, and on what than others who enter into the process either blindly or as a knee-jerk reaction to an incumbent price hike. This is particularly important for tail spend in those spend categories on the indirect side that too often fly under the radar.

That edge isn’t a given, however. Building a spend analysis can serve as the foundation for strong opportunity assessments, but doing so won’t automatically lead to better sourcing projects. Organizations who spend time on spend analyses can and do still fail at strategic sourcing for a very big reason. We put too much faith in the front-end process of building this analysis, and forsake the back-end, leaving a critical gap in our understanding of our spend profile.

The Front-End Spend Analysis

The first steps of a spend analysis are akin to cleaning out your basement. What’s the first thing you do? Before sorting into keep-or-toss piles can begin, even before moving and opening boxes – we need to turn on the light and survey the room. “Turning on the light” is really what the front-end of a spend analysis is. Our goal is to shine a light on the spend we have so sourcing project identification can begin. How does a spend analysis accomplish this?


  • Cleansing & Consolidation. Take all of the disparate data sources that make up our profile and create a single view of them, cleaning up supplier names and other critical fields along the way. For example, referring to the supplier “Dun and Bradstreet,” with that single name, even when spend from a second set that refers to “D&B.”
  • Classification. With all spend in one consolidated set, we will now attach meaningful classifications. The discussion around the best way to do this is worthy of a discussion of its own, so let’s simply say care should be taken here. Choose a system that speaks to your organization’s process, products, and objectives.

Let’s cook up an example. Let’s say we want to look into our IT spend to see where we can cut costs. We conduct a spend analysis covering the points above and learn the following: We have four locations using four different managed IT service providers offering similar services at four different price points.

This is the type of intel that suggests a strategic sourcing initiative may be called for. Pitting these suppliers against each other in a market event will drive down costs and potentially streamline operations if we can establish a single supplier for all four locations. We can estimate these savings by building a baseline spend profile and comparing to our average savings by following this strategy within this category. Simple enough. So why do sourcing initiatives often fail to deliver?

Moving Into Opportunity Assessment

Because we just committed a big mistake: We took our initial view of the spend and jumped right to goal setting without taking the time to properly scope. We went from turning on the basement light to selling boxes, en masse and unopened, directly on Ebay without knowing what was inside.

As we go to market, our sourcing event fails each of our four locations for different reasons:


  • The first location is locked into a multi-year contract with a painful termination clause. Without scoping, who didn’t know what our contractual obligations looked like
  • The second location isn’t locked into a contract, but is locked in by a lack of competition in the market. Without scoping, we never looked beyond our own buying history into the market landscape
  • The third location is free of both of these problems, but this isn’t their first rodeo. They used the providers that locations one and two use in the past, but abandoned them due to severe performance issues. Without scoping, we can’t get a good enough view into the decision making process that led to incumbent relationships.
  • Finally, our fourth location. No issues with suppliers, contracts, or market competition. The problem here? When we dig into the spend, we realize the bulk was capex: The purchase of equipment for a new server room buildout. Now that the equipment is purchased, we won’t see this spend come back around for years to come. Without scoping, we assumed spend was annually recurring, and now we have next to nothing.

Better Spend Analysis through Better Scoping

Once our spend analysis is complete, we’ll need to bring additional stakeholders into the fold. Bring in the employees who actually interact with these suppliers and their products and work with them to develop a sourcing history:


  • Did we accurately describe how you use this supplier with our chosen classification system?
  • What are we specifically buying from this supplier, and are these purchases made regularly or only once every few years?
  • How was this supplier selected, and who chose them? Were any competitors engaged at the same time? How did this incumbent beat them out?
  • What does this supplier do well? Where are their biggest points of failure?
  • Has this category been sourced recently? How was the event conducted, and what was the result?

Beyond this interview, ask these stakeholders to provide copies of any active MSAs, SOWs, SLAs, or any other document that can help define the relationship. Of particular note will be termination clauses. What date does the agreement end, and what are the renewal terms? What steps do we follow to terminate on that date, and by when do they need to be taken? If terminating before that date, are there any penalties?

From Insight to Action

Building a detailed spend analysis takes time, and the commitment of resources that could be doing other things. As such, you need to ensure you get a good ROI out of the exercise.

The best way to do that is to see beyond the front-end of what a spend analysis is (the unification, cleansing, and classification of spend data) and consider what a spend analysis helps Procurement do (identify strategic sourcing initiatives and estimate potential impact). Scoping is a critical part of this process, and properly scoping opportunities that a spend analysis shines a light on is a great way to get that ROI.

Thanks, Brian!

Why Are CFOs and CPOs Still Delusional When it Comes to Analytics?

the doctor was recently asked if an organization needs a dedicated Sourcing Spend Analytics solution if the organization already has a generic BI tool that sits on top of its ERP or data warehouse. Well, while the answer is No in theory, it’s rarely No in practice. This is because even if the generic platform you have can support (sourcing) spend analysis, chances are it hasn’t been set up for that. And it will need to be (heavily) customized.

So you either need to get a consultancy and do a lot of specialization, or buy a dedicated solution that is ready out of the box — and, preferably, if possible, buy one that is built on top of your BI platform if you bought one (like Tableau or Qlik) that is best in class.

As we noted in our piece last year that asked why do we still have first generation ERP/Data Warehouse BI, most arguments for generic BI have more holes than swiss cheese. As the Spend Master noted himself ten years ago in his classic, but still under-read, piece on screwing up the screw-ups in BI:

  • central databases, like the kind favoured by most BI tools, don’t solve the analysis problem
  • business analysts should be able to construct BI datasets on their own
  • a governance and stewardship program, which is likely the reason for the generic BI platform acquisition, doesn’t actually put any meat on the table
  • cleansing is often the problem, not basic analysis & reporting
  • BI systems are difficult to use and set up, it is difficult to create ad hoc reports, and it is impossible to change the dataset organization … all the stuff that makes spend analysis, you know, valuable

Plus,

  • BI reports are pretty generic, and not fine tuned to Sourcing, Procurement, or Finance
  • BI engines work on one schema — the ERP schema … which is rarely suited to spend analysis
  • BI engines expect all of the data to come from the ERP. SA systems don’t.
  • The ability of first (and even second generation) BI engines to create arbitrary reports is considerably overstated.

Hopefully someday soon CPOs and CFOs alike will get the point that if you want to do proper Sourcing and Procurement Spend Analysis, you need a proper Sourcing and Procurement Spend Analysis Solution.

Don’t Throw Away That Old Spend Cube, Spendata Will Recover It For You!

And if you act fast, to prove they can do it, they’ll recover it for free. All you have to do is provide them 12 months of data from your old cube. More on this at the end of the post, but first …

As per our article yesterday, many organizations, often through no fault of their own, end up with a spend cube (filled with their IP) that they spent a lot of money to acquire, but which they can’t maintain — either because it was built by experts using a third party system, built by experts who did manual re-mappings with no explanations (or repeatable rules), built by a vendor that used AI “pattern matching”, or built by a vendor that ceased supporting the cube (and simply provided it to the company without any of the rules that were used to accomplish the categorization).

Such a cube is unusable, and unless maintainable rules can be recovered, it’s money down the drain. But, as per yesterday’s post, it doesn’t have to be.

  1. It’s possible to build the vast majority of spend cubes on the largest data sets in a matter of days using the classic secret sauce described in our last post.
  2. All mappings leave evidence, and that evidence can be used to reconstruct a new and maintainable rules set.

Spendata has figured out that it’s possible to reverse engineer old spend cubes by deriving new rules by inference, based on the existing mappings. This is possible because the majority of such (lost) cubes are indirect spending cubes (where most organizations find the most bang for their buck). These can often be mapped to 95% or better accuracy using just Vendor and General Ledger code, with outliers mapped (if necessary) by Item Description.

And it doesn’t matter how your original cube was mapped — keyword matching algorithms, the deep neural net de jour, or by Elves from Rivendell — because supplier, GL-code, and supplier and GL-code patterns can be deduced from the original mappings, and then poked at with intelligent (AI) algorithms to find and address the exceptions.

In fact, Spendata is so confident of its reverse-engineering that — for at least the first 10 volunteers who contact them (at the number here) — they’ll take your old spend cube and use Spendata (at no charge) to reverse-engineer its rules, returning a cube to you so you can see the results (as well as the reverse-engineering algorithms that were applied) and the sequenced plain-English rules that can be used (and modified) to maintain it going forward.

Note that there’s a big advantage to rules-based mapping that is not found in black-box AI solutions — you can easily see any new items at refresh time that are unmapped, and define rules to handle them. This has two advantages.

  1. You can see if you are spending where you are supposed to be spending against your contracts and policies.
  2. You can see how fast new suppliers, products, and human errors are entering your system. [And you can speak with the offending personnel in the latter case to prevent these errors in the future.]

And mapping this new data is not a significant effort. If you think about it, how many new suppliers with meaningful spending does your company add in one month? Is it five? Ten? Twenty? It’s not many, and you should know who they are. The same goes for products. Chances are you’ll be able to keep up with the necessary rule additions and changes in an hour a month. That’s not much effort for having a spend cube you can fully understand and manage and that helps you identify what’s new or changed month over month.

If you’re interested in doing this, the doctor is interested in the results, so let SI know what happens and we’ll publish a follow-up article.

And if you take Spendata up on the offer:

  1. take a view of the old cube with 13 consecutive months of data
  2. give Spendata the first 12 consecutive months, and get the new cube back
  3. then add the 13th month of data to the new cube to see what the reverse-engineered rules miss.

You will likely find that the new rules catch almost all of the month 13 spending, showing that the maintenance effort is minimal, and that you can update the spend cube yourself without dependence on a third party.

Is That Old Spend Cube Money Down the Drain?

How many times has this happened? You hire some experts to help with a sourcing effort, they produce a one-off spend analysis, you run some initiatives and realize some savings, and … a year later, you’ve got an obsolete spend cube with IP you’ve paid a lot of money for, but can neither use nor extend, because either the experts didn’t share the process they used to create the cube or, even worse, they used “AI” with “intelligent transaction pattern matching” and there simply aren’t any rules to share.

Or, as often happens (due to the competitive landscape), maybe your original vendor has lost interest in spend analysis, or has left the business, or was acquired and sidelined — and your spend analysis system is either end-of-life, largely unsupported, or obsolete. What then?

Well, you have two options:

  1. Write it off, throw it away, and start all over again
  2. Recover the cube

And yes, you read that right, recover the cube!

You’re probably saying, how can that be done, especially if the original cube was mapped with AI or one-time overlay rules that were created by an expert and lost in the sands of time?

With intelligence, observation, and an application of proper, inverse, AI that sifts through the evidence left behind and generates real rules to start you off — rules that can then be extended in a system that supports layering in a logical fashion to not only allow for a re-creation of the original cube, but an improvement that fixes original errors and takes into account changes in the business since the cube was created.

And yes, this is possible, because mappings leave evidence, the same way a suspect at a scene leaves evidence, and that evidence can be unearthed by applying the digital equivalent of classic archaeological techniques that have been used for over a century to interpret the past. (the doctor has given presentations on this and if you are intrigued, contact him)

And it’s even easier in the case of spend analysis when you remember that you can completely map even a Fortune 100’s spend by hand in less than a week to high accuracy by using the classic secret sauce of:

  1. map the GL codes
  2. map the suppliers
  3. map the suppliers and GL codes
  4. map the exceptions
  5. map the (significant) exceptions to the exceptions

… and then run the rules in the same order.

This works because the vast majority of spend cubes are on indirect spend, and indirect spend cubes can almost always be mapped effectively this way. Even if there is no specific GL code in the data set, there should be similar patterns around the key fields that determine GL code (product description, SKU, etc.) And what doesn’t match defines the exceptions.

In other words, it’s theoretically possible to do a reverse engineering when you understand the foundations of most spend cubes and learn how to interpret the mapping evidence left behind.

But, is anyone doing this?

… And Stop Paying for More Analysis Software Than You Need!

Yesterday SI featured a guest post from Brian Seipel who advised you to Stop Paying for More Analysis than You Need because, simply put, a lot of analytics effort and reports yield little to no return. As Brian expertly noted,

  • Sometimes 80% classification at the transactional level is enough
    Especially if you can get 95%+ by supplier or dollar volume. Once it’s easy to see there’s no opportunity in a category (either because it’s all under contract, the spend is low, the spend versus market price on what is classified leaves little savings opportunity etc.), why classify more?
  • If you are producing a heap of reports on a regular basis, many won’t get looked at
    Especially if the reports aren’t telling you anything new. Plus, as previously explained on SI, a great Spend Analysis Report is useful 3 times. The first time it is used to detect an opportunity, midway through a project to capture an identified savings opportunity to make sure the plan is coming together, at the end of the project to gauge the realized savings. That’s it.
  • A 20% savings isn’t always meaningful
    You’re probably overspending on office supplies by 20%, but it may not matter. If office supplies (because you’ve moved to a mostly paperless office thanks to investments in 2nd monitors and tablets and secure electronic distribution and janitorial supplies is under MRO) is only 10K, and capturing that 2K would take a week of effort running a simple event and negotiating a master contract when your fully burdened cost is 2K a day, is it worth it? Heck no. You don’t spend 10K to save 2K. It’s all about the ROI.
  • Speculative analysis on categories you have no control over may not pay out
    Just because you can show Marketing they are overspending by 50% doesn’t mean they are going to do anything about it. If they solemnly believe you can’t measure talent or impact on a spend basis, and you have no say over the final award, you will be fighting an uphill battle and while the argument should be made to the C-Suite, it has to come from the CPO, so until she is ready to take the battle on, spending on an analysis you can predict from intuition and market analysis is not going to give the ROI you need today.

When you put all this together, this gives you some rules about what you should be looking for, and spending on, when you select an analytics system (especially if you are not a do-it-yourselfer, even though there are systems today that are ridiculously easy to use compared to the reporting systems that first rolled out two decades ago).

  • Don’t overpay for auto-class
    While no one wants to manually classify transactions (even though a crack analyst can classify a Fortune 500 spend by hand in 2 to 3 days to 95%+ with a powerful multi-level rules-based system with regular expression pattern match, augmented intelligence, and drag and drop reclassification capability), considering how easy it is to manually classify straggler transactions once you’ve achieved 90%+ auto-classification to a best-in-class industry categorization (with 95%+ reliability), don’t overpay for auto-class. In fact, don’t pay extra at all — there are a dozen systems with this feature that can get you there. Only pay extra for a system that makes it easy to accomplish mappings and re-mappings and maintain them in a consistent and non-conflicting manner.
  • It doesn’t matter how many reports there are out of the box
    Because, once you get through the first set of projects that fix the spend issues identified, they will all be useless anyway. What matters is how many templates there are for customizing your own. It’s all about being able to define the top X from a subset of categories, geographies, suppliers, departments, users, etc. that are likely to contain your best opportunities, not just the top X spend or transaction volume. It’s about the Schneidermann diagrams and bubble charts on the dimensions that matter on the relevant subset of data. It should be easy to define any type of report you may need to run regularly on whatever filtered subset of data that is relevant to you at the time.
  • Totals, CheckSums, and Data Validations Should be Easy
    … and auto-run on every data import. You want to be able to focus in on your mapping and verification efforts where the spend, and potential opportunity, is large enough to be worth your time, know that the totals add up (to what is expected), and that the data wasn’t corrupted on export or import. The system should verify the data is within the appropriate time window, that at least key dimensions (supplier [id], GL code, etc.) are within expected sets and ranges, and source system identifiers are present.
  • Built In Category Intelligence is only valuable if you need it
    … don’t pay for community spend intelligence, integrated market feeds, or best-practice templates for categories you don’t source (regularly) or that don’t constitute a significant savings opportunity, especially if those fees are ongoing as part of a subscription. Unless it’s intelligence you will use every month, pay for it as a one-off from a market intelligence vendor that offers that service.

The reality is that second generation spend analysis systems are now a commodity, and you can get a great enterprise platform subscription that starts in the low to mid five figures annually that does more than than most organizations need. (And personal consultant licenses to great products for much, much, less.) Don’t overpay for the software, save it for the analyst who can use it to find you savings.