Category Archives: Spend Analysis

… And Stop Paying for More Analysis Software Than You Need!

Yesterday SI featured a guest post from Brian Seipel who advised you to Stop Paying for More Analysis than You Need because, simply put, a lot of analytics effort and reports yield little to no return. As Brian expertly noted,

  • Sometimes 80% classification at the transactional level is enough
    Especially if you can get 95%+ by supplier or dollar volume. Once it’s easy to see there’s no opportunity in a category (either because it’s all under contract, the spend is low, the spend versus market price on what is classified leaves little savings opportunity etc.), why classify more?
  • If you are producing a heap of reports on a regular basis, many won’t get looked at
    Especially if the reports aren’t telling you anything new. Plus, as previously explained on SI, a great Spend Analysis Report is useful 3 times. The first time it is used to detect an opportunity, midway through a project to capture an identified savings opportunity to make sure the plan is coming together, at the end of the project to gauge the realized savings. That’s it.
  • A 20% savings isn’t always meaningful
    You’re probably overspending on office supplies by 20%, but it may not matter. If office supplies (because you’ve moved to a mostly paperless office thanks to investments in 2nd monitors and tablets and secure electronic distribution and janitorial supplies is under MRO) is only 10K, and capturing that 2K would take a week of effort running a simple event and negotiating a master contract when your fully burdened cost is 2K a day, is it worth it? Heck no. You don’t spend 10K to save 2K. It’s all about the ROI.
  • Speculative analysis on categories you have no control over may not pay out
    Just because you can show Marketing they are overspending by 50% doesn’t mean they are going to do anything about it. If they solemnly believe you can’t measure talent or impact on a spend basis, and you have no say over the final award, you will be fighting an uphill battle and while the argument should be made to the C-Suite, it has to come from the CPO, so until she is ready to take the battle on, spending on an analysis you can predict from intuition and market analysis is not going to give the ROI you need today.

When you put all this together, this gives you some rules about what you should be looking for, and spending on, when you select an analytics system (especially if you are not a do-it-yourselfer, even though there are systems today that are ridiculously easy to use compared to the reporting systems that first rolled out two decades ago).

  • Don’t overpay for auto-class
    While no one wants to manually classify transactions (even though a crack analyst can classify a Fortune 500 spend by hand in 2 to 3 days to 95%+ with a powerful multi-level rules-based system with regular expression pattern match, augmented intelligence, and drag and drop reclassification capability), considering how easy it is to manually classify straggler transactions once you’ve achieved 90%+ auto-classification to a best-in-class industry categorization (with 95%+ reliability), don’t overpay for auto-class. In fact, don’t pay extra at all — there are a dozen systems with this feature that can get you there. Only pay extra for a system that makes it easy to accomplish mappings and re-mappings and maintain them in a consistent and non-conflicting manner.
  • It doesn’t matter how many reports there are out of the box
    Because, once you get through the first set of projects that fix the spend issues identified, they will all be useless anyway. What matters is how many templates there are for customizing your own. It’s all about being able to define the top X from a subset of categories, geographies, suppliers, departments, users, etc. that are likely to contain your best opportunities, not just the top X spend or transaction volume. It’s about the Schneidermann diagrams and bubble charts on the dimensions that matter on the relevant subset of data. It should be easy to define any type of report you may need to run regularly on whatever filtered subset of data that is relevant to you at the time.
  • Totals, CheckSums, and Data Validations Should be Easy
    … and auto-run on every data import. You want to be able to focus in on your mapping and verification efforts where the spend, and potential opportunity, is large enough to be worth your time, know that the totals add up (to what is expected), and that the data wasn’t corrupted on export or import. The system should verify the data is within the appropriate time window, that at least key dimensions (supplier [id], GL code, etc.) are within expected sets and ranges, and source system identifiers are present.
  • Built In Category Intelligence is only valuable if you need it
    … don’t pay for community spend intelligence, integrated market feeds, or best-practice templates for categories you don’t source (regularly) or that don’t constitute a significant savings opportunity, especially if those fees are ongoing as part of a subscription. Unless it’s intelligence you will use every month, pay for it as a one-off from a market intelligence vendor that offers that service.

The reality is that second generation spend analysis systems are now a commodity, and you can get a great enterprise platform subscription that starts in the low to mid five figures annually that does more than than most organizations need. (And personal consultant licenses to great products for much, much, less.) Don’t overpay for the software, save it for the analyst who can use it to find you savings.

Stop Paying for More Analysis than you Need


Today we welcome another guest post from Brian Seipel a Procurement Consultant at Source One Management Services focused on helping corporations understand their spend profile and develop actionable strategies for cost reduction and supplier relationship management. Brian has a lot of real-world project experience in supply chain distribution, and brings some unique insight on the topic.

I wrapped up a large spend analysis initiative recently. The project spanned a dozen international operating companies with over two dozen stakeholders pitching in. By the end, we analyzed roughly one million transactions from dozens of disparate systems. It was a lot of work to be sure, but it also provided an unparalleled view into over $1 billion in spend.

Despite the heavy lift, this analysis was critical. It served as the foundation for identifying strategic sourcing projects slated to save this organization millions. The benefit far outweighed the cost.

This is not always the case.

We live in an age where analytics reign supreme. Some organizations staff whole teams to churn out an uncountable (and maybe uncontrollable) number of spreadsheets, reports, and dashboards filled to the brim with data. Other organizations hire third parties like yours truly or implement state-of-the-art analytics packages to crunch these numbers. Either way, end users are left with more data points than they’d ever care to actually use in their decision-making processes.

I feel like I’ve slammed a lot of hyperbole into a few short paragraphs. Let’s dial it back with a simple statement and follow-up question: Even in this data-forward world, organizations need to ensure that we’re not wasting valuable resources on analyses that don’t warrant it. So how do we tell which efforts are worth the time?

Let’s break that down into a few more specific questions.

What direct impact are we trying to make?

This sounds like a throw-away question, but it isn’t. Think of the last ten reports you personally handed off to your boss or your boss’ boss. If I were a betting man, I’d say you could take at least one of them out of your weekly stack without the end user even noticing. Why? Because the people consuming these reports are inundated by data. They don’t have time to sift through reports generated for the sake of bureaucracy.

If you can look at a report and not know what specific challenge it helps solve, odds are good the answer is “none.” Sync up with the end user and confirm it provides the value you think it does.

How much of an impact can we expect?

A spend analysis has a clear enough direct impact on a defined challenge – we need to understand where money is going, to which suppliers, at what point in time, in order to identify projects to reduce cost. That said, some spend may not warrant the attention.

This may sound a bit like a “chicken vs. egg” issue, since we often can’t estimate value before we dig into the numbers. That said, we should have general figure in our mind before investing the time. Saving 20% on office supplies is great when your Staples bill is six figures. Drop that to a few spare thousand every year and the value just isn’t there.

How much buy-in can we expect?

Are relevant stakeholders likely to pursue the projects your analysis shines light on? If not, do you have the leverage, authority, or sheer charm and charisma needed to turn them? I’ve seen plenty of projects die on the vine because of hesitation or outright hostility on the part of key stakeholders. Investing in analytics for projects destined to fail before they start is a sucker’s game.

?There’s a decades-gone-by phrase that old timers in the IT industry will recognize: “Nobody ever got fired for buying IBM.” The elements of fear, uncertainty, and doubt that made it effective back then are still relevant today. Think of the last time your office’s internet connection dropped off, even for a few minutes. Were you thinking about the cost savings your new provider offers? Cost savings may be good, but IT knows reliable uptime is better and is what makes or breaks them.

How deep does our dive need to be?

It pays to get down into the weeds when creating a spec list or generating an in-depth market basket. Once you’ve established the value of a project, it makes sense to invest in it by pulling the devil out of the details. Ending on a detailed note doesn’t mean we need to start the same way, though.

I pick on office supplies a lot when giving an example here. Let’s go back to that six figure Staples spend from earlier. How many pens, pencils, dry erase markers, reams of paper, and other supplies make up that figure? We’re looking at potentially thousands of line items. Remember the goal of our spend analysis – identify projects that can lead to cost savings. Do we really care about each individual line item right now? Will knowing how many black ballpoints versus blue felt tips make project identification easier? No – in fact, spending too much time on this granular detail now will only waste time and lead to potential lost opportunity costs.


I understand the knee-jerk reaction to traverse that DIKW (Data-Information-Knowledge-Wisdom) pyramid, I really do. It often is the right call. At the same time, there’s something to be said for taking a step back and looking at the bigger picture.

Every action we take needs to have purpose. Don’t waste time on a report today just because you ran it yesterday. Understand how your analysis fits into your organization’s goals and, if you find it doesn’t, cut ties so you can focus on more impactful endeavours.

Thanks, Brian!

Spend Rappin’ … The Sequel

Now don’t you give me all that JIVE about code I used before you’s alive
Cause this ain’t nineteen ninety five — ain’t even two thousand and five
Now I’m the guy named Lamoureux and Spend is the one thing that I know
So every year, in summer time, I’ll celebrate it with a rhyme!

Gonna save it, gonna shave it, gonna make it good,
Gonna take it all down through your neighbourhood.
Gonna wring it, gonna sling it till it’s understood.
My rap’s about to happen, like the knee you was slappin;
Or the toe you been tappin’ on a hunk of wood.
‘Bout a two fisted dude, with a friendly attitude
and a sack full of savings for the people on the block.

He’s an old grey beard, maybe looks kind of weird,
and if you ever seen him he could give you quite a shock.

Now people let me tell ya about this guy
the dude who’s still slicing spend through July
Now his wit is out, his gloves on the ground,
best you stay to watch him cut it down.
When this dude gets to work on your spend block,
you will be glued to just one spot,
as the master works at a solid pace,

get a taste of the waste thrown in your face.

this old spend slayer will lay down a heavy layer
of his spend mapping rhythm to a cross-mapped beat
he don’t need no database, just a chunk of spend to trace
and a family of vendors that will roll up neat

I was in a quiet mood, which was good for a brood
as not a sound did abound as he ploughed through the mound

and you will utter a gasp as he slices through the past
lays your mav’rick spend bare faster than a white hare
while you’re up in the attic dealing with the static
that your current spending tool is programmed to give
he’s got an all new app that don’t give a cr@p
about where your data’s from or what form it is in

It’s quick, it’s sharp, and always on the mark
Delivering success on his chinny, chin, chin
it does away with “cubes”, and OLAP attitudes
and treats the spend as a set to be mapped on whim
He’s cool for a fool throwin’ out every rule
every hour of the day when the hot sun shines
Though the beard was-a cleared, I still have never cheered
like I did on that day when he discovered cloud nine

You know I’m right, your spend’s a fright
you need a guide to help you lay it out right
So if you ask him nice, once or twice
he might just show you the hand of sleight
How he syncs disparate data in real time
Whether ERP, Flat File, or API
Without AI or one hundred thousand rules
At a speed that’s so fast, no time to drool

When he gets down to work, this fine old gent
Whips up live reports that are heaven sent
Built on cross-linked filters that stay in sync
As he cross-drills down through multiple data sinks

There are just no words that are fit to describe
How this expert makes your data come alive
The tricks he employs are out of the realm
Of what you will get unless he’s at the helm
You’ve just never seen spend insight like this
When you map your data with his clever twists
Your old Ford engine is now a Mercedes AMG F1
The power at your fingers is second to none

This old spend dude never left the keys
up late till all’s where it should be
But if he were posting here today
he’d say Truthful Spending and to all a good day!

Long time readers know that Sourcing Innovation used to have a Spend Rappin’ holiday tradition until Opera Solutions Acquired BIQ. But even that couldn’t stop the old spend dude, who, after some time off and some contemplative thought, got back to the keys and came up with a whole new approach for do-it-yourself spend analysis that is really so easy your grade schooler could do it (and, sadly, probably do it better than you).

If you haven’t checked it out yet, check out Spendata. As per the doctor‘s deep dive over on Spend Matters (Part 1 and Part 2, registration required), it really is a leap forward in D.I.Y. Spend Analysis. Easy creation and propagation of views using a new concept called filter coins, no more static reports (as every view is a report that can be exported at any time), and no more traditional time-consuming ETL — map, cleanse, enrich on the fly, in real time, in any sequence you like, across any data sets you like, and cross-join and sync ’em all using whatever scheme makes sense to you. Nonsense you say? All we can say is this isn’t grandpa’s spend reporting tool.

It’s Christmas in July. Hence our new Spend Rappin’ tradition begins!

Contract Compliance Trust But Verify: Part III Monitoring Demand

Today’s post is from Eric Strovink, the spend slayer of spendata. real savings. real simple. Eric was previously CEO of BIQ; before that, he led the implementation of Zeborg’s ExpenseMap, which was acquired by Emptoris and became its spend analysis solution.

When you join transaction data to contract data in order to validate contract price compliance, it is possible to discover lots of interesting information. Some if it can be quite surprising.

For example, you might notice that off-contract items make up a surprisingly large proportion of the spending. This may be trending up with time, so it is worth doing a time-series analysis. You might also notice a pattern of overcharges on particular items, which could be an easily-corrected disconnect at the vendor side on contract terms.

In Excel, these analyses require new pivot tables and, concomitantly, more maintenance effort on refresh. But in a spend analysis system, the model can be augmented with additional pivot-table-equivalents in seconds, with just a few mouse clicks. And, refresh is not an issue, because the spend analysis system updates everything automatically upon loading new transactions. So, much more interesting analyses become real possibilities — including monitoring demand.

The Who

Suppose that we have from the vendor not only the item pricing, but also an idea of who within the organization is doing the purchasing. This then enables us not only to identify off-contract spending, but also find the source of the leakage within the organization, so that corrective action can be taken internally.

There are a number of ways that “Who bought the items” can find its way into PxQ data. Sometimes it is present as a matter of course; sometimes it requires effort.

  • If the item is a catalog buy or punch-out, invoice items likely already contain the cost center.
  • If a PO number was provided to the vendor, invoice items should contain the PO. The PO can be easily translated to cost center (well, “easily” if the PO data can be linked in, as it can be with a spend analysis system).
  • If there’s a useful delivery address on the invoice, that can be mapped to a cost center using the spend analysis system’s mapping tools (of course, you need access to the mapping tools, and they need to be simple to use).
  • Your contract with the vendor could require a cost center to be provided on the invoice as a prerequisite for payment. No cost center, no payment.
  • Corporate purchasing cards are by definition associated with a cost center, so these can be mapped to cost center using the spend analysis system’s mapping tools.
  • Consultants put project codes on invoices; lawyers put matter numbers. These can be mapped to cost centers as well. Any invoice without a project code or matter number shouldn’t be paid.
  • Some spend already has a fixed cost center, for example with copiers. Each copier is assigned a cost center, which shows up on the invoice.

In a nutshell, if you want to have a cost center attached to each row of an invoice, it is very doable, and very worthwhile.

Let’s revisit the dashboard from Part II.

  • We can see a breakdown of overcharge buys by cost center (blue). A similar breakdown of off-contract items helps identify who is buying off-contract. There may be very good reasons for this, of course; and those reasons need to be understood, so that we can either get those items onto the contract, or channel the buying to similar items that are on contract.
  • We can see a time-series analysis of item buys by class, with an associated chart (red). Over time, fewer items are being bought with the contract price, which is not a good trend.
  • We can see all the buys, showing both contract and overcharged prices (green). This is all we need to show to the vendor — just dump it to Excel, email the spreadsheet, done.

Click to enlarge

The basic pattern of this type of analysis doesn’t change with the commodity. Providing that the goods or services can be standardized with a fixed price, and that a contract price is available, the technique is always the same — and the analysis always worthwhile, if only to prove that the contract is in place and actually working.

Thanks, Eric!

Contract Compliance Trust But Verify Part II: Monitoring the Vendor

Today’s post is from Eric Strovink, the spend slayer of spendata. real savings. real simple. Eric was previously CEO of BIQ; before that, he led the implementation of Zeborg’s ExpenseMap, which was acquired by Emptoris and became its spend analysis solution.

If you have a contract with a vendor, you should be paying the contract price. But until you check, you don’t really know — and what you find out may surprise you.

In Part I of this series we discussed the two pieces of data required — transactions from the vendor, and contract prices for the items under contract. The next step is to join those two datasets together, in this case by Part Number.

Here is what that might look like if we do it in Excel:

This was done by:

  • Sorting the contract prices by Part Number so VLOOKUP will work
  • Building a helper column K which is the difference between invoice price and VLOOKUP’d contract price (hidden)
  • Building a VLOOKUP to compare contract price to invoice price (shown)
  • Building a Pivot Table to roll up column L

Lots more could be done. For example, we could:

  • Add a computation of the amount of overcharge.
  • Add year-month to the pivot table, giving us an idea as to the distribution of the overcharges. Have they all occurred recently, or just in the relatively distant past?
  • Produce a table of only the overcharged items, in order to send it to the vendor with a request for compensation.
  • Identify “who” is buying the excluded items (more on this in Part III).

However, as the model becomes more complex, it becomes more difficult to maintain. What happens next month, when a new tranche of transactions is available? Who updates the model? Each of the formulas and pivot tables needs to be updated carefully — a process that’s irritating and time-consuming at best, as well as highly error-prone.

Make it Easy, not Hard

A spend analysis tool can make this a lot easier. Load the two datasets, and link them by Product Number. Then build a price difference column, set up a range, and you’re done. This requires no advanced Excel knowledge, and produces a model that updates automatically when new data are added. This dashboard was put together using Spendata, but there are certainly other options.

Click to enlarge

And now, adding next month’s data to the analysis is anticlimactic — literally a couple of clicks, and everything auto-updates. So, even if you could “do it in Excel”, you won’t, because it’s just too painful. But if you use the right tools, you can produce compliance models quickly, and you can maintain them with near-zero effort.

We’ll conclude our discussion in Part III: Monitoring Demand. Thanks, Eric!