Category Archives: Spend Analysis

Reporting is Not Analysis — And Neither Are Spreadsheets, Databases, OLAP Solutions, or “Business Intelligence” Solutions

… and one of the best explanations the doctor has ever read on this topic (which he has been writing about for over two decades) was just published over on the Spendata blog on Closing the Analysis Gap. Written by the original old grey beard himself (who arguably built what was the first stand alone spend analysis application back in 2000 and then redefined what spend analysis was not once, but twice, in two subsequent start-ups that built two, entirely new, analytics applications that took a completely different, more in-depth approach), it’s one of the first articles to explain why every current general purpose solution that you’re currently using to try and do analysis actually doesn’t do true analysis and why you need a purpose built analysis solution if you really want to find results, and in our world, do some Spend Rappin’.

We’re not going to repeat the linked article in its entirety, so we’ll pause for you to go read it …


… we said, go to the linked article and read it … we’ll wait …


READ IT! Then come back. Here’s the the linked article again …


Thank you for reading it. Now we’ll continue.

As summarized by the article, we have the following issues:

Tool Issue Resolution Loss of Function
Spreadsheet Data limit; lack of controls/auditability Database No dependency maintenance; no hope of building responsive models
Database performance on transactional data (even with expert optimization) OLAP Database Data changes are offline only & tedious, what-if analysis is non-viable
OLAP Database Interfaces, like SQL, are inadequate BI Application Schema freezes to support existing dashboards; database read only
BI Application Read only data and limited interface functionality Spreadsheets Loss of friendly user interfaces and data controls/auditability

In other words, the cycle of development from stone-age spreadsheets to modern BI tools, which was supposed to take us from simple calculation capability to true mathematical analysis in the space age using the full breadth of mathematical techniques at our disposal (both built-in and through linkages to external libraries), has instead taken us back to the beginning to begin the cycle anew, while trying to devour itself like an Ouroboros.

Source: Wikipedia


Why did this happen? The usual reasons. Partly because some of the developers couldn’t see a resolution to the issues when they were first developing these solutions, or at least a resolution that could be implemented in a reasonable timeframe, partly (and sometimes mostly) because vendors were trying to rush a solution to market (to take your money), and partly (and sometimes largely) because the marketers keep hammering the message that what they have is the only solution you need until all the analysts, authors, and columnists repeat the same message to the point they believe it. (Even though the users keep pounding their heads against the keyboard when given a complex analysis assignment they just can’t do … without handing it off to the development team to write custom code, or cutting corners, or making assumptions, or whatever.) [This could be an entire rant on its own how the rush to MVP and marketing mania sometimes causes more ruin than salvation, but considering volumes still have to be written on the dangers of dunce AI, we’ll have to let this one go.]

The good news is that we now have a solution you can use to do real analysis, and this is much more important than you think. The reality is that if you can’t get to the root cause of why a number is as it is, it’s not analysis. It’s just a report. And I don’t care if you can drill down to the raw transactions that the analysis was derived from, that’s not the root cause, that’s just supporting data.

For example, profit went down because warranty costs increased 5% is not helpful. Why did warranty costs go up? Just being able to trace down to the transactions where you see 60% of that increase is associated with products produced by Substitional Supplier is not enough (and in most modern analysis/BI tools, that’s all you can do). Why? Because that’s not analysis.

Warranty costs increasing 5% is the inevitable result of something that happened. But what happened? If all you have is payables data, you need to dive into the warranty claim records to see what happened. That means you need to pull in the claim records, and then pull out the products and original customer order numbers and look for any commonalities or trends in that data. Maybe after pulling all this data in you see, of the 20 products you are offering (where each would account for 5% of the claims if all things were equal) there are 2 products that account for 50% of the claims. Now you have a root cause of the warranty spend increase, but not yet a root cause of what happened, or how to do anything about it.

To figure that out, you need to pull in the customer order records and the original purchase order records and link the product sent to the customer with a particular purchase order. When you do this, and find out that 80% of those claims relate to products purchased on the last six monthly purchase orders, you know the products that are the problem. You also know that something happened six months or so ago that caused those products to be more defective.

Let’s say both of these products are web-enabled remote switch control boxes that your manufacturing clients use to remotely turn on-and-off various parts of their power and control systems (for lighting, security monitoring, etc.) and you also have access, in the PLM system, to the design, bill of materials (BOM), and tier 2 suppliers and know a change takes 30 to 60 days to take effect. So you query the tier 1 BOM from 6, 7, 8, and 9 months ago and discover that 8 months ago the tier 2 supplier for the logic board changed (and nothing else) for both of these units. Now you are close to the root cause and know it is associated with the switch in component and/or supplier.

At this point you’re not sure if the logic board is defective, the tier 1 supplier is not integrating it properly, or the specs aren’t up to snuff, but as you have figured out this was the only change, you know you are close to the root cause. Now you can dive in deep to figure out the exact issue, and work with the engineering team to see if it can be addressed.

You continue with your analysis of all available data across the systems, and after diving in, you see that, despite the contract requiring that any changes be signed off by the local engineering team only after they do their own independent analysis to verify the product meets the specs and all quality requirements, you see that the engineering, who signed off on the specs, did not sign off on the quality tests which were not submitted. You can then place a hold on all future orders for the product, get on the phone with the tier 1 supplier and insist they expedite 10 units of the logic board air freight for quality testing, and get on the phone with engineering to make sure they independently test the logic boards as soon as they arrive.

Then, when the product, which is designed for 12V power inputs, arrives and the engineers do their stress tests and discover that the logic board, which was spec’ed to be able to handle voltage spikes to 15V (because some clients power backup systems off of battery backups that run off of chained automotive batteries) actually burns out at 14V, you have your root cause. You can then force the tier 1 supplier to go back to the original board from the original supplier, or find a new board from the current supplier that meets the spec … and solve the problem. [And while it’s true you can’t assume that all of the failure increases were the logic board without examining each and every unit of each and every claim, in this situation, statistically, most of the increase in failures will be due to this [as it was the only change].]

In other words, true analysis means being able to drill into raw data, bring in any and all associated data, do analysis and summaries of that data, drill in, bring in related data, and repeat until you find something you can tie to a real world event that led to something that had a material impact on the metrics that are relevant to your business. Anything less is NOT analysis.

Source-to-Pay+ is Extensive (P14) … So Do Not Stop at Spend Analysis!

As we discussed in Part 8, once you have your eProcurement baseline, that’s just the beginning. The very beginning. Even though not all modules are equal, and not all modules will return equal results, you, and your organization, will need all of Source-to-Pay eventually. However, since you can’t implement it all at once, you take it one module at a time.

After eProcurement, if you aren’t 100% sure where the most value will come from, you go on to spend analysis and use it to help you identify the best opportunities, and those opportunities may indicate the next best module to implement (for your organization at the current time). After that, you may have a clear answer, or, you may not. Sometimes the analysis indicates almost equal opportunity between sourcing and contracting, between contracting and supplier management, or between sourcing and supplier management. (Or, you might not have the manpower or expertise to do the analysis you need to get the right answer.)

So if it’s unclear as to which solution to choose next, it’s back to arguments and logic in an effort to determine which of the three aforementioned solutions to choose.

How about Strategic Sourcing? It’s the one technology proclaimed to identify the most savings and deliver the best results. The truth is that while it almost always identifies the most savings, it doesn’t actually deliver those savings, or even guarantee them. The savings are guaranteed by the contract, delivered by the (new) supplier, and captured by the eProcurement system.

So how about a Contract Management System? In order to guarantee the cost reductions, you need the contract. Or it’s just a quote that’s given today, denied tomorrow. But, as we indicated in a prior post, you don’t need a contract management system for a contract. You need (e-)paper, (e-)ink, and a pair of (e-)signatures. The right contract management system makes it easy to author, negotiate, manage, track, and enforce a contract. But the contract itself is up to people, and if they don’t agree, there’s no contract, and, thus, no need for a contract management system.

This just leaves Supplier Management. But is this where we start? If we think about the value sourcing identifies, it’s generated by the supplier. So it’s critical that the supplier perform. If we have a good supplier management solution, it will track the supplier’s progress, alert us to issues, and assist us in managing the relationship if intervention is required. It will enable performance management, which is critical because if the supplier doesn’t perform and/or doesn’t adhere to the contract, then it doesn’t matter how great the sourcing event was or how good the contract inked was.

And so, because suppliers, and relationships with them, are key, when all things are about equal, or when it’s hard to identify where to go next, we go with supplier management.

Start the dive in Part 15.

Source-to-Pay+ Is Extensive (P12) … Here are Some Spend Analysis Vendors

As promised in our last installment (Part 11), where we outlined the baseline capabilities that are needed for a solution to qualify as a modern spend analysis solution, here are some vendors that you can consider that meet most of the requirements. Note that, where spend analysis is concerned, some companies actually use two solutions, one as part of the platform ecosystem that they use that serves as the centralized master data store for spend analysis with the central “cube” and pre-configured reports for management, and a standalone best-of-breed powerhouse tool for free-form what-if analytics, where the power analysts can slice, dice, and reconfigure the data as they wish without impacting anyone else in the organization. Thus, it’s okay to choose two different, complementary, solutions if that meets your needs better than one (or keeps your users happy and using a system vs. trying to bypass it).

Note that, as with the list of e-Procurement Vendors we provided in Part 7, this list is in no-way complete (as no analyst is aware of every company), is only valid as of the date of posting (as companies sometimes go out of business and acquisitions happen all of the time in our space), and does not include generic business intelligence or analytic applications offered by providers without any specialization in spend analysis. (Nor does it include vendors that are only focussed on one vertical. While a couple of vendors below have a primary vertical, our understanding is that they can support other, related, verticals and have some generic elements of spend analysis.)

Also note that, and we want to be very clear here, not all vendors are equal, and we’d venture to say that NONE of the following are equal. The companies listed below are of all sizes (very small to very large, relative to vendor sizes in our space), cover the baselines differently (in terms of percentage of features offered, how deep those features are, how integrated analytics is [or can be] with other modules, and how customized the solution can be for an organization or the vertical in which it plays), offer different additional features, have different types of service offerings (backed up by different expertise), focus on different company sizes, and focus on different ecosystems (such as plugging into other platforms/ecosystems, serving as the Source-to-Pay master data repository or controller, offering a plug-and-play model for a larger, or different, ecosystem) etc.

Do your research, and reach out to an expert for help if you need it in compiling a starting short list of relevant, comparable, vendors for your organization and its specific needs. For many of these vendors, good starting points can again be found in the Sourcing Innovation archives, Spend Matters Pro, and Gartner Cool Vendor write-ups if any of these sources has a write-up on the vendor.

And, again, note that if we say Source-to-Pay, it means that the vendor offers modules that also cover baseline capability across most of Sourcing, Supplier/Vendor Management, Contract Management, e-Procurement, and/or e-Invoicing/Accounts Payable/Invoice-to-Pay. As to whether or not SI would consider those modules as meeting the majority of baseline functional requirements, you will have to (wait for and) check the starting vendor lists in those areas.

Company LinkedIn Employees HQ (State) Country Other Offerings/Notes
Alteryx 3065 California, USA
Analytics8 SpendView 213 Illinois, USA
Anaplan 2395 California, USA Finance, Sales & Marketing, HR, Supply Chain
AnyData Solutions 10 United Kingdom Supplier Management, Contract Management
Corcentric Platform 587 New Jersey, USA Source-to-Pay, Payments
Coupa 3666 California, USA Source-to-Pay, Treasury, Contingent Workforce, Supply Chain Planning
Delicious Data 27 Germany
ElectrifAI 132 New Jersey, USA Contract Analytics, Supply Chain Analytics
Everstream 165 California, USA Supplier Risk
GEP 4640 New Jersey, USA Source-to-Pay, Supply Chain
Ignite Procurement 60 Sweden Contract Management, Supplier Management
intelflow 7 Germany Procurement Intelligence
Ivalua 848 California, USA Source-to-Pay, Direct Materials
Jaggaer ONE 1263 North Carolina, USA Source-to-Pay, Inventory Management, Supplier Network, Direct Materials
kiresult 5 Germany
LevaData 58 California, USA Direct Materials
McKinsey (Orpheus) 15 Germany
Metric Insights 18 California, USA
Neqo 8 France
Onventis (Spendency) 139 Germany Source-to-Pay, Direct Materials
Oversight Systems 145 Georgia, USA Payment Monitoring
PRGX 1421 Georgia, USA M&A Analytics, Retail Analytics, Audits
RightSpend 23 New York, USA Marketing Procurement
Pro(a)Act 5 Sweden
Robobai 50 Australia Sustainability, Risk, Treasury
Rosslyn 65 United Kingdom
SAP Ariba 84 California, USA Source-to-Pay, Supplier Network
Scalue 6 Germany
ScanMarket (Unit4) 60 Denmark Sourcing, Supplier Management, Contract Management
Sourcing Insights 9 Indiana, USA Contract Management, Risk Management
SpendBoss 3 North Carolina, USA
Sievo 303 Finland Project Management
Silvon 18 Illinois, USA
Simfoni 260 California, USA eSourcing, Tail Spend Management
Spendata ?? Massachusetts, USA
SpendKey ?? United Kingdom
SpendHQ 76 Georgia, USA Procurement Performance Management
SpendWorx 7 California, USA Market Intelligence
Suplari 10 Washington, USA
Tamr 169 Massachusetts, USA Healthcare
The Smart Cube 1004 United Kingdom Services
Xelix United Kingdom Payment Monitoring

Onwards to Part 13!

Source-to-Pay+ Is Extensive (P11) … What Do You Need For (A) Spend Analysis (Baseline), Installment 2

In our last post (Part 10), after reviewing the spend analysis process which, in short is:

  • Extract the relevant data
  • Load the data into the solution (mapping it to a starting taxonomy)
  • Structure for the types of analyses you need to perform
  • Analyze the data and get useful insights to
  • Act on the insights you get

We identified that the core requirements a spend analysis system needs to support are those that enable:

  • Load
  • Structure
  • Analyze

with a focus on

  • Efficiency

Let’s take these requirements one by one.

Load: The first step is to get the data in. It needs to be easy to ingest large data files and map the data to a starting taxonomy that can be manipulated for the purposes of analysis. Particularly, those data files in classic csv, row, or column formats that are universal. The ingestion needs to be fast and intelligent and learn from everything the user does so that the next time the application sees a similar record, it knows what to do with that record. This allows us to identify our first two core requirements:

  • rules: the application needs to support rules that allow for deterministic based (re)mappings when certain data values (within a tolerance) are identified (and these rules need to be easily editable over time as needed)
  • hybrid AI: that can analyze the data and suggest the rules for the user to select to speed up rule definition and mapping during load

Structure: The next step is to structure the data for analysis. In spend analysis, the core structure is a

  • Cube: the application must be able to build a custom cube for each type of analyses required; one size, and thus one cube, does NOT fit all; the cubes must also support derived dimensions using measures and summaries

Sometimes the cube needs to be explored, which means that the application also needs to support

  • Drill Down: to the data of interest
  • Filters: to define the relevant data subset
  • Views: that can be configured and customized using measures, drill downs, and filters for easy exploration and easy revisiting

Also, while the theory is that you have one record in your ERP, AP, etc. for a supplier, product, and other real-world entity, the reality is that you have multiple (multiple [multiple]) entries, so the application has to also support

  • Familying of like entites: suppliers, products, and even locations
  • Mapping of children organizations to their parent when you can cut master contracts / agreements (such as with hotel chains)

At this point, we’ve built a cube, and we’re ready for:

Analysis: where we analyze our slices of the data to get insight that we can eventually act on; this requires:

  • Measures: that can summarize the data in a meaningful way
  • Benchmarks: that can be compared against
  • Reports: which can be bookmarked views that show the right summary (and can be saved or printed)
  • Data Science Hooks: to external algorithms and libraries for forecast generation, trend analysis, etc.

And at this point, while we don’t necessarily have everything the doctor would want in a modern spend analysis system, we almost have everything that is needed to meet the baseline, with one exception, and that’s the functionality needed to enable

Efficiency which, in spend analysis, equates to the technical requirements that eliminate the need to “reinvent the wheel” every time the analysis effort needs to be repeated. The problem with traditional spend analysis systems is that any time the data changes, all of the work has to be repeated. A good system will remember everything that was done, preserve it, just identify the data changes and new data, and pull them in. Some systems do this okay, but if the underlying data source changes, they fall apart.

However, when there’s more than one user, which is the case in most organizations, the implementation creates a central, “master”, cube and everyone has to work off of that. Usually this involves creating a copy of that cube, and then working off of that central cube. And then, when that cube is updated, create a copy of that cube and start all over.

Better systems will allow the user to pull in “just the new data” if the structure of the core cube hasn’t changed and the data can be mapped by the existing rules. But any time the base cube undergoes even a minor structural change, all of the analysts have to start again, from scratch. But this is mitigated if the system supports

  • Inheritance: which creates every user’s cube as a sub-cube of another system cube or the master cube and, when any parent cube changes, use the relationship to automatically propagate any changes without any effort required on the part of the user

There are, of course, other features and functions that can be added to increase efficiency even more, but this one capability makes a spend analysis system exponentially more efficient than any system that came before.

We should note that, as of today, only one spend analysis system supports full inheritance, but a couple support partial inheritance and are attempting to improve their offering. So keep this in mind when you are comparing solutions, as not all will be equal.

Continue to Part 12.

Source-to-Pay+ Is Extensive (P10) … What Do You Need For (A) Spend Analysis (Baseline), Installment 1

In Part 8 we briefly reviewed the major modules in Source-to-Pay in an attempt to identify which module to work on after e-Procurement, and concluded that you select Spend Analysis, and start using it (even without integration) as soon as possible because. Spend Analysis not only helps your organization identify its best opportunities, but also what module should come next (in terms of implementation and integration).

Then, in Part 9 we elaborated on our comment that spend analysis can help you identify the most important Source-to-Pay modules for your organization based upon the types of opportunities that are identified. We identified situations in which Supplier Management, Contract Management, Risk Management, Source-to-Pay, and even I2P is relevant to capture opportunities. We did this to illustrate the criticality of getting going on spend analysis as soon as possible.

The next step is to identify what you need in a spend analysis solution. But before we can do that, we need to review the basic spend analysis process:

you need to extract the relevant data from the relevant applications
you need to load the data into the spend analysis solution (and map it a starting taxonomy)
you need to structure the data for the various types of analyses you want to perform
you need to perform the analyses and get insight
you need to take action, which involves initiating processes, tracking progress, and getting results

Looking at this process, you need whatever functionality is required to

  • Load,
  • Structure and
  • Analyze the data

Most older platforms don’t support modern API hooks or data transfer standards, so the reality is that you will need to export the data from those platforms, and there will be limited “extraction” in the spend analysis platform beyond support for requesting data through an API in the standard format the spend analysis tool supports and the API calls the spend analysis tool supports. As a result, the “extraction” part of the process is mostly outside the scope of the spend analysis tool.

Similarly, most organizations will have, or want, to use other tools to create projects, assign actions, track progress, and so on. As a result, the “act”ion part of the process is often mostly outside the spend analysis tool with, of course, the ability to push the results out in a standard format through a supported API.

Thus, in order to define a solid spend analysis baseline, we need to define all of the functionality to

  • Load,
  • Structure and
  • Analyze the data

and, most importantly, do it in a manner that

  • supports efficiency.

In other words, the last thing you want to do is have to repeat the entire process every time data is updated or re-classified in the source system. In our next installment, Part 11, we will review the core functionality required for each of these four core requirements.