Category Archives: Spend Analysis

Source-to-Pay+ Is Extensive (P11) … What Do You Need For (A) Spend Analysis (Baseline), Installment 2

In our last post (Part 10), after reviewing the spend analysis process which, in short is:

  • Extract the relevant data
  • Load the data into the solution (mapping it to a starting taxonomy)
  • Structure for the types of analyses you need to perform
  • Analyze the data and get useful insights to
  • Act on the insights you get

We identified that the core requirements a spend analysis system needs to support are those that enable:

  • Load
  • Structure
  • Analyze

with a focus on

  • Efficiency

Let’s take these requirements one by one.

Load: The first step is to get the data in. It needs to be easy to ingest large data files and map the data to a starting taxonomy that can be manipulated for the purposes of analysis. Particularly, those data files in classic csv, row, or column formats that are universal. The ingestion needs to be fast and intelligent and learn from everything the user does so that the next time the application sees a similar record, it knows what to do with that record. This allows us to identify our first two core requirements:

  • rules: the application needs to support rules that allow for deterministic based (re)mappings when certain data values (within a tolerance) are identified (and these rules need to be easily editable over time as needed)
  • hybrid AI: that can analyze the data and suggest the rules for the user to select to speed up rule definition and mapping during load

Structure: The next step is to structure the data for analysis. In spend analysis, the core structure is a

  • Cube: the application must be able to build a custom cube for each type of analyses required; one size, and thus one cube, does NOT fit all; the cubes must also support derived dimensions using measures and summaries

Sometimes the cube needs to be explored, which means that the application also needs to support

  • Drill Down: to the data of interest
  • Filters: to define the relevant data subset
  • Views: that can be configured and customized using measures, drill downs, and filters for easy exploration and easy revisiting

Also, while the theory is that you have one record in your ERP, AP, etc. for a supplier, product, and other real-world entity, the reality is that you have multiple (multiple [multiple]) entries, so the application has to also support

  • Familying of like entites: suppliers, products, and even locations
  • Mapping of children organizations to their parent when you can cut master contracts / agreements (such as with hotel chains)

At this point, we’ve built a cube, and we’re ready for:

Analysis: where we analyze our slices of the data to get insight that we can eventually act on; this requires:

  • Measures: that can summarize the data in a meaningful way
  • Benchmarks: that can be compared against
  • Reports: which can be bookmarked views that show the right summary (and can be saved or printed)
  • Data Science Hooks: to external algorithms and libraries for forecast generation, trend analysis, etc.

And at this point, while we don’t necessarily have everything the doctor would want in a modern spend analysis system, we almost have everything that is needed to meet the baseline, with one exception, and that’s the functionality needed to enable

Efficiency which, in spend analysis, equates to the technical requirements that eliminate the need to “reinvent the wheel” every time the analysis effort needs to be repeated. The problem with traditional spend analysis systems is that any time the data changes, all of the work has to be repeated. A good system will remember everything that was done, preserve it, just identify the data changes and new data, and pull them in. Some systems do this okay, but if the underlying data source changes, they fall apart.

However, when there’s more than one user, which is the case in most organizations, the implementation creates a central, “master”, cube and everyone has to work off of that. Usually this involves creating a copy of that cube, and then working off of that central cube. And then, when that cube is updated, create a copy of that cube and start all over.

Better systems will allow the user to pull in “just the new data” if the structure of the core cube hasn’t changed and the data can be mapped by the existing rules. But any time the base cube undergoes even a minor structural change, all of the analysts have to start again, from scratch. But this is mitigated if the system supports

  • Inheritance: which creates every user’s cube as a sub-cube of another system cube or the master cube and, when any parent cube changes, use the relationship to automatically propagate any changes without any effort required on the part of the user

There are, of course, other features and functions that can be added to increase efficiency even more, but this one capability makes a spend analysis system exponentially more efficient than any system that came before.

We should note that, as of today, only one spend analysis system supports full inheritance, but a couple support partial inheritance and are attempting to improve their offering. So keep this in mind when you are comparing solutions, as not all will be equal.

Continue to Part 12.

Source-to-Pay+ Is Extensive (P10) … What Do You Need For (A) Spend Analysis (Baseline), Installment 1

In Part 8 we briefly reviewed the major modules in Source-to-Pay in an attempt to identify which module to work on after e-Procurement, and concluded that you select Spend Analysis, and start using it (even without integration) as soon as possible because. Spend Analysis not only helps your organization identify its best opportunities, but also what module should come next (in terms of implementation and integration).

Then, in Part 9 we elaborated on our comment that spend analysis can help you identify the most important Source-to-Pay modules for your organization based upon the types of opportunities that are identified. We identified situations in which Supplier Management, Contract Management, Risk Management, Source-to-Pay, and even I2P is relevant to capture opportunities. We did this to illustrate the criticality of getting going on spend analysis as soon as possible.

The next step is to identify what you need in a spend analysis solution. But before we can do that, we need to review the basic spend analysis process:

Extract
you need to extract the relevant data from the relevant applications
Load
you need to load the data into the spend analysis solution (and map it a starting taxonomy)
Structure
you need to structure the data for the various types of analyses you want to perform
Analyse
you need to perform the analyses and get insight
Act
you need to take action, which involves initiating processes, tracking progress, and getting results

Looking at this process, you need whatever functionality is required to

  • Load,
  • Structure and
  • Analyze the data

Most older platforms don’t support modern API hooks or data transfer standards, so the reality is that you will need to export the data from those platforms, and there will be limited “extraction” in the spend analysis platform beyond support for requesting data through an API in the standard format the spend analysis tool supports and the API calls the spend analysis tool supports. As a result, the “extraction” part of the process is mostly outside the scope of the spend analysis tool.

Similarly, most organizations will have, or want, to use other tools to create projects, assign actions, track progress, and so on. As a result, the “act”ion part of the process is often mostly outside the spend analysis tool with, of course, the ability to push the results out in a standard format through a supported API.

Thus, in order to define a solid spend analysis baseline, we need to define all of the functionality to

  • Load,
  • Structure and
  • Analyze the data

and, most importantly, do it in a manner that

  • supports efficiency.

In other words, the last thing you want to do is have to repeat the entire process every time data is updated or re-classified in the source system. In our next installment, Part 11, we will review the core functionality required for each of these four core requirements.

Coronavirus/COVID-19 Response: Analytics Can Help Get You Through the Crisis

In the first stage of the pandemic, mines close, processors close, or other suppliers of critical raw materials become unavailable and your direct procurement becomes threatened, and you have to identify new sources of supply quickly to maintain supply assurance, while also making the best selection for the business to keep total of cost ownership acceptable and predictable (as a lower cost risky alternative could put you back in the same position in a few months). You need good analytics to make the right decision.

In the second stage of the pandemic, factories close, certain distribution channels become unstable, and distributor stockpiles run out and indirect goods become scarce and problematic across key categories. And you need to respond. Good analytics will again be key as you don’t want to be going back to market in three to six months, but you also need to keep costs down to insure you have the cash to deal with cost spikes in direct lines where supply unavailability significantly tips the supply/demand balance scale or where costly expedited logistics will be needed. You again need good analytics to make the right decision.

And unless you have a modern best-of-breed Source-to-Pay suite with great analytics embedded or a best-of-breed stand-alone analytics solution, you don’t have anywhere close to what you need. Just a few of the questions you will need to answer include:

  • How much am I paying now for a product, and how much should I pay based on today’s commodity pricing and currency volatility?
  • How do I understand the cost impact of supplier failure?
  • How do I understand the cost impact of raw material availability?
  • How do I identify outliers that might signify future issues or opportunities?

… along with dozens more. So how do you answer these questions? What technologies do you choose? Check out the doctor‘s CORONAVIRUS RESPONSE: Advanced Procurement Analytics — find the risks hiding in your data, prioritize and take action Pro piece over on Spend Matters. Even if you don’t have Pro access, the content in front of the paywall is still useful and might give you some ideas on where to start.

Furthermore, No Modern 2020 Platform Will Be Without What-if?!

As you may have noticed, the doctor has been on a bit of a bent lately defining what a modern S2P platform is as he’s completely fed up of all of the “digital” bullshit where marketers are trying to sell everything old like its new again and technically advertising solutions that are less powerful than the doctor could code on his 8088 three decades ago! (It had a 2400 baud modem so the requirement of network connectivity was even met.)

And if you think the doctor is being a bit extreme, go back and re-read the definitions of “digital” and “analysis”, ask some pointed questions to these vendors about what their solutions can really do, and you’ll find that maybe, just maybe, he’s not being that extreme at all. It’s sad how many vendors believe that a fancy new UX on a weak Procurement 2.0 solution all of a sudden makes it 3.0 and 4.0 ready when all they are really doing is putting lipstick on a pig (and no self respecting pig wants to wear lipstick)!

Yesterday we defined the levels of analytics and hopefully made it clear that there shouldn’t be a single platform on your consideration list that doesn’t have at least basic prescriptive capability and that you should also make sure the vendor is on a permissive journey before signing on the bottom line!

But that’s not all you need to demand in a platform. You also need to demand a platform with embedded What If? capability.

It’s going to be a while before the predictive analytics work across all the situations a procurement specialist need them to work in, and even longer until the platform supports the insights needed for permissive analytics. But, in the interim, the procurement specialists still need to extract value from analytics — and that value is going to come from What If?.

What If? the demand next year is the same as this year, what will the total cost be if the cost stays flat? What if demand rises 10%?

What If? the delivery is late by a day? By 3 days? By a week? What if the order is routed to the backup supplier? The backup location?

What If? the supplier’s financial woes get worse? What if the supplier goes bankrupt?

What If? the contract milestone isn’t hit? What is the impact? What is the risk?

The procurement professional needs to be able to ask What If? throughout the platform and, more importantly, throughout the analytics. Some Reports should be interactive and allow the user to project the next quarter, year, etc. of data using current data and advanced What If? algorithms. Anything less won’t be enough.

… And Advanced Analytics Should Be a Must in 2020!

Just like any vendor can claim to have a digital procurement solution because, as we clearly explained last week, email and spreadsheets technically count, any vendor can claim to have analytics. Consider the definition:

the analysis of data, typically large sets of business data, by the use of mathematics, statistics, and computer software

And then consider the common definition of analysis:

a presentation, usually in writing, of the results of this process

This means that any software that provides a canned report summarizing a data set (average, mean, etc.) qualifies. MRP software from four decades ago had canned reports that did this and qualify. Thus, since computers are modern in the grand scheme of human history, any vendor can tell you with a straight faced that they have a modern platform with a modern analytics solution if it runs on a computer, supports bid collection in a spreadsheet, and contains a canned report summary — especially if they were an English or Arts Major (especially since we are in the post-modern phase in their worldview).

DO YOU REALLY WANT TWO-PLUS DECADES OLD TECHNOLOGY?

Think carefully about this — because if you don’t ask the right questions and use the right measuring stick, that’s precisely what you might get if you don’t get beyond this “digital” and baseline “analytics” crap.

What you have to know is that there are levels to analysis. And while the number of levels might very depending on how granular you want to get, there are at least five in today’s technology platforms, and these are the seven levels the doctor likes to use.

1. Classificative
At this level, data is classified into buckets for the purpose of basic analytics.

2. Descriptive
At this level, basic statistics are run to compute summary, typically canned, reports on the data.

For decades, this is all you got, and many vendors still try to pass this off as sufficient.

3. Diagnostic
At this level, the user is either given the ability to define their own reports to drill in and find the potential root causes of issues identified in the reports or to run more advanced statistics (beyond just average and mean) to identify correlations between data to find potential root causes of issues.

Most platforms developed or upgraded in the last five years in S2P, Sourcing, and Spend Analysis have this capability. But this is not enough any more, especially when there are do-it-yourself software packages for under 1K that can allow you to get to the next level, which has been around in specialized demand planning and analytics for decades.

4. Predictive
At this level, the platform employs statistical trend analysis, advanced clustering, and/or machine learning to identify trends and predict future costs, risks, performance, etc.

A few platforms are starting to incorporate this, but this should be a baseline requirement considering ERPs, demand planning, and advanced BI tools have had at least some capability here for close to 2 decades

5. Prescriptive
At this level, the platform is not just identifying and computing future trends, but providing advice on what to do as a result of those trends.

Leading platforms are starting down this path, but given that the foundations of prescriptive analytics have been around for over two decades and that best practices in sourcing and procurement have been around almost as long, if a platform can’t provide not only insight and recommendations what to do with that insight, it will never even achieve 3.0 objectives … meaning 4.0 will never be a reality.

In other words, any platform without some prescriptive capability is behind and not one you should be investing in.

6. Permissive
At this level, the prescriptive analytics is used to power automatic actions based on embedded rules. If the platform determines a commodity that is typically on a one year contract is at an all time low, it might initiate the renewal event two months early to lock a rate in if a rule is defined that says events can be initiated up to three months early if prices drop below contracted rates and are projected to be within 2% of the projected low.

Few platforms are here, but you should be looking for a configurable platform with rules that permit simple automation based on both entered and derived data values from the application and the data it contains. Permissive analytics is a cornerstone of the Procurement 4.0 promise so make sure your chosen vendor is building in permissive analytic capability. It can be fledgeling to start, but something needs to be there or it won’t be there when you need it.

7. Cognitive
At this level, the platform embeds machine learning and advanced AI techniques to not only make good predictions but choose the right actions to take on those predictions without any user intervention for run-of-the-mill sourcing and procurement processes and events. When we reach Procurement 4.0, such systems will not only eliminate 98% of tactical work to allow buyers to focus on the strategic, but eliminate 90%+ of strategic work identified as relatively low value (at the time) and allow buyers to focus on strategic efforts that present the greatest opportunity to provide value … truly optimizing the limited Procurement resources available.