Category Archives: Spend Analysis

Coronavirus/COVID-19 Response: Analytics Can Help Get You Through the Crisis

In the first stage of the pandemic, mines close, processors close, or other suppliers of critical raw materials become unavailable and your direct procurement becomes threatened, and you have to identify new sources of supply quickly to maintain supply assurance, while also making the best selection for the business to keep total of cost ownership acceptable and predictable (as a lower cost risky alternative could put you back in the same position in a few months). You need good analytics to make the right decision.

In the second stage of the pandemic, factories close, certain distribution channels become unstable, and distributor stockpiles run out and indirect goods become scarce and problematic across key categories. And you need to respond. Good analytics will again be key as you don’t want to be going back to market in three to six months, but you also need to keep costs down to insure you have the cash to deal with cost spikes in direct lines where supply unavailability significantly tips the supply/demand balance scale or where costly expedited logistics will be needed. You again need good analytics to make the right decision.

And unless you have a modern best-of-breed Source-to-Pay suite with great analytics embedded or a best-of-breed stand-alone analytics solution, you don’t have anywhere close to what you need. Just a few of the questions you will need to answer include:

  • How much am I paying now for a product, and how much should I pay based on today’s commodity pricing and currency volatility?
  • How do I understand the cost impact of supplier failure?
  • How do I understand the cost impact of raw material availability?
  • How do I identify outliers that might signify future issues or opportunities?

… along with dozens more. So how do you answer these questions? What technologies do you choose? Check out the doctor‘s CORONAVIRUS RESPONSE: Advanced Procurement Analytics — find the risks hiding in your data, prioritize and take action Pro piece over on Spend Matters. Even if you don’t have Pro access, the content in front of the paywall is still useful and might give you some ideas on where to start.

Furthermore, No Modern 2020 Platform Will Be Without What-if?!

As you may have noticed, the doctor has been on a bit of a bent lately defining what a modern S2P platform is as he’s completely fed up of all of the “digital” bullshit where marketers are trying to sell everything old like its new again and technically advertising solutions that are less powerful than the doctor could code on his 8088 three decades ago! (It had a 2400 baud modem so the requirement of network connectivity was even met.)

And if you think the doctor is being a bit extreme, go back and re-read the definitions of “digital” and “analysis”, ask some pointed questions to these vendors about what their solutions can really do, and you’ll find that maybe, just maybe, he’s not being that extreme at all. It’s sad how many vendors believe that a fancy new UX on a weak Procurement 2.0 solution all of a sudden makes it 3.0 and 4.0 ready when all they are really doing is putting lipstick on a pig (and no self respecting pig wants to wear lipstick)!

Yesterday we defined the levels of analytics and hopefully made it clear that there shouldn’t be a single platform on your consideration list that doesn’t have at least basic prescriptive capability and that you should also make sure the vendor is on a permissive journey before signing on the bottom line!

But that’s not all you need to demand in a platform. You also need to demand a platform with embedded What If? capability.

It’s going to be a while before the predictive analytics work across all the situations a procurement specialist need them to work in, and even longer until the platform supports the insights needed for permissive analytics. But, in the interim, the procurement specialists still need to extract value from analytics — and that value is going to come from What If?.

What If? the demand next year is the same as this year, what will the total cost be if the cost stays flat? What if demand rises 10%?

What If? the delivery is late by a day? By 3 days? By a week? What if the order is routed to the backup supplier? The backup location?

What If? the supplier’s financial woes get worse? What if the supplier goes bankrupt?

What If? the contract milestone isn’t hit? What is the impact? What is the risk?

The procurement professional needs to be able to ask What If? throughout the platform and, more importantly, throughout the analytics. Some Reports should be interactive and allow the user to project the next quarter, year, etc. of data using current data and advanced What If? algorithms. Anything less won’t be enough.

… And Advanced Analytics Should Be a Must in 2020!

Just like any vendor can claim to have a digital procurement solution because, as we clearly explained last week, email and spreadsheets technically count, any vendor can claim to have analytics. Consider the definition:

the analysis of data, typically large sets of business data, by the use of mathematics, statistics, and computer software

And then consider the common definition of analysis:

a presentation, usually in writing, of the results of this process

This means that any software that provides a canned report summarizing a data set (average, mean, etc.) qualifies. MRP software from four decades ago had canned reports that did this and qualify. Thus, since computers are modern in the grand scheme of human history, any vendor can tell you with a straight faced that they have a modern platform with a modern analytics solution if it runs on a computer, supports bid collection in a spreadsheet, and contains a canned report summary — especially if they were an English or Arts Major (especially since we are in the post-modern phase in their worldview).

DO YOU REALLY WANT TWO-PLUS DECADES OLD TECHNOLOGY?

Think carefully about this — because if you don’t ask the right questions and use the right measuring stick, that’s precisely what you might get if you don’t get beyond this “digital” and baseline “analytics” crap.

What you have to know is that there are levels to analysis. And while the number of levels might very depending on how granular you want to get, there are at least five in today’s technology platforms, and these are the seven levels the doctor likes to use.

1. Classificative
At this level, data is classified into buckets for the purpose of basic analytics.

2. Descriptive
At this level, basic statistics are run to compute summary, typically canned, reports on the data.

For decades, this is all you got, and many vendors still try to pass this off as sufficient.

3. Diagnostic
At this level, the user is either given the ability to define their own reports to drill in and find the potential root causes of issues identified in the reports or to run more advanced statistics (beyond just average and mean) to identify correlations between data to find potential root causes of issues.

Most platforms developed or upgraded in the last five years in S2P, Sourcing, and Spend Analysis have this capability. But this is not enough any more, especially when there are do-it-yourself software packages for under 1K that can allow you to get to the next level, which has been around in specialized demand planning and analytics for decades.

4. Predictive
At this level, the platform employs statistical trend analysis, advanced clustering, and/or machine learning to identify trends and predict future costs, risks, performance, etc.

A few platforms are starting to incorporate this, but this should be a baseline requirement considering ERPs, demand planning, and advanced BI tools have had at least some capability here for close to 2 decades

5. Prescriptive
At this level, the platform is not just identifying and computing future trends, but providing advice on what to do as a result of those trends.

Leading platforms are starting down this path, but given that the foundations of prescriptive analytics have been around for over two decades and that best practices in sourcing and procurement have been around almost as long, if a platform can’t provide not only insight and recommendations what to do with that insight, it will never even achieve 3.0 objectives … meaning 4.0 will never be a reality.

In other words, any platform without some prescriptive capability is behind and not one you should be investing in.

6. Permissive
At this level, the prescriptive analytics is used to power automatic actions based on embedded rules. If the platform determines a commodity that is typically on a one year contract is at an all time low, it might initiate the renewal event two months early to lock a rate in if a rule is defined that says events can be initiated up to three months early if prices drop below contracted rates and are projected to be within 2% of the projected low.

Few platforms are here, but you should be looking for a configurable platform with rules that permit simple automation based on both entered and derived data values from the application and the data it contains. Permissive analytics is a cornerstone of the Procurement 4.0 promise so make sure your chosen vendor is building in permissive analytic capability. It can be fledgeling to start, but something needs to be there or it won’t be there when you need it.

7. Cognitive
At this level, the platform embeds machine learning and advanced AI techniques to not only make good predictions but choose the right actions to take on those predictions without any user intervention for run-of-the-mill sourcing and procurement processes and events. When we reach Procurement 4.0, such systems will not only eliminate 98% of tactical work to allow buyers to focus on the strategic, but eliminate 90%+ of strategic work identified as relatively low value (at the time) and allow buyers to focus on strategic efforts that present the greatest opportunity to provide value … truly optimizing the limited Procurement resources available.

20 Analytics Predictions from the “Experts” for 2020 Part I

Guess how many will be 100% accurate?

(We’ll give you a hint. You only need one hand. You won’t need your thumb. And you’ll probably have fingers to spare.)

the doctor has been scouring the internet for the usual prediction articles to see what 2020 won’t have in store. Because if there is just one thing overly optimistic futurist authors are good at, it’s at pointing out what won’t be happening anytime soon, even though it should be.

This is not to say they’re all bust — some will materialize eventually and others indicate where a turning point may be needed — but they’re definitely not this year’s reality (and maybe not even this decade’s).

So, to pump some reality into the picture, the doctor is going to discuss the 19 anti-predictions that are taking over mainstream Net media … and then discuss the 1 prediction he found that is entirely 100% accurate.

In no particular order, we’ll take the predictions one by one.

Performance benchmarks will be replaced by efficiency benchmarks

This absolutely needs to happen. Performance benchmarks only tell you how good you’ve done, not how good you are going to do in the future. The only indication of that is how good you are doing now, and this is best measured by efficiency. But since pretty much all analytics vendors are just getting good at performance benchmarks and dashboards, you can bet efficiency is still a long way coming.

IoT becomes queryable and analyzable

… but not in real-time. Right now, the best that will happen is that the signals will get pushed into a database on a near-real time schedule (which will be at least daily), indexed on a near-real time basis (at least daily), and support meaningful queries that can provide real, usable, actionable information that will help users make decisions faster than ever before (but not yet real-time).

Rise of data micro-services

Data micro-services will continue to proliferate, but does this mean that they will truly rise, especially in a business — or Procurement — context. The best that will happen is that more analytics vendors will integrate more useful data streams for their clients to make use of — market data, risk data, supplier data, product data, etc. — but real-time micro-service subscriptions are likely still a few years off.

More in-memory processing

In-memory processing will continue to increase at the same rate its been increasing at for the last decade. No more, no less. We’re not at the point where more vendors will spend big on memory and move to all in-memory processing or abandon it.

More natural-language processing

Natural language processing will continue to increase at the same rate its been increasing for the last decade. No more, no less. We’re not at the point where more vendors will dive in any faster or abandon it. It’s the same-old, same-old.

Graph analytics

Graph analytics will continue to worm its way into analytics platforms, but this won’t be the year it breaks out and takes over. Most vendors are still using traditional relational databases … object databases are still a stretch.

Augmented analytics

The definition of augmented is a system that can learn from human feedback and provide better insights and/or recommendations over time. While we do have good machine learning technology that can learn from human interaction and optimize (work)flows, when it comes to analytics, good insights comes from identifying the right data to present to the user and, in particular, data that extends beyond organizational data such as current market rates, supplier risk data, product performance data, etc.

Until we have analytics platforms that are tightly integrated with the right market and external data, and machine learning that learns not just from user workflows on internal data, but external data and human decisions based on that external data, we’re not going to have much in the way of useful augmented analytics in spend analysis platforms. The few exceptions in the next few years will be those analytics vendors that live inside consultancies that do category management, GPO sourcing, and similar services that collect meaningful market data on categories and savings percentages to help customers do relevant opportunity analysis.

Predictive analytics

As with natural language processing, predictive analytics will continue to be the same-old same-old predictions based on traditional trend analysis. There won’t be much ground-breaking here as only the vendors that are working on neural networks, deep learning, and other AI technologies will make any advancements — but the majority of these vendors are not (spend) analytics vendors

Data automation

RPA is picking up, but like in-memory processing and semantic technology, it’s not going to all-of-a-sudden become mainstream, especially in analytics. Especially since it’s not just automating input and out-of-the-box reports that is useful, but automating processes that provide insight. And, as per our discussion of augmented analytics, insight requires external data integrated with internal data in meaningful trends.

No-code analytics

Cue the woody woodpecker laugh track please! Because true analytics is anything but low-code. It’s lots and lots and lots of code. Hundreds and Thousands and Hundreds of Thousands of lines of codes. Maybe the UI makes it easy to build reports and extract insights with point-and-click and drag-and-drop and allows an average user to do it without scripting, but the analytics provider will be writing even more code than you know to make that happen.

Come back tomorrow as we tackle the next ten.

A Single Version of Truth!

Today’s guest post is from the spend master himself, Eric Strovink of Spendata.

A oft-repeated benefit of data warehouses in general, and spend analysis systems specifically, is the promise of “a single version of truth.” The argument goes like this: in order to take action on any savings initiative, company stakeholders must first agree on the structure and organization of the data. Then and only then can real progress be made.

The problem, of course, is that truth is slippery when it comes to spend data. What, for example, is “tail spend”? Even pundits can’t agree. Should IT labor be mapped to Professional Services, HR, or Technology? For that matter, what should a Commodity structure look like in the first place? Can anyone agree on a Cost Center hierarchy, when there are different versions of the org chart due to acquisitions, dotted-line responsibilities, and other (necessary) inconsistencies?

What tends to happen is that the “single version of truth” ends up being driven by a set of committee decisions, resulting in generic spending data that is much less useful than it could be. Spend analysts uncover opportunities by creating new data relationships to drive insights, not by running displays or reports against static data. So, when the time comes to propose savings initiatives, the very system that’s supposed to support decision-making is less useful than it should be; or worst-case, not useful at all.

Questions and Answers: Metadata

Do we have preferred vendors? Do buyers and stakeholders agree on which vendors are preferred? What vendors are “untouchable” because of long-term contracts or other entanglements? For that matter, with which vendors do we actually have contracts, and what do we mean by “contract”? Are there policies that mandate against a particular savings initiative, such as lack of centralized control over laser printer procurement, or the absence of a policy on buying service contracts? Can we identify and annotate opportunities and non-opportunities, by vendor or by Commodity?

The answers to these (and many other) questions produce “metadata” that needs to be combined with spend data in order to inform the next steps in a savings program. The nature of this metadata is that it’s almost certainly inaccurate when first entered. We’ll need to modify it, pretty much continually, as we learn more; for example, finding out that although John may have dealt with Vendor X and has correctly indicated that he’s dealt with them, it’s actually Carol who owns the relationship. We may also determine that the Commodity mapping isn’t helpful; network wiring, for example, might need to belong with IT, not Facilities.

Alternative Truths

As we add more and more metadata to the system — information that is critical to driving a savings program — we encounter the need to refine and reorganize data to reflect new insights and new information. Data organization is often quite purpose-specific, so multiple different versions of the data must be able to be spawned quickly and coexist without issues. This requires an agile system with completely different characteristics than a centralized system with an inflexible structure and a large audience. In essence, one must learn to become comfortable with alternative truths, because they are essential to the analysis process.

So what happens to the centralized spend analysis system, proudly trotted out to multiple users, with its “single version of truth?” Well, it chugs along in the background, making its committee members happy. Meanwhile, the real work of spend analysis must be (and is) done elsewhere.

Thanks, Eric!.