Category Archives: Spend Analysis

Furthermore, No Modern 2020 Platform Will Be Without What-if?!

As you may have noticed, the doctor has been on a bit of a bent lately defining what a modern S2P platform is as he’s completely fed up of all of the “digital” bullshit where marketers are trying to sell everything old like its new again and technically advertising solutions that are less powerful than the doctor could code on his 8088 three decades ago! (It had a 2400 baud modem so the requirement of network connectivity was even met.)

And if you think the doctor is being a bit extreme, go back and re-read the definitions of “digital” and “analysis”, ask some pointed questions to these vendors about what their solutions can really do, and you’ll find that maybe, just maybe, he’s not being that extreme at all. It’s sad how many vendors believe that a fancy new UX on a weak Procurement 2.0 solution all of a sudden makes it 3.0 and 4.0 ready when all they are really doing is putting lipstick on a pig (and no self respecting pig wants to wear lipstick)!

Yesterday we defined the levels of analytics and hopefully made it clear that there shouldn’t be a single platform on your consideration list that doesn’t have at least basic prescriptive capability and that you should also make sure the vendor is on a permissive journey before signing on the bottom line!

But that’s not all you need to demand in a platform. You also need to demand a platform with embedded What If? capability.

It’s going to be a while before the predictive analytics work across all the situations a procurement specialist need them to work in, and even longer until the platform supports the insights needed for permissive analytics. But, in the interim, the procurement specialists still need to extract value from analytics — and that value is going to come from What If?.

What If? the demand next year is the same as this year, what will the total cost be if the cost stays flat? What if demand rises 10%?

What If? the delivery is late by a day? By 3 days? By a week? What if the order is routed to the backup supplier? The backup location?

What If? the supplier’s financial woes get worse? What if the supplier goes bankrupt?

What If? the contract milestone isn’t hit? What is the impact? What is the risk?

The procurement professional needs to be able to ask What If? throughout the platform and, more importantly, throughout the analytics. Some Reports should be interactive and allow the user to project the next quarter, year, etc. of data using current data and advanced What If? algorithms. Anything less won’t be enough.

… And Advanced Analytics Should Be a Must in 2020!

Just like any vendor can claim to have a digital procurement solution because, as we clearly explained last week, email and spreadsheets technically count, any vendor can claim to have analytics. Consider the definition:

the analysis of data, typically large sets of business data, by the use of mathematics, statistics, and computer software

And then consider the common definition of analysis:

a presentation, usually in writing, of the results of this process

This means that any software that provides a canned report summarizing a data set (average, mean, etc.) qualifies. MRP software from four decades ago had canned reports that did this and qualify. Thus, since computers are modern in the grand scheme of human history, any vendor can tell you with a straight faced that they have a modern platform with a modern analytics solution if it runs on a computer, supports bid collection in a spreadsheet, and contains a canned report summary — especially if they were an English or Arts Major (especially since we are in the post-modern phase in their worldview).

DO YOU REALLY WANT TWO-PLUS DECADES OLD TECHNOLOGY?

Think carefully about this — because if you don’t ask the right questions and use the right measuring stick, that’s precisely what you might get if you don’t get beyond this “digital” and baseline “analytics” crap.

What you have to know is that there are levels to analysis. And while the number of levels might very depending on how granular you want to get, there are at least five in today’s technology platforms, and these are the seven levels the doctor likes to use.

1. Classificative
At this level, data is classified into buckets for the purpose of basic analytics.

2. Descriptive
At this level, basic statistics are run to compute summary, typically canned, reports on the data.

For decades, this is all you got, and many vendors still try to pass this off as sufficient.

3. Diagnostic
At this level, the user is either given the ability to define their own reports to drill in and find the potential root causes of issues identified in the reports or to run more advanced statistics (beyond just average and mean) to identify correlations between data to find potential root causes of issues.

Most platforms developed or upgraded in the last five years in S2P, Sourcing, and Spend Analysis have this capability. But this is not enough any more, especially when there are do-it-yourself software packages for under 1K that can allow you to get to the next level, which has been around in specialized demand planning and analytics for decades.

4. Predictive
At this level, the platform employs statistical trend analysis, advanced clustering, and/or machine learning to identify trends and predict future costs, risks, performance, etc.

A few platforms are starting to incorporate this, but this should be a baseline requirement considering ERPs, demand planning, and advanced BI tools have had at least some capability here for close to 2 decades

5. Prescriptive
At this level, the platform is not just identifying and computing future trends, but providing advice on what to do as a result of those trends.

Leading platforms are starting down this path, but given that the foundations of prescriptive analytics have been around for over two decades and that best practices in sourcing and procurement have been around almost as long, if a platform can’t provide not only insight and recommendations what to do with that insight, it will never even achieve 3.0 objectives … meaning 4.0 will never be a reality.

In other words, any platform without some prescriptive capability is behind and not one you should be investing in.

6. Permissive
At this level, the prescriptive analytics is used to power automatic actions based on embedded rules. If the platform determines a commodity that is typically on a one year contract is at an all time low, it might initiate the renewal event two months early to lock a rate in if a rule is defined that says events can be initiated up to three months early if prices drop below contracted rates and are projected to be within 2% of the projected low.

Few platforms are here, but you should be looking for a configurable platform with rules that permit simple automation based on both entered and derived data values from the application and the data it contains. Permissive analytics is a cornerstone of the Procurement 4.0 promise so make sure your chosen vendor is building in permissive analytic capability. It can be fledgeling to start, but something needs to be there or it won’t be there when you need it.

7. Cognitive
At this level, the platform embeds machine learning and advanced AI techniques to not only make good predictions but choose the right actions to take on those predictions without any user intervention for run-of-the-mill sourcing and procurement processes and events. When we reach Procurement 4.0, such systems will not only eliminate 98% of tactical work to allow buyers to focus on the strategic, but eliminate 90%+ of strategic work identified as relatively low value (at the time) and allow buyers to focus on strategic efforts that present the greatest opportunity to provide value … truly optimizing the limited Procurement resources available.

20 Analytics Predictions from the “Experts” for 2020 Part I

Guess how many will be 100% accurate?

(We’ll give you a hint. You only need one hand. You won’t need your thumb. And you’ll probably have fingers to spare.)

the doctor has been scouring the internet for the usual prediction articles to see what 2020 won’t have in store. Because if there is just one thing overly optimistic futurist authors are good at, it’s at pointing out what won’t be happening anytime soon, even though it should be.

This is not to say they’re all bust — some will materialize eventually and others indicate where a turning point may be needed — but they’re definitely not this year’s reality (and maybe not even this decade’s).

So, to pump some reality into the picture, the doctor is going to discuss the 19 anti-predictions that are taking over mainstream Net media … and then discuss the 1 prediction he found that is entirely 100% accurate.

In no particular order, we’ll take the predictions one by one.

Performance benchmarks will be replaced by efficiency benchmarks

This absolutely needs to happen. Performance benchmarks only tell you how good you’ve done, not how good you are going to do in the future. The only indication of that is how good you are doing now, and this is best measured by efficiency. But since pretty much all analytics vendors are just getting good at performance benchmarks and dashboards, you can bet efficiency is still a long way coming.

IoT becomes queryable and analyzable

… but not in real-time. Right now, the best that will happen is that the signals will get pushed into a database on a near-real time schedule (which will be at least daily), indexed on a near-real time basis (at least daily), and support meaningful queries that can provide real, usable, actionable information that will help users make decisions faster than ever before (but not yet real-time).

Rise of data micro-services

Data micro-services will continue to proliferate, but does this mean that they will truly rise, especially in a business — or Procurement — context. The best that will happen is that more analytics vendors will integrate more useful data streams for their clients to make use of — market data, risk data, supplier data, product data, etc. — but real-time micro-service subscriptions are likely still a few years off.

More in-memory processing

In-memory processing will continue to increase at the same rate its been increasing at for the last decade. No more, no less. We’re not at the point where more vendors will spend big on memory and move to all in-memory processing or abandon it.

More natural-language processing

Natural language processing will continue to increase at the same rate its been increasing for the last decade. No more, no less. We’re not at the point where more vendors will dive in any faster or abandon it. It’s the same-old, same-old.

Graph analytics

Graph analytics will continue to worm its way into analytics platforms, but this won’t be the year it breaks out and takes over. Most vendors are still using traditional relational databases … object databases are still a stretch.

Augmented analytics

The definition of augmented is a system that can learn from human feedback and provide better insights and/or recommendations over time. While we do have good machine learning technology that can learn from human interaction and optimize (work)flows, when it comes to analytics, good insights comes from identifying the right data to present to the user and, in particular, data that extends beyond organizational data such as current market rates, supplier risk data, product performance data, etc.

Until we have analytics platforms that are tightly integrated with the right market and external data, and machine learning that learns not just from user workflows on internal data, but external data and human decisions based on that external data, we’re not going to have much in the way of useful augmented analytics in spend analysis platforms. The few exceptions in the next few years will be those analytics vendors that live inside consultancies that do category management, GPO sourcing, and similar services that collect meaningful market data on categories and savings percentages to help customers do relevant opportunity analysis.

Predictive analytics

As with natural language processing, predictive analytics will continue to be the same-old same-old predictions based on traditional trend analysis. There won’t be much ground-breaking here as only the vendors that are working on neural networks, deep learning, and other AI technologies will make any advancements — but the majority of these vendors are not (spend) analytics vendors

Data automation

RPA is picking up, but like in-memory processing and semantic technology, it’s not going to all-of-a-sudden become mainstream, especially in analytics. Especially since it’s not just automating input and out-of-the-box reports that is useful, but automating processes that provide insight. And, as per our discussion of augmented analytics, insight requires external data integrated with internal data in meaningful trends.

No-code analytics

Cue the woody woodpecker laugh track please! Because true analytics is anything but low-code. It’s lots and lots and lots of code. Hundreds and Thousands and Hundreds of Thousands of lines of codes. Maybe the UI makes it easy to build reports and extract insights with point-and-click and drag-and-drop and allows an average user to do it without scripting, but the analytics provider will be writing even more code than you know to make that happen.

Come back tomorrow as we tackle the next ten.

A Single Version of Truth!

Today’s guest post is from the spend master himself, Eric Strovink of Spendata.

A oft-repeated benefit of data warehouses in general, and spend analysis systems specifically, is the promise of “a single version of truth.” The argument goes like this: in order to take action on any savings initiative, company stakeholders must first agree on the structure and organization of the data. Then and only then can real progress be made.

The problem, of course, is that truth is slippery when it comes to spend data. What, for example, is “tail spend”? Even pundits can’t agree. Should IT labor be mapped to Professional Services, HR, or Technology? For that matter, what should a Commodity structure look like in the first place? Can anyone agree on a Cost Center hierarchy, when there are different versions of the org chart due to acquisitions, dotted-line responsibilities, and other (necessary) inconsistencies?

What tends to happen is that the “single version of truth” ends up being driven by a set of committee decisions, resulting in generic spending data that is much less useful than it could be. Spend analysts uncover opportunities by creating new data relationships to drive insights, not by running displays or reports against static data. So, when the time comes to propose savings initiatives, the very system that’s supposed to support decision-making is less useful than it should be; or worst-case, not useful at all.

Questions and Answers: Metadata

Do we have preferred vendors? Do buyers and stakeholders agree on which vendors are preferred? What vendors are “untouchable” because of long-term contracts or other entanglements? For that matter, with which vendors do we actually have contracts, and what do we mean by “contract”? Are there policies that mandate against a particular savings initiative, such as lack of centralized control over laser printer procurement, or the absence of a policy on buying service contracts? Can we identify and annotate opportunities and non-opportunities, by vendor or by Commodity?

The answers to these (and many other) questions produce “metadata” that needs to be combined with spend data in order to inform the next steps in a savings program. The nature of this metadata is that it’s almost certainly inaccurate when first entered. We’ll need to modify it, pretty much continually, as we learn more; for example, finding out that although John may have dealt with Vendor X and has correctly indicated that he’s dealt with them, it’s actually Carol who owns the relationship. We may also determine that the Commodity mapping isn’t helpful; network wiring, for example, might need to belong with IT, not Facilities.

Alternative Truths

As we add more and more metadata to the system — information that is critical to driving a savings program — we encounter the need to refine and reorganize data to reflect new insights and new information. Data organization is often quite purpose-specific, so multiple different versions of the data must be able to be spawned quickly and coexist without issues. This requires an agile system with completely different characteristics than a centralized system with an inflexible structure and a large audience. In essence, one must learn to become comfortable with alternative truths, because they are essential to the analysis process.

So what happens to the centralized spend analysis system, proudly trotted out to multiple users, with its “single version of truth?” Well, it chugs along in the background, making its committee members happy. Meanwhile, the real work of spend analysis must be (and is) done elsewhere.

Thanks, Eric!.

The Key Reason Spend Analyses Fail (that Often Goes Overlooked)


Today we welcome another guest post from Brian Seipel a Procurement Consultant at Source One Management Services focused on helping corporations understand their spend profile and develop actionable strategies for cost reduction and supplier relationship management. Brian has a lot of real-world project experience in sourcing, and brings some unique insight on the topic.

Organizations that develop an understanding of their spend have an edge when it comes to strategic sourcing: They better understand where money is being spent, with who, and on what than others who enter into the process either blindly or as a knee-jerk reaction to an incumbent price hike. This is particularly important for tail spend in those spend categories on the indirect side that too often fly under the radar.

That edge isn’t a given, however. Building a spend analysis can serve as the foundation for strong opportunity assessments, but doing so won’t automatically lead to better sourcing projects. Organizations who spend time on spend analyses can and do still fail at strategic sourcing for a very big reason. We put too much faith in the front-end process of building this analysis, and forsake the back-end, leaving a critical gap in our understanding of our spend profile.

The Front-End Spend Analysis

The first steps of a spend analysis are akin to cleaning out your basement. What’s the first thing you do? Before sorting into keep-or-toss piles can begin, even before moving and opening boxes – we need to turn on the light and survey the room. “Turning on the light” is really what the front-end of a spend analysis is. Our goal is to shine a light on the spend we have so sourcing project identification can begin. How does a spend analysis accomplish this?


  • Cleansing & Consolidation. Take all of the disparate data sources that make up our profile and create a single view of them, cleaning up supplier names and other critical fields along the way. For example, referring to the supplier “Dun and Bradstreet,” with that single name, even when spend from a second set that refers to “D&B.”
  • Classification. With all spend in one consolidated set, we will now attach meaningful classifications. The discussion around the best way to do this is worthy of a discussion of its own, so let’s simply say care should be taken here. Choose a system that speaks to your organization’s process, products, and objectives.

Let’s cook up an example. Let’s say we want to look into our IT spend to see where we can cut costs. We conduct a spend analysis covering the points above and learn the following: We have four locations using four different managed IT service providers offering similar services at four different price points.

This is the type of intel that suggests a strategic sourcing initiative may be called for. Pitting these suppliers against each other in a market event will drive down costs and potentially streamline operations if we can establish a single supplier for all four locations. We can estimate these savings by building a baseline spend profile and comparing to our average savings by following this strategy within this category. Simple enough. So why do sourcing initiatives often fail to deliver?

Moving Into Opportunity Assessment

Because we just committed a big mistake: We took our initial view of the spend and jumped right to goal setting without taking the time to properly scope. We went from turning on the basement light to selling boxes, en masse and unopened, directly on Ebay without knowing what was inside.

As we go to market, our sourcing event fails each of our four locations for different reasons:


  • The first location is locked into a multi-year contract with a painful termination clause. Without scoping, who didn’t know what our contractual obligations looked like
  • The second location isn’t locked into a contract, but is locked in by a lack of competition in the market. Without scoping, we never looked beyond our own buying history into the market landscape
  • The third location is free of both of these problems, but this isn’t their first rodeo. They used the providers that locations one and two use in the past, but abandoned them due to severe performance issues. Without scoping, we can’t get a good enough view into the decision making process that led to incumbent relationships.
  • Finally, our fourth location. No issues with suppliers, contracts, or market competition. The problem here? When we dig into the spend, we realize the bulk was capex: The purchase of equipment for a new server room buildout. Now that the equipment is purchased, we won’t see this spend come back around for years to come. Without scoping, we assumed spend was annually recurring, and now we have next to nothing.

Better Spend Analysis through Better Scoping

Once our spend analysis is complete, we’ll need to bring additional stakeholders into the fold. Bring in the employees who actually interact with these suppliers and their products and work with them to develop a sourcing history:


  • Did we accurately describe how you use this supplier with our chosen classification system?
  • What are we specifically buying from this supplier, and are these purchases made regularly or only once every few years?
  • How was this supplier selected, and who chose them? Were any competitors engaged at the same time? How did this incumbent beat them out?
  • What does this supplier do well? Where are their biggest points of failure?
  • Has this category been sourced recently? How was the event conducted, and what was the result?

Beyond this interview, ask these stakeholders to provide copies of any active MSAs, SOWs, SLAs, or any other document that can help define the relationship. Of particular note will be termination clauses. What date does the agreement end, and what are the renewal terms? What steps do we follow to terminate on that date, and by when do they need to be taken? If terminating before that date, are there any penalties?

From Insight to Action

Building a detailed spend analysis takes time, and the commitment of resources that could be doing other things. As such, you need to ensure you get a good ROI out of the exercise.

The best way to do that is to see beyond the front-end of what a spend analysis is (the unification, cleansing, and classification of spend data) and consider what a spend analysis helps Procurement do (identify strategic sourcing initiatives and estimate potential impact). Scoping is a critical part of this process, and properly scoping opportunities that a spend analysis shines a light on is a great way to get that ROI.

Thanks, Brian!