Category Archives: Market Intelligence

Procurement should NOT be reimagined!

It’s not just vendors that have latched onto the Marketing Madness that we addressed in last week’s article where we tried to help you decipher ten meaningless phrases that are polluting the Procurement technology landscape, but consultants and thought leaders as well. And while the marketing madmen fill us with meaningless messaging, these consultants are feeding us with dangerous delusions that we can solve our problems by simply redefining Procurement as something it is not.

Procurement is not something to be reimagined as it is not something that should even be redefined at the core. The purpose of Procurement has not changed since the first known Purchasing manual, The Handling of Railway Supplies: Their Purchase and Disposition was published back in 1887, nor should it change. It’s the process of sourcing, acquiring, and paying for the goods and services the organization needs, and doing it in a manner that ensures that the products will meet the needs, at the best price, and show up at the right time — and that as many orders as possible are “perfect” (or, more precisely, problem free).

Key aspects are thus:

  • Supplier Discovery and Vetting (Risk and Compliance)
  • RFP creation or Auction (Product Service Verification and Competitive Pricing)
  • Award and Contract (Negotiation and Terms and Conditions)
  • Catalogs, Purchase Orders, Pre-Scheduled Deliveries, Auto-Reorders (“Buying”)
  • Logistics Routing, Delivery Scheduling and Monitoring (Risk Management)
  • Invoice Processing and Payment (Payment Confirmation, Fraud Prevention)
  • Quality Assurance and Inventory Management (Loss Minimization)

There is nothing to imagine here. And definitely NOTHING to re-imagine here. Now that supply assurance is still near an all time low (due to geopolitical instability, rampant inflation, unpredictable demand, etc.), it’s time to double down on what is critical and get it right. Not wander off to Imaginationland searching for a magical solution to tough, real-world problems.

New and improved processes might increase the chance of success (by decreasing the odds that something is missed), new technologies might increase the level of automation (and decrease the amount of manual [e-]paper pushing), but neither fundamentally change the work that must be done, the effort that must be made, and the human intelligence (HI) that must be applied to get the job done. No amount of “re-imagining” will change this. As we’ve said before, and will probably have to say again and again and again, there is no big red easy button, and no amount of imagining (or re-imagining) will create one. So, if someone tells you to re-imagine procurement. you tell them the same thing you should tell them if they spew Marketing Madness: CUT THE CR@P!

Is Procurement Complexity at an All-Time High?

A couple of months ago, CIPS and RS released the 2024 Indirect Procurement Report ‘Maintaining Focus’ that focussed on the state of the sector for those responsible for supplies supporting maintenance, repair, & operations (MRO). The survey, which drew a record number of responses, including a large number of younger individuals contributing compared to prior years, like many other, provided a lot of data points and statistics, but unlike other surveys, was pretty surprising.

Why?

Usually, when a report asks about business pressures, challenges, top areas of focus, etc., there are typically 2 or 3 responses that the majority of respondents agree on. But in this survey, the top business pressure received 32% agreement, the top challenge 37% agreement, the top day-to-day challenge 33%, and top activity to drive efficiency 27%, the top strategy 26%, the largest challenge to delivery ESG 39%, the biggest driver of downtime 19%, the top indicator for supplier performance management 44% (which is as close as we get to 50%), the top reason to adopt new tech 32%, the top benefits of a digital procurement service 29%, etc. It would have been really useful if CIPS did a study as to why (especially when we are so used to Deloitte, McKinsey, and Accenture studies with so much agreement), but they didn’t. So we have to hypothesize.

And the hypothesis that the doctor is coming to is that complexity is at an all time high and, because of this, most procurement professionals, especially newer professionals, just don’t know where to focus. There are too many challenges, too many demands, too many conflicting goals and pressures within the organization, and too many possibilities to address them, and with all the meaningless marketing mayhem and Gen-AI garbage, there’s no real guidance out there.

All-in-all, for all but the most die-hard seasoned professionals who remember the last time Procurement was this challenging (which was decades ago, since the 2000s and 2010s saw the constant introduction of newer, greater, technology; steady, stable, globalization; affordable (if not cheap) logistics; lots of sustainability talk, but no real regulations (beyond RoHS, WEEE, and their ilk); etc.), most Procurement professionals have never had so many challenges, demands, regulations, and technology options to deal with.

And if the doctor‘s right, then what is the solution? (He’ll tell you one thing — it’s not intake to orchestrate, but that’s a different rant [but see point 11 of the market madness].) It’s a very good question, and, right now, even the doctor doesn’t have the complete answer. But while technology will obviously be part of the answer, the full answer will require clarity and Human Intelligence (HI). So get ready to wake up and use your brain. There’s no big red easy button for the mess McKinsey and their ilk have (helped to) put us in (with excessive outsourcing and an utter lack of clarity on tech)!

Demystifying the Marketing Madness for you!

The marketing madness is returning, the incomprehensibility is increasing, and the terminology almost terrifying, so here’s the simplest easy-peasy guide the doctor can make to interpreting what the messaging is actually saying, if it’s saying anything at all!

AI-enabled/AI-backed/AI-enhanced/AI-driven: We don’t actually have any capabilities that you won’t find in one to three dozen of our peers, but since they’ve all jumped on the “AI” bandwagon, we will too and use the exact same meaningless messaging. (Remember, there are NO valid uses for Gen-AI in Procurement and most valid uses for “AI” are constrained to specific use cases, the rest of the time it’s just rules-based RPA/Automation.)

Autonomous Sourcing: If you configure enough rules, or, even worse, turn on our Gen-AI auto-negotiator, the platform, given a demand, will auto configure and run a sourcing event to the point it selects a supplier and sends out an award notification, with little to no guarantee it’s what you wanted (if you turned on Gen-AI).

Delightful Procurement: terribly sorry, but even the doctor can’t translate this one!

Intake-to-Procure: Takes a request in, but doesn’t do anything with it … unless you have a Procurement system it can automate or punch into. (As the doctor has said, intake on its own is Pay-Per-View on your data, and something that SHOULD be included in every proper Procurement solution because you should not have to pay another third party to see YOUR data!)

Margin Multiplier: Our ROI isn’t much better than other best-in-class solutions appropriately applied (the difference between the savings achievable from an average Strategic Procurement/Source-to-Pay and a Best-in-Class Strategic Procurement/Source-to-Pay platform appropriately applied is typically less than 2% [unless one platform includes appropriate SSDO and the other doesn’t] … i.e. you might get 12% savings instead of 10%), but since it’s best in class, you might be able to multiply your margin if all the math works out (3% to 6% instead of 3% to 5.8%), and Margin Multiplier just sounds so much cooler!

Orchestration: Cloud-based middleware that allows you to connect platforms using their APIs through a UX and build data-based workflows that pulls data from one platform and pushes it to another while controlling a multi-application process. Unless it supports integration beyond source-to-pay applications, likely not that useful as it just ADDS to solution sprawl when you can just direct connect the S2P applications yourself using the APIs and rules-based automation to push and pull data (as they all work on essentially the same data).

Smart Procurement: Procurement powered by rules-based workflows, but smart just sounds cool, eh?

Spend Orchestration: We don’t do anything different than all the other orchestration providers, but it sure sounds cool!

Sustainable Procurement: Generally speaking, this simply means you can see supplier / product sustainability (carbon, etc.) data when sourcing, but we don’t actually help you identify more sustainable suppliers or, more importantly, how to work with your supplier to decrease the carbon footprint, raw material utilization, fresh water footprint, etc.

Supplier Insights: An extensible, centralized supplier information/relationship management platform that can be augmented with ALL related supplier finance, product, location, compliance, risk, ESG, and other relevant data. A capability offered by a few dozen platforms, which means this platform isn’t that special.

In short, all of this new marketing gibberish is essentially complete bullcr@p and I have to echo the desire of Sarah Scudder and Dr. Elouise Epstein for Procurement solution providers to tell us what your solution actually does and, in the doctor‘s words, CUT THE CR@P!

The Gen AI Fallacy

For going on 7 (seven) decades, AI cult members have been telling us if they just had more computing power, they’d solve the problem of AI. For going on (seven) 7 decades, they haven’t.

They won’t as long as we don’t fundamentally understand intelligence, the brain, or what is needed to make a computer brain.

Computing will continue to get exponentially more powerful, but it’s not just a matter of more powerful computing. The first AI program had a single core to run on. Today’s AI program have 10,000 core super clusters. The first AI programmer had only his salary and elbow grease to code, and train the model. Today’s AI companies have hundreds of employees and Billions in funding and have spent 200M to train a single model … which told us we should all eat one rock per day upon release to the public. (Which shouldn’t be unexpected as the number of cores we have today powering a single model is still less than the number of neurons in a pond snail.)

Similarly, the “models” will get “better”, relatively speaking (just like deep neural nets got better over time), but if they are not 100% reliable, they can never be used in critical applications, especially when you can’t even reliably predict confidence. (Or, even worse, you can’t even have confidence the result won’t be 100% fabrication.)

When the focus was narrow machine learning/focussed applications and accepting the limitations we had, progress was slow, but it was there, was steady, and the capabilities, and solutions improved yearly.

Now the average “enterprise” solution is decreasing in quality and application, which is going to erase decades of building trust in the cloud and reliable AI.

And that’s the fallacy. Adding more cores and more data just accelerates the capacity for error, not improvement.

Even a smart Google Engineer said so. (Source)

Challenging the Data Foundation ROI Paradigm

Creactives SpA recently published a great article Challenging the ROI Paradigm: Is Calculating ROI on Data Foundation a Valid Measure, which was made even greater by the fact that they are technically a Data Foundation company!

In a nutshell, Creactives is claiming that trying to calculate direct ROI on investments in data quality itself as a standalone business case is absurd. And they are totally right. As they say, the ROI should be calculated based on the total investment in data foundation and the analytics it powers.

The explanation they give cuts straight to the point.

It is as if we demand an ROI from the construction of an industrial shed that ensures the protection of business production but is obviously not directly income-generating. ROI should be calculated based on the total investment, that is, the production machines and the shed.

In other words, there’s no ROI on Clean Data or on Analytics on their own.

And they are entirely correct — and this is true whether you are providing a data foundation for spend analysis, supplier discovery and management, or compliance. If you are not actually doing something with that data that benefits from better data and better foundations, then the ROI of the data foundation is ZERO.

Creactives is helping to bringing to light three fallacies that the doctor sees all the time in this space. (This is very brave of them considering that they are the first data foundation company to admit that their value is zero unless embedded in a process that will require other solutions.)

Fallacy #1. A data cleansing/enrichment solution on its own delivers ROI.

Fallacy #2. You need totally cleansed data before you can deploy a solution.

Fallacy #3. Conversely, you can get ROI from an analytics solution on whatever data you have.

And all of these are, as stated, false!

ROI is generated from analytics on cleansed and enriched data. And that holds true regardless of the type of analytics being performed (spend, process, compliance, risk, discovery, etc.).

And that’s okay, because is a situation where the ROI from both is often exponential, and considerably more than the sum of its parts. Especially since analytics on bad data sometimes delivers a negative return! What the analytics companies don’t tell you is that the quality of the result is fully dependent on the quality, and completeness, of the input. Garbage in, garbage out. (Unless, of course, you are using AI, in which case, especially if Gen-AI is any part of that equation, it’s garbage in, hazardous waste out.)

So compute the return on both. (And it’s easy to partition the ROI by investment. If the data foundation is 60% of the investment, it is responsible for 60% of the return, and the ROI is simply 0.6 Return/Investment.)

Then, find additional analytics-based applications that you can run on the clean data, increase the ROI exponentially (while decreasing the cost of the data foundation in the overall equation), and watch the value of the total solution package soar!