Category Archives: rants

Why Are You Still Buying That Fancy New Piece of Software That

  • Could Get You Sued?
  • Increases The Chance You Will Be Hacked!
  • Could Result in a 100 Million Processing Error?
  • Could Shut Down Your Organization’s Systems for Days!
  • Helps Your Employees Commit Fraud?

If someone told you this when evaluating a piece of software, and asked if you wanted to buy it, I’m sure the vast majority of you would say HELL NO!

In which case I want you to please tell me, why are you all still riding the AI Hype Train, Buying, and Using Gen-AI everywhere?

It has already resulted in lawsuits and losses!
The Air Canada lawsuit over the Gen-AI chatbot is just one notable well publicized example.

AI systems are AI coded, and AI code has a much greater security risk
as it generates code using training repositories that contain large amounts of untested, unverified, and high risk code — generating code so full of security holes it’s a hacker’s dream! (See this great piece on the ACM on The Drunken Plagiarists.)

AI systems negotiate on the data they have
and with a single decimal point error and you could be paying 10X what you need to. Not to mention, they don’t always translate right. Remember, the Experimental AI DOGE used claimed an 8 Billion savings on an 8 Million contract!

Bad data generated by an AI system and fed into a legacy system with poor data validity checks can shut it down.
Plus, Gen-AI can also push out bad updates faster than any human can and you can easily have your own Crosslake situation!

Now it’s being used by employees to generate fake receipts
that look so real that, if the employee does a few seconds of research (to get the restaurant info, current menu prices, tax code, etc.), you can’t distinguish the generated image from the real thing. And, before you say “Ramp solves this”, well, it only does if the employee is lazy (which, let’s face it, is human nature, so you’ll catch about 90% of it). But what happens when a user strips the meta data which, FYI, can be as easy as taking a picture of the picture … oops! (And if you’re a hacker, running it through a meta data stripper/replacement routine is even easier as you’re just hotkeying a background task.)

AI is good. Gen-AI has its [limited] uses. But unrestricted and unhinged mass adoption of untested, unverified AI for inappropriate uses is bad. So why do you keep doing it?

Especially since it’s now proven it’s worse for you than some illegal drugs! (Source)

Features ARE NOT Applications; But Applications Require Features!

THE PROPHET recently asked What Procurement Tech Product Categories Were Really Just Features All Along? Which is a great question, except he cheated.

He cheated with the first 5!

  • Supplier performance management
  • Supplier quality management
  • Supplier information management / supplier master data management
  • Supplier diversity
  • Supplier risk management (not supply chain risk!)

We’ve known for years it should be one Supplier 360 solution! (Even though no one offers that when you consider all of the elements that should be there. Heck, none of them even offer the 10 basic CORNED QUIP requirements … in fact, good luck finding a solution that offers 5 of those requirements among the 100+ supplier management solutions).

He you cheated again with the next 3!

  • Should cost / cost modeling (for procurement, not design engineers)
  • RFX and reverse auctions (when not bundled with broader capabilities or services)
  • Sourcing optimization

We’ve also known for yours it should be cost-model and optimization backed sourcing (auction, RFX, hybrid, single source negotiation, etc.) … otherwise, it’s an incomplete solution. But only a fraction of the 80+ sourcing platforms offer true optimization (less than 10) and fewer still do extensive cost modelling. (Note that we are focussed on modelling, not cost estimation — that requires data, and that can, and probably should, be a third party data feed.)

And he was wrong on the last front.

Real Spend Analytics should be standalone. Wrapping restricts it! The modules you use should provide all the specific views you need, but the reason that spend analysis quickly becomes shelfware in most organizations today is the same reason it became shelfware 20 years ago … once you exhaust the limits of the interface its wrapped in, it becomes useless. Go back to the series Eric and I wrote 18 years ago (which you can since Sourcing Innovation didn’t delete everything more than a decade old when it had to change servers in 2024, unlike Spend Matters when it did its site upgrade in 2023).

But Very, Very right in that features are not applications!

And very, very right in that too many start-ups are launching today as features (which will only survive if acquired and rolled up into existing applications and platforms), and not solutions. While apps dominate the consumer world, in business there is not always an app for that, and, frankly, there shouldn’t be. This focus on point-based apps is ridiculous. It’s not features, it’s functions. It’s not apps, it’s platforms. It’s not orchestration (and definitely not spend orchestration), it’s ecosystems!

Recent stats, such as those published by Spendesk put the average number of apps a business uses at 371, with an average of 253 for SMBs and 473 for enterprise firms. WHAT. THE. F6CK? This is insane. How many departments does an average organization have? Less than 10. How many key functional areas? Less than 12. Often less than 10! How many core tasks in each function? Usually less than 6. That says, in the worst case, an enterprise might have 72 distinct critical tasks which might need their own application (but probably not). This says that SMBs have at least 3 times the app they should have, mid-size organizations at least 5 times, and enterprises at least 7 times. That is insane! No wonder there are so many carbon copy SaaS optimizers (as we covered in our piece on sacred cows), because if you have that many SaaS apps, you have features, not applications. And you need to replace sets of these with functional applications that solve your core problems.

(And if you want to know how to prevent app sprawl, before buying yet-another-app, ask yourself “is this supporting a function that should be done on its own, or just a task that should be part of an existing function” … if the latter, it’s a feature, not an application, and if the application it should be part of does not have an upgrade/module that supports the task, then you have the wrong application and it’s time to replace it, not pointlessly extend the ecosystem!)

The Best Way to Survive the AI-Powered Apocalypse? Go Old School!

If you’ve been following along, you know that a great purge is coming on two fronts. All the pundits agree on that! On the first front, a large number of vendors are going bye bye, as we’ve been telling you since our first post on the Marketplace Madness. On the second front, they took ‘er jobs. Except it’s not they, it’s AI.

So doesn’t this mean that if you want to survive the days ahead that you should find the most advanced AI provider that isn’t going to get purged in the near future, adopt the tech, replace as much staff as you can with AI, find a way to survive the hardship, and come out ahead when everyone decides that what they have to do?

Well, for the vast majority of the analysts and pundits, it is exactly what you should do — and do it right now. It’s AI overload all the time. And just when most hype cycles start to die down, this one gets a second wind of hurricane proportions.

But, in fact, it’s the last thing you should do. In fact, you should implement a Gen-AI ban and Agentric AI ban immediately, and identify classic ML-powered AI augmented intelligence tech that can supercharge your team, acquire it, and train your team on that immediately. Because you can get the same results as any Agentric AI can get if you employ the right classic ML-powered human-driven AI technology with the right algorithms, analytics, optimization, etc. Sure, a human might be a little bit slower than an algorithm that can work 24/7/365 without a break, but human who is appropriately skilled and trained will make up for this with something the AI doesn’t have, true intelligence.

You see, the thing about Gen-AI and Agentric AI is that it works great until it doesn’t. As per our recent post, Gen-AI is full of problems. In a recent post, we noted that, Gen-AI can:

  • get you sued
  • increase the chance you will be hacked
  • result in Million/Billion-Plus processing errors
  • shut down your organization’s systems for days
  • help your employees commit fraud

And those are the good side effects from its hallucinations. There are much worse side effects that can happen. If you refer back to our posts on the valid uses for Gen AI and the valid uses for Gen AI in Procurement

  • the embedded biases, that you might not even be aware of, could result in decisions diametrically opposed to what you are expecting
  • when it computes two options that are equally likely to generate the same end result for the company relative to the KPI it is using, there’s no guarantee it will select the right option — and there’s always a right option, especially if one option for cost savings is a longer term contract so the supplier can upgrade equipment and the other option is forcing the supplier to cut an already razor thin margin 50%
  • the hallucinations eventually become real, as the systems get so advanced that they not only create super realistic evidence to back up their recommendations, but take over your entire systems in the background so that you don’t know that a web request to verify a claim is actually still being processed by the AI that is now running in the background
  • it starts negotiations and cutting contracts you haven’t even authorized yet
  • it becomes you … and you get blamed for all its mistakes

In other words, ignore the Gen-AI and Agentric-AI technologies that are not the miracle cures they are promised to be. The miracle cures are the last generation ML-based AI technology that was just about to transform your operations under the expert fingers of your leading practitioners, not some probabilistic monstrosity that requires an entire data center to run to generate an output no one verify using a system no one understands. Hone your chops on those and you’ll get the results you need, without having to deal with unexpected, possibly catastrophic, failures along the way.

After all, when we told you about all of the great advancements that were coming in Source To Pay in our classic series (indexed here), none of it required Gen-AI to achieve!

Your Upteenth Reminder That Every Dollar Saved By Procurement Goes Straight to the Bottom Line!

… while 10 cents from every additional sale might make it, if you’re lucky!

A week or so ago, Joël Collin-Demers said COVID was the instigating event that pushed Procurement front and center in a comment to yet another post about the tariff crisis (to which, as I keep saying, the only solution is BTCHaaS), when it was really the (fist) elevating event in over a decade.

The first event that really put ProcureTech on the map was the 2008 financial crisis. This is because companies had to stop the bleeding, fast, and charged Procurement to get ‘er done. But once the markets settled, and the provider base stabilized, and companies willing to spend the money they needed to implement proper tech and get more efficient did so, Procurement kind of faded into the background again. That’s because, when markets rise, and sales rise, the C-Suite focusses entirely on revenue, almost to the point of irrationality, because the faster that revenue rises, the higher the valuation, and the more money they can make on the markets and trades.

However, the 2008 financial crisis is why the M&A and PE activity started to ramp up in ProcureTech in the early teens, because of the importance placed on cost cutting as a result of the 2008 financial crisis. And why, if something else had happened sooner, Procurement would have risen up the organizational chart faster, instead of falling back into obscurity at many organizations who returned undue focus to Sales and Marketing.

This, of course, belies the sad, sorry, state of affairs of North American business that still sees marketing and sales as the key to growth in a shrinking economy (and yes, with birth rates declining in almost all first world countries, it is a shrinking economy) when the real key is cost management. Remember your business 101 equation: Profit = Revenue – Expenses.

This says that every dollar of revenue you add is eaten up by the total cost to acquire that dollar — the total cost of that good or service, which is usually at least 90 cents of that dollar.

However, every dollar of expense you cut is gone in its entirety. Every dollar saved goes straight to the bottom line.

Thus, Procurement is 10 times as valuable as sales! But yet, the marketing madmen will try to hide that from you to protect their multi-million budgets!

So if you want to survive the crisis of the day, whatever that crisis may be, it’s not sales, it’s not marketing, it’s not finance, it’s not executive leadership or vision, it’s Procurement. Plain and simple. Maximize every dollar spent while eliminating those that don’t need to be.

Unless, of course, you are a ProcureTech vendor, in which case, as per a previous post, skip the fairy dust and buzzwords, focuses on your customers pain, and put together some educational materials (marketing and training) that will help them ease the bleeding. If you’ve forgotten how to do that, or never learned, there are those of us who can help you!

Who Needs The Beef?

For those of you who have been following my rants, especially on intake-to-orchestrate (which really is clueless for the popular kids as it doesn’t do anything unless you already have all the systems you need and don’t know how to connect them), you’ll know that one of my big qualms, to this day, is Where’s the Beef?, because while the intake and orchestrate buns are nice and fluffy and likely very tasty, they aren’t filling. If you want a full stomach, you need the beef (or at least a decent helping of Tofu, which, unless you are vegetarian, won’t taste as good or be quite as filling, but will give you the subsistence you need).

And you need filling. Specifically, you need the part of the application that does something — that takes the input data (possibly properly transformed), applies the complex algorithms, and produces the output you need for a transaction or to make a strategic decision. That’s not intake-to-orchestrate, that’s not a fancy UI/UX, that’s not an agent that can perform transactional tasks that fall within scope, and that’s NOT a fancy bun. It’s the beef.

But, apparently, at least as far as THE PROPHET is concerned, (bio) re-engineering is going to eliminate the need for the beef. Apparently, the buns are going to have all the nutrients (or data processing abilities) you need to function and do your job.

In THE PROPHET‘s latest analogy, today’s enterprise technology burger consists of:

  • the patty: (not to be mistaken for the paddy) which combines enterprise technology and labour (which means it really should be the patty [labour] and the trimmings [technology] in this analogy)
  • the upper bun: and
  • the lower bun: which collectively provide you a way to cleanly get a grip on the patty

But tomorrow’s enterprise technology burger will consist of:

  • the upper bun: which will be replaced by a new type of technology that fuses co-pilots and agentic systems to power autonomous agents and replaces the patty [labour] and part of trimmings
  • the lower bun: which will represent the next generation data store and information supply chain and build in “self-healing” technology for data maintenance and replace the other part of the trimmings

… and that’s it. NO BEEF! Just two co-dependent buns that are destined to fuse into a roll … and not a very tasty one at that. Because this roll will, apparently, operate fully autonomously and never get anywhere near you, leaving you perpetually hungry.

Now, apparently, not all parts of the patty (with its complex amino acid chains and protein structures) will be capable of being (bio) re-engineered into the buns right away and the patty won’t disappear all at once, just shrink bit by bit over the next decade until there’s nothing left and the last protein structure is absorbed (or replaced by a good enough AI-generated facsimile — they can do that now too). In THE PROPHET‘s view, legacy systems of record (ERP/MRP, payment platforms, etc.) will be the last to be replaced, and those will survive along with the legacy labour to maintain them until they can finally be split up into components and absorbed into the bun.

In other words, in THE PROPHET‘s view, you don’t need the patty, and, more specifically, you don’t need (or even want) the beef. I have to argue this is NOT the case.

1. You Need the Beef

Thinking that the patty can be completely absorbed into the buns is what results from a lack of understanding of enterprise software architecture best practices and software development in general.

The best architecture we have, which took years to get two, is MVC, which stands for

  • Model: specifically, data model, which should be at the bottom (and could be absorbed into a data bun)
  • View: specifically, the UI/UX we interact with (and could be absorbed into a soft, warm, sweet smelling sourdough bun)
  • Controller: the core algorithms and data processing, which needs to be its own layer that supports the UX (and allows the UX to reconfigure the processing steps and outputs as needed) and can be cross-adapted to the best available data sources (that need to be remain independent)

Moreover, even Bill Gates, who predicts AI will have devastating effects across all industries, realizes that you can’t replace coders, energy experts, and biologists, and, by extension, jobs that require constantly evolving code, organic structure, and energy requirements to complete. So you will still need labour that creates, and relies on, highly specialized algorithms and expert interpretations of outputs to do their jobs. That also means that, in our field, strategic sourcing and procurement professionals cannot be replaced but tactical AP clerks are on their way out as AP software automatically processes 99% to 99.9% of invoices with no human involvement, even those with missing data and errors, handling the return, correction, negotiation, etc. until all of the data matches and costs are within tolerance.

2. You Want the Beef!

The whole point of modern architectures and engineering is to minimize legacy code / technical debt and maximize tactical data processing and system throughput (and have the system do as much thunking as possible, which is what it’s good at). If you try to push too much into the lower bun, you don’t have separation of data and processing, which means it’s almost impossible to validate the data as it’s not data you’re getting, but processed data, which means that the system might be continually pushing wrong data to the outer bun, even with good data fed in, due to a bug deep in the transformation and normalization code. But your automatic checks and fail safes would never catch it because you’ve turned what should be a crystal (clear) box into a black box! If you try to push too much processing into the upper bun, you have to replicate common functionality across every agent and application, leading to a lot of replication and bloat that consumes too much space, uses too much energy, and makes the systems even harder to maintain than the legacy applications of today.

So while the burger of tomorrow might be different with a much leaner, more protein rich, patty (with less sauce and unhealthy trimmings), and the bread might be a super healthy natural yeast-free multi-grain flat bread, making for a smaller (and possibly less appetizing burger from a surface view), it still needs to be a burger and anyone who thinks otherwise has joined the pretty fly Gen-AI in hallucination land!