Author Archives: thedoctor

The MOST Important Clause in Your (Procure) Tech (SaaS) Contract (Part I)

While you might think there is no one most important clause as there are a lot of important clauses, especially if you ask around.

  • In Procurement, you will want implementation in the promised timeframe
  • Finance will want holdbacks and penalties if functionality is not delivered or timeframes are not met
  • Risk Management will want clauses around cyber-security and privacy
  • Legal will be very concerned about governing venue, liability, and standard termination clauses,
  • etc.

And those are all important, but the reality is:

  • regardless of what’s in the contract, the solution will be implemented when it gets implemented, and delays will be blamed on your IT team, partners, etc. especially if it is their fault
  • you have to prove it was the vendors fault to get any penalties enforced, and that will be very hard indeed
  • good clauses alone are not enough if a cyber-breach or data-breach happens, your customers will still be coming after you
  • the legal venue usually isn’t that important, the only time liability typically comes into play is if customer data is fraudulently accessed as a result of the provider’s failure in security or there is a massive prolonged system failure, and, no matter how bad the performance, the contract won’t be terminated unless there is outright fraud because the organization still needs a system
  • etc.

which means that, while important, unless there was outright fraudulent representation (or serious negligent misrepresentation) in the signing of the contract, none of these clauses really matter as they aren’t protecting you nearly as much as you think they are since any damages you would be awarded in court would be limited to fees paid, which could be dwarfed by the legal fees and mounting losses while you waited months or years for the situation to be resolved!

Moreover, when you consider that the average company is not a Fortune 500, and no longer has (multi-)million budgets for SaaS, that means that most of your purchases are going to be in the (low) six figure range. This means that the vendor knows that the cost of any legal action that would arise plus the losses that would be incurred by the organization that takes action will dwarf the fees paid, and that means that the likelihood of any action coming the vendor’s way is minimal. (Plus, after all of the glowing recommendations you gave the vendor to the C-Suite upon selection, to their customers in the all-expenses paid customer event at the fancy resort destination that was offered to you as a big new name customer, and to new potential customers in reference calls when you were still enthralled by the shiny screen, they know you won’t want to come forward and admit how wrong you were.)

This means that a good portion of you will be screwed to some extent. Let’s consider the reality.

  • Once FinTech, and then ProcureTech, became hot, you had all of the top performing sales people from across enterprise tech move in — and not all of them are altruistic; in fact, some of them are as psychopathic as they come and will promise anything to get the deal signed, even if they know the vendor organization CAN NOT deliver
  • Many providers have been capitalized at multiples of 7, 10, 15, or more by VC and PE firms looking for the next unicorn and are under pressure to reach ridiculous, and wholly unrealistic, sales targets and will effectively over promise to get sales and then underdeliver when the investors don’t allow them to hire enough support personnel due to not hitting sales targets
  • There are over 700 providers in a space that offers less than 10 core modules. That’s almost 10 times the number of providers that are needed. Most will not make/retain profitability and, thus, most will not survive. Some will go under, others will be acquired in fire sales or discount sell offs by investors who cut their losses before they lose it all. Even if your vendor gets acquired, chances are the acquirer will gut it and support levels will significantly decrease (and new development come to a standstill).
  • If the vendor needs the sale to get the bank loan, keep their jobs, make payroll, even the best providers will assume they can figure it out later with money in the bank, but this won’t always happen, especially if they are behind on promises to other customers.

In other words, even if the sales person and the provider had no will intent, you are still likely to get screwed.

This means that the most important clause in the contract is …

Finally A Good Webinar on Gen-AI in ProcureTech …

… from SAP?!?

Yes, the doctor is surprised! In ProcureTech, SAP is not known for being on the leading edge. It’s latest Ariba refresh is 3 to 6 years late. (Had it been released in 2019, before the intake and orchestration players started hitting the scene and siphoning off SAP customers with their ease of use and ability to integrate into the back-end for data storage, it would have been revolutionary. Had it been released in 2022 before these players really started to grow beyond the early adopters, it would have been leading. Now, no matter how good it is, SAP Ariba is going to be playing catch up in the market for the next two years! This is because it’s been fightiong to not only keep its current customers, but grow when it now has suites, I2O [Intake-to-Orchestrate] Providers, and mini-suites in the upper mid-market all chomping at its customer base!)

Most players in ProcureTech jumping on the Gen-AI Hype Train are just repeating the lies, damn lies, and Gen-AI bullcr@p that the big providers (Open AI, Google, DeepSeek, etc.) are trying to shove down our collective throats, especially since these ProcureTech players don’t have real AI experts in house to know what’s real and what’s not. Given that SAP Procurement is not a big AI player, one would expect that, despite their best efforts, they might be inclined to take provider and partner messaging and run with it. But they didn’t.

In fact, they went one step further and engaged Pierre Mitchell of Spend Matters (A Hackett Group Company) in their webinar (now on demand) who is one of the few analysts in our space more-or-less getting it right (and trying to piece together a plan for companies to successfully identify, analyze, and implement AI in their ProcureTech operations). (Now, the doctor doesn’t entirely agree with all of his architecture or all of his viewpoints, but the effort and accuracy of Pierre’s work is leagues beyond anything else he’s seen in our space, and if you’re careful and follow his models and advice properly, low risk. Moreover, you’re starting from sanity if you follow his guidance! More than can be said for the majority of AI approaches out there.)

When it was said that that architecting the solution and the area around [the] business data cloud and managing data and data models is really important because AI has shown that, hey, we have all this amazingly powerful data that’s out there, but we got to tap it and we got to make it more structured and we have to make it useful and that the data quality around data coming out of those models right now needs to be limited to co-pilots and chatbots because we’re not ready to turn the keys over to the LLMs and that they have to be wrapped into deterministic tooling they are not only making clear the limitations of the LLM technology but making clear they understand those limitations and that they have to do more than just plug in an LLM to deliver dependable, reliable value to their customers.

When even the leading LLM, ChatGPT, generates responses with incorrect information 52% of the time, that tells you just how unreliable LLM technology is! Moreover, it’s not going to get any better considering that OpenAI (and its peers) literally downloaded the entire internet (including illegally using all of the copyright data that had been digitized to date [until the Big Beautiful Bill that restricted Federal AI Regulation for 10 years was past, retroactively making their IP theft legal]) to train their models and the vast majority of data produced since then (which now accounts for half of the internet) is AI slop. (This means that you can only expect performance to get worse, and not better!) This means that you can’t rely on LLMs for anything critical or meaningful.

However, if you go back to the basics and focus on what LLMS are good for, namely:

  • large document search and summarization and
  • natural language processing and translation to machine friendly formats

then you realize these models can be trained with high accuracy to parse natural language requests and return machine friendly program calls that execute reliable deterministic code and then parse the programmatic strings returned and convert them to natural language responses. If you then use LLMs only as an access layer, and take the time to build up the cross-platform data integration, models, and insights a user will need in federated cubes and knowledge libraries, you can provide real value to a customer using traditional, dependable, analytics, optimization, and Machine Learning (ML) in an interface that doesn’t require a PhD to use it!

This is what they did, as they explained in their example of what should be done when your CFO asks for a breakdown of your laptop and keyboard spend to potentially identify opportunities to consolidate vendors. Traditionally, this request might take your business analyst days to compile across multiple systems stakeholders and spreadsheets but if you have SAP spend control tower with AI, they unify data across multiple sources in the platform for you. Whether your purchases are coming through existing contracts, P cards expense reports, or any other channel they federate the data by apply[ing] intelligent classifications to automatically categorize your purchases with standard UNSPSC codes to ensure that items like your Dell XPS 15 and your MacBook Pro 16 are both properly classified as laptops, despite the different naming conventions. Moreover, since they have also integrated with Dun & BradStreet, you can easily consolidate your suppliers. So rather than it looking like you’re purchasing items from three different subsidiaries, your purchases will align to the same parent company. This says they are using traditional categorizations, rules, and machine learning on the backend to build one integrated cube with summary reports, and all the LLM has to do is create an English summary, to which you can attach the supporting system generated reports.

Moreover, this also says that if you need to source 500 laptops and 500 [external] keyboards with the goal of cut[ing] current costs from what you’ve been paying by 15% it can automatically identify the target prices, identify the suppliers/distributors who have been giving you the best prices, automatically run predictive analytics to estimate the quotes you would get from awarding all of the business to one supplier (who would then be inclined to give better price breaks), and if none of those looked like they’d generate the reduction, access its anonymized community data, identify other suppliers/distributors supplying the same laptops you typically buy, compute their average price reduction over the past three months, and identify those that should be invited to an RFX or Auction to increase competition and the chances of you achieving the target price reduction while informing you of the price reduction it predicts (which might only be 10%, or 5%, if you are already getting better than average market pricing). And it will do all of this with a few clicks. You’ll simply tell the system what your demand is and what your goal is and all of these computations will be run, supplier and event (type) recommendations generated, and it will be one click to kick off the sourcing event.

Moreover, when the webinar said that if you think about this area around workflow and process orchestration, there’s no reason why you can’t take pieces of that, like on the endpoints, around intake or invoices or whatever and use AI there and bake it in a controlled way
into your processes
. Because that’s they key. Taking one tactically oriented process, that consumes too much manual intervention, at a time and using advanced tech (which need not be AI, by the way, modern Adaptive RPA [ARPA] is often more than enough) to improve it. Then, over time, stringing these together to automate more complex processes where you can gate them to ensure exceptional situations aren’t automated without over guidance. One little win at a time. And after a year it cumulatively adds up to one big win. (Versus going for a big-bang project, which always ends in a big-bang that blows a whole in your operation that you might not be able to recover from.)

The only bad part of this webinar was slide 24, Spend Matters recommendation #1: “Aggressively Implement GenAI”!

Given that Gen-AI is typically interpreted as “LLM”, as per above, this is the last AI tech you should aggressively implement given its unreliability for anything but natural language translation and search and summarization. Moreover, any tech that is highly dynamic and emerging should be implemented with care.

What the recommendation should be is aggressively implement AI because now that we have the computational power and data that we didn’t have two decades (or so) ago, which was the last time AI was really hot, tried and true (dependable) machine learning and AI is now practical and powerful!

Now, in his LinkedIn post, Pierre asked what we’d like to see next in terms of research/coverage (regardless of venue). So I’m going to answer that:

Gen-AI LLM-Free AI Transformation!

Because you don’t need LLMs to achieve all of the value we need out of AI in ProcureTech and, to be honest, any back office tech. As I have been saying recently, everything I penned in the classic Spend Matters series on AI in Procurement (Sourcing, Supplier Management) today, tomorrow, and the day after in the last decade … including the day after, was possible when I penned the series. It just wasn’t a reality because there were few AI experts in our space, data was lacking, and the blood, sweat and tears required to make it happen was significant. We didn’t have readily available stacks, frameworks and models for the machine learning, predictive analytics, and semantic processing required to make it happen. Vendors would have had to build the majority of this themselves, which would have been as much (or more) work than building their core offering. But it was possible. And with all the modern tech at our disposal, now it’s not only possible, but doable. There is zero need to embed an untested unreliable LLM in an end-user product to provide all of the advantages AI can offer. (Or, if you don’t have the time to master traditional semantic tech for NLP, zero need to use an LLM for anything more than NLP.)

So, I’d like to see this architecture and explanation of how providers can roll out safe AI and how buying organizations can use it without fear of being another failure or economic disaster when it screws up, goes rogue, and orders 100,000 units of the wrong product!

Tomorrow Doesn’t Matter In Procurement. Only Today.

Stop racing towards a future that won’t happen, or running away from one you don’t believe. It doesn’t matter. As per our prior posts this week, the doctor has been reading future of Procurement white papers for 20 years now. All of which have promised us radical change. This means that they should have started to come true 10 years ago. Not one did. Not ONE! The reality is that we can’t predict the future, and trying to do so just wastes time and effort. However, we can be vigilant about where things are today, learn the tools and techniques that can make us much more efficient in our job, identify those vendors who offer the tools backed by the right technology to enable us to be more effective, acquire and use those tools, and become at least five times more efficient in our job than the average Procurement employee.

For those who tuned out for a while, this is more-or-less Part 5 of the series we have been running this week inspired by the recent white paper by Jonathan O’Brien of Positive Purchasing and Guy Strafford of OneSupplyPlanet on the Functional ExtAInction Battle where the authors claim that AI might just lead to the extinction of Procurement as a business function. To get to the punchline, it won’t, but the non-stop bullcr@p AI Hype might! (Given how many C-Suites are blinded by the hype that is generated 24/7/365 by the A.S.S.H.O.L.E.)

In that series we told you that, despite a few false assumptions, the authors still got to the right answer, more or less. The conclusion that the only Procurement organizations that are going to survive are those that manage to automate and mostly eliminate the tactical, double down on the strategic, and find new value to bring to the business is the correct one. Moreover, those are the Procurement departments that will be rewarded and maintain more headcount than their peers because, after the massive losses from AI failures and the forthcoming AI market crash, the C-Suites who lead their businesses to survival will be those that realize the value of best-in-class Procurement People and invest in them.

However, that doesn’t mean the training budgets that disappeared two and a half decades ago are coming back. They aren’t. Since they C-Suites are still hoping for the day they can fire you, they won’t invest in you, which means that you need to get there on your own. It also means you need to start now. Start learning, start studying, start identifying very cost effective tools that can be put on a P-Card that will significantly improve a function and return value the quarter the tool is acquired (and before you get the third degree about that unexpected P-Card purchase). Real technological progress, with or without AI, comes from one little win at a time — for each task you do, identify the most time consuming tactical part of that task and automate it. Start with the tasks you do the most and continue until you’ve taken 80% out of all of the most time-consuming tactically oriented tasks you do on a monthly basis. When you reach that point you will find that you have not only digitized, but revolutionized, your function and reached the point where have flipped the tables and are spending 80% of your time on strategic decision making and relationship building and only 20% on tactically oriented tasks — a percentage that will decrease over time as you improve the tools and end-to-end automation across functions.

Furthermore, no super powers are required. Just intelligence, the willingness to study late, get up early, roll up the sleeves and work hard until you sweat through your tears. Like all real progress, it’s hard at first, but it will pay off later — when you still have a job and are delivering above peers while only working reasonable hours.

Moreover, you won’t need deep software (or even system) architecture skills either. Just the ability to define what a process should be, how a tool should support it, and find that tool. You need to be a solution architect — leave the technical and system architecture skills to the experts. If the tool they are selling gets it right at low cost with low compute and high reliability, the architecture is probably such that you wouldn’t do any better.

And whatever you do, don’t waste time playing the paradigm game. Leave that to the influencers, who won’t last near as long as they think they will. Or to the consultants, who will be walked out the door and never invited back once the C-Suite realizes they flushed millions down the drain chasing an AI utopia that doesn’t exist. Just consistently get results, push those results in front of the C-Suite, and tell them that they can call it whatever they want, but Procurement is the function — and sometimes the ONLY function — that gets results.

AI Will Not Replace You. People Using AI Will Not Replace You. But People Who Correctly Embrace Digitization Will. (Part 4)

In the Functional ExtAInction Battle white paper, the authors repeat the ridiculous claim that we may see AGI within the next decade (despite their acknowledgement that it will need a gargantuan data processing infrastructure which will require an unachievable energy requirement that is the equivalent of 34 new nuclear power stations by 2030 and 34 more by 2035, which still overlooks the requirement for an exponential increase in training data, which does not and will not exist — as we don’t even have enough data today for effective LLMs, which already stole every piece of data on the internet to get to the point where it fails almost 50% of the time on tasks it was specifically trained for, like ChatGPT which generates responses with incorrect information 52% of the time! [And now that the internet is filled with as much AI slop as actual human created content, performance is just getting worse.])

The reality is that we won’t. Every 10 years we see a resurgence in AI hype and every 20 years it is a big one. This is the biggest AI craze since the early 80s, when Japan almost bet its entire IT economy on 5GL which was supposed to allow a computer to solve a problem presented to it from constraints alone. i.e. A first attempt at … AGI! It didn’t happen them, it’s not going to happen now. Until we crack intelligence and ingenuity, and how to effectively model it, all the processing power (and flawed data) in the world is not going to do that for us. Moreover, we will continue to outsmart the most advanced AIs in the world using toddler level thinking and hiding in boxes and bushes, like these marines did.

However, the C-Suite’s desire to believe in this is forcing digitization efforts, if not AI, and the teams that survive this current nightmare will be those that not only digitize under the guise of AI (since every vendor claims it, whether they really have it or not and since most C-Suites have no clue what AI is or isn’t), but in fact embrace next generation automation on steroids (like Adaptive RPA that learns from every exception, decision, and override to continue to decrease the need for human intervention over time). (The teams that fail to embrace modern tech will be sidelined and the teams that don’t resist the experimental AI being pushed on them by the overpriced consultancies brought in by the C-Suite will continue to contribute to the 95% failure statistic and possibly end their function entirely.)

Moreover, the teams that embrace appropriate digitization, like form fit ARPA, will evolve into the evolutionary niche that the business needs, and that only Human Powered Procurement can do. While the authors got the premise wrong (it’s not AI that you need to worry about, it is the AI Marketing), and are overzealous about the emergence of AGI, they get the future right. (Which is the past, by the way, but more on this later!)

According to the white paper authors, the future is spend that needs human interaction (because it can change the game or presents existential risk to the business). This is what should be Procurement’s primary purpose. Procurement’s never had enough time to manage all spend, and like the authors note, it shouldn’t be managing tactical spend. It should be automating it. If the spend is low volume, low risk, easily replaceable, etc., Procurement should define the best-practice strategies and processes and let modern tech (which doesn’t require AI*, by the way) entirely automate it. This goes for routine and non-routine spend where in the case of non-routine spend, the provider has best practice templates culled from its experience and community intelligence. However, routine strategic spend will not be turned over to AI. It will be highly automated, but human experts will still vet the suppliers and verify the decisions before a contract is signed or a PO is sent out, but a lot less time will be spent on strategic spend that is routine and usually doesn’t change much from year to year.

The world, and the technological underpinnings, will continue to evolve as they have for the last four decades. The pace will pick up a bit, but not much. Humans are naturally lazy and change resistant, which means that significant change typically requires a generation (or two). It’s never a “whole new world”, just a slightly different one. The only time humanity has ever undergone and grudgingly accepted such significant change is as the result of a significant natural or man-made disaster that has devastated entire cities and populations, and forced adaptation to survive. But rarely has the survival brought something better in the lifetime of those forced to undergo it! (Plus, the world has been a commercial hub since before history was recorded. We’ve always traded to survive, thrive, and satisfy our desires. It’s just that we’ve replaced food and trinkets with digital bits that represent food and trinkets with a digital equivalent of their perceived monetary value. So whether you call the function in the business that manages that aspect of the outside world Procurement or the Commercial Hub [of the business] is irrelevant.)

So don’t fear a rapid change, it’s not going to happen. But prepare for a steady change, and you can keep up while your peers fall behind.

* When the doctor wrote his AI In Procurement (Sourcing, Supplier Management, etc.) Today, Tomorrow, and the Day After in the late 2010s (before all the Gen-AI bullcr@p), what he didn’t tell you was that everything he included in “the day after”, which is the majority of everything the Agentic AI providers are promising now, was already possible. It just required a lot more code, sweat, and tears on yesteryear’s stacks with a lot more templates and customized training data sets than most providers, or companies, had at their disposal at the time.