The MOST Important Clause in Your (Procure) Tech (SaaS) Contract (Part II)

In Part I we told you that

  • while you might think there is no single most important clause as there are a lot of important clauses, especially if you ask around,
  • liability or penalty clauses are quite important, or that
  • termination matters

the reality is that

  • there is a most important clause, and it’s not what you think,
  • liability is worthless if collecting costs more than you get, and
  • you can’t terminate if you don’t have another choice!

But this isn’t the worst of it! The worst of it is that, after signing the contract, there is a good chance you will be screwed to some extent, whether or not the provider intends it. Between:

  • psychopathic salespeople who will promise anything to sign the deal (and off to their next job before the reckoning comes),
  • investor owners that are going to limit/cut support when unreachable sales targets are not hit, forcing the C-Suite to pick and choose who to screw over,
  • the fact that your vendor will likely be acquired (because if it’s not, it’s likely to go out of business), and
  • a struggling vendor with the best of intentions will take on too much and be forced to leave some customers high and dry

the chances are that you are going to be screwed.

This means there is one clause that overrules them all:

IT’S MY DATA … AND I CAN, AND WILL, GET IT ANYTIME I WANT IT!

You might think it’s your data, and you might think you can get it anytime you want it as there will be clauses around data protection, privacy, security, etc. as well as acknowledgements that you own your data, it will be kept separate from competitors, and the provider will not use it except to serve you, which may include using limited anonymized portions of it in community data.

And you might think you can get your data anytime you want it because they will guarantee up time, allow you to export transactions and reports, and so on.

But ask yourself this. Of the hundreds (and possibly beyond a thousand) of SaaS applications your organization currently uses, and has used throughout your career there, how many could you, self-serve, do a complete export of all of your data on-demand? And by all of your data, I mean all of your data. Not just reports or summaries or core record subsets. When sourcing, all suppliers and all related 360-data — all risk scores, compliance certificates, performance KPIs, related transactions, related bids, related events, product catalogues, tooling data, etc. In Procurement, all documents related to a transaction — not just the invoice but the purchase order, acknowledgement, goods receipt, credit note, etc.

When we say all of your data, we mean ALL of your data. Chances are, you can’t get it self-serve from your SaaS Application. You might not even be able to get all of your data with help from the the provider’s services personnel. For some applications, the only chance is if the developer does a, relatively undocumented, database export. And good luck with that!

This means three things.

  1. If the provider says that have no way for you to get all of your data at any time, you should not consider them.
  2. You must have a clause that:
    • allows you to export all of your data self-serve at any time (although it’s reasonable for the provider to charge a fee if we’re talking many GBs or TBs and you decide to export all of it on a regular basis, but you should be able to do this, depending on the data velocity and volume, at least once a quarter, month, or week, for free) in a standardized format; in addition, you must also include a modified
    • penalty clause with a significant penalty if you cannot do so by whatever date the baseline implementation is supposed to be completed; a (modified)
    • termination clause if the provider is unable to correct this by a certain time, and a (modified)
    • liability clause for the damages incurred as you will have to find another solution and will have lost time and money on implementing the providers solution.
  3. You must test the ability as soon as the initial import of all of your data is complete, and again in a few weeks once you create a whole lot of new data in the system (updated profiles, end-to-end sourcing events, thousands of new transactions with associated documents, etc.). We realize this will take a lot of time, but much less than trying to figure out what to do six to eighteen months down the road when the vendor fails (you) and you’re left high and dry.

That way, if the provider

  • fails to complete the implementation and required integrations in a reasonable time (and you’re unable to adopt the system),
  • sells you something they don’t have and may not have within the timeframe of the initial agreement,
  • gets acquired by a larger vendor with no intent to support the solution longer than they feel it will take for their forced migration to a higher-priced solution you don’t want, or
  • serves you a notice that it is winding down operations

you can keep going. As long as you can export all of your data in a standard, documented, format, you know that there are a dozen (if not dozens of) providers who will happily convert it to to their format (for free) for your business. Just be sure they will also agree to the same IT’S MY DATA … AND I CAN, AND WILL, GET IT ANYTIME I WANT IT! before selecting them!

The reality of the situation is that there is no unique capability in business data processing that can’t be, and isn’t, more-or-less replicated by dozens of other solutions. Sure they have different UIs, add or subtract process steps, and use different data storage formats, but universal business processes are universal, there are dozens of ways to do them, and get around the software patents supposedly protecting them (which should be banned in the US, as they are in the EU). The next solution might not be as custom fit as the one you are forced to abandon, but it will work (as long as you have unhindered access to 100% of your data). That’s the point.

As long as you can always get your data, you’re never completely screwed. (And once you’ve switched, if the losses are still significant, then, if the C-Suite wants to pursue, you can let the lawyers have their day. You won’t be held ransom by a vendor holding your data hostage.)

The MOST Important Clause in Your (Procure) Tech (SaaS) Contract (Part I)

While you might think there is no one most important clause as there are a lot of important clauses, especially if you ask around.

  • In Procurement, you will want implementation in the promised timeframe
  • Finance will want holdbacks and penalties if functionality is not delivered or timeframes are not met
  • Risk Management will want clauses around cyber-security and privacy
  • Legal will be very concerned about governing venue, liability, and standard termination clauses,
  • etc.

And those are all important, but the reality is:

  • regardless of what’s in the contract, the solution will be implemented when it gets implemented, and delays will be blamed on your IT team, partners, etc. especially if it is their fault
  • you have to prove it was the vendors fault to get any penalties enforced, and that will be very hard indeed
  • good clauses alone are not enough if a cyber-breach or data-breach happens, your customers will still be coming after you
  • the legal venue usually isn’t that important, the only time liability typically comes into play is if customer data is fraudulently accessed as a result of the provider’s failure in security or there is a massive prolonged system failure, and, no matter how bad the performance, the contract won’t be terminated unless there is outright fraud because the organization still needs a system
  • etc.

which means that, while important, unless there was outright fraudulent representation (or serious negligent misrepresentation) in the signing of the contract, none of these clauses really matter as they aren’t protecting you nearly as much as you think they are since any damages you would be awarded in court would be limited to fees paid, which could be dwarfed by the legal fees and mounting losses while you waited months or years for the situation to be resolved!

Moreover, when you consider that the average company is not a Fortune 500, and no longer has (multi-)million budgets for SaaS, that means that most of your purchases are going to be in the (low) six figure range. This means that the vendor knows that the cost of any legal action that would arise plus the losses that would be incurred by the organization that takes action will dwarf the fees paid, and that means that the likelihood of any action coming the vendor’s way is minimal. (Plus, after all of the glowing recommendations you gave the vendor to the C-Suite upon selection, to their customers in the all-expenses paid customer event at the fancy resort destination that was offered to you as a big new name customer, and to new potential customers in reference calls when you were still enthralled by the shiny screen, they know you won’t want to come forward and admit how wrong you were.)

This means that a good portion of you will be screwed to some extent. Let’s consider the reality.

  • Once FinTech, and then ProcureTech, became hot, you had all of the top performing sales people from across enterprise tech move in — and not all of them are altruistic; in fact, some of them are as psychopathic as they come and will promise anything to get the deal signed, even if they know the vendor organization CAN NOT deliver
  • Many providers have been capitalized at multiples of 7, 10, 15, or more by VC and PE firms looking for the next unicorn and are under pressure to reach ridiculous, and wholly unrealistic, sales targets and will effectively over promise to get sales and then underdeliver when the investors don’t allow them to hire enough support personnel due to not hitting sales targets
  • There are over 700 providers in a space that offers less than 10 core modules. That’s almost 10 times the number of providers that are needed. Most will not make/retain profitability and, thus, most will not survive. Some will go under, others will be acquired in fire sales or discount sell offs by investors who cut their losses before they lose it all. Even if your vendor gets acquired, chances are the acquirer will gut it and support levels will significantly decrease (and new development come to a standstill).
  • If the vendor needs the sale to get the bank loan, keep their jobs, make payroll, even the best providers will assume they can figure it out later with money in the bank, but this won’t always happen, especially if they are behind on promises to other customers.

In other words, even if the sales person and the provider had no will intent, you are still likely to get screwed.

This means that the most important clause in the contract is …

Finally A Good Webinar on Gen-AI in ProcureTech …

… from SAP?!?

Yes, the doctor is surprised! In ProcureTech, SAP is not known for being on the leading edge. It’s latest Ariba refresh is 3 to 6 years late. (Had it been released in 2019, before the intake and orchestration players started hitting the scene and siphoning off SAP customers with their ease of use and ability to integrate into the back-end for data storage, it would have been revolutionary. Had it been released in 2022 before these players really started to grow beyond the early adopters, it would have been leading. Now, no matter how good it is, SAP Ariba is going to be playing catch up in the market for the next two years! This is because it’s been fightiong to not only keep its current customers, but grow when it now has suites, I2O [Intake-to-Orchestrate] Providers, and mini-suites in the upper mid-market all chomping at its customer base!)

Most players in ProcureTech jumping on the Gen-AI Hype Train are just repeating the lies, damn lies, and Gen-AI bullcr@p that the big providers (Open AI, Google, DeepSeek, etc.) are trying to shove down our collective throats, especially since these ProcureTech players don’t have real AI experts in house to know what’s real and what’s not. Given that SAP Procurement is not a big AI player, one would expect that, despite their best efforts, they might be inclined to take provider and partner messaging and run with it. But they didn’t.

In fact, they went one step further and engaged Pierre Mitchell of Spend Matters (A Hackett Group Company) in their webinar (now on demand) who is one of the few analysts in our space more-or-less getting it right (and trying to piece together a plan for companies to successfully identify, analyze, and implement AI in their ProcureTech operations). (Now, the doctor doesn’t entirely agree with all of his architecture or all of his viewpoints, but the effort and accuracy of Pierre’s work is leagues beyond anything else he’s seen in our space, and if you’re careful and follow his models and advice properly, low risk. Moreover, you’re starting from sanity if you follow his guidance! More than can be said for the majority of AI approaches out there.)

When it was said that that architecting the solution and the area around [the] business data cloud and managing data and data models is really important because AI has shown that, hey, we have all this amazingly powerful data that’s out there, but we got to tap it and we got to make it more structured and we have to make it useful and that the data quality around data coming out of those models right now needs to be limited to co-pilots and chatbots because we’re not ready to turn the keys over to the LLMs and that they have to be wrapped into deterministic tooling they are not only making clear the limitations of the LLM technology but making clear they understand those limitations and that they have to do more than just plug in an LLM to deliver dependable, reliable value to their customers.

When even the leading LLM, ChatGPT, generates responses with incorrect information 52% of the time, that tells you just how unreliable LLM technology is! Moreover, it’s not going to get any better considering that OpenAI (and its peers) literally downloaded the entire internet (including illegally using all of the copyright data that had been digitized to date [until the Big Beautiful Bill that restricted Federal AI Regulation for 10 years was past, retroactively making their IP theft legal]) to train their models and the vast majority of data produced since then (which now accounts for half of the internet) is AI slop. (This means that you can only expect performance to get worse, and not better!) This means that you can’t rely on LLMs for anything critical or meaningful.

However, if you go back to the basics and focus on what LLMS are good for, namely:

  • large document search and summarization and
  • natural language processing and translation to machine friendly formats

then you realize these models can be trained with high accuracy to parse natural language requests and return machine friendly program calls that execute reliable deterministic code and then parse the programmatic strings returned and convert them to natural language responses. If you then use LLMs only as an access layer, and take the time to build up the cross-platform data integration, models, and insights a user will need in federated cubes and knowledge libraries, you can provide real value to a customer using traditional, dependable, analytics, optimization, and Machine Learning (ML) in an interface that doesn’t require a PhD to use it!

This is what they did, as they explained in their example of what should be done when your CFO asks for a breakdown of your laptop and keyboard spend to potentially identify opportunities to consolidate vendors. Traditionally, this request might take your business analyst days to compile across multiple systems stakeholders and spreadsheets but if you have SAP spend control tower with AI, they unify data across multiple sources in the platform for you. Whether your purchases are coming through existing contracts, P cards expense reports, or any other channel they federate the data by apply[ing] intelligent classifications to automatically categorize your purchases with standard UNSPSC codes to ensure that items like your Dell XPS 15 and your MacBook Pro 16 are both properly classified as laptops, despite the different naming conventions. Moreover, since they have also integrated with Dun & BradStreet, you can easily consolidate your suppliers. So rather than it looking like you’re purchasing items from three different subsidiaries, your purchases will align to the same parent company. This says they are using traditional categorizations, rules, and machine learning on the backend to build one integrated cube with summary reports, and all the LLM has to do is create an English summary, to which you can attach the supporting system generated reports.

Moreover, this also says that if you need to source 500 laptops and 500 [external] keyboards with the goal of cut[ing] current costs from what you’ve been paying by 15% it can automatically identify the target prices, identify the suppliers/distributors who have been giving you the best prices, automatically run predictive analytics to estimate the quotes you would get from awarding all of the business to one supplier (who would then be inclined to give better price breaks), and if none of those looked like they’d generate the reduction, access its anonymized community data, identify other suppliers/distributors supplying the same laptops you typically buy, compute their average price reduction over the past three months, and identify those that should be invited to an RFX or Auction to increase competition and the chances of you achieving the target price reduction while informing you of the price reduction it predicts (which might only be 10%, or 5%, if you are already getting better than average market pricing). And it will do all of this with a few clicks. You’ll simply tell the system what your demand is and what your goal is and all of these computations will be run, supplier and event (type) recommendations generated, and it will be one click to kick off the sourcing event.

Moreover, when the webinar said that if you think about this area around workflow and process orchestration, there’s no reason why you can’t take pieces of that, like on the endpoints, around intake or invoices or whatever and use AI there and bake it in a controlled way
into your processes
. Because that’s they key. Taking one tactically oriented process, that consumes too much manual intervention, at a time and using advanced tech (which need not be AI, by the way, modern Adaptive RPA [ARPA] is often more than enough) to improve it. Then, over time, stringing these together to automate more complex processes where you can gate them to ensure exceptional situations aren’t automated without over guidance. One little win at a time. And after a year it cumulatively adds up to one big win. (Versus going for a big-bang project, which always ends in a big-bang that blows a whole in your operation that you might not be able to recover from.)

The only bad part of this webinar was slide 24, Spend Matters recommendation #1: “Aggressively Implement GenAI”!

Given that Gen-AI is typically interpreted as “LLM”, as per above, this is the last AI tech you should aggressively implement given its unreliability for anything but natural language translation and search and summarization. Moreover, any tech that is highly dynamic and emerging should be implemented with care.

What the recommendation should be is aggressively implement AI because now that we have the computational power and data that we didn’t have two decades (or so) ago, which was the last time AI was really hot, tried and true (dependable) machine learning and AI is now practical and powerful!

Now, in his LinkedIn post, Pierre asked what we’d like to see next in terms of research/coverage (regardless of venue). So I’m going to answer that:

Gen-AI LLM-Free AI Transformation!

Because you don’t need LLMs to achieve all of the value we need out of AI in ProcureTech and, to be honest, any back office tech. As I have been saying recently, everything I penned in the classic Spend Matters series on AI in Procurement (Sourcing, Supplier Management) today, tomorrow, and the day after in the last decade … including the day after, was possible when I penned the series. It just wasn’t a reality because there were few AI experts in our space, data was lacking, and the blood, sweat and tears required to make it happen was significant. We didn’t have readily available stacks, frameworks and models for the machine learning, predictive analytics, and semantic processing required to make it happen. Vendors would have had to build the majority of this themselves, which would have been as much (or more) work than building their core offering. But it was possible. And with all the modern tech at our disposal, now it’s not only possible, but doable. There is zero need to embed an untested unreliable LLM in an end-user product to provide all of the advantages AI can offer. (Or, if you don’t have the time to master traditional semantic tech for NLP, zero need to use an LLM for anything more than NLP.)

So, I’d like to see this architecture and explanation of how providers can roll out safe AI and how buying organizations can use it without fear of being another failure or economic disaster when it screws up, goes rogue, and orders 100,000 units of the wrong product!

Tomorrow Doesn’t Matter In Procurement. Only Today.

Stop racing towards a future that won’t happen, or running away from one you don’t believe. It doesn’t matter. As per our prior posts this week, the doctor has been reading future of Procurement white papers for 20 years now. All of which have promised us radical change. This means that they should have started to come true 10 years ago. Not one did. Not ONE! The reality is that we can’t predict the future, and trying to do so just wastes time and effort. However, we can be vigilant about where things are today, learn the tools and techniques that can make us much more efficient in our job, identify those vendors who offer the tools backed by the right technology to enable us to be more effective, acquire and use those tools, and become at least five times more efficient in our job than the average Procurement employee.

For those who tuned out for a while, this is more-or-less Part 5 of the series we have been running this week inspired by the recent white paper by Jonathan O’Brien of Positive Purchasing and Guy Strafford of OneSupplyPlanet on the Functional ExtAInction Battle where the authors claim that AI might just lead to the extinction of Procurement as a business function. To get to the punchline, it won’t, but the non-stop bullcr@p AI Hype might! (Given how many C-Suites are blinded by the hype that is generated 24/7/365 by the A.S.S.H.O.L.E.)

In that series we told you that, despite a few false assumptions, the authors still got to the right answer, more or less. The conclusion that the only Procurement organizations that are going to survive are those that manage to automate and mostly eliminate the tactical, double down on the strategic, and find new value to bring to the business is the correct one. Moreover, those are the Procurement departments that will be rewarded and maintain more headcount than their peers because, after the massive losses from AI failures and the forthcoming AI market crash, the C-Suites who lead their businesses to survival will be those that realize the value of best-in-class Procurement People and invest in them.

However, that doesn’t mean the training budgets that disappeared two and a half decades ago are coming back. They aren’t. Since they C-Suites are still hoping for the day they can fire you, they won’t invest in you, which means that you need to get there on your own. It also means you need to start now. Start learning, start studying, start identifying very cost effective tools that can be put on a P-Card that will significantly improve a function and return value the quarter the tool is acquired (and before you get the third degree about that unexpected P-Card purchase). Real technological progress, with or without AI, comes from one little win at a time — for each task you do, identify the most time consuming tactical part of that task and automate it. Start with the tasks you do the most and continue until you’ve taken 80% out of all of the most time-consuming tactically oriented tasks you do on a monthly basis. When you reach that point you will find that you have not only digitized, but revolutionized, your function and reached the point where have flipped the tables and are spending 80% of your time on strategic decision making and relationship building and only 20% on tactically oriented tasks — a percentage that will decrease over time as you improve the tools and end-to-end automation across functions.

Furthermore, no super powers are required. Just intelligence, the willingness to study late, get up early, roll up the sleeves and work hard until you sweat through your tears. Like all real progress, it’s hard at first, but it will pay off later — when you still have a job and are delivering above peers while only working reasonable hours.

Moreover, you won’t need deep software (or even system) architecture skills either. Just the ability to define what a process should be, how a tool should support it, and find that tool. You need to be a solution architect — leave the technical and system architecture skills to the experts. If the tool they are selling gets it right at low cost with low compute and high reliability, the architecture is probably such that you wouldn’t do any better.

And whatever you do, don’t waste time playing the paradigm game. Leave that to the influencers, who won’t last near as long as they think they will. Or to the consultants, who will be walked out the door and never invited back once the C-Suite realizes they flushed millions down the drain chasing an AI utopia that doesn’t exist. Just consistently get results, push those results in front of the C-Suite, and tell them that they can call it whatever they want, but Procurement is the function — and sometimes the ONLY function — that gets results.