Category Archives: Best Practices

With Great Data Comes Great Opportunity!

In fact, it can quadruple your ROI from a major suite.

Not long ago, Stephany Lapierre posted that your team may only be realizing <50% of the ROI from your Ariba or Coupa investment, to which, of course, my response was:

50% of value on average? WOW!

Let’s break some things down.

A suite will typically cost 4X a leaner mid-market offering which is often enough even for an enterprise just starting it’s Best in Class journey (that will take at least 8 years, as per Hackett group research in the 2000s).

Moreover, even if the enterprise can make full use of the suite it buys for 4X, at least 80% of the “opportunity” comes from just having a good process, technology, baseline capability and automation behind it. That says you’re paying 4X to squeeze an additional 20% worth of opportunity in the best case.

On average, it takes 2 to 3 years to implement a suite (on a 3 to 5 year deal). So maybe you’re seeing an average of 66% functionality over the contract duration.

As Stephany pointed out, bad data leads to

  • increased supplier discovery and management times
  • invoice processing delays and errors
  • increased risk and decreased performance insight

As well as an

  • inability to take advantage of advanced (spend) analytics
  • inability to build detailed optimization models
  • decreased accuracy in cost modelling and market prediction

This is even more problematic! Why? These are the only technologies found to deliver year-over-year 10%+ savings! (This is where the extra value a suite can offer comes from, but only with good data. Otherwise, at most half of the opportunity will be realized.)

Thus, one can argue an average organization is only getting 66% of 25% of 80% of its investment against peers (based on 2/3rd functionality, the 4X suite cost, and the baseline savings available from a basic mid-market application that instills good process and cost intelligence) and 50% of 20% (as it is able to take advantage of at most half of the advanced functionality offered by the suite due to poor and incomplete data). In other words, at the end of the day, we’d argue an average company is only realizing 23% of the potential value from an opportunity perspective!

However, as one should rightly point out, the true value of a suite is not the value you get on the base, it’s the ROI on that extra spend that allows for 20% more opportunity than a customer can get from lesser peer ProcureTech solutions.

For example, let’s say you are a company with 1B of spend with a 100M opportunity.

If tackling 20M of that opportunity requires advanced analytics, optimization, and extensive end-to-end data, it’s likely that you’ll never see that with an average mid-market solution with limited analytics, no optimization, and only baseline transactional data. If the company paid an extra 1.5M over 3 years for this enhanced functionality, then the ROI on that is 13X, which is definitely worth it.

Moreover, if the suite supports the creation of enhanced automations, you could get more throughput per employee and realize the base 80M with half or one quarter of the workforce, which would lead to a lowering of the HR budget that more than covers the baseline cost.

However, ALL of this requires great data, advanced capability, and the in-house knowledge to use both. This is only the case in the market leaders. As a result, we’d argue that the majority of clients are only realizing about 25% of the suite’s potential — when sometimes the only thing standing in their way of realizing the rest is good data.

The Pundits Agree. Winter is Coming!

In a recent LinkedIn Post, THE PROPHET, a year late to the party (see the SI and Procurement Insights archives, and our Marketplace Madness post in particular), finally announced that Winter is Coming: The Great Procurement (and Broader) Legacy SaaS Rationalization and that it is going to be a very cold winter that will be Swift. Brutal. And very, very final.

There’s too many companies that took too much funding at too high a valuation with nothing to show. PE firms will be dropping these companies faster than a hot potato into boiling lava pits to focus on the companies in their portfolio with current year-over-year growth in hopes of making some of their losses back. Too many companies started without doing any research and literally built the 10th AP clone, SXM clone, RFX clone etc. of a dozen already existing solutions. (Just check the mega map if you don’t believe me.) In a race to the bottom (which helps no one), they’ll lose due to their lack of a bank account as they slash prices too deep in hopes of getting customers. Too many applications took a silo focus, didn’t build Open API centric, and the hurdles of plugging them in will be too much or too costly, and no matter how good the tech is, they won’t get bought.

And then Agentric AI will thin what remains (but not lead the cull as THE PROPHET predicts, because these AI solutions still cost money, and sometimes a lot of it, and for what they cost you can hire a real expert and not a fake one, license a few augmented intelligence tools for a quarter of what these over hyped Agentric AI platforms are charging (because they raised, and wasted, too much cash and have to recoup that before they become the next hot potato dropped into the lava pit by their PE investors), and have a super-human employee who does the work of an entire team, error free (with little to no risk of that skilled employee getting you sued as a result of a conversation, installing a back door for hackers on your systems, shutting down your systems for days, costing you 10X on a purchase due to a dot transcription error, or increasing your internal fraud [link]).

And for some of these companies, it’s already too late to pivot. For others, there are actions they can do. THE PROPHET offered some:

  • risk over-cutting
    (if done smartly)
  • consumption-based models
    (which will be more attractive to some potential customers)
  • challenge the team to earn their existence
    (but that doesn’t necessarily mean prompting GPT like a pro: you don’t want junkies)
  • redefine the sales org
    (a better playbook is key, and it needs to be differentiated)
  • Skip the Fairy Dust and Buzzwords
    (Hear! Hear! I’ve been shouting that for years! Unfortunately, not sure most of the companies out there know how to do that though! I know for a fact only the squirrels have been listening to me, and they are getting very tired of that rant. They like variety. Basically, it’s been a long, long time since marketing focussed on education and value and startups priced based on that.)

And that’s just the tip of the iceberg. SI has posted entire series on:

Failures from those who raised too much and offer too little will be coming fast and furious. This needs to be repeated. You need to be very careful which vendor you buy from and what protections you have in place if they don’t make it. (In particular, there must be a “we own our data” clause which gives you the right to all of your data and the right to export all of it into a standard file format at any time, and that specifies your data includes rules and workflow configurations and the right to export those too — for example, in spend analysis, it’s not just the data, it’s the rules that create the cubes; in invoice processing, it’s the workflow and approval rules … you won’t be able to migrate to another system quickly if all you have is the transaction data, the data that defines the processes and rules is just as important. And if you can’t export all of your data, rules, templates, etc., at any time, then don’t buy the system!)

However, while the app consolidation will be brutal, as will the renegotiations if you want to be one of the apps that make it (now that organizations are realizing they don’t need 17 apps for S&M and probably shouldn’t have 17 apps in any function, including Procurement), Agentric AI (especially at 20K a month when you can hire a REAL person for 1/4 of that) will not replace people en masse (but AI-enabled technology will). Teams will be cut and replaced by two individuals who can use next generation augmented intelligence solutions that can truncate months of research and analysis to a few days and allow strategic decisions to be made in hours, not weeks, and shifts to be made seemingly overnight while eventually allowing 99% of all tactical data processing to be automated through evolving rules and workflows under expert guidance.

Moreover, at the end of the day, relationships are not built on 1s and 0s, and they are needed now more than ever. So not only will we have skilled technologists, but skilled relationship managers. (While everyone else who does nothing but push e-paper 90% of the time or code spreadsheets will slowly be eliminated.) Of course, this means if you don’t understand optimization, analytics, statistics, game theory, economics, and logic, or you’re not an expert in relationship management, you’re screwed, but everyone had a chance to study STEM in University (and skip the woke liberal arts) and learn the technical skills for the first set of jobs.

So this also means if the platform is not enabling this next generation of employees to become more and more productive over time, its lifespan is probably short.

So get focussed in your diligence efforts on solution acquisition if you don’t want your platforms disappearing out from under your virtual feet, and if you need help, call an expert!

It’s Not Just Public Procurement Offices That Should Avoid Tech Fads

A recent article over on State Tech Magazine boldly stated that State Procurement Offices Should Carefully Avoid Tech Fads. And the headline, and author, was right. But it’s not just the public procurement offices that should be avoiding tech fads, the private sector offices should be avoiding them too. But more on this later.

The author noted that Artificial Intelligence is everywhere these days, and that news, advertising [and marketing] may leave you feeling [more than a bit] pressured to join the crowd and be an early adopter. But, as the article points out, and as THE REVELATOR would also be quick to point out, successful IT procurement inolves engaging with a comprehensive list of stakeholders, conducting thorough research and careful implementation planning, and, as THE REVELATOR reminds us on a regular basis, understanding what you need in the first place.

As the author notes, emerging technologies often present unforeseen challenges and novel issues that procurement offices must be aware of and prepared for. Failure to do due diligence can lead to embarassing or costly results. Not only do you have the extremely high failure rates with advanced tech projects, exceeding 85% according to Gartner, but, as the author points out, in the public sector technology breakdowns can have much more consequential impacts. The Air Canada lawsuit is just the first example of what is to come from the inevitable failures of AI not ready for prime time.

It’s not about the hype, it’s about the value the solution will provide, which includes, as the author notes, the total cost of ownership and longevity. The solution must fulfill the organizational need, not the hype. Otherwise, the total cost of ownership is high as no value is delivered while the longevity will be very limited.

But this should be just as true in the private sector. After all, a solution that could get you sued if it fails, that doesn’t solve the problem, that is worthless from the minute it is implemented, and that will paralyze you until a replacement is found and implemented is not something you should ever, ever want in private industry either. So don’t fall for the hype, and stay on the course that’s right — real solutions that solve real problems.

Data Governance is Essential to Good Data Management …

… so why is there still so little of it in most organizations?

Good data is becoming ever more successful to business and Procurement success, especially if you want to use any any sort of predictive analytics or AI, but so few organizations have so little data governance, if they have any at all. With good data, you can get great insight into current operations, opportunities, and ordeals. Without good data, you have no clue what you’re buying or selling, what processes are going on at any point of time, or what problems are festering about to explode and cause major issues.

But good data is a rarity in most organizations, getting rarer by the day due to rapidly increasing data volumes (in excess of 400 million terabytes of data being generated daily across the globe), lack of controls in legacy systems, poor data processes, and lack of good IT talent with enough history to know what the data is, what it’s used for, and how to qualify it as good, or bad.

Why? Because organizations are putting systems in place before understanding what data those systems will need, where it will live, how it will be validated, how it will be maintained, how it will be archived, and how it will eventually be retired.

In most organizations, when they need data for an analytics-based project, the current answer is to get a “data warehouse”, “data lake”, or “data lakehouse”; dump all the organizational data to that warehouse, lake, or lakehouse; possibly run a simple AI-cleansing/enrichment algorithm, and hope for the best. However, this is not governance, and, in fact, exacerbates the problem more than it solves it. Now there are two copies of bad data, no strategy for pushing back any data that is cleansed, and if the data is changed in the source system before any eventual synch with the data warehouse, which data is correct? Chances are neither record is fully accurate, and any synch has to be done at the field level, if you have enough data to validate which field is correct (as you can’t just use time stamps, because if some data was updated by AI and unvalidated, it may not be right).

Governance is not just maintaining data in systems as you use it, occasionally validating it against third party databases or by manual review, and occasionally enriching it.

Governance is


  • defining what data the organization needs for its various functions
  • defining what data will be collected
  • defining what systems it will be maintained in, and, if the data is in multiple systems, which system is master
  • defining which data fields are critical and how they will be validated
  • defining when and how critical fields will be revalidated
  • defining the process for any data migration from master systems

And doing it


  • collecting the data
  • installing a new system
  • stating an analytics / AI project

NOT AFTER!

But how many organizations do that? Most don’t even do a proper RFP (taken in by the FREE RFP scam), even though the solution to good software (which is critical to maintaining good data) is an Affordable RFP.

Moreover, part of the RFP for any software solution should define the data management strategy as it impacts, and is impacted by, the solution.

Why Aren’t You Realizing the Full Value of Your Sourcing Efforts?

It’s been a well known statistic going back all the way to 2009 that at least 30% (and often 40%) of identified value in a sourcing event is never realized when Mickey North Rizza of AMR Research (acquired by Gartner in 2010 in an acquisition game of the 64,000,000 pyramid) published her classic 3-part series on Reaching Sourcing Excellence with Part 1 titled How to Keep 30 Cents of Every Dollar Spent. The reality is that while many leading organizations adopted strategic sourcing quickly during its first heyday in the mid-2000s, often before Procurement, because of the huge savings opportunities that were identified with good reverse auction platforms (in markets where supply exceeded demand) and good sourcing optimization (regardless of market conditions), as sourcing optimization identified an average savings of 12% consistently (compared to reverse auctions which saw significant drops every time they were applied to the same category), most of these leaders who identified savings of 10% or more never saw half of the identified savings. This is because savings requires more than just identification and a signature on a contract, it requires execution!

Execution that, at a minimum, requires:

  • making sure you order on the contract
  • … on time to receive delivery on time using the preferred shipping method
  • making sure you receive defect-free goods that meet the spec before paying for them
  • making sure the amount you are billed is the amount as per the contract
  • … and that you are not billed for expediting fees or surcharges you DID NOT agree to
  • making sure you pay on time (to avoid penalties)
  • … and only ever pay for any good or service once (using an m-way match)
  • making sure you terminate or renegotiate before an evergreen renewal
  • … and that you have verified the supplier has all the certifications and insurances in place before placing an order or renewing the contract
  • … etc.

It comes back to the concept of the perfect order which must be

  • on time,
  • complete,
  • damage free,
  • correctly documented,
  • correctly billed, and
  • adherent to all contract terms

This is not easy to do unless you

  • have a good procurement system
  • have a (carrier that has a) good WIMS (Warehousing and Inventory Management) system

and, the part that most people miss,

  • have a good contract lifecycle management system that manages the contract execution post signing

And when you look at the majority of contract management systems, they tend to fall into three categories:

  • a glorified e-filing cabinet / document repository where you can store your contracts and search their metadata (and literally no better than what a high school student with Microsoft Access and minimal coding skills could build 20 years ago)
  • a contract creation system that will allow you to quickly draft contracts using:
    • contract templates, from your, or their, legal department, tagged by region and category they can be used for,
    • clause libraries and templates, possibly with multiple version support based on territories and categories, or, today
    • Gen-AI drafting of templates through specification of category, region, requirements, and risks that must be covered as well as e-versions of all previously signed contracts in the category, region, business requirement, or risk categories (which then need to be mildly to moderately edited by a Legal expert)
  • a signatory platform with negotiation support (version control, dynamic redlining, audit trails, etc.)

Which is all fine and dandy, and well implemented can make your Legal team and Sourcing teams considerably more productive during the negotiation process, but does diddly squat when it comes time to actually helping you manage the contract execution. Now, you might think that you can do that in the Supplier Management system, because you’re ultimately managing a supplier, or the Risk Management system, because you’re ultimately managing a risk, and you can, to a point, and specifically the point at which those systems allow you to define contract tasks, but none of these are set up to let you holistically manage a contract — contract 360 if you will. This is especially the situation if you have a master contract with a number of sub-contracts, and those sub-contracts have sub-contracts as well. This will be the case if you are buying off of a contract tied to a GPO master contract, a holding company master contract (if your company is part of a group of companies), or in construction / engineering / shipbuilding industries where your main supplier will need to subcontract to a number of smaller suppliers for custom parts or services and your organization needs to manage that for regulatory or risk reasons.

In other words, the only contract lifecycle management solution that is truly valuable to Procurement is the solution that allows the contract to be managed from post signature to termination, helping the organization ensure all of the obligations are met and rights are received.