Category Archives: Technology

To Manage Innovation, Governments Must Fix Procurement … And Take Care Where AI is Concerned!

A recent article on Civil Service World noted two things that attracted my attention:

  1. To manage innovation, governments must fix procurement
  2. Too often, contracts in AI do not give governments powers to investigate algorithms or the data they are trained on. As a result, they risk taking the blame when things go wrong without the means to find out why.

Public Procurement is expensive. Very expensive. Given that it represents 12% of the annual GDP of an average developed economy, that is a huge amount of spend. Given that the overspend in most departments of most jurisdictions is likely as bad as in the private sector, which means, depending on the category, is likely in the 4% to 6% range at a minimum (based on the results high performing organizations see when implementing best-in-class processes and technology), that means a minimum of 1/2% of GDP is being wasted annually, but based on the fact that most public sector projects exceed initial budgets and timelines, we’d bet that the overspend is double that and at least 1% of the annual GDP. That’s a lot of waste — 770 Billion on the top 10 economies. Furthermore, that assumes that all of the spend is necessary and well planned. (There is likely considerably more savings with better demand planning, more operational efficiency, better project planning, etc. We’re just stating that the savings on committed spend alone is likely 10%.)

The article notes that despite the strategic importance of Procurement, it’s rarely seen as a priority and is more often treated as a standardized compliance function, rather than a tool for strategic investment and, in some cases, has become synonaomous with absurdity, due to an accumulation of rules so complex that even those administering them cannot interpret them creates the perverse incentive of doing the least risky thing to avoid individual liability. As a result, governments end up buying obsolete technologies that make them vulnerable, because innovation evolves so rapidly, and forces them to buy more. The cycle repeats, budgets balloon, and public capabilities diminish.

And, unfortunately, public procurement is a brick-and-mortar process, still more suited to bulk-buying precisely describable goods, accounting for them, and moving onto the next purchase. Innovation is different: you do not know today what is going to be possible tomorrow, even when you are the one inventing the tech. While governments work in one-off projects, innovation is made of ever-changing, always-fleeting products.

Furthermore, those in charge of procuring these technologies are not technologists. Public procurement is professionalized in only 38% of OECD countries, so even if officials had the incentive to experiment, they would not have the expertise.

To combat this, the authors of the article propose that Procurement systems should be like good software, fluid, flexible, and constantly evolving. However, as they note, this will take more than changing rules. As they note, it will take talent that are experts in what they are buying. It will take the treatment of Procurement as a strategic function, with clear lines for advancement for all personnel (as studies have shown that even a marginal improvement in skill can yield significant reductions in costs, times, and contracting complexity). Thirdly, they will need a federated data environment to make use of modern technology. (Especially if they want to use AI.)

This is just the start of what is necessary. There needs to be regular training. There needs to be specialization to different types of functions and purposes. There needs to be a rewrite of rules to focus on the right outcomes, not just a plethora of rules designed to prevent previously undesirable outcomes. There needs to be clear paths from buyer to public organization CPO to department head, not just paths of advancement within the Procurement function. There needs to be a focus on what’s best for the public being served, not best to minimize the risk to the buyer. And a willingness to accept that their may be a few mistakes made here and there as new buyers learn the ropes, while a willingness to weed out anyone that “makes a mistake” in order to give a contract to a supplier who is not the best fit (and do so in exchange for a kickback).

But most importantly, if they acquire AI technology, they also need to acquire the right to investigate the algorithms being used, the data it is trained on, the results of prior training, and the right to inspect any changes to the algorithms, data, and training. Otherwise, you can never trust any AI technology you might want to acquire.

Because governments need to apply the most appropriate AI-enhanced technology more than the private sector, but are the least likely to be able to use them properly.

Data Governance is Essential to Good Data Management …

… so why is there still so little of it in most organizations?

Good data is becoming ever more successful to business and Procurement success, especially if you want to use any any sort of predictive analytics or AI, but so few organizations have so little data governance, if they have any at all. With good data, you can get great insight into current operations, opportunities, and ordeals. Without good data, you have no clue what you’re buying or selling, what processes are going on at any point of time, or what problems are festering about to explode and cause major issues.

But good data is a rarity in most organizations, getting rarer by the day due to rapidly increasing data volumes (in excess of 400 million terabytes of data being generated daily across the globe), lack of controls in legacy systems, poor data processes, and lack of good IT talent with enough history to know what the data is, what it’s used for, and how to qualify it as good, or bad.

Why? Because organizations are putting systems in place before understanding what data those systems will need, where it will live, how it will be validated, how it will be maintained, how it will be archived, and how it will eventually be retired.

In most organizations, when they need data for an analytics-based project, the current answer is to get a “data warehouse”, “data lake”, or “data lakehouse”; dump all the organizational data to that warehouse, lake, or lakehouse; possibly run a simple AI-cleansing/enrichment algorithm, and hope for the best. However, this is not governance, and, in fact, exacerbates the problem more than it solves it. Now there are two copies of bad data, no strategy for pushing back any data that is cleansed, and if the data is changed in the source system before any eventual synch with the data warehouse, which data is correct? Chances are neither record is fully accurate, and any synch has to be done at the field level, if you have enough data to validate which field is correct (as you can’t just use time stamps, because if some data was updated by AI and unvalidated, it may not be right).

Governance is not just maintaining data in systems as you use it, occasionally validating it against third party databases or by manual review, and occasionally enriching it.

Governance is


  • defining what data the organization needs for its various functions
  • defining what data will be collected
  • defining what systems it will be maintained in, and, if the data is in multiple systems, which system is master
  • defining which data fields are critical and how they will be validated
  • defining when and how critical fields will be revalidated
  • defining the process for any data migration from master systems

And doing it


  • collecting the data
  • installing a new system
  • stating an analytics / AI project

NOT AFTER!

But how many organizations do that? Most don’t even do a proper RFP (taken in by the FREE RFP scam), even though the solution to good software (which is critical to maintaining good data) is an Affordable RFP.

Moreover, part of the RFP for any software solution should define the data management strategy as it impacts, and is impacted by, the solution.

Who Needs The Beef?

For those of you who have been following my rants, especially on intake-to-orchestrate (which really is clueless for the popular kids as it doesn’t do anything unless you already have all the systems you need and don’t know how to connect them), you’ll know that one of my big qualms, to this day, is Where’s the Beef?, because while the intake and orchestrate buns are nice and fluffy and likely very tasty, they aren’t filling. If you want a full stomach, you need the beef (or at least a decent helping of Tofu, which, unless you are vegetarian, won’t taste as good or be quite as filling, but will give you the subsistence you need).

And you need filling. Specifically, you need the part of the application that does something — that takes the input data (possibly properly transformed), applies the complex algorithms, and produces the output you need for a transaction or to make a strategic decision. That’s not intake-to-orchestrate, that’s not a fancy UI/UX, that’s not an agent that can perform transactional tasks that fall within scope, and that’s NOT a fancy bun. It’s the beef.

But, apparently, at least as far as THE PROPHET is concerned, (bio) re-engineering is going to eliminate the need for the beef. Apparently, the buns are going to have all the nutrients (or data processing abilities) you need to function and do your job.

In THE PROPHET‘s latest analogy, today’s enterprise technology burger consists of:

  • the patty: (not to be mistaken for the paddy) which combines enterprise technology and labour (which means it really should be the patty [labour] and the trimmings [technology] in this analogy)
  • the upper bun: and
  • the lower bun: which collectively provide you a way to cleanly get a grip on the patty

But tomorrow’s enterprise technology burger will consist of:

  • the upper bun: which will be replaced by a new type of technology that fuses co-pilots and agentic systems to power autonomous agents and replaces the patty [labour] and part of trimmings
  • the lower bun: which will represent the next generation data store and information supply chain and build in “self-healing” technology for data maintenance and replace the other part of the trimmings

… and that’s it. NO BEEF! Just two co-dependent buns that are destined to fuse into a roll … and not a very tasty one at that. Because this roll will, apparently, operate fully autonomously and never get anywhere near you, leaving you perpetually hungry.

Now, apparently, not all parts of the patty (with its complex amino acid chains and protein structures) will be capable of being (bio) re-engineered into the buns right away and the patty won’t disappear all at once, just shrink bit by bit over the next decade until there’s nothing left and the last protein structure is absorbed (or replaced by a good enough AI-generated facsimile — they can do that now too). In THE PROPHET‘s view, legacy systems of record (ERP/MRP, payment platforms, etc.) will be the last to be replaced, and those will survive along with the legacy labour to maintain them until they can finally be split up into components and absorbed into the bun.

In other words, in THE PROPHET‘s view, you don’t need the patty, and, more specifically, you don’t need (or even want) the beef. I have to argue this is NOT the case.

1. You Need the Beef

Thinking that the patty can be completely absorbed into the buns is what results from a lack of understanding of enterprise software architecture best practices and software development in general.

The best architecture we have, which took years to get two, is MVC, which stands for

  • Model: specifically, data model, which should be at the bottom (and could be absorbed into a data bun)
  • View: specifically, the UI/UX we interact with (and could be absorbed into a soft, warm, sweet smelling sourdough bun)
  • Controller: the core algorithms and data processing, which needs to be its own layer that supports the UX (and allows the UX to reconfigure the processing steps and outputs as needed) and can be cross-adapted to the best available data sources (that need to be remain independent)

Moreover, even Bill Gates, who predicts AI will have devastating effects across all industries, realizes that you can’t replace coders, energy experts, and biologists, and, by extension, jobs that require constantly evolving code, organic structure, and energy requirements to complete. So you will still need labour that creates, and relies on, highly specialized algorithms and expert interpretations of outputs to do their jobs. That also means that, in our field, strategic sourcing and procurement professionals cannot be replaced but tactical AP clerks are on their way out as AP software automatically processes 99% to 99.9% of invoices with no human involvement, even those with missing data and errors, handling the return, correction, negotiation, etc. until all of the data matches and costs are within tolerance.

2. You Want the Beef!

The whole point of modern architectures and engineering is to minimize legacy code / technical debt and maximize tactical data processing and system throughput (and have the system do as much thunking as possible, which is what it’s good at). If you try to push too much into the lower bun, you don’t have separation of data and processing, which means it’s almost impossible to validate the data as it’s not data you’re getting, but processed data, which means that the system might be continually pushing wrong data to the outer bun, even with good data fed in, due to a bug deep in the transformation and normalization code. But your automatic checks and fail safes would never catch it because you’ve turned what should be a crystal (clear) box into a black box! If you try to push too much processing into the upper bun, you have to replicate common functionality across every agent and application, leading to a lot of replication and bloat that consumes too much space, uses too much energy, and makes the systems even harder to maintain than the legacy applications of today.

So while the burger of tomorrow might be different with a much leaner, more protein rich, patty (with less sauce and unhealthy trimmings), and the bread might be a super healthy natural yeast-free multi-grain flat bread, making for a smaller (and possibly less appetizing burger from a surface view), it still needs to be a burger and anyone who thinks otherwise has joined the pretty fly Gen-AI in hallucination land!

You Say You Want Success, But Do You?

This post is inspired by THE REVELATOR‘s inquiry where he asked Do You Really Want a Successful ProcureTech Initiative?

For the vast majority of you, the answer is a clear and resounding “YES” (with the possible exception of those of you who have been treated badly by your employer and want to use your last official act to stick them with an application that will make them as miserable as you are, but as far as I can tell, you are a very small minority — you didn’t get into Procurement expecting it to be easy, or to be a way to make friends).

However, you are only one cog in the ecosystem. Let’s look at the other cogs:

Vendor: as long as you keep renewing the SaaS subscription, the C-Suite at the vendor doesn’t care if they sold you a Ferrari (at a Ferrari price tag) but delivered a 2004 Mazda RX-8 …

Analyst Firm: as long as the big research subscriptions keep rolling in from the big vendors (who always feature at the top / upper right / frontal wave of their maps), the analyst firm doesn’t care if you succeed or not, and will not only happily push the hype the vendors want pushed, but happily blame you for not doing your research and not selecting the appropriate technology when you and your counterparts take their advice en-masse and then contribute to the all-time high project failure rates of 88% (two and a half decades of project failure)

Implementor: not really, because if you don’t swap out the solution at renewal time, where is their future revenue going to come from???

Big X who pushed the platform: Hell No! … they need to sell you projects to find bolt ons, do custom additions, and tweak the process for years as they need to keep their bench empty! (And some of these shops have over 100K junior consultants they have to keep busy. Moreover, they don’t make money training them on AI, they make money deploying them as your external support force. (Remember, many of these shops are effectively the new Manpower, except they have to pay their consultants on the bench, whereas job placement agencies just had to place people to keep their government grants or get their placement fee!)

And since YOU don’t take the time to do your research and figure this out (including the fact that the Big X pushed the worst fit solution from their stable on you to keep their Gold/Platinum/Sycophant status with the solution provider), that’s why YOU keep failing. Even if the salesperson honestly wanted to sell you a win (and many don’t, and the doctor can say that confidently with over 25 years in Enterprise Software and he’s sure THE REVELATOR has some stories to tell here), that’s far from a guarantee that a win will happen.

If you truly want success, YOU have to define your processes, define your problem, find the right vendor, make the vendor contractually responsible for implementation success (whether they do it or use a third party) with delayed payment (where you don’t pay for a module until it is working and passes predefined tests) and early termination clauses, identify the gaps, identify the right niche consultancy (who doesn’t have a stadium of junior consultants) to help you identify add ons and processes to fill them, and define early out clauses in case of non-delivery! You have to do all the work the vendors, analysts, and consultants claim they do for you … because they don’t (or at least don’t do it in your best interest). And while the good ones (which may take you a while to find) will help you, YOU still have to take the lead!

And the doctor knows you don’t always have the time to do it all, which is why he keeps pushing Project Assurance where you hire a niche specialist to help you, one who is not a part of the big COGs that need never-ending projects from you to stay solvent, and only cares about helping you get everything in order for success. (After all, there are so few of these experts it is literally a case of too many companies, too little time. These people or small niche consultancies don’t have to worry about running out of work, and by the time they made it through all the current companies they could handle, it would be time for their initial clients to upgrade to next generation systems anyway — and the only way they’d be available for a future project is to ensure client success with every client they take on.)

As we indicated, in our last two rants, you can no longer afford to be led by the Clueless vendors. It’s time you take your Procurement destiny into your own hands. It’s time for the Revenge of the Nerds!

Tech Won’t Solve Your Procurement Problems!

Probably not something you’d expect from a blog that was initially founded to educate you on best practices and best tech in Procurement and or from the doctor who has publicly reviewed close to 400 companies on Sourcing Innovation (and Spend Matters between ’16 and ’22), but it is something that needs to be said, and yelled loudly, now that everyone (analysts, influencers, marketers, etc.) is telling you this next generation of Gen-AI, Agentric, or AI-driven tech they are building will solve all your problems.

Because it won’t. In fact, it probably won’t solve any!

That’s because Procurement is NOT like other business functions. And while all business units are different, Procurement is different in a unique way. It has to constantly solve problems the business has not experienced yet. Sales just has to sell the next N customers in the target customer base which will be rather similar to the last N. Marketing is messaging this potential base which is not changing their business overnight, or even year to year, isn’t rapidly advancing in their market understanding, and won’t recognize more than a subtle shift in the message. Moreover, you don’t have new mediums popping up everyday. There’s print, radio, TV, skywriting, and web/social media. (We haven’t invented gamma radiation-based dream advertising yet!) Finance isn’t changing the rules of accounting, and even minor changes, like GAAP, only change every couple of decades.

Not so in Procurement. It’s not just acquiring supply at the lowest cost, it’s sustaining supply at a cost that allows the organization to remain profitable, which is not simply repeating the last order to the current supplier when stock gets low. That’s because Procurement not only has to constantly deal with supplier capacities, raw material shortages, carrier capacities, occasional port strikes, occasional carrier and supplier failures, but unexpected natural disasters that wipe out entire yields of renewable raw materials, arbitrary sanctions and border closings making suppliers and routes unavailable, and completely unexpected trade wars sparking tariffs that can completely upset all the cost models you ever developed.

That means that every model you have built and every solution you have customized instantly becomes irrelevant. And you can’t use AI to tell you what to do because AI can only tell you what it has been trained to do, and it can only be trained on existing data which would be based on historical situations.

That means that tech cannot solve your Procurement problems.

That means that the only option you have, as Sourcing Innovation has been saying for months, is Human Intelligence (HI!). That means that only educated, experienced, skilled, and smart people can solve Procurement Problems.

This isn’t to say that you shouldn’t use tech. You most definitely should! Because most of what you do is tactical data processing that is well defined, for which there are configurable solutions that will allow the software to do the majority of it for you, and “AI” solutions that can be trained to learn from the exceptions you manually deal with to handle them automatically the next time.

But when it comes to strategic decisions, there is no Agentric AI that can solve a problem, especially one it, and you, haven’t seen before. You have to do that. If you’re smart, you’ll acquire all of the best knowledge summarization and analytic solutions that you can get your hands on because they’ll automatically acquire, process, summarize, and graphically display all of the information available, which will help you make the right decision efficiently and effectively, so that you can react fast in a crisis with confidence, but it will still be you, the human, who has to make the (right) decision!

As an IBM slide deck stated in 1979:

A computer can never be held accountable, therefore a computer must never make a management decision.

Just because it can do a billion calculations a second and thunk better than you, that doesn’t mean it can think, because it can’t (and when Gen-AI claims to display a “chain of thought” it is lying, it is a “chain of compute”, which is not thinking, just identifying patterns that typically follow other patterns in sequences it was “trained” on). Only you can. (Remember, if machines ever become intelligent, our best case scenario is they need us for bioelectric energy and create the matrix where we believe we are living a life free of machines. Otherwise, we’re probably looking at a SkyNet situation. It’s only logical for many, many reasons.)