Category Archives: SaaS

Cross-Enterprise Part Data Synchronization Nightmare? Don’t Get Reactive, Get Creactives!

For those who have been following along, and who had access, the doctor first covered Creactives in 2021 on Spend Matters Pro (I: Overview and II: Deep Dive) and provided one of the first North American overviews of Creactives’ industry leading AI Knowledge Engineering Platform for Material (& Service) Classification and Master Data Governance.

We’ll give a brief review of the foundations in this post, but focus on the enhancements which, honestly, have been made throughout the platform and force us to cover most of it again (but that’s a good thing for you).

Founded back in 2000 as a cost reduction consultancy to help global manufacturers in Automotive, Aerospace & Defense, Industrial Equipment & Machinery, High-Tech, Chemicals, Metal Production, Food and Beverage, Pharma, Energy Production, Oil & Gas companies get a grip on their data, they were the first to realize that the largest problem multi-national manufacturers had was data synchronization across the enterprise. The major Source-to-Pay players like to gloss over this fact, because then it would become all too clear that there isn’t just one ERP (or PLM) to integrate to, one supplier master, and, more importantly, one part master, but dozens — at least one for, and in, each of the different markets served by the multinational and typically in the language of that location.

As a result, global consolidation of demand for centralized sourcing of common parts or core materials becomes almost impossible since every ERP:

  • has its own record structure for a part/material
  • has its own categorization and GL-coding
  • shoves most of the core requirements in the description
  • and every department uses their own shorthand to cram a paragraph into 256 characters or less
  • usually in their own language
  • and they may or may not include the supplier in the description
  • etc.

This makes cross-organizational data mappings and harmonization at the part level almost impossible — and also explains why almost no vendors even attempt to tackle the problem and label it a “data cleansing and harmonization project by consultancies before you implement our system“. Which, as we know, means it usually doesn’t get done, or get done well, and then the sourcing/procurement/supply chain solutions these customers buy never deliver on the promises that were made (because all those promises assumed correct, complete, and universally harmonized data across the organization).

If the organization is lucky, it will stumble upon one of the supplier data cleansing, harmonization, and enrichment providers (and by that we mean Tealbook, Veridion, or ScoutBee), and at least get its supplier master in order, which will help it use one of the third party manufacturing supplier discovery platforms (like Find My Factory or PartFox) and use that to cross-reference its database and at least figure out which suppliers are supplying its parts and materials and capable of supplying more parts and materials across the organization, but that’s about it. It will still have to investigate the part records one by one to determine precisely what the part is, if it is a duplicate record of another part (somewhere else in the category hierarchy or in another ERP), which suppliers can supply the part, which suppliers can supply a more-or-less equivalent substitute part, and what the differences are.

For a typical multi-national enterprise which will have thousands of material groups across dozens of ERP instances (30 to 50 is NOT uncommon), this just doesn’t happen. At best, the organization will identify the top product lines, the top parts by cost or product in each product line, and undertake a focussed effort to try and identify commonality in suppliers and parts for only those parts across the organization, and stop the consulting engagement there. The hope is that this will cover 60% to 80% of the core direct spend, but since the mapping will only be 60% to 80% at best, the organization will miss out on 1/2 to 2/3 of the direct sourcing and cost optimization opportunities that would be available to it if they achieved 95% to 98% mapping across all of the parts across the entire organization.

Creactives was developed over two decades to solve this specific challenge (which is a nightmare in most large multi-national enterprises), and that is precisely what it does. As was written four years ago, at its core, Creactives is a platform designed to properly identify, and classify, procurement items in enterprise master data to support the proper taxonomic classification, reporting and analysis within Procurement and other enterprise systems. It does this by way of custom designed ML and AI technology that has been developed over two decades (and which was developed long before the new generation of hallucinatory LLMs, which is why it actually works), which integrates proprietary dictionaries, semantic processing, linguistic identification, clustering, and deep learning technology in a highly specialized and optimized arrangement that takes advantage of a human in the loop to achieve very high accuracy in its classification. Creactives guarantees 95%, but typically achieves 98% (or more) for the majority of its clients.

This is not easy to do when, in an average organization, even as something as simple as a ball bearing might appear in a dozen different organizational group categories (spare parts, MRO spare parts, electric motor spare parts, bearings and accessories, mechanical parts, steel parts, appliance repair, etc.) and then variations of those material groups across dozens of ERP systems. Even when you get more complex, such as motors, and you think you’d have standardization across the enterprise, you still typically have a few different categories (electric, mechanical, appliance, etc.) and a few (to a few dozen) variations. When today’s appliances will contain hundreds of parts, automotive/aerospace vehicles thousands, and electronics systems tens of thousands, a large multi-national enterprise will use hundreds of thousands of parts and have millions (and millions) in its databases as it will have many (and sometimes dozens of) duplicates, a lot of equivalent variations, even more substitutions, and often these will be replicated across multiple suppliers the organization is doing business with. (When we hinted it’s a nightmare task, we weren’t joking.)

This is what the core of the Creactives Material & Services Master Data Governance product does in its TAM4 offering, and what powers its

  • data cleansing and enrichment service (which underlies not only its initial implementation and integration services to get the organization up and running but its ongoing data cleansing & enrichment service)
  • spend analysis
  • data assistants (including its SAP integration that ensures the user always selects the right part and its new part creator)

When a buyer first selects Creactives, the first thing it will do is integrate all of the organizations ERP systems, bring all the data in (with source tracking), create translations of all of the data to the working languages of the project (while maintaining the source data), organize the records under the existing material groups, analyze all the data, assign the records to the lowest level UNSPSC categorization possible, use this to recommend the new material group structure (by identifying duplicate, poorly defined, or unused material groups), identity as many of the (re)mappings as possible, create a sample list of mappings for verification (where it believes it has the mappings right) for the human in the loop, and create a representative list for mappings that need to be made (where one mapping will allow it to potentially map dozens of other parts based on that insight) for the human in the loop.

At this point the human will be able to see the initial mapping progress dashboard which will summarize the number of ERP instances, geographic coverage, languages, locations, original material groups, new material groups, UNSPSC (sub) categories, brands, codes, mappings, and computed accuracy. From here, the reviewers can drill into the data from any of the previously mentioned dimensions, review the line items, verify or correct, and more importantly, dive into the unmapped or the mappings needing verification, do the mappings and verifications, and kick off the next training cycle. This will continue until the desired accuracy is achieved (which will improve over time as new data comes into the system properly mapped and the system is retrained on a regular basis).

Once the initial training cycle is complete, all of this data will be explored, managed, and enriched through their Material Master Governance product — TAM 4. From here the buyer can explore categories, identify and merge/delete duplicates, associate substitutes, bulk upload new parts or supplier-part data, download part records for editing by the shop-floor team (who only know how to use Excel) and then upload those records again when modified, manage suppliers, monitor all of the integrations, and keep track of current workflow processes around training or data enrichment. The platform can also be used by the user to manage all of their associated tasks around (new) part creation/enrichment, approvals, duplicate management, (supplier) relationships, etc. And, of course, they can also dive into the standard spend and product reports.

Part creation and modification through the platform is trivial. To modify a part, the user can simply edit any field of any existing part, as well as all of the default language translations in any working language used by the organization (which could be dozens in a large multi-national), and the updates, once approved, will be pushed to all integrated systems. To create a new part, the user can find the closest part record in the system, copy it, add or remove fields as necessary, and change the field data as necessary. Creating minor variations or changes as a product design changes over subsequent iterations is extremely quick and easy, and since the platform can integrate with the PLM and bring in the associated diagrams and drawings, data can truly be harmonized across enterprise NPD, Sourcing, Manufacturing, and Supply Chain systems.

Since our last deep dive four years ago, the TAM platform has been enhanced significantly. The self-serve Vanessa interface (that consulting partners can use when clients want to continue to work with their preferred consulting firm on sourcing/supply chain/data management projects) has had workflow and usability improvements (and has reduced the amount of data a human-in-the-loop has to manually categorize for maximum mapping efficiency) by way of an increased focus on identifying the most representative records for training and classification purposes (through [statistical]) similarity to other records]). (They’ve also optimized their processes too and can typically get a multi-national enterprise up and fully operational across its dozens of systems in three months. The data can then be fully maintained from that point on, through automated data cleansing and synchs on a weekly basis.)

The core analytics platform is enhanced and the category explorer, which is built on the core platform, allows the user to drill down and filter on any dimension at any time, and the user can even do pattern searches on key (description) fields. It’s also been enhanced to allow users to identify records with missing fields and categories which would most benefit from manual review and data record enhancement. They’ve also improved the dashboard interface summaries that allow a user to quickly get a high-level understanding of a category that includes the attributes, material types, plant and country distribution, spend, suppliers, etc.

Duplicate management has been greatly enhanced and a user can determine for any category, subcategory, or part, the material groups or parts that partially match the filter, fully match, have active processes, the materials used, and the stock, consumption, and order amounts. Drilling in to a stock part, the user can not only see the identified duplicates in the system but the stock, consumption, and PO units and dollars for each, which gives insight into the savings potential from volume leverage and standardization across a right-sized group of suppliers. From here they can accept the duplicate (and the parts will be merged in the master), assign it for processing to someone else (if it looks like it might be a substitute if not an exact duplicate but an expert is needed to classify it), or confirm one or more parts are distinct and not part of a duplicate group (that should be associated with the stock item in one or more of the organization’s ERPs).

The smart part creation/modification is also enhanced (and fully embedded in the tool) and allows a user to bring up similar parts by description, select one, add or subtract fields, update standard attributes (and the system has been trained on data sheets that specify standard part attributes for tens of thousands of categories and has all of these templates at its disposal), select from standard (enforced) values, carefully control the description field, in every language, that will be fed back to the ERP, and ensure the (new) part is properly categorized from the beginning. And, of course, the user can also upload/pull in all of the documents, diagrams, and models necessary to completely describe that part and store all of the part data in one central location.

As we said before in our summary of Vanessa, it’s the power tool an expert in multi-lingual direct material data classification needs to unify data across dozens of global ERP, MRP and associated database instances and harmonize tens, or hundreds, of thousands of parts across those dozens of systems to enable more efficient NPD/NPI, sourcing, supply base rationalization, and supply chain design. And one that will likely not be equalled for years*. So if you need better cross-enterprise part and material masters, we cannot stress enough how important it is to include Creactives in your RFP/evalulation.

* As ForeStreet found out when they decided eight years ago to build an AI-backed supplier intelligence tool, it’s a very significant challenge and not one you can just throw a DLNN or LLM at and solve it (and, if you try, you’ll end up with a situation that’s only worse). (Remember, it took Forestreet 7 years to get the first version of its platform ready for wide market release, and supplier intelligence is a simpler problem. You shouldn’t be surprised that it took Creactives 18 years to get to self serve and over 20 to get where they are today — which explains why they are currently alone in their category.)

Accept It! You ARE Selecting Obsolete Tech.

But that’s not necessarily a bad thing.

In a recent LinkedIn article, Joel said that digital procurement is like a pie eating contest, and while we’re not sure we agree, he made one valid point:

The system you select is already heading toward obsolescence the moment you go live.

But it’s worse than that!

1) It’s heading toward obsolescence from the minute the implementation starts … you have no idea the technical debt in the systems you are being sold today from the build fast, scale faster, fix it later mentality infused by VCs and most PE firms!

2) It was probably obsolete when you selected it, especially if you chose a vendor who has been leading the same Gartner and Forrester maps for 10 years with no significant changes to their product or platform!

3) Even worse, chances are that the process you digitized makes you outdated anyways and keeps you that way — digitization is the best time for identifying not how things work, but the way they should work to maximize efficiency and minimize risk (and that’s not, as we continually point out, jumping on the Gen AI / Agentric AI bandwagon and being blinded by the hype).

4) Moreover, you really shouldn’t need different channels (i.e. completely different apps) to source, just different workflows and interfaces, but since most providers don’t do more than one category (among indirect, direct, services, capex projects, etc.), you likely need MORE apps. Moreover, few suites have more than one or two modules that are truly best of breed (despite their claims), so if you don’t plan for the constant upgrades and bolts ons … well … you won’t be ready when you have to select and implement one quick, and then you’ll have even more obsolescence than you planned for.

That doesn’t mean that you should give up on modern tech because it’s all obsolete, because it’s not, and the good vendors recognize this and continually update their tech to minimize the obsolescence. It does mean that you need to be very careful when selecting your tech to find a solution that has minimal technical debt, is beyond where you are at today with respect to the processes it supports, and is being continually enhanced by the vendor. If the vendor offers a truly best of breed solution, is beyond where you are today, and has a track record of keeping up with best practices, and best tech, it’s likely a good vendor.

Especially if the tech today is considerably enhanced against the tech it had two to three years ago (which you should be able to determine by looking up old demo videos, articles, independent reviews, etc.).

However, if you can’t tell any difference between the (mega) suite tech being pushed at you today vs. what the (mega) suite tech were advertising five years ago, then you should probably stay away. Far, Far Away.

Wham Up Your Direct Sourcing with EffiGO!

EffiGO might not be a name you know, as they spent the first decade building, deploying, and growing primarily in India (where they have over 150 enterprise customers including some of the largest names in India in Construction, Manufacturing, CPG, Automotive, IT, Pharma, and Chemicals and have sourced and procured over 25 Billion in Spend), but they now have a growing presence across Asia, the Middle East, and are just starting to expand into Europe (with America coming soon).

However, it now is a name you should know because they built the system from the ground-up to be a complete purchase requisition to invoice approval system with all of the key sourcing and procurement steps in between for indirect (and tail spend), direct, rate-card based services AND complex (project) procurements for their customers — whatever their customers needed. And the foundational “plan to pay” from purchase requisition to ok-to-pay suite can be obtained by a LMM or SE (Large MidMarket/Small Enterprise) at an annual license cost starting at 100K. Integrations, and they highly recommend integrating to the ERP (where they have integrated with most major ERPs multiple times including, but not limited to SAP, Oracle, Infor, Dynamics, etc.), custom configurations, and services are extra, as with any other major player, but the license cost makes it affordable for the mid-markets who need a direct/complete sourcing solution.

The core of the EffiGO platform is broken into two main modules that cover the two main work streams:

Plan to PO

The Plan to PO component consists of the creation/acceptance of the Purchase Requisitions (which can be pushed from the ERP or manually created in the platform), the creation and execution of the sourcing events, the selection of the award, and the definition of the contract that orders will be made against.

Once a Purchase Requisition is pulled in from the ERP or manually created by a user in another organizational department, the user will see it in EffiGO and can pull it up, see all the details, edit those details (including, but not limited to the goods and services requested, the units, the delivery dates requested, the payment terms, etc.) or request an edit if they don’t have the authority, and approve it for sourcing.

With respect to core sourcing, the platform supports:

  • RFX – Quick
  • RFX – Full (with or without TechnoCommercial Evaluation)
  • Auction
  • Reorder (from a past RFX created in the last quarter)
  • Order from Catalog (for products where [rate] contracts are in effect)

RFX (and auction) creation starts by selecting one or more approved requisitions to kick-off an RFX (or auction) process, selecting the event type, entering basic information (name, business unit, event owners, business unit, desired delivery locations, currency, etc.), and determining whether the event only requires commercial specifications and terms or detailed engineering/technical review and a weight-based award based on commercial terms and product/supplier review.

Note that the system will inform the buyer if one or more parts or items in one or more of the requisitions they select is either in inventory and/or already under contract and can just be fulfilled without going through a sourcing event.

Once the basic event criteria have been defined, and the items and quantities confirmed, the user is walked through the remaining configuration steps that include:

  • documentation – standard organizational terms and conditions, NDAs, and other project specific documents (which can be pulled in from a central library) or uploaded
  • price tables – the platform supports pre-configured bidding templates for different categories and products (that can be associated with any level of the product and service hierarchy they support), which can include non-price components, and the user just needs to select one
  • vendor selection – the buyer can search for vendors by group, category, location, etc. and add them one at a time or in groups
  • dates: clarification questions, bids, follow-ups (if requested), notifications, etc.
  • review criteria: techno only – select the template that will be used for product/services/vendor review and scoring

Note that since the requisitions can be pushed in by the ERP, they can range from a requisition for a single item to a requisition for a complete bill of materials, each item or part can be associated with its own cost breakdown table defined in the EffiGO platform, each part can have its own associated documents, including drawings and detailed product specifications, which can be included in the ERP push, pulled in from the EffiGO library, or even pulled in from a (n optional) PLM integration, and the cost tables can also include service cost rate tables as well. To make bidding easy for the suppliers, the bid sheets can be pulled down into Excel (and then re-uploaded), and that can be done on a product or event basis (and then the workbook will be multi-tab if different cost models are required for different parts and/or service rate cards).

If the sourcing event is being awarded on commercial terms only, then the application will select the lowest bids at the part, bundle (grouping), or RFQ level for award, and if the buyer approves, the award selection(s) can be output for offers, letters of intent, and contract negotiations, one per supplier. If the sourcing event is on commercial and technical, the commercial are auto-scored and the buyer scores the technical components, and then the award can be auto-computed in the application according to the award level.

Once a contract has been signed, it can be uploaded with all of the terms and conditions defined (and all meta-data from an associated event can be associated with the contract), and custom completion requirements can be specified in the meta-data to make sure that all POs go out with those requirements (and they are not forgotten — more on this in our discussion of the PO to Pay module).

PO to Pay

The PO to Pay component consists of the creation of the purchase orders, the management of the purchase order and assurance of contract terms and conditions, the management of associated communications (acknowledgements, change requests, ASNs, etc.), the acceptance of the invoices against the orders, the processing and approvals, and the creation of an ok-to-pay push notification to the payment system.

When a buyer is ready to place an order, the buyer can create a purchase order:

  • off of an RFQ
  • off of one or more catalog items which may or may not be under contract (but are approved for purchase)

As with sourcing, if the buyer selects an item that is already in inventory or under contract (and can be requisitioned without any approvals), the system will inform the buyer.

As with any other system, a purchase order consists of items, units, approved pricing, delivery locations, dates, and other key pieces of information. Unlike other systems, the buyer can specify a full host of requirements that must be met before the PO can be issued, acknowledged, and dispatched against which include, but are not limited to:

  • whether an Ack(nowledgement) is required
  • whether acceptance is mandatory
  • whether an ABG (Advanced Bank Guarantee) is required
  • whether a [C]PBG ([Contract] Performance Bank Guarantee) is required
  • whether a LC (Letter of Credit) is required
  • whether the vendor needs to submit any technical documentation
  • whether the requesting buyer needs to provide the vendor with any instructions or documents
  • whether stage monitoring is required (and what the stages are; these can be selected from pre-configured or PLM lists)
  • whether transportation is in the scope of the buyer or vendor
  • whether the vendor is required to submit dispatch instructions
  • other potential organizational specific requirements around purchase orders (for certain products, services, or categories)

When a vendor receives the purchase order, they also receive all of the associated documents and information provided by the buyer along with all the instructions they need to follow and requirements they need to meet to make a delivery AND get paid for it.

Once a vendor has dispatched (part of) a purchase order (which is also tracked against an RFQ to make sure that they never dispatch more units than they have been approved for), they can submit an invoice, which is associated with the order, which goes into an approval queue. Approval chains can be configured to be as simple, or complex, as needed, with as many steps as necessary.

Catalogs are buyer maintained. Suppliers can upload and submit catalogs to the buyer, but they don’t go live until approved by the buyer, who can accept or reject items and pricing. Once awards have been made and/or contracts have been signed after the issuance of a sourcing event, the buyers can create catalog items with the details and pricing, and mark them as under contract if a contract is signed or the rates are approved (if the supplier is willing to honour the quotes in the latter case).

Catalog items can have as many buyer standardized fields as needed to completely specify the item, which can be searched by type, category, supplier, location, status, and keywords against key fields. All items can be associated with their proper place in the organizational category hierarchy, which can be as deep as required. (Note that vendors can identify the categories they service up to Level 4 in their profile.)

Vendor Management

Required vendor information management is embedded throughout the process and is included with both of the core modules and includes vendor onboarding as well as ongoing information management, reviews, status updates (which can block on a category, unit, or organizational level), and insights (through the built-in reporting).

Vendors can be loaded from the ERP on implementation or created inside the platform. Vendor profiles in EffiGO consist of basic corporate details (type, corporate id, taxation registration, primary category, HQ, etc.), deep business details (registered and correspondence details, production locations, etc.), financial info, registration & certifications (statutory, documents, etc.), sustainability information, declarations, and audit log. Additional forms can be configured on implementation to capture any additional information that the buyer needs to track.

In addition, the buyer can maintain the vendor status and whether or not they are approved on a division, or even category basis. Unapproved vendors can be invited to events by an authorized user, but cannot be sent POs, or approved for payment.

Vendor Portal

A vendor has their own portal to interact with the buyers on the EffiGO platform. While they will get email notifications of every sourcing event, change, award, contract offer, purchase order, change, information request, etc., many actions will need to be taken through their portal (for which they will get a direct link to do so in the e-mail). This is because communications, acknowledgements, change requests, etc. need to be associated with the right event or purchase order, key documents need to be secure, and the organization needs to make sure invoices (with payment instructions) are not tampered with.

Summary

EffiGO is a very different kind of platform — one that was built to serve manufacturing clients in Construction, CPG, Automotive, IT, Pharma, and Chemicals from the ground up and one that ended up being a direct-focussed system that can also handle indirect, services, and complex project procurements as well! It’s a name you don’t know, but if you have a mix of direct, service, and indirect needs, one you should know — especially if you are based in EMEA where EffiGO is currently expanding to!

Technobug
Technobug
Technobug
Technobug

It puts the boom-boom into my heart (hoo-hoo)
It sends my soul sky-high
When the PR starts
Technobug into my brain (yeah, yeah)
Goes bang-bang-bang
‘Til my keys do the same

But something’s bugging me
Something ain’t right
My best friend told me
What he did last night
When I was sleeping in my bed
I was dreaming
But I should’ve been Sourcing instead

Wake me up for EffiGO-go
Don’t leave me hanging on like a yo-yo
Wake me up for EffiGO-go
I don’t wanna miss it when we hit that high
Wake me up for EffiGO-go
‘Cause I’m not planning on Sourcing solo
Wake me up for EffiGO-go
Lets get Sourcing tonight
I wanna hit that high, yeah yeah!

Why Are You Still Buying That Fancy New Piece of Software That

  • Could Get You Sued?
  • Increases The Chance You Will Be Hacked!
  • Could Result in a 100 Million Processing Error?
  • Could Shut Down Your Organization’s Systems for Days!
  • Helps Your Employees Commit Fraud?

If someone told you this when evaluating a piece of software, and asked if you wanted to buy it, I’m sure the vast majority of you would say HELL NO!

In which case I want you to please tell me, why are you all still riding the AI Hype Train, Buying, and Using Gen-AI everywhere?

It has already resulted in lawsuits and losses!
The Air Canada lawsuit over the Gen-AI chatbot is just one notable well publicized example.

AI systems are AI coded, and AI code has a much greater security risk
as it generates code using training repositories that contain large amounts of untested, unverified, and high risk code — generating code so full of security holes it’s a hacker’s dream! (See this great piece on the ACM on The Drunken Plagiarists.)

AI systems negotiate on the data they have
and with a single decimal point error and you could be paying 10X what you need to. Not to mention, they don’t always translate right. Remember, the Experimental AI DOGE used claimed an 8 Billion savings on an 8 Million contract!

Bad data generated by an AI system and fed into a legacy system with poor data validity checks can shut it down.
Plus, Gen-AI can also push out bad updates faster than any human can and you can easily have your own Crosslake situation!

Now it’s being used by employees to generate fake receipts
that look so real that, if the employee does a few seconds of research (to get the restaurant info, current menu prices, tax code, etc.), you can’t distinguish the generated image from the real thing. And, before you say “Ramp solves this”, well, it only does if the employee is lazy (which, let’s face it, is human nature, so you’ll catch about 90% of it). But what happens when a user strips the meta data which, FYI, can be as easy as taking a picture of the picture … oops! (And if you’re a hacker, running it through a meta data stripper/replacement routine is even easier as you’re just hotkeying a background task.)

AI is good. Gen-AI has its [limited] uses. But unrestricted and unhinged mass adoption of untested, unverified AI for inappropriate uses is bad. So why do you keep doing it?

Especially since it’s now proven it’s worse for you than some illegal drugs! (Source)

Features ARE NOT Applications; But Applications Require Features!

THE PROPHET recently asked What Procurement Tech Product Categories Were Really Just Features All Along? Which is a great question, except he cheated.

He cheated with the first 5!

  • Supplier performance management
  • Supplier quality management
  • Supplier information management / supplier master data management
  • Supplier diversity
  • Supplier risk management (not supply chain risk!)

We’ve known for years it should be one Supplier 360 solution! (Even though no one offers that when you consider all of the elements that should be there. Heck, none of them even offer the 10 basic CORNED QUIP requirements … in fact, good luck finding a solution that offers 5 of those requirements among the 100+ supplier management solutions).

He you cheated again with the next 3!

  • Should cost / cost modeling (for procurement, not design engineers)
  • RFX and reverse auctions (when not bundled with broader capabilities or services)
  • Sourcing optimization

We’ve also known for yours it should be cost-model and optimization backed sourcing (auction, RFX, hybrid, single source negotiation, etc.) … otherwise, it’s an incomplete solution. But only a fraction of the 80+ sourcing platforms offer true optimization (less than 10) and fewer still do extensive cost modelling. (Note that we are focussed on modelling, not cost estimation — that requires data, and that can, and probably should, be a third party data feed.)

And he was wrong on the last front.

Real Spend Analytics should be standalone. Wrapping restricts it! The modules you use should provide all the specific views you need, but the reason that spend analysis quickly becomes shelfware in most organizations today is the same reason it became shelfware 20 years ago … once you exhaust the limits of the interface its wrapped in, it becomes useless. Go back to the series Eric and I wrote 18 years ago (which you can since Sourcing Innovation didn’t delete everything more than a decade old when it had to change servers in 2024, unlike Spend Matters when it did its site upgrade in 2023).

But Very, Very right in that features are not applications!

And very, very right in that too many start-ups are launching today as features (which will only survive if acquired and rolled up into existing applications and platforms), and not solutions. While apps dominate the consumer world, in business there is not always an app for that, and, frankly, there shouldn’t be. This focus on point-based apps is ridiculous. It’s not features, it’s functions. It’s not apps, it’s platforms. It’s not orchestration (and definitely not spend orchestration), it’s ecosystems!

Recent stats, such as those published by Spendesk put the average number of apps a business uses at 371, with an average of 253 for SMBs and 473 for enterprise firms. WHAT. THE. F6CK? This is insane. How many departments does an average organization have? Less than 10. How many key functional areas? Less than 12. Often less than 10! How many core tasks in each function? Usually less than 6. That says, in the worst case, an enterprise might have 72 distinct critical tasks which might need their own application (but probably not). This says that SMBs have at least 3 times the app they should have, mid-size organizations at least 5 times, and enterprises at least 7 times. That is insane! No wonder there are so many carbon copy SaaS optimizers (as we covered in our piece on sacred cows), because if you have that many SaaS apps, you have features, not applications. And you need to replace sets of these with functional applications that solve your core problems.

(And if you want to know how to prevent app sprawl, before buying yet-another-app, ask yourself “is this supporting a function that should be done on its own, or just a task that should be part of an existing function” … if the latter, it’s a feature, not an application, and if the application it should be part of does not have an upgrade/module that supports the task, then you have the wrong application and it’s time to replace it, not pointlessly extend the ecosystem!)