Category Archives: Knowledge Management

With Great Data Comes Great Opportunity!

In fact, it can quadruple your ROI from a major suite.

Not long ago, Stephany Lapierre posted that your team may only be realizing <50% of the ROI from your Ariba or Coupa investment, to which, of course, my response was:

50% of value on average? WOW!

Let’s break some things down.

A suite will typically cost 4X a leaner mid-market offering which is often enough even for an enterprise just starting it’s Best in Class journey (that will take at least 8 years, as per Hackett group research in the 2000s).

Moreover, even if the enterprise can make full use of the suite it buys for 4X, at least 80% of the “opportunity” comes from just having a good process, technology, baseline capability and automation behind it. That says you’re paying 4X to squeeze an additional 20% worth of opportunity in the best case.

On average, it takes 2 to 3 years to implement a suite (on a 3 to 5 year deal). So maybe you’re seeing an average of 66% functionality over the contract duration.

As Stephany pointed out, bad data leads to

  • increased supplier discovery and management times
  • invoice processing delays and errors
  • increased risk and decreased performance insight

As well as an

  • inability to take advantage of advanced (spend) analytics
  • inability to build detailed optimization models
  • decreased accuracy in cost modelling and market prediction

This is even more problematic! Why? These are the only technologies found to deliver year-over-year 10%+ savings! (This is where the extra value a suite can offer comes from, but only with good data. Otherwise, at most half of the opportunity will be realized.)

Thus, one can argue an average organization is only getting 66% of 25% of 80% of its investment against peers (based on 2/3rd functionality, the 4X suite cost, and the baseline savings available from a basic mid-market application that instills good process and cost intelligence) and 50% of 20% (as it is able to take advantage of at most half of the advanced functionality offered by the suite due to poor and incomplete data). In other words, at the end of the day, we’d argue an average company is only realizing 23% of the potential value from an opportunity perspective!

However, as one should rightly point out, the true value of a suite is not the value you get on the base, it’s the ROI on that extra spend that allows for 20% more opportunity than a customer can get from lesser peer ProcureTech solutions.

For example, let’s say you are a company with 1B of spend with a 100M opportunity.

If tackling 20M of that opportunity requires advanced analytics, optimization, and extensive end-to-end data, it’s likely that you’ll never see that with an average mid-market solution with limited analytics, no optimization, and only baseline transactional data. If the company paid an extra 1.5M over 3 years for this enhanced functionality, then the ROI on that is 13X, which is definitely worth it.

Moreover, if the suite supports the creation of enhanced automations, you could get more throughput per employee and realize the base 80M with half or one quarter of the workforce, which would lead to a lowering of the HR budget that more than covers the baseline cost.

However, ALL of this requires great data, advanced capability, and the in-house knowledge to use both. This is only the case in the market leaders. As a result, we’d argue that the majority of clients are only realizing about 25% of the suite’s potential — when sometimes the only thing standing in their way of realizing the rest is good data.

Enterprises have a Data Problem. And they will until they accept they need to do E-MDM, and it will cost them!

This originally published on April (29) 2024.  It is being reposted because MDM is becoming more essential by the day, especially since AI doesn’t work without good, clean, data.

insideBIGDATA recently published an article on The Impact of Data Analytics Integration Mismatch on Business Technology Advancements which did a rather good job on highlighting all of the problems with bad integrations (which happen every day [and just result in you contributing to the half a TRILLION dollars that will be wasted on SaaS Spend this year and the one TRILLION that will be wasted on IT Services]), and an okay job of advising you how to prevent them. But the problem is much larger than the article lets on, and we need to discuss that.

But first, let’s summarize the major impacts outlined in the article (which you should click to and read before continuing on in this article):

  • Higher Operational Expenses
  • Poor Business Outcomes
  • Delayed Decision Making
  • Competitive Disadvantages
  • Missed Business Opportunities

And then add the following critical impacts (which is not a complete list by any stretch of the imagination) when your supplier, product, and supply chain data isn’t up to snuff:

  • Fines for failing to comply with filings and appropriate trade restrictions
  • Product seizures when products violate certain regulations (like ROHS, WEEE, etc.)
  • Lost Funds and Liabilities when incomplete/compromised data results in payments to the wrong/fraudulent entities
  • Massive disruption risks when you don’t get notifications of major supply chain incidents when the right locations and suppliers are not being monitored (multiple tiers down in your supply chain)
  • Massive lawsuits when data isn’t properly encrypted and secured and personal data gets compromised in a cyberattack

You need good data. You need secure data. You need actionable data. And you won’t have any of that without the right integration.

The article says to ensure good integration you should:

  • mitigate low-quality data before integration (since cleansing and enrichment might not even be possible)
  • adopt uniformity and standardized data formats and structures across systems
  • phase out outdated technology

which is all fine and dandy, but misses the core of the problem:

Data is bad (often very, very bad), because the organizations don’t have an enterprise data management strategy. That’s the first step. Furthermore this E-MDM strategy needs to define:

  1. the master schema with all of the core data objects (records) that need to be shared organizational wide
  2. the common data format (for ids, names, keys, etc.) (that every system will need to map to)
  3. the master data encoding standard

With a properly defined schema, there is less of a need to adopt uniformity across data formats and structures across the enterprise systems (which will not always be possible if an organization needs to maintain outdated technology either because a former manager entered into a 10 year agreement just to be rid of the problem or it would be too expensive to migrate to another system at the present time) or to phase out outdated technology (which, if it’s the ERP or AP, will likely not be possible) since the organization just needs to ensure that all data exchanges are in the common data format and use the master data encoding standard.

Moreover, once you have the E-MDM strategy, it’s easy to flush out the HR-MDM, Supplier/SupplyChain-MDM, and Finance-MDM strategies and get them right.

As THE PROPHET has said, data will be your best friend in procurement and supply chain in 2024 if you give it a chance.

Or, you can cover your eyes and ears and sing the same old tune that you’ve been singing since your organization acquired its first computer and built it’s first “database”:

Well …
I have a little data
I store it on my drive
And when it’s old and flawed
The data I’ll archive

Oh, data, data, data
I store it on my drive
And when it’s old and flawed
The data I’ll archive

It has nonstandard fields
The records short and lank
When I try to read it
The blocks all come back blank

I have a little data
I store it on my drive
And when it’s old and flawed
The data I’ll archive

My data is so ancient
Drive sectors start to rot
I try to read my data
The effort comes to naught

Oh, data, data, data
I store it on my drive
And when it’s old and flawed
The data I’ll archive

Follow the Money — To Find the Spigots that can Turn it Off!

A recent CPO Crunch article over on Procurement Leaders said to Follow the Money as a focus on profit contribution can provide a starting point for improving supply chain transparency.

The article states that having knowledge of our suppliers is one thing, but it’s quite another to have a good understanding of who are suppliers’ suppliers are … not to mention those even further beyond and in a complex, risk-riddled world, such visibility is crucial and can bring meaningful competitive advantage.

In other words, following the money can increase profitability by allowing you to optimize the flow. Which is true, but only half the picture.

The other half is how the flow can be diverted or stopped. Two important things to remember about money flows. First, if these money flows present an opportunity for you, they present an opportunity for others. Not just outright theft of money (or product), but skimming, fraudulent billings/overpayments/handling fees (or your goods don’t move), and even fraudulent good substitution (with knockoffs). Secondly, if any input to any of these flows stops (beyond your visibility), the entire flow stops. And these flows could stop 6 levels down at the source.

For example, let’s say you are in medical device manufacturing or microwave-based manufacturing. Then you need thulium, which is one of the rarest rare earth minerals in the world. If a mine closes, even temporarily, and that mine is the only source of supply into your raw material or component supplier (that produces your enclosed radiation source or manufacturing ferrites), what do you think is going to happen? Production will stop, and your inventory will disappear. Or if you need a custom chip for the control system in your high end electric car, and the one plant currently capable of producing it experiences a fire. (This HAS happened, and chip shortages have been responsible for MULTIPLE shortages in MULTIPLE automotive lines. Just Google it.)

If your only production is in a country with geopolitical instability or deteriorating relations with your country, and borders (temporarily) close, what happens? And so on. If you don’t know the myriad of ways the spigots can be turned off, it doesn’t matter how well you know, or optimize, the money flow. These days, it’s all about risk management, visibility, and quick reaction if a spigot gets turned off to get it reopened again.

opstream: taking the orchestration dream direct!

INTRODUCTION

opstream was founded in 2021 to bring enterprise level orchestration, previously only available to IT, to Procurement in a manner that supported all of Procurement’s needs regardless of what that need was. In the beginning, this was allowing Procurement to not only create end-to-end workflows from request through requisition through reaping, but also create workflows out into logistics tracking, inventory management, and supply chain visibility, as required. Workflows that incorporated not just forms, but processes, intelligence, and, most importantly, team collaboration.

In other words, not much different than most of the other intake-to-orchestration platforms that were popping up faster than the moles in whack-a-mole at an initial analysis, especially since every platform was, and still is, claiming full process support, limitless integration, and endless collaboration while bringing intelligence to your Procurement function. (There was one big difference, and we’ll get to that later.)

However, unlike many founders who came from Procurement and assumed they knew everything that was needed, or from enterprise IT orchestration solutions who thought they knew everything Procurement needed, the founders of Opsteam admitted day one they didn’t know Procurement, interviewed over 400 professionals during development, and also realized the one thing that their orchestration platform had to do when it came to Procurement intake and orchestration, if the platform was to be valuable to their target customers, was direct. And they were 100% right. Of the 700+ solutions out there in S2P+ (see the Sourcing Innovation MegaMap), less than 1/20th address direct in any significant capacity, and none of the current intake-to-orchestrate platforms were designed to support direct from the ground up on day one.

So what does opstream do?

SOLUTION SUMMARY

As with any other intake-to-orchestrate platform, the platform has two parts, the user-based “intake” and the admin-based “orchestrate”.

We’ll start with the primary components of the admin-based orchestrate solution.

Intake Editor

The intake editor manages the intake schemas that define the intake workflows which are defined by request type. In opstream, an intake workflow will contain a workflow process for the requester, approver(s), vendor, and the opstream automation engine as well as a section to define the extent to which the approvers can impact the workflow.

The workflow builder is form based and allows the form builder to build as many steps as needed, with as many questions of any type as needed, using pre-built blocks that can accelerate the process, any and all available data sources for request creation and validation, and any of these sources, from all integrated systems, can also be used for conditional logic validations. This logic can be used to determine whether or not a step, or a question, is shown, or what values are accepted, or if it will influence a later question on the form.

Building the workflow is as easy as building an RFX as a user is just selecting from a set of basic elements such as a text block, numeric field, date/time object, multiple choice, data source, logic block, etc.

The data source allows a user to select any object definition in the data source, select an associated column, and restrict values to entries in that column. And the form will dynamically update and adjust as the underlying data source adjusts.

In addition to having workflows that adjust as related systems and data sources change, the platform also has one other unique capability when it comes to building workflows for Procurement requests. It understands multiple item types: inventory item, non-inventory item, and service — which are understood by many platforms — other charge, which is a capability for capturing non PO spend that only a few deep Procurement platforms understand, and, most importantly, assembly/bill of materials, which is an option the doctor hasn’t seen yet (and which enables true direct support).

As long as the organization has an ERP/MRP or similar system that defines a bill of materials, then the opstream platform can model that bill of materials and allow the administrators to build workflows where the manufacturing department can request orders, or reorders, against part, or all, of the bill of material.

In addition, if the organization orders a lot of products that need to be customized, such as computer/server builds, 3D printer assemblies, or fleet vehicles, the admins can define special assembly / configurator workflows that can allow the user to specify every option they need on a product or service when they make the request.

The approval workflows can be as sparse or as detailed as the request process, and can have as few or as many checks as desired. This can include verifications against budgets, policies, and data in any integrated system. As with any good procurement systems, approvals can be sequential or parallel, restricted to users or opened to teams, and can be short circuited by super-approvers.

In addition, workflows can also be setup for vendors to get requests from the buyer, provide information, and execute their parts of the workflow, including providing integration information to their systems for automatic e-document receipt, transmission, update, and verification.

Finally, the automation workflow can be setup to automate the creation and distribution of complete requisitions for approval, complete purchase orders for deliveries, the receipt and acknowledgement of vendor order acknowledgements and advanced shipping notices and invoices, the auto-transmission of ok-to-pay and payment verifications.

But it doesn’t have to stop there. One big differentiator of the opstream platform is because it was built to be an enterprise integration platform at the core is that — as we’ve already hinted about in our discussion of how, unlike pretty much every other intake/orchestrate platform, it supports assembly/bill of materials out of the box by integrating with ERPs/MRPs — it doesn’t have to stop at the “pay” in source-to-pay. It can pull in the logistics management/monitoring system to track shipments and inventory en route. It can be integrated with the inventory management system to track current inventory and help a Procurement organization manage requisitions against inventory and guide buyers when inventory needs to be replenished. It can also integrate with quality management and service tracking systems to track the expected quality and lifespan of the products that come from inventory and warn buyers if the quality or number of service issues is increasing at the time of requisition or reorder.

Data Source Manager

opstream comes with hundreds of systems integrated out of the box, but it’s trivial for opstream to add more platforms as needed (as long as those platforms have an open API) as the opsteam platform has built-in data model discovery capabilities and support for the standard web connection protocols. This means that adding a new data source is simply a matter of specifying the connection strings and parameters and the source will be integrated and the public data source auto-discovered. The admin can then configure exactly what is available for use in the opstream solution, who can see/use what, and the synch interval.

Now we’ll discuss the primary components of the buyer-based orchestration solution.

Requests

The request screen centralizes all of the requests a user has access to which include, but are not limited to, requests created by, assigned to, archived by, and departmentally associated with the user. They can be filtered by creator, assignee, status, request type, category, time left, date, and other key fields defined by the end user organization.

Creating a new request is simple. The user simply selects a request type from a predefined lists and steps through the workflow. The workflows can be built very intelligently such that whenever the user selects an option, all other options are filtered accordingly. If the user selects a product that can only be supplied by three vendors, only those vendors will be available in the requested vendor drop down. Alternatively, if a vendor is selected first, only the products the vendor offers will be available for selection. Products can be limited to budget ranges, vendors to preferred status, and so on. Every piece of information is used to determine what is, and is not, needed and make it as simple as possible to the user. If the vendor is not preferred or the product not preferred, and there is a preferred vendor or product, the workflow can be coded to proactively alert before the request is made. The buyer can also define any quotes, certifications, surveys, or other documentation required by the supplier before the PO is cut. (And then, once the requisition is approved, the vendor work stream will kick-off). And, once the vendor work stream is complete, the final approvals can be made and the system will automatically send the purchase order and push the right documentation into the right systems.

Vendors

Vendors provides the user with central access to all loaded organizational vendors and provides a quick summary of onboarding/active status, preferred status, number of associated products, # of associated requests, and total spend. Additional summary fields can be added as required by the buying organization.

Documents

Documents act as a central document repository for all relevant vendor, product, and Procurement related information from request through receipt, vendor onboarding through offboarding, and product identification through end-of-life retirement of the final unit. Documents have categories, associated requests, vendors, and/or products, status information, dates of validity and other metadata relevant to the organization. Documents can be categorized into any categorization scheme the buying organization wants and can include compliance, insurance, NDAS, contracts, security reports, product specifications, product certifications, sustainability reports, and so on.

Analytics

The analytics component presents a slew of ready made dashboards that summarize the key process, spend, risk, compliance, and supply chain inventory metrics that aren’t available in any individual platform. Right now, there is no DIY capability to build the dashboards and all have to be created by opstream, but opstream can create new custom dashboards really fast during rollout and you can get cross-platform insights that can include, but not be limited to:

  • process time from purchase order (e-Pro) to carrier pickup to warehouse arrival (TMS) to distribution time to the retail outlets (WIMS)
  • contract price (CMS) to ePro (invoice) to payment (AP) as well as logistics cost (invoice) to tax (AP) to build total cost pictures for key products relative to negotiated unit prices
  • risk pictures on a supplier that include financial (D&B), sustainability (Ecovadis), quality (QMS), failure rate (customer support), geolocation (supply chain risk), geopolitical risk (supply chain risk), transportation risk (OTD from the TMS), etc.
  • compliance pictures that pull data from the insurer, regulatory agencies, internal compliance department, and third party auditor
  • supply chain inventory metrics that include contractual commitment (CLM), orders (ePro), fulfillments (inventory management), current inventory (inventory management), commitments (ERP/MRP), etc.

In addition, since all data is available through the built-in data bus, if the user wants to build her own dashboards, she can push all of the data into a (spend) analytics application to do her own analysis, and with opstream‘s ability to embed third party analytics apps (PowerBI for now, more coming), the user can even see the analytics inside the opstream platform.

This is the second main differentiator of opstream a user will notice. The founders realized that not only is data key, so is integrated analytics and they built a foundation to enable it.

Which leads us to the third and final differentiator you don’t see, the data model. The data model is automatically discovered and built for each organization as their systems are integrated. Beyond a few core entities and core identifiers upon which the data model is automatically built and generated, opstream doesn’t fix a rigid data model that all pieces of data need to map to (or get left out of the system). This ensures that an organization always has full access to all of their integrated data upon which to do cross-platform analytics on process, spend, inventory, and risk.

CONCLUSION

opstream understands there is no value in intake or orchestration on its own, and that for it to be relevant to Procurement, they have to do more than just connect indirect S2P systems together. As a result, they have built in support for direct, dynamic data model discovery, and integration with end-to-end enterprise systems that power the supply chain, allowing an organization to go beyond simple S2P and identify value not identifiable in S2P systems alone. As a result, they should definitely be on your shortlist if you are looking for an integration/orchestration platform (to connect your last generation systems to next generation systems through the cloud) that will allow you to increase overall system value (vs. just increasing overall system cost).

Procurement Leaders Listen to Roxette!


How do you do (do you do) the things that you do?
No one I know could ever keep up with you
How do you do?
Did it ever make sense to you …

A recent article over on Procurement Leaders asks CPOs why do you do and notes that a recent exercise they’ve been carrying out has been to ask CPOs to share the value propositions they have in place for their function.

Procurement Leaders’ goal was to force extremely busy people to take a step back and think deeply about why they do what they do. What are the ultimate goals of those negotiations with suppliers? Why are they spending time building relationships with certain suppliers and not others? Where should scarce resources and investment dollars be spent? This is because while a value proposition for a Procurement department is not an easy thing to produce and even more challenging to agree and implement, the provocation can allow a Procurement Department to get back to strategy, think about how our decisions affect our stakeholders, suppliers and the communities we do business in.

And while a Procurement department should understand its value proposition, because it helps it focus and relay its value, getting everyone in the organization to agree can be a very extensive effort and extremely time consuming. Furthermore, when you consider the possibility that the “value proposition” ultimately agreed on could be such a mish-mash of different viewpoints and demands to the point that it adds absolutely no value whatsoever, just like a corporate “mission statement” when everyone gets to add their bit to it (and the end result is no different than what the Dilbert Mission Statement Generator used to generate).

However, if you look at the example questions Procurement Leaders’ quoted, you realize that while a vision might be a good goal, a better effort, or at least a better way to start, is to ask the C-Suite to outline it’s top goals for the year, and then for the Procurement organization to identify the best ways they can meet those goals. From there they can identify: which categories should be strategically sourced, which products or services are critical for them, which suppliers are likely critical, and then, for each project, define the value and the goal and not spend effort building relationships with suppliers who are supplying tactical products or services that can be just as easily obtained from the next three lowest bid suppliers and instead spend time developing relationships with suppliers who are critical, even if the overall spend is low. For example, control chips in cars and power regulation systems are extremely critical and often only (capable of) being produced by a few suppliers due to highly specific requirements or proprietary natures. Compared to the costs of the steel, the transmission, the engine and/or the batteries, and even the tires, the total spend might not even register when the chips are only a couple of dollars each — but if a supplier failure, logistics delay, or raw material shortage shuts down your entire production line because you didn’t see a shortfall coming and either work with your supplier to build up an inventory or work with the backup supplier to allow production to be ramped up quickly, hundreds of millions of dollars in revenue could be at stake.

Furthermore, no effort should be spent “strategically” sourcing a product or category where the payback isn’t at least 3X the cost of the manpower required to do so. If an automated multi-round RFX with automated feedback or a reverse auction will get you 99% of the savings and the last 1% won’t even pay for 3X the salary and overhead of the buyer, it’s just not worth it if this prevents the organization from sourcing a lower cost category with a 5% savings potential through better analysis and negotiation. Know the value, define the value, and only put effort in where there is real value to be gained. Otherwise, use appropriate automation or redefine categories and projects. (Definitely don’t go nuts and RFQ everything, because even the squirrels will know you’re nuts if you do. But maybe do some overarching sourcing or negotiation that you can just cut POs or one-time orders against for a year. Sometimes just negotiating for 20% off of lowest list price in a 30 day window [and carefully tracking and documenting those prices to prevent invoice overcharges] is enough to automate catalog orders.)

And similar logic applies to all Procurement (related) activities. While machines can’t replace procurement professionals, they can take over the tasks where their intervention doesn’t add value. That’s the point. So think before you act, and act appropriately.