Category Archives: Decision Optimization

Coupa: Comprehensive Optimization Underlies Procurement Assurance: Coupa Supply Chain Solutions

We’ve never covered Coupa Supply Chain Solutions (for Design and Planning), formerly known as Llamasoft, here on Sourcing Innovation, but the doctor did contribute to some of the coverage over on Spend Matters, including the acquisition coverage (Functional Overview, Overlap Between Direct Procurement and Supply Chain, and Procurement, Finance, and Supply Chain Use Cases [Content Hub Subscription Required]) in 2020. Llamasoft / Coupa Supply Chain Design and Planning has also been more recently covered by Spend Matters’ Pierre Mitchell as part of his analysis of Coupa for Supply Chain Management overall. For those interested with a ContentHub subscription, see his pieces on Can Coupa manage supply as well as spend?, Coupa’s journey from Business Spend Management to Supply Chain Management: Assessing progress on seven key dimensions, and From Spend to Supply — Coupa’s direct spend management progress.

Coupa Supply Chain Solutions consists of four main offerings:

  • Supply Chain Modeller: the core solution, that can be used offline on the desktop (Supply Chain Guru) as well as online in the cloud, where you build network, inventory, and transportation models for optimization and exploration through the dynamic reporting and dashboard creation module; note that the online version can process multiple “what-if” optimization models simultaneously
  • Supply Chain App Studio: the online solution which allows users to build custom interfaces to the underlying model that can be, if desired, custom designed for different user types (procurement, logistics, demand planners, etc.) and then shared with those users who can use the app for regular analysis and what-if optimization
  • Demand Modeller: for demand modelling and forecasting — not covered in this article
  • Supply Chain Prescriptions: uses machine learning and AI to identify savings opportunities from changes to transportation and inventory models, as well as to identify risk mitigation scenarios, based upon the current supply chain design

In this article we are going to primarily cover the capabilities of the Modeller / App Studio and the Prescriptions which is the core of their supply chain (design and planning) solution suite.

The Modeller has three primary components:

  • Model : where you build the models
  • Explore : where you build what-if scenarios, that are then optimized
  • Results : the outputs of the what-if scenarios

Model building is quite easy. It’s simply a matter of selecting, or uploading, a set of data tables for each relevant supply chain entity. They can be pulled in from a relational database or from a CSV file in standard row-based column format. As long as the column headers have standard field names, the SCP solution can auto detect what entity the table represents (warehouse, lane, transportation mode, etc.) and what data is provided on the entity. It understands all the common elements of supply chain modelling, common names and representations, and appropriate business rules that can do all of the auto mappings.

When you pull in a table, and it does the mappings to the standard internal models, it also automatically analyzes and validates the data. It makes sure all entries are unique, key values required for the types of analysis supported are there (such as coordinates for warehouses, costs per distance for transportation modes, stock levels and associated product requirements for inventory), etc. and flags any conflicting, missing, or likely erroneous data for user review and correction.

When you go to build a scenario, it understands what is required in the base underlying model and validates that all of the necessary data is present. If data is missing, it warns you and gives you a chance to provide the missing data. (Furthermore, as you add constraints to the scenario, the platform understands the data is required and ensures that data is present as well before it tries to run the scenario.)

The application was designed for ease of use and speed, tailored for automating most of the model building process for standard network/inventory/transportation scenarios (including setting parameters and defaults) so that standard models can be built for analysis quick and easy (and it is also quick and easy to change or override any default as needed).

Explore provides the capability where you build scenarios for what-if? exploration.

Building scenarios is simple. You simply select the scenario requirements, or constraints, from a set of existing, or newly created, scenario items that define the parameters of the scenario. For example, for a network optimization, you might want to explore limiting the number of existing distribution hubs or adding more proposed nodes to see if you can reduce cost, carbon, or distribution time. For transportation, you might want to explore adding in rail to a network that is currently all truck to see if you can decrease cost. For inventory, you might want to reduce the number of locations where safety stock for rarely used components is stored (so you can limit the number of locations with a low turn rate and minimize the warehouse size/footprint you need at those locations) and see what happens and so on. Each scenario is built from a set of specifications that specify the restrictions that you want to enforce, which could even be a reduction in the current number of restrictions. These restrictions can be on any entity, or relationship. One can also create scenarios to explore how the network will change under different circumstances, such as demand change, cost change, or disruption. Selecting is a simple point-and-click or drag and drop exercise.

Once you’ve created the scenario(s) of interest (remembering that you can optimize multiple simultaneously in the online version), you launch them by selecting the type of optimization (the “technology”), the sub-type of optimization (the “problem type”), the horizon (the timeframe you want to analyze), and, optionally, override default parameters (if you don’t want to do a cost optimization but instead want to optimize carbon, service level, fulfillment time, risk, etc.). Then you run the scenario, and once the optimization engine works its math, you can explore the results.

The Model supports:

  • Network Optimization
    • Standard Network Optimization
    • … with Network Decomposition
    • … with Infeasibility Diagnosis
    • Greenfield Analysis
    • Cost-to-Serve Analysis
  • Inventory Optimization
    • Safety Stock
    • Safety Stock & Service Level
    • Safety Stock & Rolling Horizon
    • Safety Stock Infeasibility
    • Service Level
    • Rolling Horizon
    • Rolling Horizon Validation
    • Demand
  • Transportation Optimization
    • Transportation – Standard
    • Transportation – Interleaved
    • Transportation – Hub
    • Transportation – Periodic
    • Transportation – Backhaul
    • Transportation – Backhaul Matching
    • Driver Scheduling

In short, it’s a very extensive network, inventory, and transportation optimization modelling solution out of the box that makes it really easy for supply chain and procurement analysts to build scenarios and solve them against all of the traditional models (and variants) they would want to run. (And if your particular variant isn’t out of the box, the SCP team can code and add the variant into your deployment as the underlying solution was built to allow for as many models as was needed as well as unlimited scenarios on those models.) Note that, by default, the platform will always run the baseline scenario so you have a basis for comparison.

Results, which are output in the form of results tables, can then be analyzed in table form (by selecting the output table), graph form (by accessing the graphs), map mode (by accessing the map), or as a built-in or custom report/dashboards that the analyst can create as needed. For every type of analysis in the system, SCP includes a default set of dashboards for exploring the data set, which adapt to not only the type and subtype of optimization that was run, but the goal (objective function) as well. So if you did a cost optimization scenario, they summarize the costs. If you did a carbon optimization scenario, they summarize the carbon. If you did a service level optimization, they summarize the service level. If you did a carbon optimization relative to a maximum cost increase, they summarize the carbon and cost (and the relationship). In their platform, if you optimize one element or KPI, you can see the impact on all of the other costs and KPIs as all of the associated data is also output for analysis.

There is an output table for all elements which can be analyzed in detail, but most users prefer graph or map view on the relevant data.

Views provide custom, tabular, reports on the relevant fields of one or more tables, which can be exported to csv or pushed to another application for planning purposes. For example, if the model was a network optimization model, you can create a view that outputs the new distribution centres and fulfilment lanes for the revised network and push that to the TMS (Transportation Management System). If it was a transportation optimization model, you can output a table that specifies the carrier and rate for each lane, or, if necessary, each lane product combination and push that into the TMS. If it was a safety stock optimization model, you can output the product, location, minimum stock levels, and reorder points and push that into the Inventory Management or ERP system. And so on. There are default views for cost, carbon, service level, demand, and inventory optimizations, along with drill ins for relevant types of cost (site, production, by transportation type, etc.), but it is quite easy for a user to create a view on any table, or set of tables, with derived fields, with the view editor.

Graphs summarize the data in tables or views graphically, allowing for easy visual comparison. Select the scenario, select the data, select the graph type, and there’s your graph. They are most useful as components in dashboard summaries.

Maps provide a visual representation of the supply chain network — warehouses/distribution centers, customer locations, transportation lanes — overlaid on a real-world map with the ability to filter into particular supply chain network elements. There is a default map for the full network overview, and it can be copied and edited to just display certain elements.

Dashboards group relevant elements, such as a map of the current distribution network, a map of the optimized distribution network, a graphic summary of current distribution costs, a graphic summary of new distribution costs, and tabular (view-based) cost, carbon, and service level comparisons as the result of a supply chain network optimization scenario. These are typically custom built by the analyst to what is relevant to them.

Prescriptions, only in the online version of Supply Chain Modeller, are based on the 22 years of experience the SCP team has in building and analyzing models and uses advanced ML, simulation, and AI to automatically identify potential cost savings, and risk reductions and presents rank-ordered opportunities for you in each category, which you can drill into and explore. This solution automatically generates dozens (upon dozens) of scenarios and performs hundreds (or thousands) of analyses to automatically bring you actionable insights that you can implement TODAY to improve your network.

These savings will be grouped by type for easy exploration. For example, when it comes to cost savings, these will often be obtained by node skipping, mode switching, or volume consolidation — and the prescriptions module will summarize the prescriptions in each category, as well as summarizing the relative total savings of each category. A user can accept or reject each (sub) set of prescriptions, and then export all of the accepted prescriptions into new route definition records that can be imported into the TMS.

Note that the analysis that underlies the prescription analysis is very detailed, and in addition to the prescriptions, the platform will also identify the top network factors that are impacting the transportation costs, such as fleet distance, unique modes, certain carriers, country, etc.

When a user drills in, she sees the complete details of the prescription, including the before and after. In the node skipping example, they will see the current distance, products, quantities, (total) weight and volume, and current rates and then will see these in comparison to the new distance, new rates, and new costs. The old and new routes will be mapped side by side. The old and new lanes will be detailed.

The out of the box network risk summary for revenue at risk is quite impressive. The platform is able to compute the overall network revenue AND network profit at risk based on single sourced site-products, % of flow quantity single sourced, avg. end-to-end service times, and impacted paths. It will then do analyses to identify potential risk mitigation improvements allowing for 5%, 10%, and 15% network change (based on how product flows through the network with the current design) and compute the corresponding change in revenue and profit at risk as a result of those changes as well as the change in network cost. Usually the cost will increase slightly, but not always. For example, it could be that you could reduce the revenue at risk by 5% just through a supplier reallocation and network redesign, and if you were really risk averse, it could be that a 1% increase in network cost could result in an 8% to 10% decrease in revenue, and profit, at risk. And that could be the cheapest supply chain insurance you can buy.

Of course, you can drill into each model, the prescriptions, and the risk reduction with each individual change. It’s an extremely powerful tool.

Another thing that is really powerful in Coupa Supply Chain Solutions is the specific applications they can enable in the online App Studio, including the Cost-to-Serve App (which is just one example of the custom interfaces that can be built) that is one of the most complete dynamic dashboards for network insights that the doctor has ever seen. A summary can’t do it justice, but to whet your appetite to be sure you ask to see it in a demo, it has a full set of meaningful baseline KPIs, a visual network and flow summary, deep details on product costs and profitability, deep details on lanes and transportation costs, and so on. You can also quick-select a scenario to run and compare against the baseline in the app. It’s extremely well thought out.

Furthermore, you can build scripts in the App Studio to rebuild and run models on a schedule when you have a network in flux (because of disruptions, supply base changes, network changes as a result of prescriptions, etc.). And, of course, you can share these models and apps and dashboards with other analysts and democratize supply chain planning, easily enabling planners to analyze their own scenarios and make decisions collaboratively in a user-friendly App.

In short, Coupa has fulfilled the supply chain use cases we identified back in 2020 in Procurement, Finance, and Supply Chain Use Cases. It’s a great solution that you should check out, especially if you would like to have procurement and supply chain under one umbrella.

Even Forbes is Falling for the the Gen-AI Garbage!

This recent article in Forbes on the Supply Chain Shift to Intelligent Technology is what inspired last week’s and this week’s rant because, while supply chains should be shifting to intelligent technology, the situations in which that is Gen-AI are still extremely rare (to the point that a blue moon is much more common). But what really got the doctor‘s goat is the ridiculous claims as to what Gen-AI can do. Claims with are simultaneously maddening and saddening because, if they just left out Gen-AI, then everything they claimed is not only doable, but doable with fantastic results.

Of the first three claims, Gen-AI can only be used to solve one — and only partially.

Procurement and Regulatory Compliance
This is one example where a Closed Private Gen-AI LLM is half the battle — it can process, summarize, and highlight key areas of hundred page texts faster and better than prior NLP tech. But it can’t tell you if your current contracts, processes, efforts, or plans will meet the requirements. Not even close. In fact, no AI can — the best AI can just indicate the presence or absence of data, processes, or tech that are most likely to be relevant and then an intelligent human needs to make the decision, possibly only after obtaining appropriate expert Legal advice.
Manufacturing Efficiency
streamline production workflows? optimize processes? reduce errors? No, Hell No, and even the Joker wouldn’t make that joke! You want streamlining? You first have to do a deep process cycle time analysis, compare it to whatever benchmarks you can get, identify the inefficiencies, identify potential processes and tech for improvement, and implement them. Optimize processes? Detailed step by step analysis, identification of opportunities, expert process redesign, training, implementation, and monitoring. Reduce errors? No! People and tech do the processes, not Gen-AI — implement better monitoring, rules, and safeguards.
Virtual Supply Collaboration
A super-charged chatbot on steroids is NOT a virtual assistant. Now, properly sandwiched between classical AI and rules-based intelligence it can deal with 80% of routine inquiries, but not on its own, and it’s arguable if it’s even worth it when a well designed app can get the user to the info they need 10 times faster with just a couple of clicks. Supply chain communicating? People HATE getting a “robot” on a support line as much as you do, to the point some of us start screaming profanities at it if we don’t get a real operator within 10 seconds. Based on this, do you really think your supplier wants to talk to a dumb bot that has NO authority to make a decision (or, at least, should NEVER have the authority — though the doctor is sure someone’s going to be dumb enough to give the bot the authority … let’s just hope they can live with the inevitable consequences)?

And maybe if the article had stopped there the doctor would let it pass, but
first of all, it went on to state the following for “AI”, without clarifying that Gen-AI doesn’t fit in the process, leading us to conclude that, since the first part of the article is about Gen-AI, this part is too, and thus is totally wrong when it claims that:

“AI” understands dirty data
with about 70% accuracy where it counts IF you’re lucky; that’s about how accurate it is at identifying a supplier from your ERP/AP transaction records; an admin assistant will get about 98% accuracy by comparison
it can “confirm” inventories
all it can do is regurgitate what’s in the inventory system — that’s not confirmation!
it can identify duplicate materials
first it has to identify two records that are actually duplicates;
and how likely do you think this is with a supplier mapping accuracy of 70%?
it can identify materials to be shared among facilities
well, okay, it can identify materials that are used across facilities and could be located in a central location — but how useful is that? it’s not because, first of all, YOU ALREADY KNOW THIS, and, second, IT CAN’T DO SUPPLY CHAIN OPTIMIZATION — THAT’S WHAT A SUPPLY CHAIN OPTIMIZATION SOLUTION IS FOR! OPTIMIZATION!!! We’ll break it down syllabically for you so you know what to ask for. OP – TUH – MY – ZAY – SHUN!
it can recommend ideal storage locations
again, NO! This requires solving a very sophisticated optimization model it doesn’t have the data for, doesn’t know how to build, and definitely doesn’t know how to solve.
it can revamp outdated stocking policies
well, only the solution of a proper Inventory OPTIMIZATION Model that identifies the appropriate locations and safety stock levels can identify how these should be revamped
it can recommend order patterns by consumption and lead time
that’s classical curve fitting and tend projection

And, secondly, as the doctor just explained, most of what they were saying AI could do CAN’T be done with AI, and instead can only be done with analytics, optimization, and advanced mathematical models! (You know, the advanced tech (that works) that you’ve been ignoring for over two decades!)

The Gen-AI garbage is getting out of control. It’s time to stop putting up with it and start pushing back against any provider who’s trying to sell you this miracle cure silicon snake oil and show them the door. There are real solutions that work, and have worked, for two decades that will revolutionize your supply chain. You don’t need false promises and tech that isn’t ready for prime time.

Somedays the doctor just wishes he was the Scarecrow. Only someone without a brain can deal with this constant level of Gen-AI bullsh!t and not be stressed about the deluge of misinformation being spread on a daily basis! But then again, without a brain, he might be fooled by the slick salespeople that Gen-AI could give him one, instead of remembering the wise words of the True Scarecrow.

You Don’t Need Gen-AI to Revolutionize Procurement and Supply Chain Management — Classic Analytics, Optimization, and Machine Learning that You Have Been Ignoring for Two Decades Will Do Just Fine!

Open Gen-AI technology may be about as reliable as a career politician managing your Nigerian bank account, but somehow it’s won the PR war (since there is longer any requirement to speak the truth or state actual facts in sales and marketing in most “first” world countries [where they believe Alternative Math is a real thing … and that’s why they can’t balance their budgets, FYI]) as every Big X is pushing Open Gen-AI as the greatest revolution in technology since the abacus. the doctor shouldn’t be surprised, given that most of the turkeys on their rafters can’t even do basic math* (but yet profess to deeply understand this technology) and thus believe the hype (and downplay the serious risks, which we summarized in this article, where we didn’t even mention the quality of the results when you unexpectedly get a result that doesn’t exhibit any of the six major issues).

The Power of Real Spend Analysis

If you have a real Spend Analysis tool, like Spendata (The Spend Analysis Power Tool), simple data exploration will find you a 10% or more savings opportunity in just a few days (well, maybe a few weeks, but that’s still just a matter of days). It’s one of only two technologies that has been demonstrated, when properly deployed and used, to identify returns of 10% or more, year after year after year, since the mid 2000s (when the technology wasn’t nearly as good as it is today), and it can be used by any Procurement or Finance Analyst that has a basic understanding of their data.

When you have a tool that will let you analyze data around any dimension of interest — supplier, category, product — restrict it to any subset of interest — timeframe, geographic location, off-contract spend — and roll-up, compare against, and drill down by variance — the opportunities you will find will be considerable. Even in the best sourced top spend categories, you’ll usually find 2% to 3%, in the mid-spend likely 5% or more, in the tail, likely 15% or more … and that’s before you identify unexpected opportunities by division (who aren’t adhering to the new contracts), geography (where a new local supplier can slash transportation costs), product line (where subtle shifts in pricing — and yes, real spend analysis can also handle sales and pricing data — lead to unexpected sales increases and greater savings when you bump your orders to the next discount level), and even in warranty costs (when you identify that a certain supplier location is continually delivering low quality goods compared to its peers).

And that’s just the Procurement spend … it can also handle the supply chain spend, logistics spend, warranty spend, utility and HR spend — and while you can’t control the HR spend, you can get a handle on your average cost by position by location and possibly restructure your hubs during expansion time to where resources are lower cost! Savings, savings, savings … you’ll find them ’round the clock … savings, savings, savings … analytics rocks!

The Power of Strategic Sourcing Decision Optimization

Decision optimization has been around in the Procurement space for almost 25 years, but it still has less than 10% penetration! This is utterly abysmal. It’s not only the only other technology that has been generating returns of 10% or more, in good times and bad, for any leading organization that consistently uses it, but the only technology that the doctor has seen that has consistently generated 20% to 30% savings opportunities on large multi-national complex categories that just can’t be solved with RFQ and a spreadsheet, no matter how hard you try. (But if you want to pay them, a Big X will still claim they can with the old college try if you pay their top analyst’s salary for a few months … and at 5K a day, there goes three times any savings they identify.)

Examples where the doctor has repeatedly seen stellar results include:

  • national service provider contract optimization across national, regional, and local providers where rates, expected utilization, and all-in costs for remote resources are considered; With just an RFX solution, the usual solution is to go to all the relevant Big X Bodyshops and get their rate cards by role by location by base rate (with expenses picked up by the org) and all-in rate; calc. the expected local overhead rate by location; then, for each Big X – role – location, determine if the Big X all-in rate or the Big X base rate plus their overhead is cheaper and select that as the final bid for analysis; then mark the lowest bid for each role-location and determine the three top providers; then distribute the award between the three “top” providers in the lowest cost fashion; and, in big companies using a lot of contract labour, leave millions on the table because 1) sometimes the cheapest 3 will actually be the providers with the middle of the road bids across the board and 2) for some areas/roles, regional, and definitely local, providers will often be cheaper — but since the complexity is beyond manageable, this isn’t done, even though the doctor has seen multiple real-world events generate 30% to 40% savings since optimization can handle hundreds of suppliers and tens of thousands of bids and find the perfect mix (even while limiting the number of global providers and the number of providers who can service a location)
  • global mailer / catalog production —
    paper won’t go away, and when you have to balance inks, papers, printing, distribution, and mailing — it’s not always local or one country in a region that minimizes costs, it’s a very complex sourcing AND logistics distribution that optimizes costs … and the real-world model gets dizzying fast unless you use optimization, which will find 10% or more savings beyond your current best efforts
  • build-to-order assembly — don’t just leave that to the contract manufacturer, when you can simultaneously analyze the entire BoM and supply chain, which can easily dwarf the above two models if you have 50 or more items, as savings will just appear when you do so

… but yet, because it’s “math”, it doesn’t get used, even though you don’t have to do the math — the platform does!

Curve Fitting Trend Analysis

Dozens (and dozens) of “AI” models have been developed over the past few years to provide you with “predictive” forecasts, insights, and analytics, but guess what? Not a SINGLE model has outdone classical curve-fitting trend analysis — and NOT a single model ever will. (This is because all these fancy-smancy black box solutions do is attempt to identify the record/transaction “fingerprint” that contains the most relevant data and then attempt to identify the “curve” or “line” to fit it too all at once, which means the upper bound is a classical model that uses the right data and fits to the right curve from the beginning, without wasting an entire plant’s worth of energy powering entire data centers as the algorithm repeatedly guesses random fingerprints and models until one seems to work well.)

And the reality is that these standard techniques (which have been refined since the 60s and 70s), which now run blindingly fast on large data sets thanks to today’s computing, can achieve 95% to 98% accuracy in some domains, with no misfires. A 95% accurate forecast on inventory, sales, etc. is pretty damn good and minimizes the buffer stock, and lead time, you need. Detailed, fine tuned, correlation analysis can accurately predict the impact of sales and industry events. And so on.

Going one step further, there exists a host of clustering techniques that can identify emergent trends in outlier behaviour as well as pockets of customers or demand. And so on. But chances are you aren’t using any of these techniques.

So given that most of you haven’t adopted any of this technology that has proven to be reliable, effective, and extremely valuable, why on earth would you want to adopt an unproven technology that hallucinates daily, might tell of your sensitive employees with hate speech, and even leak your data? It makes ZERO sense!

While we admit that someday semi-private LLMs will be an appropriate solution for certain areas of your business where large amount of textual analysis is required on a regular basis, even these are still iffy today and can’t always be trusted. And the doctor doesn’t care how slick that chatbot is because if you have to spend days learning how to expertly craft a prompt just to get a single result, you might as well just learn to code and use a classic open source Neural Net library — you’ll get better, more reliable, results faster.

Keep an eye on the tech if you like, but nothing stops you from using the tech that works. Let your peers be the test pilots. You really don’t want to be in the cockpit when it crashes.

* And if you don’t understand why a deep understand of university level mathematics, preferably at the graduate level, is important, then you shouldn’t be touching the turkey who touches the Gen-AI solution with a 10-foot pole!

The Power of Optimization-Backed Sourcing is in the Right Sourcing Mix Across Scales of Size and Service

the doctor has been pushing optimization-backed sourcing since Sourcing Innovation started in 2006. There’s a number of reasons for this:

  • there is only one other technology that has repeatedly demonstrated savings of 10% or more
  • it’s the only technology that can accurately model total cost of ownership with complex cost discounts and structures
  • it’s the only technology that can minimize costs while adhering to carbon, risk, or other requirements
  • it’s one of only two technologies that can analyze cost / risk, cost / carbon, or other cost / x tradeoffs accurately

However, the real power of optimization-backed sourcing is how it can not only give you the right product mix, but the right mix across scales. This is especially prevalent when doing sourcing events for national or international distribution or utilization. Without optimization, most companies can only deal with suppliers who can handle international distribution or utilization. This generally rules out regional suppliers and always rules out local suppliers, some of whom might be the best suppliers of goods or services to the region or locality. While one may be tempted to think local suppliers are irrelevant because they will struggle to deliver the economy of scale of a regional supplier and will definitely never reach the economy of scale of a national (or international) supplier, unit cost is just one component of the total lifecycle cost of a product or service. There’s transportation cost, tariffs, taxes, intermediate storage, and final storage (of which more will be required since you will need to make larger orders to account for longer distribution timelines) among other costs. So, in some instances, local and regional will be the overall lowest cost and keeping them out of the mix increases costs (and sometimes increases carbon and risk as well).

When it comes to services, the right multi-level mix can lead to savings of 30% or more in an initial event. the doctor has seen this many times over his career (consulting for many of of the strategic sourcing decision optimization startups) because while the big international players can get competitive on hourly rates where they have a lot of resources with a skill set, when it comes to services, there are all in-costs to consider, which include travel to the client site and local accommodations. The thing with national and international services providers is that they tend to cluster all of their resources with a certain skill set in a handful of major locations. So their core IT resources (developers, architects, DBAs, etc.) will be in San Francisco and New York, their core Management consultants will be in Chicago and Atlanta, their core Finance Pros in Salt Lake City and Denver, etc. So if you need IT in Jefferson City, Missouri, Management in Winner, South Dakota, or accounting in Des Moines, Iowa, you’re flying someone in, putting them up at the highest star hotel you have, and possibly doubling the cost compared to a standard day rate.

However, simple product mix and services scenarios are not the only scenarios optimization-backed sourcing can handle. As per this article over on IndianRetailer.com, retailers need to back away from global sourcing and embrace regional (and even local) strategies for cost management, supply stability, and resilience. They are only going to be able to figure that out with optimization that can help them identify the right mix to balance cost and supply assurance, and when you need to do that across hundreds, if not thousands, of products, you can’t do that with an RFX solution and Microsoft Excel.

Furthermore, when you need to minimize costs when a price is fixed, like the price of oil or airline fuel, you need to maximize every related decision like where to refuel, what service providers to contract with, how to transport it, etc. When it can cost up to $40,000 to fuel a 737 for a single flight (when prices are high), and you operate almost 7,000 flights per day with planes ranging from a gulf stream that costs about $10,000 to refuel to a Boeing 747 that, in hard times, can cost almost $400,000 to refuel, you can be spending $60 Million a day on fuel as your fleet burns 10 Million gallons. Storing those 10 Million gallons, transporting those 10 Million gallons, and using that fuel to fuel 7,000 planes takes a lot of manpower and equipment, all of which has an associated cost. Hundreds of thousands of associated costs per day (on the low end), and tens of millions per year. Shaving off just 3% would save over a million dollars easy. (Maybe two million.) However, the complexity of this logistics and distribution model is beyond what any sourcing professional can handle with traditional tools, but easy with an optimization backed platform that can model an entire flight schedule, all of the local costs for storage and fueling, all of the distribution costs from the fuel depots, and so on. (This is something that Coupa is currently supporting with its CSO solution, which has saved at least one airline millions of dollars. Reach out to Ian Milligan for more information if this intrigues you or how this model could be generalized to support global fleet management operations of any kind.)

In other words, Optimization-Backed Sourcing is going to become critical in your highly strategic / high spend categories as costs continue to rise, supply continues to be uncertain, carbon needs to be accounted for, and risks need to be managed.

COUPA: Centralized Optimization Underlies Procurement Adoption …

… or at least that’s what it SHOULD stand for. Why? Well, besides the fact that optimization is only one of two advanced sourcing & procurement technologies that have proven to deliver year-over-year cost avoidance (“savings”) of 10% or more (which becomes critical in an inflationary economy because while there are no more savings, negating the need for a 10% increase still allows your organization to maintain costs and outperform your competitors), it’s the only technology that can meet today’s sourcing needs!

COVID finally proved what the doctor and a select few other leading analysts and visionaries have been telling you for over a decade — that your supply chain was overextended and fraught with unnecessary risk and cost (and carbon), and that you needed to start near-sourcing/home-sourcing as soon as possible in order to mitigate risk. Plus, it’s also extremely difficult to comply with human rights acts (which mandate no forced or slave labour in the supply chain), such as the UK Modern Slavery Act, California Supply Chains Act, and the German Supply Chain Act if your supply chain is spread globally and has too many (unnecessary) tiers. (And, to top it off, now you have to track and manage your scope 1, 2, and 3 carbon in a supply chain you can barely manage.)

And, guess what, you can’t solve these problems just with:

  • supplier onboarding tools — you can’t just say “no China suppliers” when you’ve never used suppliers outside of China, the suppliers you have vetted can’t be counted on to deliver 100% of the inventory you need, or they are all clustered in the same province/state in one country
  • third party risk management — and just eliminate any supplier which has a risk score above a threshold, because sometimes that will eliminate all, or all but one, supplier
  • third party carbon calculators — because they are usually based on third party carbon emission data provided by research institutions that simply produce averages for a region / category of products (and might over estimate or under estimate the carbon produced by the entire supply base)
  • or even all three … because you will have to migrate out of China slowly, accept some risk, and work on reducing carbon over time

You can only solve these problems if you can balance all forms of risk vs cost vs carbon. And there’s only one tool that can do this. Strategic Sourcing Decision Optimization (SSDO), and when it comes to this, Coupa has the most powerful platform. Built on TESS 6 — Trade Extensions Strategic Sourcing — that Coupa acquired in 2017, the Coupa Sourcing Optimization (CSO) platform is one of the few platforms in the world that can do this. Plus, it can be pre-configured out-of-the-box for your sourcing professionals with all of the required capabilities and data already integrated*. And it may be alone from this perspective (as the other leading optimization solutions are either integrated with smaller platforms or platforms with less partners). (*The purchase of additional services from Coupa or Partners may be required.)

So why is it one of the few platforms that can do this? We’ll get to that, but first we have to cover what the platform does, and more specifically, what’s new since our last major coverage in 2016 on SI (and in 2018 and 2019 on Spend Matters, where the doctor was part of the entire SM Analyst team that created the 3-part in-depth Coupa review, but, as previously noted, the site migration dropped co-authors for many articles).

As per previous articles over the past fifteen years, you already know that:

So now all we have to focus on are the recent improvements around:

  • “smart scenarios” that can be templated and cross-linked from integrated scenario-aware help-guides
  • “Plain English” constraint creation (that allows average buyers & executives to create advanced scenarios)
  • fact-sheet auto-generation from spreadsheets, API calls, and other third-party data sources;
    including data identification, formula derivation and auto-validation pre-import
  • bid insights
  • risk-aware functionality

“Smart Events”

Optimization events can be created from event templates that can themselves be created from completed events. A template can be populated with as little, or as much as the user wants … all the way from simply defining an RFX Survey, factsheet, and a baseline scenario to a complete copy of the event with “last bid” pricing and definitions of every single scenario created by the buyer. Also, templates can be edited at any time and can define specific baseline pricing, last price paid by procurement, last price in a pre-defined fact-sheet that can sit above the event, and so on. Fixed supplier lists, all qualified suppliers that supply a product, all qualified suppliers in an area, no suppliers (and the user pulls from recommended) and so on. In addition to predefining a suite of scenarios that can be run once all the data is populated, the buyer can also define a suite of default reports to be run, and even emailed out, upon scenario completion. This is in addition to workflow automation that can step the buyer through the RFX, auto-respond to suppliers when responses are incomplete or not acceptable, spreadsheets or documents uploaded with hacked/cracked security, and so on. The Coupa philosophy is that optimization-backed events should be as easy as any other event in the system, and the system can be configured so they literally are.

Also, as indicated above, the help guides are smart. When you select a help article on how to do something, it takes you to the right place on the right screen while keeping you in the event. Some products have help guides that are pretty dumb and just take you to the main screen, not to the right field on the right sub-screen, if they even link into the product at all!

“Plain English” Constraint Creation

Even though the vast majority of constraints, mathematically, fall into three/four primary categories — capacity/allocation, risk mitigation, and qualitative — that isn’t obvious to the average buyer without an optimization, analytical, or mathematical background. So Coupa has spent a lot of time working with buyers asking them what they want, listening to their answers and the terminology they use, and created over 100 “plain english” constraint templates that break down into 10 primary categories (allocation, costs, discount, incumbent, numeric limitations, post-processing, redefinition, reject, scenario reference, and collection sheets) as well as a subset of most commonly used constraints gathered into a a “common constraints” collection. For example, the Allocation Category allows for definition “by selection sheet”, “volume”, “alternative cost”, “bid priority”, “fixed divisions”, “favoured/penalized bids”, “incumbent allocations maintained”, etc. Then, when a buyer selects a constraint type, such as “divide allocations”, it will be asked to define the method (%, fixed amount), the division by (supplier, group, geography), and any other conditions (low risk suppliers if by geography). The definition forms are also smart and respond to each, sequential, choice appropriately.

Fantastic Fact Sheets

Fact Sheets can be auto-generated from uploaded spreadsheets (as their platform will automatically detect the data elements (columns), types (text, math, fixed response set, calculation), mappings to internal system / RFX elements), and records — as well as detecting when rows / values are invalid and allow the user to determine what to do when invalid rows/values are detected. Also, if the match is not high certainty, the fact-sheet processor will indicate the user needs to manually define and the user can, of course, override all of the default mappings — and even choose to load only part of the data. These spreadsheets can live in an event or live above the event and be used by multiple events (so that company defined currency conversions, freight quotes for the month, standard warehouse costs, etc. can be used across events).

But even better, Fact Sheets can be configured to automatically pull data in from other modules in the Coupa suite and from APIs the customer has access to, which will pull in up to date information every time they are instantiated.

Bid Insights

Coupa is a big company with a lot of customers and a lot of data. A LOT of data! Not only in terms of prices its customers are paying in their procurement of products and services, but in terms of what suppliers are bidding. This provides huge insight into current marketing pricing in commonly sourced categories, including, and especially, Freight! Starting with freight, Coupa is rolling out a new bid pricing insights for freight where a user can select the source, the destination, the type (frozen/wet/dry/etc), and size (e.g. for ocean freight, the source and destination country, which defaults to container, and the container size/type combo and get the quote range over the past month/quarter/year).

Risk Aware Functionality

The Coupa approach to risk is that you should be risk-aware (to the extent the platform can make you risk aware) with every step you take, so risk data is available across the platform — and all of that risk data can be integrated into an optimization project and scenarios to reject, limit, or balance any risk of interest in the award recommendations.

And when you combine the new capabilities for

  • “smart” events
  • API-enabled fact sheets
  • risk-aware functionality

that’s how Coupa is the first platform that literally can, with some configuration and API integration, allow you to balance third party risk, carbon, and cost simultaneously in your sourcing events — which is where you HAVE to mange risk, carbon, and cost if you want to have any impact at all on your indirect risk, carbon, and cost.

It’s not just 80% of cost that is locked in during design, it’s 80% of risk and carbon as well! And in indirect, you can’t do much about that. You can only do something about the next 20% of cost, risk and carbon that is locked in when you cut the contract. (And then, if you’re sourcing direct, before you finalize a design, you can run some optimization scenarios across design alternatives to gauge relative cost, carbon, and risk, and then select the best design for future sourcing.) So by allowing you to bring in all of the relevant data, you can finally get a grip on the risk and carbon associated with a potential award and balance appropriately.

In other words, this is the year for Optimization to take center stage in Coupa, and power the entire Source-to-Contract process. No other solution can balance these competing objectives. Thus, after 25 years, the time for sourcing optimization, which is still the best kept secret (and most powerful technology in S2P), has finally come! (And, it just might be the reason that more users in an organization adopt Coupa.)