Category Archives: Decision Optimization

You Don’t Need Gen-AI to Revolutionize Procurement and Supply Chain Management — Classic Analytics, Optimization, and Machine Learning that You Have Been Ignoring for Two Decades Will Do Just Fine!

Open Gen-AI technology may be about as reliable as a career politician managing your Nigerian bank account, but somehow it’s won the PR war (since there is longer any requirement to speak the truth or state actual facts in sales and marketing in most “first” world countries [where they believe Alternative Math is a real thing … and that’s why they can’t balance their budgets, FYI]) as every Big X, Mid-Sized Consultancy, and the majority of software vendors are pushing Open Gen-AI as the greatest revolution in technology since the abacus. the doctor shouldn’t be surprised, given that most of the turkeys on their rafters can’t even do basic math* (but yet profess to deeply understand this technology) and thus believe the hype (and downplay the serious risks, which we summarized in this article, where we didn’t even mention the quality of the results when you unexpectedly get a result that doesn’t exhibit any of the six major issues).

The Power of Real Spend Analysis

If you have a real Spend Analysis tool, like Spendata (The Spend Analysis Power Tool), simple data exploration will find you a 10% or more savings opportunity in just a few days (well, maybe a few weeks, but that’s still just a matter of days). It’s one of only two technologies that has been demonstrated, when properly deployed and used, to identify returns of 10% or more, year after year after year, since the mid 2000s (when the technology wasn’t nearly as good as it is today), and it can be used by any Procurement or Finance Analyst that has a basic understanding of their data.

When you have a tool that will let you analyze data around any dimension of interest — supplier, category, product — restrict it to any subset of interest — timeframe, geographic location, off-contract spend — and roll-up, compare against, and drill down by variance — the opportunities you will find will be considerable. Even in the best sourced top spend categories, you’ll usually find 2% to 3%, in the mid-spend likely 5% or more, in the tail, likely 15% or more … and that’s before you identify unexpected opportunities by division (who aren’t adhering to the new contracts), geography (where a new local supplier can slash transportation costs), product line (where subtle shifts in pricing — and yes, real spend analysis can also handle sales and pricing data — lead to unexpected sales increases and greater savings when you bump your orders to the next discount level), and even in warranty costs (when you identify that a certain supplier location is continually delivering low quality goods compared to its peers).

And that’s just the Procurement spend … it can also handle the supply chain spend, logistics spend, warranty spend, utility and HR spend — and while you can’t control the HR spend, you can get a handle on your average cost by position by location and possibly restructure your hubs during expansion time to where resources are lower cost! Savings, savings, savings … you’ll find them ’round the clock … savings, savings, savings … analytics rocks!

The Power of Strategic Sourcing Decision Optimization

Decision optimization has been around in the Procurement space for almost 25 years, but it still has less than 10% penetration! This is utterly abysmal. It’s not only the only other technology that has been generating returns of 10% or more, in good times and bad, for any leading organization that consistently uses it, but the only technology that the doctor has seen that has consistently generated 20% to 30% savings opportunities on large multi-national complex categories that just can’t be solved with RFQ and a spreadsheet, no matter how hard you try. (But if you want to pay them, an expert consultant will still claim they can with the old college try if you pay their top analyst’s salary for a few months … and at, say, 5K a day, there goes three times any savings they identify.)

Examples where the doctor has repeatedly seen stellar results include:

  • national service provider contract optimization across national, regional, and local providers where rates, expected utilization, and all-in costs for remote resources are considered; With just an RFX solution, the usual solution is to go to all the relevant Big X and Mid-Sized Bodyshops and get their rate cards by role by location by base rate (with expenses picked up by the org) and all-in rate; calc. the expected local overhead rate by location; then, for each Big X / Mid-Size- role – location, determine if the Big X all-in rate or the Big X base rate plus their overhead is cheaper and select that as the final bid for analysis; then mark the lowest bid for each role-location and determine the three top providers; then distribute the award between the three “top” providers in the lowest cost fashion; and, in big companies using a lot of contract labour, leave millions on the table because 1) sometimes the cheapest 3 will actually be the providers with the middle of the road bids across the board and 2) for some areas/roles, regional, and definitely local, providers will often be cheaper — but since the complexity is beyond manageable, this isn’t done, even though the doctor has seen multiple real-world events generate 30% to 40% savings since optimization can handle hundreds of suppliers and tens of thousands of bids and find the perfect mix (even while limiting the number of global providers and the number of providers who can service a location)
  • global mailer / catalog production —
    paper won’t go away, and when you have to balance inks, papers, printing, distribution, and mailing — it’s not always local or one country in a region that minimizes costs, it’s a very complex sourcing AND logistics distribution that optimizes costs … and the real-world model gets dizzying fast unless you use optimization, which will find 10% or more savings beyond your current best efforts
  • build-to-order assembly — don’t just leave that to the contract manufacturer, when you can simultaneously analyze the entire BoM and supply chain, which can easily dwarf the above two models if you have 50 or more items, as savings will just appear when you do so

… but yet, because it’s “math”, it doesn’t get used, even though you don’t have to do the math — the platform does!

Curve Fitting Trend Analysis

Dozens (and dozens) of “AI” models have been developed over the past few years to provide you with “predictive” forecasts, insights, and analytics, but guess what? Not a SINGLE model has outdone classical curve-fitting trend analysis — and NOT a single model ever will. (This is because all these fancy-smancy black box solutions do is attempt to identify the record/transaction “fingerprint” that contains the most relevant data and then attempt to identify the “curve” or “line” to fit it too all at once, which means the upper bound is a classical model that uses the right data and fits to the right curve from the beginning, without wasting an entire plant’s worth of energy powering entire data centers as the algorithm repeatedly guesses random fingerprints and models until one seems to work well.)

And the reality is that these standard techniques (which have been refined since the 60s and 70s), which now run blindingly fast on large data sets thanks to today’s computing, can achieve 95% to 98% accuracy in some domains, with no misfires. A 95% accurate forecast on inventory, sales, etc. is pretty damn good and minimizes the buffer stock, and lead time, you need. Detailed, fine tuned, correlation analysis can accurately predict the impact of sales and industry events. And so on.

Going one step further, there exists a host of clustering techniques that can identify emergent trends in outlier behaviour as well as pockets of customers or demand. And so on. But chances are you aren’t using any of these techniques.

So given that most of you haven’t adopted any of this technology that has proven to be reliable, effective, and extremely valuable, why on earth would you want to adopt an unproven technology that hallucinates daily, might tell of your sensitive employees with hate speech, and even leak your data? It makes ZERO sense!

While we admit that someday semi-private LLMs will be an appropriate solution for certain areas of your business where large amount of textual analysis is required on a regular basis, even these are still iffy today and can’t always be trusted. And the doctor doesn’t care how slick that chatbot is because if you have to spend days learning how to expertly craft a prompt just to get a single result, you might as well just learn to code and use a classic open source Neural Net library — you’ll get better, more reliable, results faster.

Keep an eye on the tech if you like, but nothing stops you from using the tech that works. Let your peers be the test pilots. You really don’t want to be in the cockpit when it crashes.

* And if you don’t understand why a deep understand of university level mathematics, preferably at the graduate level, is important, then you shouldn’t be touching the turkey who touches the Gen-AI solution with a 10-foot pole!

The Power of Optimization-Backed Sourcing is in the Right Sourcing Mix Across Scales of Size and Service

the doctor has been pushing optimization-backed sourcing since Sourcing Innovation started in 2006. There’s a number of reasons for this:

  • there is only one other technology that has repeatedly demonstrated savings of 10% or more
  • it’s the only technology that can accurately model total cost of ownership with complex cost discounts and structures
  • it’s the only technology that can minimize costs while adhering to carbon, risk, or other requirements
  • it’s one of only two technologies that can analyze cost / risk, cost / carbon, or other cost / x tradeoffs accurately

However, the real power of optimization-backed sourcing is how it can not only give you the right product mix, but the right mix across scales. This is especially prevalent when doing sourcing events for national or international distribution or utilization. Without optimization, most companies can only deal with suppliers who can handle international distribution or utilization. This generally rules out regional suppliers and always rules out local suppliers, some of whom might be the best suppliers of goods or services to the region or locality. While one may be tempted to think local suppliers are irrelevant because they will struggle to deliver the economy of scale of a regional supplier and will definitely never reach the economy of scale of a national (or international) supplier, unit cost is just one component of the total lifecycle cost of a product or service. There’s transportation cost, tariffs, taxes, intermediate storage, and final storage (of which more will be required since you will need to make larger orders to account for longer distribution timelines) among other costs. So, in some instances, local and regional will be the overall lowest cost and keeping them out of the mix increases costs (and sometimes increases carbon and risk as well).

When it comes to services, the right multi-level mix can lead to savings of 30% or more in an initial event. the doctor has seen this many times over his career (consulting for many of of the strategic sourcing decision optimization startups) because while the big international players can get competitive on hourly rates where they have a lot of resources with a skill set, when it comes to services, there are all in-costs to consider, which include travel to the client site and local accommodations. The thing with national and international services providers is that they tend to cluster all of their resources with a certain skill set in a handful of major locations. So their core IT resources (developers, architects, DBAs, etc.) will be in San Francisco and New York, their core Management consultants will be in Chicago and Atlanta, their core Finance Pros in Salt Lake City and Denver, etc. So if you need IT in Jefferson City, Missouri, Management in Winner, South Dakota, or accounting in Des Moines, Iowa, you’re flying someone in, putting them up at the highest star hotel you have, and possibly doubling the cost compared to a standard day rate.

However, simple product mix and services scenarios are not the only scenarios optimization-backed sourcing can handle. As per this article over on IndianRetailer.com, retailers need to back away from global sourcing and embrace regional (and even local) strategies for cost management, supply stability, and resilience. They are only going to be able to figure that out with optimization that can help them identify the right mix to balance cost and supply assurance, and when you need to do that across hundreds, if not thousands, of products, you can’t do that with an RFX solution and Microsoft Excel.

Furthermore, when you need to minimize costs when a price is fixed, like the price of oil or airline fuel, you need to maximize every related decision like where to refuel, what service providers to contract with, how to transport it, etc. When it can cost up to $40,000 to fuel a 737 for a single flight (when prices are high), and you operate almost 7,000 flights per day with planes ranging from a gulf stream that costs about $10,000 to refuel to a Boeing 747 that, in hard times, can cost almost $400,000 to refuel, you can be spending $60 Million a day on fuel as your fleet burns 10 Million gallons. Storing those 10 Million gallons, transporting those 10 Million gallons, and using that fuel to fuel 7,000 planes takes a lot of manpower and equipment, all of which has an associated cost. Hundreds of thousands of associated costs per day (on the low end), and tens of millions per year. Shaving off just 3% would save over a million dollars easy. (Maybe two million.) However, the complexity of this logistics and distribution model is beyond what any sourcing professional can handle with traditional tools, but easy with an optimization backed platform that can model an entire flight schedule, all of the local costs for storage and fueling, all of the distribution costs from the fuel depots, and so on. (This is something that Coupa is currently supporting with its CSO solution, which has saved at least one airline millions of dollars. Reach out to Ian Milligan for more information if this intrigues you or how this model could be generalized to support global fleet management operations of any kind.)

In other words, Optimization-Backed Sourcing is going to become critical in your highly strategic / high spend categories as costs continue to rise, supply continues to be uncertain, carbon needs to be accounted for, and risks need to be managed.

COUPA: Centralized Optimization Underlies Procurement Adoption …

… or at least that’s what it SHOULD stand for. Why? Well, besides the fact that optimization is only one of two advanced sourcing & procurement technologies that have proven to deliver year-over-year cost avoidance (“savings”) of 10% or more (which becomes critical in an inflationary economy because while there are no more savings, negating the need for a 10% increase still allows your organization to maintain costs and outperform your competitors), it’s the only technology that can meet today’s sourcing needs!

COVID finally proved what the doctor and a select few other leading analysts and visionaries have been telling you for over a decade — that your supply chain was overextended and fraught with unnecessary risk and cost (and carbon), and that you needed to start near-sourcing/home-sourcing as soon as possible in order to mitigate risk. Plus, it’s also extremely difficult to comply with human rights acts (which mandate no forced or slave labour in the supply chain), such as the UK Modern Slavery Act, California Supply Chains Act, and the German Supply Chain Act if your supply chain is spread globally and has too many (unnecessary) tiers. (And, to top it off, now you have to track and manage your scope 1, 2, and 3 carbon in a supply chain you can barely manage.)

And, guess what, you can’t solve these problems just with:

  • supplier onboarding tools — you can’t just say “no China suppliers” when you’ve never used suppliers outside of China, the suppliers you have vetted can’t be counted on to deliver 100% of the inventory you need, or they are all clustered in the same province/state in one country
  • third party risk management — and just eliminate any supplier which has a risk score above a threshold, because sometimes that will eliminate all, or all but one, supplier
  • third party carbon calculators — because they are usually based on third party carbon emission data provided by research institutions that simply produce averages for a region / category of products (and might over estimate or under estimate the carbon produced by the entire supply base)
  • or even all three … because you will have to migrate out of China slowly, accept some risk, and work on reducing carbon over time

You can only solve these problems if you can balance all forms of risk vs cost vs carbon. And there’s only one tool that can do this. Strategic Sourcing Decision Optimization (SSDO), and when it comes to this, Coupa has the most powerful platform. Built on TESS 6 — Trade Extensions Strategic Sourcing — that Coupa acquired in 2017, the Coupa Sourcing Optimization (CSO) platform is one of the few platforms in the world that can do this. Plus, it can be pre-configured out-of-the-box for your sourcing professionals with all of the required capabilities and data already integrated*. And it may be alone from this perspective (as the other leading optimization solutions are either integrated with smaller platforms or platforms with less partners). (*The purchase of additional services from Coupa or Partners may be required.)

So why is it one of the few platforms that can do this? We’ll get to that, but first we have to cover what the platform does, and more specifically, what’s new since our last major coverage in 2016 on SI (and in 2018 and 2019 on Spend Matters, where the doctor was part of the entire SM Analyst team that created the 3-part in-depth Coupa review, but, as previously noted, the site migration dropped co-authors for many articles).

As per previous articles over the past fifteen years, you already know that:

So now all we have to focus on are the recent improvements around:

  • “smart scenarios” that can be templated and cross-linked from integrated scenario-aware help-guides
  • “Plain English” constraint creation (that allows average buyers & executives to create advanced scenarios)
  • fact-sheet auto-generation from spreadsheets, API calls, and other third-party data sources;
    including data identification, formula derivation and auto-validation pre-import
  • bid insights
  • risk-aware functionality

“Smart Events”

Optimization events can be created from event templates that can themselves be created from completed events. A template can be populated with as little, or as much as the user wants … all the way from simply defining an RFX Survey, factsheet, and a baseline scenario to a complete copy of the event with “last bid” pricing and definitions of every single scenario created by the buyer. Also, templates can be edited at any time and can define specific baseline pricing, last price paid by procurement, last price in a pre-defined fact-sheet that can sit above the event, and so on. Fixed supplier lists, all qualified suppliers that supply a product, all qualified suppliers in an area, no suppliers (and the user pulls from recommended) and so on. In addition to predefining a suite of scenarios that can be run once all the data is populated, the buyer can also define a suite of default reports to be run, and even emailed out, upon scenario completion. This is in addition to workflow automation that can step the buyer through the RFX, auto-respond to suppliers when responses are incomplete or not acceptable, spreadsheets or documents uploaded with hacked/cracked security, and so on. The Coupa philosophy is that optimization-backed events should be as easy as any other event in the system, and the system can be configured so they literally are.

Also, as indicated above, the help guides are smart. When you select a help article on how to do something, it takes you to the right place on the right screen while keeping you in the event. Some products have help guides that are pretty dumb and just take you to the main screen, not to the right field on the right sub-screen, if they even link into the product at all!

“Plain English” Constraint Creation

Even though the vast majority of constraints, mathematically, fall into three/four primary categories — capacity/allocation, risk mitigation, and qualitative — that isn’t obvious to the average buyer without an optimization, analytical, or mathematical background. So Coupa has spent a lot of time working with buyers asking them what they want, listening to their answers and the terminology they use, and created over 100 “plain english” constraint templates that break down into 10 primary categories (allocation, costs, discount, incumbent, numeric limitations, post-processing, redefinition, reject, scenario reference, and collection sheets) as well as a subset of most commonly used constraints gathered into a a “common constraints” collection. For example, the Allocation Category allows for definition “by selection sheet”, “volume”, “alternative cost”, “bid priority”, “fixed divisions”, “favoured/penalized bids”, “incumbent allocations maintained”, etc. Then, when a buyer selects a constraint type, such as “divide allocations”, it will be asked to define the method (%, fixed amount), the division by (supplier, group, geography), and any other conditions (low risk suppliers if by geography). The definition forms are also smart and respond to each, sequential, choice appropriately.

Fantastic Fact Sheets

Fact Sheets can be auto-generated from uploaded spreadsheets (as their platform will automatically detect the data elements (columns), types (text, math, fixed response set, calculation), mappings to internal system / RFX elements), and records — as well as detecting when rows / values are invalid and allow the user to determine what to do when invalid rows/values are detected. Also, if the match is not high certainty, the fact-sheet processor will indicate the user needs to manually define and the user can, of course, override all of the default mappings — and even choose to load only part of the data. These spreadsheets can live in an event or live above the event and be used by multiple events (so that company defined currency conversions, freight quotes for the month, standard warehouse costs, etc. can be used across events).

But even better, Fact Sheets can be configured to automatically pull data in from other modules in the Coupa suite and from APIs the customer has access to, which will pull in up to date information every time they are instantiated.

Bid Insights

Coupa is a big company with a lot of customers and a lot of data. A LOT of data! Not only in terms of prices its customers are paying in their procurement of products and services, but in terms of what suppliers are bidding. This provides huge insight into current marketing pricing in commonly sourced categories, including, and especially, Freight! Starting with freight, Coupa is rolling out a new bid pricing insights for freight where a user can select the source, the destination, the type (frozen/wet/dry/etc), and size (e.g. for ocean freight, the source and destination country, which defaults to container, and the container size/type combo and get the quote range over the past month/quarter/year).

Risk Aware Functionality

The Coupa approach to risk is that you should be risk-aware (to the extent the platform can make you risk aware) with every step you take, so risk data is available across the platform — and all of that risk data can be integrated into an optimization project and scenarios to reject, limit, or balance any risk of interest in the award recommendations.

And when you combine the new capabilities for

  • “smart” events
  • API-enabled fact sheets
  • risk-aware functionality

that’s how Coupa is the first platform that literally can, with some configuration and API integration, allow you to balance third party risk, carbon, and cost simultaneously in your sourcing events — which is where you HAVE to mange risk, carbon, and cost if you want to have any impact at all on your indirect risk, carbon, and cost.

It’s not just 80% of cost that is locked in during design, it’s 80% of risk and carbon as well! And in indirect, you can’t do much about that. You can only do something about the next 20% of cost, risk and carbon that is locked in when you cut the contract. (And then, if you’re sourcing direct, before you finalize a design, you can run some optimization scenarios across design alternatives to gauge relative cost, carbon, and risk, and then select the best design for future sourcing.) So by allowing you to bring in all of the relevant data, you can finally get a grip on the risk and carbon associated with a potential award and balance appropriately.

In other words, this is the year for Optimization to take center stage in Coupa, and power the entire Source-to-Contract process. No other solution can balance these competing objectives. Thus, after 25 years, the time for sourcing optimization, which is still the best kept secret (and most powerful technology in S2P), has finally come! (And, it just might be the reason that more users in an organization adopt Coupa.)

Keelvar: Not satisfied with the hill, it’s trying to climb the mountain!

The last time we covered Keelvar on Sourcing Innovation was back in 2016 when we re-introduced Keelvar: An Optimization-Backed Sourcing Platform because it was The Little Engine that Could. (It’s last deep dive on Spend Matters was also in 2016, in Jason Busch’s 3-part Vendor Analysis that the doctor consulted on, which can be found linked here in Part 1, Part 2, and Part 3: subscription required. With subscription, you can also check out the What Makes It Great Solution Map Analysis.)

Since our last update, Keelvar has made considerable progress in a number of areas, but of particular relevance are:

  1. total cost modelling
  2. constraint definition for its optimization
  3. workflow-based event automation
  4. usability

After a basic overview of the software, the above four improvements are what we are going to focus on in this article as it is the most relevant to sourcing-based cost savings identification.

Keelvar is an optimization-backed sourcing platform (for RFQs and Auctions) that can also support extensive sourcing automation, especially once a full-fledged sourcing event has been run and a template already exists (and approved suppliers have already been defined). We will start with a review of the sourcing platform.

The sourcing platform is designed to walk a user through a sourcing event step-by-step. Keelvar uses a 7-stage sourcing workflow that they break down as follows:

  1. Design: This is where the event is defined. In this stage you define the meta information (id, name, description, contacts, etc.), the schedule, the RFI, the bid sheet (as the application supports export to/import from Excel for Suppliers who can’t figure out how to use anything except Excel), the cost calculation per unit (for analysis, optimization, and reporting), and basic event settings, especially if using an auction.
  2. Invite: This is where you select suppliers for invitation.
  3. Publish: This is where you review the design and invite list and launch it.
  4. Bid: This is the bidding phase where suppliers place bids. The buyer can see bids as they come in, get reports on activity, and manage the event as needed (extend the deadline, answer questions, and distribute the responses to all suppliers).
  5. Evaluate: This is where the mathematical magic happens. In this step you define item/lot groups, bidder groups, and scenarios. (You need to define groups for risk mitigation and quality constraints, which are impossible to define in the platform otherwise.) Scenarios allow you to find the lowest cost options under different business rules, constraints, and goals.
  6. Analyze: This is where the user can apply detailed analytics across bids and scenarios to see the differences, gaps, supplier ranks, etc. in tabular or visual formats; do detailed analysis on the individual scenarios to understand what is driving the cost or the award; and even analyze the potential awards against RFI criteria submitted by the suppliers.
  7. Award: After doing the analysis and making their decision, this is where the buyer makes their award from either a solved scenario or a manual allocation.

So now that the basics are out of the way, let’s talk about total cost modelling. As per our summary above, that starts with the bid sheet. Either in the platform, or, if you prefer, in Excel, you can define all of the cost components of interest (and even upload starting bid values from the current I2P/AP system and/or previous bid sheets). If you have an Excel sheet that breaks down the bid elements you want to collect, and the totals you want, in columnar format, with enough sample rows, you can just upload it and the platform will not only differentiate the raw data columns from the bidder columns, and map your column names to internal, mandatory, defined columns (for items, lanes, etc.), but differentiate purchaser input columns (such as destination city, country, service/product, etc.) from bidder columns (origin city, country, lane cost, unit cost, tariffs/taxes, etc), differentiate raw columns from formulas, extract the formulas, and even determine default visibility to the bidder (who won’t see the formulas, especially if hidden offsets or weightings are used). The user can, of course, correct and override anything if needed, but for each sheet process, the application learns the mappings (based on user overrides and corrections) and over time has a high success rate on import. Once the columns are defined, editing the column roles (purchaser vs. bidder, visibility, mandatory vs optional, etc. is very easy) – you can simply toggle.

In addition, and this is a major improvement over the early days (when there was no quality control on the coal being used to power that little engine), all of the inputs can be associated with one or more validation rules that can require an input be completed, from a valid set, the same as related bid values, and so on. Out of the box rules exist for easily defining uniform values across a column for a lot (if all items must come from or go to the same [intermediate] location, for example) and requiring complete coverage on a group of lots (critical if a supplier must bid all or nothing on an item, set of related items, sub-assembly of a BoM, etc.). If those don’t work, you can use advanced conditional logic on any (set of) column(s) to ensure specific conditional rules are met, especially if a value or answer is dependent on another column or value. The conditional rule generator uses the formula builder that supports all standard numeric operators and numeric columns as well as string-based matching and type/value based operators for ensuring entries come from an appropriate set of values, possibly dependent on the non-numeric value defined in another column.

In other words, because all cost elements can be defined, because arbitrary formulas can be used to define costs, and because rules can be created to ensure all cost elements are valid, the platform truly supports total cost modelling (which is one of the four pillars of Strategic Sourcing Decision Optimization [SSDO]).

For easy reference, the other three pillars are:

  • solid mathematical foundations, which we know Keelvar has from previous coverage;
  • what-if capability, which has been there since the beginning as Keelvar has always supported multiple scenarios;
  • sophisticated constraint definition and analysis — which was lacking in the past and which we will cover next.

Moving onto constraint definition, Keelvar has made considerable improvements both in the definition of bidder and lot groups and the ability to define arbitrary limit constraints on arbitrary collections of bidders and lots/items. This allows it to address the four categories required for SSDO:

  • allocation: to define minimum, fixed, or maximum allocations for a supplier
  • capacity: to take into account supplier, lane, warehouse, or other capacity limits
  • risk mitigation/group-wise allocation: ensuring that the award is split across a group of suppliers to mitigate risk, that a supplier receives a minimum amount of a group of items to satisfy an existing contract, etc.
  • qualitative: to make sure a minimum, average, quality level, diversity goal (volume-wise) or other non-cost constraint is adhered to

Keelvar has always been great at capacity and allocation but, in the past, it’s ability to define risk mitigation/group-wise allocation was limited and qualitative almost non-existent. But with proper definition of bidder and (item) lot groups, and the ability to define constraints on any numeric dimension (not just cost), one can now define the majority of foreseeable instances of both of these constraints. You can create bidder groups by geography, and ensure each geography gets a minimum or maximum allocation. (And even though you couldn’t define a 20/30/50 split directly, you know the cheapest supplier will get 50%, the most expensive 20%, and the middle one 30% by basic logic. If you wanted a 10/25/35/40, that would be a bit more difficult. But logic dictates the two cheapest get 40%, ensuring the two most expensive get 10%, if you insist each group get between 10% and 40%. A simple total-cost analysis tells you which group should be 40%, which group 35%, which group 25%, and which group 10%. And almost every other group-based allocation you would reasonably want to define would be straight-forward or close with post-scenario analysis.)

Quality constraints such as diversity (by volume), quality (by unit), or sustainably approved (by unit) are also very straight-forward to define. For diversity, simply group all the diverse suppliers and ensure they get a minimum percentage of the volume (by unit cost if that’s your metric) to meet your goals. For quality, if every supplier has an internal quality rating, for each quality level, you can define a maximum allocation that can be allowed for that group to ensure a minimum overall quality level. (And if there was hard data by unit by supplier, you’d just define a hidden column in the bid sheet and define a limit constraint on the quality instead of the cost.) For sustainably approved (by unit), you’d simply group all the sustainable suppliers (instead of the diverse ones) and ensure they received a minimum percentage.

In addition, since we last covered Keelvar, they have incorporated soft-constraint support and made the definition thereof super easy. In the application, you can define a constraint as available to be relaxed if the total cost savings exceeds a certain value. That’s as easy (peasy) as it gets.

This takes us to workflow-based event automation. In the updated Keelvar platform, you can define a complete event workflow, and the platform will automate almost the entire event for you, handling everything until it’s time to allocate the award. Once you create an instance, which is as easy as selecting an event template for activate and defining just a few pieces of meta-data, it will auto-fill / update all of the remaining meta-data (since last time if it was previously run), extract the current, approved, supplier list, automatically request approval from the category owner, publish the RFP (or launch the auction) on the predefined date, automatically send the invites out, collect (and validate) the bids (using the predefined validation rules), run the predefined scenarios when the bidding closes, kick-off the predefined analyses and reports on those scenarios and package them up for the event owner (which can include exports), and take the buyer right to the award screen for scenario and/or manual allocation where the user can make the award if ready, review an analysis, or jump back to a scenario, alter it slightly, re-run it, and then use that modified scenario for the award definition.

In terms of process definition, Keelvar has an integrated visual workflow editor where the user can compose the mandatory steps, conditional steps, and necessary approvals at each step (which could be the category owner, a manager if the estimated event value exceeds a threshold, etc.). Each step can link to an appropriate element which can be completely customized as needed.

However, the easiest way to define an event template, and the most effective way, is to instantiate one off of a completed RFP. The built in logic and machine learning can automatically generate a complete workflow-driven template off an RFP. It can define rules for filling in all definition fields off of a few key pieces of meta-data, define rules for identifying the (recommended) suppliers for future events (for one-click approval by the category owner), suggest publication dates and bidding timeframes, define all of the bid validation rules based on the bid-sheets and defined rules, create default scenario definitions, (re)create default bid/scenario analysis and visualization reports as well as rules to auto-package and distribute exports to the event owner, and even identify the recommended scenario for award allocation.

Once the event template is automatically extracted from the completed event, a user can review it in its entirety and edit whatever they want. And then they know when they next instantiate it, it will run flawlessly. (It’s automation. Not automated. And that’s the way it should be.)

Finally, when it comes to usability, if it’s not immediately obvious, usability has been enhanced throughout the platform. But it’s easier to see it than describe it. So if you want a modern optimization-backed sourcing optimization platform, just get a demo and see it for yourself.

In closing, Keelvar is not just the last standing specialist optimization provider, they’re now one of the best. Let’s hope the next major enhancement tackles true Multi-Objective Strategic Sourcing Decision Optimization On Procurement Tends. (MOSS DO OPT!)

Logility “Starboard”: The Real-Time What-If Supply Chain Network Modeller that Every Sourcing Professional Should Have

Now, it’s true that this blog is focussed on Source-to-Pay and it’s true that, as a result, we usually focus on Strategic Sourcing Decision Optimization and occasionally on Logistics-focussed Models and Optimization Solutions, as that what’s typically needed for a Sourcing Professional to make the optimum buy, but this time we’re going to make an exception.

Why is network modelling an exception (besides the fact that, as we told you yesterday when we said Don’t Overlook the Network, it has the absolute best return on investment across all supply chain applications)? Well, if you think about classic network modelling, it’s not something a sourcing professional would do because it’s typically up to logistics and supply chain to maintain the network infrastructure that gets the product from the suppliers to the ports and warehouses and then to the distribution centres, retail facilities, and end consumers in drop-ship models. It’s up to logistics to re-evaluate the supporting network infrastructure on a bi/tri-annual basis and determine if warehouses should be added, relocated, or deleted (on lease end); if ports should be changed (to reduce overall costs due to port fees or local carrier costs or rail vs truck options); if new carriers should be considered; and so on.

The reason that this is typically only done on a bi/tri-annual basis is because it has traditionally been an arduous endeavour where you have to

  • build a very detailed model of all
    • the supplier production facilities, ports, warehouses, distribution centers, manufacturing/assembly centers, and retail facilities
    • the lanes used
    • the modes used for each lane
    • the carriers used for each mode / lane combination
    • the LTL and FTL rates for each carrier
    • the drop-ship rates for direct-to-consumer
  • identify all of the products being purchased and
    • associate them with the appropriate suppliers
    • associate them with the appropriate lanes, modes, and carriers
    • associate them with the appropriate warehouses
    • associate them with the appropriate retail locations or drop-ship locations
  • collect all of the current rates, for every supplier-carrier-lane-mode option in use
  • then solve a current-state optimization problem to determine baseline costs, times to serve, carbon emissions, etc.
  • identify all of the potential port and warehouse locations you could (also) use
  • identify all of the new lanes that would create
  • identify all of the additional carriers that could be used
  • collect quotes for every lane-carrier-mode combination from the potential new options that might actually be used
  • then build an extended model that includes all options and feed in all of the data
  • then solve a full-state model to determine baseline costs, times to serve, carbon emissions, etc.
  • then determine the ranges for the number of ports, warehouses, distribution centers, carriers, time to serve, carbon emissions, etc. that are acceptable
  • solve a copy of the restricted full-state model to determine a new baseline cost
  • then make and create copies of the model and run analysis against different objectives until the model is acceptable, and the costs (time to serve, emissions, etc.) reduced significantly enough to do a network transformation exercise

and this endeavour would typically take three to six months due to the fact it would take weeks to build the baseline models, months to collect the data, and weeks to build, solve, and analyze the models and come up with a new state that improved all the measures of interest as well as the implementation plan to make it happen.

But the problem with doing this bi/tri-annually is that you never know the impact of adding a new supplier or, more importantly, replacing a supplier of a significant product line or category where that supplier is in a completely different location, and possibly one that the last network design never took into account. Plus, the removal of a big supplier might cause a certain node (warehouse, distribution center, etc.) to be significantly under-utilized, resulting in unexpected overspend in certain parts of the distribution network.

But this knowledge is critically important to know before making a major sourcing decision that might change the supply base for a highly utilized product line or category — because the costs of the award will not be the expected costs. They will not be the unit or expected transportation costs used in the analysis that the award decision is based on, but will instead be those costs plus the fixed and variable losses incurred from underutilizing a sub-set of the network and/or overutilizing another sub-set of the network.

While this has always been the case, as the belief was that nothing could traditionally be done about it, if there was a tool that could

  • actively maintain the current network model
  • allow for copies to be created on the fly
  • allow for those copies to be easily modified, including
    • the addition or deletion of nodes (suppliers, ports, warehouses, distribution centers, retail locations, etc.)
    • the definition of new lanes
    • the the addition of carriers and/or carrier modes
    • updated costs for every lane
  • solve those copies quickly and accurately

then a sourcing professional could have deep insight into whether their cost models and assumptions are correct and logistics could update the network model, or at least the future state (if leases/contracts need to expire and new leases/contracts need to be signed), upon every award, and the overall sourcing, logistics, and supply chain costs.

And this is what you can do with Logility Network Optimization, formerly Starboard Solutions (acquired in 2022) and exactly why we are making an exception and covering them.

With the Logility Network Optimization Solution (which really should be called Logility Starboard, for reasons that will soon become clear), a Sourcing Professional can:

  • instantly see a graphical view of the current global network
  • bring up reports that summarize all of the key data
  • drill in to node / carrier / supplier / port / warehouse / distribution / product / combination costs
  • create a copy of the current network with all relevant data
  • and then create a what-if baseline scenario where they can
    • add whatever they wish (through simple pop-up interfaces they can add nodes and relationships),
    • remove whatever they wish (by simply clicking on a node or searching for the entity or relationship and deleting it), and, most importantly,
    • change whatever they want through in the network design through a simple drag-and-drop mechanism
  • they can then specify any constraints and goals, run an optimization, and see the new costs, and, most importantly, extract lanes / costs / variables of interest to populate into the TCO (total cost of ownership) calculations in their sourcing events

Logility Network Optimization can do this because it integrates with third party platforms and constantly extracts current market quotes and market rates for all major global lanes and can, when you change a design, automatically bring in those market rates and costs as a baseline for any lane / (generic) carrier / mode / volume combination you don’t already have a quote for. This not only provides a baseline rate (which might get better with a volume promise, negotiation, or current quote), but a statistically accurate one (especially if you just go with a generic carrier rate). (And, if there is no quote for a lane, the platform is smart enough to build one up from lane segments or tear one down using existing quotes and statistically significant costs per distance using statistically significant base rates for just securing the transportation mode.)

Furthermore, because it is a true multi-tenant cloud solution that uses a distributed “serverless” model that can decompose tasks into subtasks that can be run in (a massively) parallel (manner), such as data fetching, sub-model building, and even model solving (as all optimization models can be solved by solving sub-models on convex subspaces of the high-dimensional solution space), it can do it fast. And it’s just as accurate as the traditional, prior generation tools, at a speed that is breakneck in comparison, and that’s even if it uses statistically significant calculated data.

Moreover, it’s very easy to define multiple constraints and weighted objectives. You can guarantee maximum times to serve / times to deliver (subject to minimums that cannot be improved upon) while balancing overall cost and carbon footprint (through a weighted objective). (It’s quite easy to define objectives in the platform which have built in pop-ups to solve for different goals — service time, emissions limit, cost, and best X, where X is a single dimension or derived dimension that weights 2 or more other dimensions.) Or you can guarantee maximum times to serve, a fixed / x% carbon reduction, while minimizing overall cost. Or you can keep ports you know are stable and the warehouses with contracts you can’t break while allowing the delivery network architecture to shift to minimize overall costs.

The browser based interface to Logility’s platform offers a graphically represented virtual twin to an organization’s network with high-level summary data (products, facilities, lanes, suppliers, customers, activities, and costs) with easy scenario selection and easy definition and modification of scenarios. It’s very easy to dive into definition screens and see the suppliers, facilities, lanes, etc. and see/edit all of the data for any individual supplier, facility, lane, etc.; add a new instance, delete one, and see the associated costs, times, emissions, etc. and the underlying calculations associated with a node or relationship in the network graph (which is stored in a graph database that allows for massive scalability).

It’s also very easy to dynamically generate comparison reports between scenarios that compare (activity) costs (across cost types, such as leases, handling, transport costs, rail costs, ocean freight, tariffs, etc.), (average) service times (by supplier, product, lane, etc.), carbon (by carrier, lane, product, supplier, etc.), and other metrics of interest. Furthermore, when a user is happy with a scenario, they can one-click generate and output a complete comparison / summary report deck to PowerPoint for executive reporting (across as many scenarios as they like).

To enhance usability — which is quite obvious out-of-the-box to anyone who understands the basics of modelling, optimization, and decision analysis — Logility has an integrated quick-tour to get started, a full multi-media course on the platform and the modelling that can be done, playbooks for particular problems and challenges, and weekly office hours where users can ask Logility pros questions and get answers in real time. Logility Network Optimization was designed from the bottom up for usability and success.

Logility Network Optimization is the perfect complement to optimization-backed sourcing platforms with bill of material support. Buyers can model potential changes that would result from awarding to a new supplier, not awarding to an existing supplier, changing carriers or lanes, get expected transportation and tariff costs, augment the supplier quotes with this updated data in real-time through a Logility API feed (from an identified scenario), run a total cost of optimization scenario on the full set of bids augmented with accurate total cost of ownership data, make an award, push that award back to Logility Network Optimization which will update the network model in real time, create a new what-if, and see if the network model should be altered when the new supplier is brought fully on board. For the first time, an organization could have closed loop sourcing, logistics, and network optimization in real-time — a reality that was once as far away as the stars themselves (and why the platform takes you Starboard). It’s a powerful concept, and worth branching out beyond traditional Source-to-Pay providers and Strategic Sourcing Decision Optimization to achieve.