Category Archives: Decision Optimization

Introducing LevaData. Possibly the first Cognitive Sourcing Solution for Direct Procurement.

Who is LevaData? LevaData is a new player in the new optimization-backed direct material prescriptive analytics space, and, to be honest, probably the only player in the optimization-backed direct material prescriptive analytics space. While Jaggaer has ASO and Pool4Tool, it’s direct material sourcing is optimization backed and while it has VMI, it does not have advanced prescriptive analytics for selecting vendors who will ultimately manage that inventory.

LevaData was formed back in 2014 to close the gaps that the founders saw in each of the other sourcing and supply management platforms that they have been a part of over the last two decades. They saw the need for a platform that provided visibility, analytics, insight, direction, optimization, and assistant — and that is what they sent out to do.

So what is the LevaData platform? It is sourcing platform for direct materials that integrates RFX, analytics, optimization, (should) cost modelling, and prescriptive advice into a cohesive whole that helps a buyer buy better when they use and which, to date, has reduced costs (considerably) for every single client.

For example, the first year realized savings for a 5B server and network company who deployed the LevaData platform was 24M; for a 2.4B consumer electronics company, it was 18M; and for a 0.6B network customer, it was 8M. To date, they’ve delivered over 100M of savings across 50B of spend to their customer base, and they are just getting started. This is due to the combination of efficiency, responsiveness, and savings their platform generates. Specifically, about 60% of the value is direct material cost reduction and incremental savings, 30% is responsiveness and being able to take advantage of market conditions in real time, and 10% is improved operational efficiency.

The platform was built by supply chain pros for supply chain buyers. It comes with a suite of f analytics reports, but unlike the majority of analytics platforms, the reports are fine tuned to bill of materials, component, and commodity intelligence. The reports can provide deep insight to not only costs by product, but costs by component and/or raw material and roll up and down bill of materials and raw materials to create insights that go beyond simple product or supplier reports. Moreover, on top of these reports, the platform can create costs forecasts and amortization schedules, track rebates owed, and calculate KPIs.

In order to provide the buyer with market intelligence, the application imports data from multiple market fees, creates benchmarks, compares those benchmarks to internal market data, automatically creates competitive reports, and calculates the foundation costs for should cost models.

And it makes all the relevant data available within the RFX. When a user selects an RFX, it can identify suppliers, identify current market costs, use forecasts and anonymized community intelligence to calculate target costs, and then use optimization to determine what the award split would be, subject to business constraints, and identify the suppliers to negotiate with, the volumes to offer, and the target costs to strive for.

It’s a first of its kind application, and while some components are still basic (as there is no lane or logistics support in the optimization model), missing (as there is no ad-hoc report builder, or incomplete (such as collaboration support between stakeholders or a strong supplier portal for collaboration), it appears to meet the minimal requirements we laid out yesterday and could just be the first real cognitive sourcing application on the market in the direct material space.

There are No Economies of Scale … Just Economic Production Quantities

As the public defender likes to point out on a regular basis over on Spend Matters UK / Europe, economies of scale is a procurement myth. The idea that the more you buy, the bigger discount you can get because the cost diminishes is a myth because, if it were not, if you could buy a large enough quantity, then the cost would eventually get close to 0 per unit.

But the reality is that there are always hard costs that cannot be reduced in the supply chain … particularly those components that involve human labour — product creation, product transportation, product component creation, product component transportation, raw material mining, raw material component transportation, security guards for storage, etc. — and facility leases, utility cost, taxes, etc.

And there are always limits to “economies of scale” production lot sizes. If the line can only do 60 units per hour, then the line can only do 2400 in a normal workweek, 4800 in a double shift work week, 7200 in a triple shift work week and maxes out at 10,080 a week … assuming no downtime (and most lines will require some maintenance). In this case, the major economies of scale are 2400, 4800, and 7200 — as this insures that the labour cost (and facility costs) are spread over the maximum number of units.

In other words, there are economic production quantities (EPQ) where the price per unit is minimized, and this is the optimal economy of scale.

So if you really want to minimize your costs, you can start by minimizing your supplier, and carrier costs, which can be done by appropriately distributing the award across suppliers in economic production quantities that can allow them to give you larger discounts (and still retain a reasonable margin). So how do you do that? Considering that each supplier has a different EPQ, each carrier has a different EPQ, and this varies by product (and plant location), how can you possibly figure out how to split in such a way that you can enable suppliers to reduce their bids?

If you’re a regular reader of Sourcing Innovation, you know the answer. A decision optimization platform …

Is WalMart Going to Force Logistics Scheduling Optimization Mainstream?

Recently, Spend Matters pointed out that Retail Mega-Giant Wal-Mart is stepping up its pressure on suppliers to get fulfillment perfect or pay a fine. According to Bloomberg, the goal is to add 1 Billion to revenue by improving (desired) product availability at stores (as the average stock-out rate of 8% costs a mega-retailer like Wal-Mart an awful lot of money).

But it’s not just stock-outs costing Walmart money. It’s deliveries that don’t happen when they are expected to happen. If a delivery arrives late, then warehouse workers have to stay overtime to get the truck unloaded, and that costs Walmart at least time and a half for every hour the workers have to stay late (plus any hours they had to be paid to wait around, probably doing nothing, for the delivery). If a delivery arrives (a day) early, then regularly scheduled deliveries have to be pushed ahead, possibly contributing to overtime and payment for empty hours (when workers show up for their shift and there is no work to be done for two hours).

And if trucks are waiting in winter, the drivers are not only being paid to sit to wait, but are probably also idling their trucks to keep warm, burning fuel, bumping up costs. So, the supplier is paying more to deliver, and passing that cost onto Walmart. When you think of how many early and late deliveries a mega-retailer like Wal-Mart must get, and you add up all the OT costs, empty hour costs for warehouse workers and drivers, and additional fuel costs, that costs a lot of money even before you take in the potential losses from stock-outs.

Bravo for Wal-Mart for trying to force more perfection into the supply chain and eliminate the considerable losses that come from imperfect orders. But how will the average supplier and/or carrier comply? Logistics scheduling can be a nightmare and be way too much for the average scheduler, or spreadsheet to handle. But as we’ve indicated before, not too much for an appropriately defined optimization solution. It’s about time optimization got more respect, even if it starts with scheduling.

And while optimization needs to be more universally applied, once a supplier or carrier gets comfortable with scheduling optimization, they’ll get more comfortable with optimization in general and move onto the adoption of decision optimization for logistics, and that’s just one step away from the application of decision optimization to high value / strategic events. And that’s, hopefully, only one step away from the universal application of optimization across all sourcing events.

So while this isn’t the most critical application of optimization for an average organization, it’s a great start and bravo to Wal-Mart for forcing suppliers and carriers to perform better in a manner that should force the eventual adoption of optimization.

And if you don’t like it, get over it. And if you don’t like Wal-Mart, remember, their dominance is all your fault.

The UX One Should Expect from Best-In-Class Optimization … Part IV

In our last post on The UX One Should Expect from Best-in-Class Optimization (Part III) we continued our foray into the world of optimization and how the requirements for a good user experience go far beyond the requirements for e-RFX and even best-in-class e-Auction platforms. (We will remind you that you can review the basic requirements for e-RFX and e-Auction in our four part series on Best-in-Class e-Sourcing and Best-in-Class e-Auctions, Part I, Part II, Part III, and Part IV.)

As per the previous entry in this series, the doctor cannot emphasize enough just how important sourcing optimization is, and is about to become, for any organization that wants to continue to not only find savings but identify crucial value. This is because, as stated in our last post, in many global jurisdictions, and in North America and the UK (where many global organizations are headquartered) in particular, savings are about to go up in smoke due to inflation, protectionist policies, insufficient supply of raw materials, and forthcoming breakdowns in trade agreements.

There’s no other solution that can not only identify year-over-year savings of 10% or more but also an award scenario that will allow the organization to realize that savings of 10% or more if the award scenario is adhered to. It’s getting hard to understand how organizations, getting more budget-stressed by the year, can continue to hold out from adopting the only solution that can help them.

Buyers have to put aside their fear of the maths and go out and get a solution ASAP. But not just any solution claiming to be a true strategic sourcing decision optimization platform, or even any true strategic sourcing decision optimization platform, will do. It has to be one that provides the right user experience.

So what is the right user experience? One that satisfies about a dozen different key usability requirements, of which we’ve already discussed three — true, powerful, cost modelling; guided sourcing by the way of system-assisted “what-if” support; and identification of the constraints, or sets of, that are preventing a solution when the model is over-constrained and a solution is needed.

But this, as you can guess, is not enough. It’s only a fraction of the key requirements that need to be satisfied in order to provide the necessary user experience that will drive each and every buyer to find the value that only a true best-in-class optimization solution can be used to find.

Another key requirement, but by far not the last, is analytics-driven comparison reporting. You see, one thing that is critical in the identification and allocation of the proper award is, as the co-authors of this series identify in The UX One Should Expect from Best-In-Class Optimization Part IV over on Spend Matters Pro [membership required], effective visualization.

Why is effective visualization so critical? Consider that a power buyer’s job is to try and find the perfect scenario with the almost perfect award that maximally satisfies the constraints and the desires of all the stakeholders. This could require the comparison of dozens of alternatives that have about the same costs and about the same overall total constraint violation, but have the potential to dissatisfy each group in significantly different ways. Without an effective analytics and visualization component that can help a user visualize the mathematically minimal but not objectively minimal differences in the minds of the stakeholders, a user could spend days or weeks trying to decide on the right award variation among dozens of spreadsheets when a good visualization solution can eliminate the less desirable candidates in minutes. It’s another key requirement of a good, modern, user experience in sourcing optimization.

But, as you can guess, this is still not all a platform needs to do. For a deeper dive on this, and other requirements, check out the doctor and the prophet‘s final instalment in our optimization UX series, The UX One Should Expect from Best-In-Class Optimization Part IV, over on Spend Matters Pro [membership required] and stay tuned for our upcoming deep-dive series on the required UX for spend analysis as this will not only help you select an optimization solution with a good visualization solution but an analytics solution with a good visualization component as well.

The UX One Should Expect from Best-In-Class Optimization … Part III

In our last post on The UX One Should Expect from Best-In-Class Optimization (Part II) we continued our foray into the world of optimization and how the requirements for a good user experience go well beyond the requirements for e-RFX and even best-in-class e-Auction platforms. (As for the basic requirements of any e-RFX or e-Auction platform, see our two-part series on Best-in-Class e-Sourcing Part I and Part II and our deeper dive into Best-in-Class e-Auctions Part I and Part II.)

the doctor cannot emphasize just how important sourcing optimization is, and is about to become, for any organization that wants to continue to not only find savings but identify crucial value, and, to be blunt, unavoidable. This is because, especially in North America and the UK (where many global organizations are headquartered), savings are about to go up in smoke due to inflation, protectionist policies, insufficient supply of raw materials, and forthcoming breakdowns in trade agreements.

Considering that it’s one of only two advanced sourcing solutions that can be used to generate year-over-year savings, and value, in excess of 10%, and the only solution that can not only identify the potential but also identify an actual award scenario to realize the savings and/or value, organizations getting budget-stressed year-over-year will not be able to hold out much longer — at least not if they want to stay competitive.

But not any platform will do, it has to be useable. Most buyers, who are afraid of, as the Brits like to say, the maths, are naturally afraid of such platforms, which they still fear are math heavy and insanely difficult to use (like the first generation platforms tended to be), and that fear will only be overcome if they see a great user experience awaits. And we mean great.

What does that entail? About a dozen different usability requirements, of which we’ve already discussed two — true, powerful, cost modelling and guided sourcing by the way of system-assisted “what-if” support.

But this is just the beginning. Another core requirement, and one the vast majority of systems have missed in the past, is automatic identification of unsatisfiable constraints. In basic sourcing optimization events, as the doctor and the prophet pointed out over on Spend Matters Pro [membership required] in our latest article on What To Expect from Best-in-Class Sourcing Optimization Technology and User Design (Part 3), the “model” itself will only be lightly constrained, as procurement will want to limit the potential inputs to award scenarios (all of which impact price) to one or only a handful of constraints. Quite often these constraints are centred around capacity in potential award decisions. But as stakeholders gather and add more constraints, the potential increases to create a null award set. And if the scenario can’t be solved, the value of optimization can’t be realized.

Sometimes a human can easily figure out which constraints are prevent solution, and sometimes, because there are so many and they all weakly constrain the award and it’s actually a set of 6 to 10 working in unison making the scenario unsolveable, a human can’t. And if a human’s only option to find a reasonable solution is just to delete constraints until the model solves, that’s not likely to generate the lowest cost award or the award that satisfies the stakeholder’s desired constraints to the maximum extent possible. The buyer or analyst really needs to know the exact constraints, or sets, causing the model to be unsolveable so she can modify, or remove, just those and present her stakeholders with the lowest cost / highest value award that violates the smallest number of desired constraints.

But, as you can guess, that’s not all a platform needs to do. For a deeper dive, check out the doctor and the prophet‘s latest instalment in our optimization UX series, What To Expect from Best-in-Class Sourcing Optimization Technology and User Design (Part 3), over on Spend Matters Pro [membership required] and stay tuned for Part IV.