Category Archives: Decision Optimization

Embracing Complexity

Recently, Supply and Demand Chain Executive ran an article on “Embracing Complexity” that pointed out that supply networks that are becoming increasingly extended and complex; integration between companies and their trading partners is becoming deeper at the systems and process levels; and emerging technologies like radio frequency identification are producing ever-growing mountains of supply chain data and that these and other factors threaten to overwhelm the systems that companies rely on to monitor and manage their flows of goods and 20th century systems may be inhibiting companies from moving toward a 21st century supply chain.

In addition, it presented Lawrence Davis’, a senior fellow at NuTech Solutions (acquired by Netezza Corp), insights into problems with current supply chain technologies. In short, he believes that contemporary solutions do not allow companies to optimize at the appropriate level of aggregation and that companies should be able to use solutions to optimize across their sourcing and procurement, production and distribution processes all at the same time; that software solutions that optimize based on deterministic assumptions about how long it will take for any given process to be completed produce “perfect” schedules that do not allow for breakdowns of machinery, traffic jams, defective parts, and other real-world assumptions; and that stochastic simulations which employ embedded agents that follow the company’s business rules are required.

They got the problems right, but I’m not sure I agree with the proposed solutions. Here’s a short list of reasons why.

  1. Optimizing at the appropriate level of aggregation has always been a discipline-independent problem and we’ve always managed. It’s as much a process problem as a technology problem. It all comes down to using appropriate levels of abstraction that allow us to connect larger and larger problems. And it works. You don’t need to simultaneously optimize all of your categories and all of your lanes – a problem you can’t solve. You can optimize all of your buys using high-order freight approximations, then collectively optimize your freight costs and distribution network.
  2. Deterministic models can be used on approximations and ranges as well as precise models. Yes, the results are still “perfect ranges”, but you can capture most of the likely outcomes. Moreover, none of the technologies proposed will capture every exception and you’ll still need exception management.
  3. Stochastic simulations are a good methodology for determining what could go wrong, but the key is identifying a set of collaborative systems that can embed the company’s business rules – because, as I just said, the processes are as important, if not more so, than the technology.
  4. The technologies proposed – “genetic algorithms”, “evolutionary computation”, and “deterministic simulation” are not silver bullets – just like the ERP was not the silver bullet you needed to manage your supply chain. They have their uses, but they are not that much better than today’s technologies, if they are better at all (as they all have their drawbacks).
  5. You’ll never be able to optimize everything. For that, you’d need a model that accounts for everything (and first of all, we can’t model the market), then you’d need an expensive High Performance Computing Cluster with hundreds (or thousands) of processors and a significant amount of memory, and finally you’d need an algorithm that can take advantage of the highly parallel machines – and you’ll quickly find that most of today’s optimization technologies, or at least the sound and complete ones, do not have efficient massively parallel implementations.

It’s true we still have a long way to go in supply chain, and that we do have to embrace technology, but we have to be careful of over-relying on new technologies, particularly those that have drawbacks as significant as the advantages they are being promoted for, to solve all of our problems. Although some things change, some things will stay the same – and the constant is that no matter what, we are going to need more brain power and good old fashioned human ingenuity to get to the 21st century supply chain.

One can wish it were otherwise, but as a technologist and former academic who could spend countless posts educating you not only on “genetic algorithms”, “evolutionary computation”, and “deterministic simulation” but also on “fractal geometry” (the basis for NuTech’s logo), “chaotic dynamical systems”, and “complexity theory”, it’s not the case. Technology is just a tool – the real solutions will come from the brains who can identify the problems, identify the process solutions, and then put the appropriate technology in place to back it up.

Advanced Sourcing is Where It’s At

“Two Turntables and a Microphone”Forget (e-Sourcing Forum, [WayBackMachine]). Advanced Sourcing is where it’s at, and Aberdeen just proved it again.

Regular readers, especially those who followed my summer series over on eSourcing Forum, will know that my favorite statistic to quote is Aberdeen’s finding (from their “Success Strategies in Advanced Sourcing and Negotiations: Optimizing Total Costs and Total Value for the Next Wave of e-Sourcing Savings” in June of 2005) that the application of optimization tools to analyze total costs, and of flexible bidding functionality to uncover creative supplier solutions has enabled early adopters to identify an average incremental savings of 12% above those that basic, price-focused auctions alone have generated “The Advanced Sourcing and Negotiation Benchmark Report: The Art and Science of the Deal”. This month, Aberdeen released the follow up on this study with which found that enterprises that are employing advanced sourcing techniques are still identifying an average savings of 11.9% per sourcing event. Furthermore, best-in-class enterprises are identifying an average savings of 13.7% per event. Considering that savings from basic sourcing techniques tend to reach saturation after a handful of events, the fact that these companies are not only fighting off stagnation but still thriving is exemplary of the power of advanced sourcing and negotiation, which includes bid optimization, cost modeling, flexible bidding, life-cycle sourcing, and Total Cost of Ownership / Total Value Management scoring techniques.

That’s why I spend so much time on true decision optimization for strategic sourcing – which, as I’ve pointed out before, must include the capability to capture all fixed and variable real-world costs accurately (including flexible bidding, tiered bidding, and life-cycle cost support), to accurately model real world constraints (which impacts cost modeling and TCO/ TVM), and to accurately solve the model (using an optimization algorithm that is sound and complete). True decision optimization for strategic sourcing supports and complements all aspects of advanced sourcing and negotiations, and I’m sure Paul Martyn will have more to say on the topic over on CombineNotes [WayBackMachine] as CombineNet was a report sponsor. (I also expect David Bush will analyze some of the key findings over on e-Sourcing Forum as Iasta was also a report sponsor – so be sure to keep your eyes on that blog as well.)

In the meantime, if you haven’t yet started to use decision optimization in your high-value or strategic events – and statistics are telling me that the vast majority of you are not, start evaluating and test-driving the solutions the market has to offer. After all, It Pays to be World Class.

Missing the Point … or … The Right Way to Handle Freight

Last week I summarized my comments on how Sometimes 80% is enough here on Sourcing Innovation. I did this for multiple reasons – it seems that not everyone gets the point that with regards to optimization, not only is 100% unattainable, but even striving for 100% is often ludicrous.

The reason for this is that you are never optimizing against actual data, but estimated data. Remember, when you are sourcing, you are sourcing against forecasted needs, on forecasted schedules, with forecasted shipment levels associated with forecasted freight costs. Your demand probably will vary slightly, and may vary significantly, your schedules will need to be accelerated or decelerated when demand spikes or drops, your shipment sizes will also vary with seasonal demand variations, and with freight surcharges the norm these days, your freight rates will never be locked in stone. Thus, even an “optimal” solution is not optimal.

Moreover, striving for an optimal solution instead of settling for a (very) near optimal solution may actually decrease the quality of your solution. For example, let’s say your supplier gives you a significant discount (in the form of a rebate) of 10% if you buy 60,000 units, and your anticipated demand is precisely 60,000 units. Let’s say you award the supplier the business, but your forecast was over by 5% and you only buy 57,000 units. Let’s also say that the second cheapest supplier was only 3% less expensive. In this situation, your search for the ultimate solution cost you 7%!

As another example, let’s say a certain carrier will beat every other carrier’s truckload rate by 10%, where the truckload rate applies if you fill 75% or more of the truck. Let’s also say that we have the situation where your expected shipment is 80% of a truckload, that 25% of a truckload costs 20% more than the average shipping cost across your other carriers, and that your shipment size varies significantly by season and promotion (because you are in the food service industry, for example). One week you’ll ship 80%, the next week you’ll ship 60%, and the week after you’ll ship 120%. Chances are good that, in reality, you will not be shipping truckload half the time and paying on average 10% more. (If you are paying 20% more half the time, you’re paying 10% more over all.)

So this brings me back to the title of my post – the right way to handle freight. First of all, let’s note that when dealing with freight, you have one of five situations:

  • Freight is a small percentage of total spend, less than 20%
  • Freight is a moderate percentage of total spend, 20% to 40%
  • Freight is more or less equal to total spend, 40% to 60%
  • Freight is a large percentage of total spend, 60% to 80%
  • Freight is a majority percentage of total spend, greater than 80%

The first case is the most important case. Why? Because it is this case that I find to be the most mishandled and misunderstood. I know for a fact that many corporations have thrown away millions, if not tens or hundreds of millions, of dollars because of their belief that freight optimization needs to be perfect even when it falls into this case and have put off acquiring a decision optimization solution in hopes that the perfect solution will come along soon.

This is the case where the “sometimes 80% is enough” rule comes into play. If someone provides you with an optimization solution that can handle your buy almost perfectly but only handle freight 80%, don’t dismiss it as imperfect and pass up an opportunity to save millions just because it’s not perfect in your eyes. Do the math! If freight is at most 20% of your spend, and the solution is expected to be at least 80% accurate, then the solution computed by the optimizer will be at least 96%. If freight is at most 10% of your spend, then the solution computed by the optimizer will be at least 98% optimal. If your non-optimization assisted solution doesn’t even approach 90% of optimal, why would you pass up an opportunity to save an extra 6%-9%? After all, as per my arguments above, I’d argue you are never going to achieve more than 98% (on average) in reality anyway! So don’t look for perfection when evaluating optimization solutions – chances are you will not find it (even though some solutions might come quite close) as it’s still a maturing and improving technology.

What about the other cases? The fifth case, where freight is the majority of your spend is also easy – you simply invert the problem and source freight lanes, and treat the product buy as freight.

The middle cases are harder. As for cases two and four, where freight is a moderate or large percentage of spend, the best way to handle these cases is to combine the categories with similar categories that can, or will in all likelihood, be shipped on the same trucks or in the same lanes. Preferably, those categories where, in the second case, freight is significantly lower and bumps you back into the first case or where, in the fourth case, freight is significantly higher and bumps you up into the fifth case, as we already know how to handle these cases. Case three is the toughie – product cost and freight are almost equal. What do you focus on?

This is the case where you do enterprise-wide freight optimization. You optimize all of the product buys, amalgamate all the freight requirements, and then optimize the freight. Unless, of course, your spend is significant enough, your pocketbook deep enough, and your patience long enough to throw CombineNet’s top-end optimization platform at it. (It really depends on your organization size – if you are a large organization, the cost of CombineNet should be inconsequential, especially considering the potential savings. If you are a small organization, the difference between the cost of the solution and the expected savings is not likely to be significant. If you are a mid-size, it depends on category size and characteristics.) On an ultra-high end server, their platform can certainly handle most of the problems you can throw at it, but not all problems solve in less than a second … a large and complex enough problem will take minutes, hours, or even days regardless of how good your optimization platform is. (However, if it takes more than a few hours, chances are your model is not the right one.)  Also, since their solution is not part of any suite where your data resides, there will be some integration time.  (But that’s a small price to pay to save $$$!)

CombineNet VII: BoB’s Power Source

Yesterday I told you that not only could CombineNet support all of the basic cost and constraint categories required for true decision support, but that the model they generate accurately represents all of the costs and constraints they support and they can solve the model faster than all of their competitors the vast majority of the time. I also told you that today I’d highlight where this unique capability comes from.

Paul highlighted it in his “Now that’s an Electric Engine …” recent post on CombineNotes [WayBackMachine]. CombineNet’s ClearBox uses sophisticated tree-search algorithms to find the optimal allocation. Furthermore, the algorithms are ‘anytime algorithms’; they provide better and better solutions during the search process. And, CombineNet has also invented a host of proprietary techniques in tree search algorithms, including new branching strategies, custom cutting plane families, cutting plane generation and selection techniques, and machine learning methods for predicting what techniques will perform well on the instance at hand (for use in dynamically selecting a technique).

Even though every model can be built and solved on an off-the-shelf optimizer, the reality is that we are dealing with NP-Complete problems, which means that solve time generally increases exponentially with problem size. This means that for any given solver and any given model class, there is a limit on the average model size that can be solved in any given time window. Although an efficient model formulation combined with an appropriately tweaked off-the-shelf solver can solve a very decently sized problem, one must not ignore the fact that generic solvers use generic algorithms which are not always optimized for the problem at hand. This indicates that it is likely that one could create an appropriately defined custom designed algorithm that is likely to solve the model faster, if not significantly faster.

What is not as obvious is the degree of difficulty often associated with the development of these custom algorithms for strategic sourcing decision optimization. The nature of these problems is that it is very hard to determine for any given solution strategy and any given model instance, whether it is more or less likely to solve the model faster than another similar solution strategy. It’s the fundamental nature of NP-Complete. If we knew how to do it, we’d likely be in P-Space.

As highlighted in the post, CombineNet began to develop its algorithms in 1997, and it has 16 people working on the algorithms. We have tested hundreds of techniques to find those that shorten solve time for Expressive Commerce clearing problems. Some of the techniques are specific to market clearing, and others apply to combinatorial optimization more broadly. And that’s where the strength of CombineNet is – 10 years of research focussed on determining how to solve the combinatorial optimization problems that underlie strategic sourcing decision optimization problems quickly and optimally. Everything else is just icing on the cake.

CombineNet VI: Strategic Sourcing Decision Optimization

We ended our last post by noting that what is important in strategic sourcing decision optimization is:

  1. The ability to support all of the relevant costs and cost tiers.
  2. The ability to support all of the fundamental constraint types required for true strategic sourcing decision optimization.
  3. The ability to generate a model that accurately represents all of the relevant costs and constraints.
  4. The ability to optimally solve the model in a realistic time frame.

Other requirements, which should go without saying, are:

  • Solid Mathematical Foundations
  • What If? Capability

Today we are going to discuss each of these points in detail and explain where the true power of a BoB solution like CombineNet is and what’s just confusing marketing hype.

1. The ability to support all of the relevant costs and cost tiers.

Fundamentally, in order to be a true solution for strategic sourcing decision optimization, the application has to support fixed and variable costs, and, furthermore, the application should allow those costs to be bid in a tiered or layered fashion or as discounts, so that a buyer can use the bidding structures that are natural to the commodity or industry they are in. This includes the ability to define unit costs, transportation costs, usage costs, and impact costs as well as sophisticated supplier discounts along the lines of “If you buy 10,000 forks, I’ll give you a discount on 10,000 spoons”.

2. The ability to support all of the fundamental constraint types required for true strategic sourcing decision optimization.

Fundamentally, the optimization solution must support, at a minimum, four basic categories of constraints: (a) capacity constraints, (b) flexible allocation, (c) risk mitigation, and (d) qualitative constraints in order for it to be a real strategic sourcing decision optimization product.

You need to take into account all of your supplier capacities, you need to be able to account for your current contracts and business rules, you need to insure that sole sourcing risks are addressed when required, and you need to be able to take into account your non-cost requirements such as quality, delivery time, and durability (etc.).

3. The ability to generate a model that accurately represents all of the relevant costs and constraints.

Many solutions exist that let you define whatever you want, but under the hood the costs and constraints are simplified and only an approximate representation is used.

4. The ability to optimally solve the model in a realistic time frame.

This is actually a two part requirement. The first requirement is that the system optimally solves the defined model, and not an approximation of the model. The second requirement is is that the system solves the model in a realistic time frame. A small model should not take more than a few minutes. A medium sized model should not take more than a few hours. A large model should solve overnight. Any longer and the usefulness of the solution is limited, especially when sourcing cycles are now completed in weeks, and not months, and all that a buyer may have to make an award decision is a few days.

Furthermore, as I mentioned in my last post, where CombineNet really stands apart from the rest of the pack is:

3. Their ability to generate a model that accurately represents all of the relevant costs and constraints.

4. Their ability to optimally solve the model in a realistic time frame.

5. Their ability to solve larger models than the majority of their competitors.

Simply put, not only can they support all of the basic cost and constraint categories required for true decision support, but the model they generate accurately represents all of the costs and constraints they support and they can solve the model faster than all of their competitors the vast majority of the time. Furthermore, they have the capability to solve larger models than the vast majority of their competitors. And tomorrow we’ll discuss where this unique capability comes from and why that, and not Expressive-Bidding, Expressive-Commerce, Comprehensive Bidding, or whatever-you-want-to-call-it-today-bidding, is what makes CombineNet BoB.