Category Archives: Decision Optimization

CombineNet V: Expressive Bidding (in Combinatorial Optimizations)

I know I ended my last post indicating that my next post would put BoB in perspective by extolling the virtues of POE, but I’m getting really tired of CombineNet over-hyping Expressive Bidding and so I’m going to explain why Expressive Bidding and Expressive Commerce has nothing to do with the price of fish when we’re talking about BoB. (Don’t worry, this does not have to be a series of finite length, so I will discuss the virtues of POE eventually so that you can put the virtues of BoB in perspective.)

In Paul’s “2007 – The Year of the Supplier” post on CombineNotes [WayBackMachine], he says Expressive Bids can include conditional (if/then) offers, volume discounts, packages of items (bundles), and other creative offers that take advantage of their strengths and/or recent innovations and with Expressive Bidding, suppliers drive the inefficiencies out of their own business and share the savings with buyers, simultaneously strengthening strategic relationships for long-term supply chain efficiencies and competitive advantage.

First of all, there’s nothing here that you couldn’t do self-serve with MindFlow’s application back in 2000/2001, which was two years before CombineNet started using the terminology and filing for trademarks / copyrights / etc. Secondly, pieces of this functionality existed before that in Emptoris’ offering, FreeMarket’s failed effort, i2’s early technology, etc. Thirdly, operation researchers have known how to do if-then constraints for at least two decades using the Theory of Logical Variables and its precursor instantiations. Fourthly, the same holds true for tiered bids (and every discount can be transformed to a tiered bid with a minimum buy if-then constraint and vice versa), which operations researchers have been accomplishing for even longer using piece-wise linear constraints. Fifthly, these bid styles existed long before the introduction of optimization technology, so there is fundamentally nothing new about what is being supported. Sixthly they’re not the first company to come up with a wizard-like interface (although it looks like theirs may be better than most). I could go on, but you get the point.

I’m not saying that Expressive Bidding, Real-World Bidding, Comprehensive Bidding, or whatever-you-want-to-call-it-today-bidding is not important, it is, because, without it, any optimization application with any degree of sophistication will be quite difficult to use, just that it’s not the greatest thing since sliced-bread, which CombineNet’s marketing materials would leave to believe.

What’s important is:

  1. The ability to support all of the relevant costs and cost tiers.
  2. The ability to support all of the fundamental constraint types required for true strategic sourcing decision optimization.
  3. The ability to generate a model that accurately represents all of the relevant costs and constraints.
  4. The ability to optimally solve the model in a realistic time frame.

Where CombineNet really stands apart from the rest of the pack is with respect to:

3. Their ability to generate a model that accurately represents all of the relevant costs and constraints.
4. Their ability to optimally solve the model in a realistic time frame.
5. Solve larger models than the majority of their competitors.

So tomorrow we’ll discuss these required capabilities and what BoB truly is, and, more importantly, what CombineNet’s offering really is and what is just annoying marketing hype.

CombineNet IV: BoB’s Unique Talents

Disclaimer: This blog, including this post, is not sponsored by CombineNet (acquired by Jaggaer). The author is not employed by, contractually engaged with, or affiliated with CombineNet. Any and all opinions expressed herein are solely those of the author. Furthermore, the opinions expressed herein should be contrasted with the opinions of other educated professionals before the reader forms his or her own opinion. Finally, the author is neither endorsing nor dissenting the use of CombineNet’s products or services – merely trying to spread awareness on the importance of optimization and the relative uniqueness of an offering like that of CombineNet. This disclaimer holds true for each post in this multi-part series and will be repeated.

Warning: This is a lengthy post.

In my last post, I outlined in some detail a problem that I felt not only required BoB (Best of Breed) but required CombineNet in particular for an optimal solution. What I did not convey is that not only are there other problems out there that I could have chosen, but there are a significant number of supply-chain related problems that often require BoB.

Today I am going to discuss six problems that generally require a BoB solution. This does not mean that you would necessarily require CombineNet (there are some other optimization vendors that can tackle a few of these), but that you would require a best of breed optimization solution (similar to that offered by CombineNet) to tackle these problems and be assured that the solution you achieved was optimal.

The problems I am going to discuss are:

  1. Distribution Network Design
  2. Large Combinatorial Problems
  3. Large Non-Homogenous Logistics Problems
  4. Non-Traditional Sourcing Problems
  5. Very Large (Traditional) Sourcing Problems
  6. Regret Minimization Problems

Distribution Network Design

Most large retailers or distributors have large distribution networks – often dozens of locations throughout a single country or region. However, this is generally not optimal. For example, in a talk at INFORMS given by a practitioner at APL Logistics, they described how they analyzed the distribution network used by JC Penney that had almost 60 DCS (and cost over 330M / yr to operate) and using in house proprietary meta-heuristic optimization algorithms, they deduced an optimal distribution network that had only 8 DCs, saved over 30M dollars, and, on average, shaved over a day off of standard delivery times! What the presentation did not dwell on (since INFORMS is an OR conference) is that these problems are usually humongous and insanely difficult to model, yet alone solve with your average off the shelf optimization problem (as bad as the multi-level make vs. buy problem discussed in my last post) and without a best of breed solution, your chances of finding the truly optimal solution are often slim.

Large Combinatorial Problems

Large pure combinatorial problems are much, much harder than large pure linear problems (which optimizer’s like IBM iLog’s CPlex can often cut through like a hot knife through butter on today’s high end machines) and significantly harder than general MIP problems. The reason is that these problems contain very large numbers of binary variables, and the best generic domain-independent techniques available are generally no better than greedy branch and bound, and for even a thousand binary variables, that could be 21000, or over a trillion evaluations. An example of a large combinatorial problem is a large marketplace auction where all the participants bid on fixed size lots which are non-decomposable. In other words, CombineNet’s (original) definition of an exchange.

How much better are best of breed solvers on large combinatorial problems? Let’s consider CombineNet’s best of breed solution customized for exchanges. According to CombineNet: “The resulting optimal tree search algorithms are often 10,000 times faster than the state-of-the-art general-purpose MIP solvers on the hard instances of real-world market clearing.” As I have expressed to CombineNet personnel directly, I doubt that this is the average case, but I know beyond a doubt that well defined algorithms can easily shave a factor of 100 or more off of solution time, and that this can be scaled up to almost 1000 on a multi-core machine with a smart parallel implementation. In other words, don’t always expect the best case, but the average case performance of BoB on these problems will demonstrate significant improvements.

Large Non-Homogeneous Logistics Problems

This is similar to a variation of the multi-level make vs. buy problem discussed in the last post. However, in this situation you are trying to optimally bundle your deliveries across product, and sourcing categories and choose the optimal carriers and distribution network independent of your suppliers. Given the non-uniform quotes (weight, volume, LTL, FTL), lot sizes, and various charges and surcharges often imposed by freight carriers, forwarders, loaders, unloaders, warehousers, etc., this problem can become really surly really fast on a large buy. Moreover, unlike the sourcing case where it often makes up a low percentage of your spend and a high order approximation is more than sufficient, when you aggregate your logistics across multiple categories, even a fraction of a percentage point can become significant.

Non-Traditional Sourcing Problems

Most Platform Optimization Engines are optimized for traditional sourcing problems – this means that they are generally not optimized for non-traditional sourcing problems. (Why should they be? Most of the problems you face are traditional everyday sourcing problems.) But every now and again you might have a non-traditional sourcing problem. One example – cell phone plan optimization. Cell phone plans are expertly crafted to be as confusing as possible to make sure the carrier maximizes profit at your expense. If you’re a small company, it probably isn’t worth the hassle trying to figure it out and standardize on a common carrier plan – the costs of manpower and resulting therapy costs will probably outweigh the savings, but if you are a large company, you can save hundreds of thousands of dollars, if not millions, with the right company wide plan (which will probably consist of different sub-plans for different groups and individuals, but all on the same corporate contract). That’s why Soligence has a solution just centered around cell phone plan optimization.

Another example, as conveyed to me by Paul Martyn himself (CombineNet’s Chief Marketing Officer and premiere evangelist on the CombineNotes blog) is energy utilization optimization. If you are a large corporation that produces energy for your production operations and consumes it from the grid, whether you realize it or not, you have a sophisticated form of an energy trading problem. If you can produce enough extra energy to add to the grid, should you, and when? If you have the potential to store energy, then adding energy to the grid at peak hours and only siphoning off extra energy at non-peak hours could slash digits off of your energy bill. Furthermore, shifting your operations so that your maximum energy utilization only occurs at non-peak hours could also save you bags of money. Considering this isn’t a traditional energy trading model, traditional production model, or traditional operations research planning model, its easy to see why you might have to call on BoB.

Very Large (Traditional) Sourcing Problems

POE, especially a well designed and implemented POE that uses real optimization technology as its underpinnings, excels at traditional sourcing problems. It’s what POE was built for. That being said, POE is built on off-the-shelf optimizers, and off-the-shelf optimizers are designed for general purpose needs. That means that they will hit their breaking points before a custom designed best of breed solution, even though they improve every year. For example, leading solvers can easily crunch MIP problems with hundreds of thousands of variables and pure LP problems with millions of variables, but once your MIP problems contain millions of variables and your LP problems tens of millions of variables, you’ll start to notice a performance degradation, which could be rapid and considerable when you get close to the underlying solver’s breaking point. When your encounter the odd problem that is this large, a best of breed solution that also integrates domain intelligence can save you a lot of time and rapidly increase your chance of finding the optimal business solution in the finite timeframe that you have to make a decision.

Regret Minimization Problems

Most of the time you know the problem you need to solve, the associated constraints, and the associated costs. But every now and again you don’t. For example, you need to rationalize your supply base but do not know the optimal number of suppliers. In most products, you have to choose a number or run multiple scenarios with different numbers and then take the best one. Although this approach can be effective, it doesn’t help you understand why a certain solution is best or guide you to the best solution. A best of breed solution that manages the search algorithm can not only guide you to a potentially optimal solution but inform you of nearby solutions that are invalidated by your soft or uncertain constraints. This is very difficult for a platform optimization engine – since it generally cannot guide the search. The best it can do is run different scenario formulations, show you nearby answers, and identify constraint conflict sets. It’s a good approach, but for a strategic problem, the more you know, and the faster you know it, the better you are.

By now, you’re probably asking does BoB have the upper hand? Well, even though BoB can solve a slew of problems that POE cannot and it’s always theoretically possible to tone down a solution, whereas it’s not always theoretically possible to built up a solution, the reality is that, as I’ve said before, when it comes to optimization, one size does not fit all and using a sledgehammer on a finishing nail is not always effective. Furthermore, these are not your everyday problems. It often takes years to realize the savings from distribution network redesign, reverse auctions and sealed bid negotiations are generally not combinatorial exchanges, you do not redesign your transportation network after every sourcing event, most of your sourcing problems are traditional, most of your sourcing problems are of a manageable size with proper strategic sourcing, and if you don’t know what your constraints should be on a regular basis, you have deeper problems you need to solve first. That’s why an upcoming post will focus on POE, the everyday hero, in more detail. When combined with this post, you should have a better understanding of where each technology can be helpful to you.

Disclaimer II: Although this blog, including this post, is not sponsored by CombineNet and the author is not employed by, contractually engaged with, or affiliated with CombineNet, the author is going to report, in full disclosure, that CombineNet did allow the author to use one of their free registrations at INFORMS (of which they were a sponsor), as well as buying the author lunch.

CombineNet Communiqué III: Differences

Disclaimer: This blog, including this post, is not sponsored by CombineNet (acquired by Jaggaer). The author is not employed by, contractually engaged with, or affiliated with CombineNet. Any and all opinions expressed herein are solely those of the author. Furthermore, the opinions expressed herein should be contrasted with the opinions of other educated professionals before the reader forms his or her own opinion. Finally, the author is neither endorsing nor dissenting the use of CombineNet’s products or services – merely trying to spread awareness on the importance of optimization and the relative uniqueness of an offering like that of CombineNet. This disclaimer holds true for each post in this multi-part series and will be repeated.

Yesterday we discussed CombineNet’s post “The Make vs. Buy Dilemma” and indicated that although this was a very good post … it did not quite meet its goal because this is a problem that could be more than adequately solved by way of a leading platform optimization engine and indicated that there are problems that the platform optimization engines cannot solve.

We noted that Paul was on the right path with the make versus buy example. This was primarily because there were three options at different levels of complexity:

  • source individual components (make)
  • source assembled components (buy)
  • source sub-assembles

and each of these options could define a sophisticated model on its own.

Now we’ll shift from make vs. buy to logistics, and, specifically, transportation network design. Consider a buyer who needs to transport product from a supplier’s facility. The buyer often has at least three options: let the supplier transport the product from their facility to the buyer facility, let a third party transport the product from the supplier’s facility to the buyer’s facility, or have the supplier transport the product to a centralized distribution center and then have a preferred third party transport the product to the buyer’s facility. Sounds easy, doesn’t it? Sounds like something we could probably do by hand since there will only be a small number of legs associated with each lane, each with a small number of costs and besides shipping costs, we only need to consider tariffs.

Now consider a buyer who needs to transport product from fifty supplier facilities. Instead of having at least three choices, you now have at least one hundred and fifty choices. Now you might be thinking that you could simply solve for each supplier location separately, which you could easily do with most platform optimization engines, but this is not likely to be the optimal solution since (a) there will likely be discounts from suppliers and freight carriers if sufficient volume is allocated and (b) if intermediate DCs are used, then products from the same category can be bundled and better rates obtained.

Of course, you might note that although most sourcing models either assume that the supplier is providing freight or that a single carrier is being used, at least one provider will soon offer a model where you can simultaneously associate multiple freight options with each product and that an intermediate distribution center is just another option (with adjusted costs) and that such a model could, with the proper formulation, costs, and adjustments likely handle such a model. Well, yes and no. It could handle a basic representation, but we haven’t considered bundling concerns – not all products from a centralized DC can be shipped on the same truck, multiple freight brackets and discounts – which could greatly increase the computational complexity, and the fact that the best solution in the design of an international transportation network might involve using multiple levels of centralized distribution centers. Even though a platform optimization engine with a really good embedded model could probably handle bundling reasonably well by way of grouping and exclusion constraints and freight brackets and discounts by way of appropriately implemented discounts and penalties, the reality is that your standard embedded sourcing, or even logistical model, is not going to be able to fully handle a dynamic multi-level transportation network.

So does this mean that best of breed wins out? Not necessarily. You’re not going to redesign your transportation network for every sourcing event. In fact, you’re probably not going to make significant changes to your network more than every couple of years. In addition, most of the time your problems are not going to be anywhere near this level of complexity.

So does this mean that leading platform optimization engines are your best choice? Not necessarily. As I indicated in my first post, on a high value category, even a couple of extra points can mean millions of dollars.

The answer is that if you are a large company, you probably need to follow the advice of the SpendFool and use both – they are not mutually exclusive. Use your (super-charged) honda-powered platform optimization engine from your leading Software-as-a-Service provider for your average sourcing event, since this is probably your highest ROI, but bring in the big lear-powered jet for your high-value complex events with very large numbers of decisions that need to be made simultaneously, particularly those that involve network design or really complicated make versus buy decisions (where you are not just considering a seat but the sourcing of the BOM for an entire car – and once you know what you are going to make versus what you are going to buy, you can probably revert to the honda-powered platform for the individual sourcing events). (As for how often will you need the lear jet, that depends on your specific needs. If you’re not sure, I would start with the honda and see how far it takes you, or, better yet, bring in a third party with expertise in optimization and market offerings to help you decide.) In my view, there’s plenty of room in the market, and plenty of need, for both BoB and POE because, when it comes to optimization, there really is no one-size-fits-all and the best you are going to find is a one-size-fits-most for any category of problems that you have.

Is the story over? Not by a mile. This series is titled CombineNet Communiqué and the next part of this series* will discuss their (public) solution offerings now that we’ve illustrated a scenario where a platform optimization engine might not be enough.

* After a brief hiatus.

CombineNet Communiqué II: Comparisons

Disclaimer: This blog, including this post, is not sponsored by CombineNet (acquired by Jaggaer). The author is not employed by, contractually engaged with, or affiliated with CombineNet. Any and all opinions expressed herein are solely those of the author. Furthermore, the opinions expressed herein should be contrasted with the opinions of other educated professionals before the reader forms his or her own opinion. Finally, the author is neither endorsing nor dissenting the use of CombineNet’s products or services – merely trying to spread awareness on the importance of optimization and the relative uniqueness of an offering like that of CombineNet. This disclaimer holds true for each post in this multi-part series and will be repeated.

Yesterday we recounted The Story To Date, and in response to the statements of CombineNet’s representatives that their product is designed for “everything in the bucket sourcing”, their optimization is easily accessible, their speed allows for more scenarios to be run in the same time frame – increasing the probability of a successful event -, they are now a hybrid of BoB (Best of Breed) and POE (Platform Optimization Engine), offering their clients the best of both worlds with their preconfigured templates, and even Jay Reddy, founder of MindFlow, openly acknowledged CombineNet’s optimization capabilities were without peer I noted that the reality was, more or less, (much?) better than it was (but easy is a relative term), definitely, more or less (but that’s not necessarily a bad thing), and pseudo revisionist history.

Today I’m going to indirectly address these issues by tackling CombineNet’s post The Make vs. Buy Dilemma, which was in response to my comments on their Analytics Support Negotiations posting.

In this post, Paul Martyn endeavors to offer a specific example of make versus buy to illustrate the saving potential an optimization-enabled sourcing process can unlock to demonstrate how Expressive Bidding unleashes savings and offers new insight into supply plans.

In this example, Paul uses the example of a seat assembly to illustrate that in addition to:

  • sourcing the individual components and assembling them (make)
  • sourcing the assembled components (buy)

one can take a hybrid approach where one considers

  • sourcing bundles of parts in combination with individual parts.

In his example, Paul illustrates a situation where sourcing individual parts (make) costs $129, sourcing the final product (buy) costs $120, but sourcing subassemblies, which may consist of the odd individual part, only costs $92.

Although this was a very good post, and one of the clearest posts out there on the power of optimization, it did not quite meet its goal because this is a problem that could be more than adequately solved by way of a leading platform optimization engine, such as Iasta’s, although it would take three scenarios instead of one (and possibly a few extra milliseconds of solve time), and a problem that could have been solved with a single scenario using an unreleased version of MindFlow’s optimizer, the former leader in the platform optimization engine category. In other words, it might take the right approach, a little creativity, or a little extra work, but some make vs. buy analysis can be done with platform optimization engines.

Does this mean that you don’t need best of breed? Not necessarily. There are problems that the platform optimization engines cannot solve. However, these tend to fall into two categories: deep and complex or highly specific. However, as it was defined, this was not one of them. But Paul was closing in on the right path, and tomorrow we will discuss a problem that you would be (very) hard pressed to solve using a platform optimization engine.

CombineNet Communiqué I: The Story to Date

Disclaimer: This blog, including this post, is not sponsored by CombineNet (which was acquired by Jaggaer). The author is not employed by, contractually engaged with, or affiliated with CombineNet. Any and all opinions expressed herein are solely those of the author. Furthermore, the opinions expressed herein should be contrasted with the opinions of other educated professionals before the reader forms his or her own opinion. Finally, the author is neither endorsing nor dissenting the use of CombineNet’s products or services – merely trying to spread awareness on the importance of optimization and the relative uniqueness of an offering like that of CombineNet. This disclaimer holds true for each post in this multi-part series and will be repeated.

For those of you following the sourcing blogsphere, you’ll know that I’ve been giving CombineNet a bit of a (good spirited) hard time lately on Spend Matters, e-Sourcing Forum (ESF) [WayBackMachine], and here on SI, but I’m just trying to poke and prod them into educating the sourcing community as a whole since I believe that decision optimization is still not well understood overall, and optimization is a much more involved topic than most people realize. After all, they have what should be the only real optimization blog out there, CombineNotes.

If you haven’t, I would highly recommend you read the spirited debates over on Spend Matters that resulted from the following posts:

Old News Keeps Flowing*
What Do Rubik’s Cube and Expressive Bidding Have in Common*
An Optimization Knock-Down!*

as well as the following CombineNet posts:

  • Project Zander and Comprehensive Network Design
  • CombineNet – The Allocation Company
  • Analytics Support Negotiations
  • Expressive Commerce and the Long Tail
  • Perspectives in Puzzling
  • The Make vs. Buy Dilemma
  • ‘Right Tool for the Job’ Sourcing

If you’re new to SI and missed my Optimization series over on ESF, you can still review part I, part II, part III, and part IV, and if you missed it, my first post on Decision Optimization is still available in the archives.

For those of you who’ve read the posts, and just want a quick recap, here it is.

I fear that optimization is not well understood and that much needs to be done to educate different users on the strengths and weaknesses of differing methodologies and solutions. There are upsides and downsides – the Rubik’s cube is simultaneously the best and worst analogy I’ve seen yet – the apparent complexity is that of a Rubik’s cube, but the actual complexity is much more so.

With respect to optimization, there is a sharp distinction between the problem, model, and solution (algorithm) and confusing them can be dangerous. If the model, and the modeling capabilities of the tool, are not appropriate to the problem, or not useable by the end user, it does not matter how good the solver (or solution algorithm) is. (MindFlow proved this point.)

There are embedded POE (Platform Optimization Engine) solutions, based on COTS (Commercial Off-The-Shelf) optimizers, BoB (Best of Breed) solutions with custom (proprietary) solution algorithms, and solutions that merge the two. The best solution all comes down to the problem at hand and its modeling capabilities – a better algorithm does not necessarily imply a better solution, although it will probably reach one faster. The model is key. Furthermore, there is a cost associated with pure speed – when it comes to optimization, you can only have any two of deep, fast, and accurate.

Depending on the problem, a better model may not save you more money – it depends on how fine grained the data you have available is and whether or not the model’s constraint representation abilities can support more advanced costing models. If your data is coarse grained, chances are the additional savings provided by a BoB solution will be negligible (less than 1%), if any. If your embedded solution limits the cost factors (i.e. forces you to combine unit, usage and/or transportation costs, for example), then a fine grained model may save you a couple of percentage points if you have detailed transportation and / or usage costs at multiple freight brackets. If your embedded solution does not support sophisticated discounts or bundle based costing, then a Best of Breed solution may save you significantly more, and the quoted range of 5% to 10% is very realistic. (However, if your embedded solution does support discounts and bundle-based costing, then a best of breed solution may not save you more than a few percentage points, if that much.)

CombineNet has some of the deepest models out there, they are one of the few companies with proprietary solution algorithms (which require a lot of brain power and development time), I thoroughly believe that their logistic models are best-in-class, but I’m not entirely sure that they are “without peer” when it comes to sourcing, primarily because of what I said in the last paragraph. It depends on the problem and, since this is business, the ROI.

Compared to many of its “peers”, CombineNet has a price tag that is directly correlated to its capability – quite high! (Based on quotes I have heard, one event could cost you as much as a year of unlimited events from an on-demand provider for a small team of sourcing professionals.) For some problems, I strongly believe that, in the words of the SpendFool a honda engine will do the job just as well as a lear engine, and if we are talking about a spend in the 10M range, then I would doubt the price tag is worth it. But if we are talking 100M+ spend on a very complicated category where you need to do an embedded make-vs-buy analysis (which I’ll elaborate on in a forthcoming post), I might actually advise you to spend money hand over fist on CombineNet because even an extra percentage point will generate a significant ROI for you. (For example, even if it only saved you two percent, on a 100M category, that’s two extra million to apply against your bottom line!)

My comments, as you might have guessed, did not go unanswered. According to CombineNet, their product is designed for “everything in the bucket sourcing”, their optimization is easily accessible, their speed allows for more scenarios to be run in the same time frame – increasing the probability of a successful event -, they are now a hybrid of BoB and POE, offering their clients the best of both worlds with their preconfigured templates, and even Jay Reddy, founder of MindFlow, openly acknowledged CombineNet’s optimization capabilities were without peer.

The reality, more or less, (much?) better than it was (but easy is a relative term), definitely, more or less (but that’s not necessarily a bad thing), and pseudo revisionist history. But these are topics for the forthcoming posts in this series, so I’ll leave you with a quote from the SpendFool:

This stuff (i.e. POE & BoB Optimization) isn’t mutually exclusive! Nothing wrong with using the consultants with big brains and tools to solve the really strategic problems AND also using mass deployed tools from ERP and/or SaaS vendors. Pick the right tool for the job. Anything else would be foolish.

Thanks SpendFool!

Looking forward to your comments on my (next) posts!

* All posts prior to 2012 were removed in the Spend Matters site refresh in June, 2023.