Monthly Archives: January 2007

aPriori

Last week, in his Spend Management Goes Upstream series, Jason presented the basics of the “aPriori Philosophy”* on Spend Matters [WayBackMachine]. About the same time, I was lucky enough to meet with them in their Concord, MA headquarters when I was in the Boston area.

I must say that I am very impressed with aPriori‘s solution and definitely convinced that their solution is unique. The reality is that if you’re a best-in-class company that has already implemented technology to support the full strategic sourcing cycle, including spend analysis, decision optimization, and compliance (in addition to the old standards of e-RFX and e-Auction), then your only chance for significant cost savings is to attack the design phase – where the majority of your costs are baked in!

This is precisely where the aPriori solution comes into play. If you’re buying direct materials from a contract manufacturer, now you have a solution for understanding precisely what you should be paying based upon precisely computable geometric (physical) cost drivers and related non-geometric (part-related) costs. The reality is that current market value for a part is not always anywhere close to what you should be paying. For example, a sales representative from a new supplier is not incentivized to give you the best deal, he’s incentivized to get the best deal he can for his company. A supplier that’s always made a certain part a certain way might not realize that new technology or materials would allow them to make that part significantly cheaper if they used a different process. In this case, this is primarily due to a lack of insight.

This lack of insight is precisely what aPriori’s tool was designed to address. The application instantly and directly interfaces with your CAD program and interrogates the solid model to construct the geometric cost drivers that aPriori uses to automatically determine all the process routings that can be used to make the part, compute the costs associated with each step based upon standard machine, material, and labor costs, and compute the total cost of each part on a per unit basis by factoring non-geometric cost-drivers such as production volumes, the selected supplier or internal factory selected, and the exact routing and machines used.

This application allows design and manufacturing engineers to understand the cost of a part before they finalize a specification, evaluate different options, and make the best price-performance decision. But this is not the coolest feature. The coolest feature is that the application is based on factories built on mechanistic process models that allow you to configure the application to understand any physical part or factory/supplier you want to analyze and produce an accurate costing model. Once you produce the mechanistic process model, the solution then applies its built in computational geometric algorithms to determine the most cost-effective construction methodology guaranteed to produce the exact part you need.

The aPriori solution is truly a significant advancement in cost-based design technology. As such, not only will I be blogging about it again in the future, but I’ve also invited aPriori to submit a few guest posts detailing some of the advancements in their platform, complementing their forthcoming posts on Spend Matters, and how these advancements will help your organization save a significant amount of money without sacrificing quality or unnecessarily stressing your supplier relations.

Sometimes 80% is enough … (when employing Decision Optimization)

Over on Spend Matters [WayBackMachine] today, Jason put up a shot post titled: “Why 80% is not enough”*.  Of course, I couldn’t leave this one alone, since sometimes 80% is enough.  The comments, which could form a post in themselves, are reprinted below.

Jason:

Obviously not an Optimization Post! Because, with optimization, often 80% is enough! Let’s say you’re operating at 90% efficiency. That says you have 10% to go. If you can achieve 80% of this goal, and get to 98%, and do it affordably (and increase cash flows and profit in the process), that is downright phenomenal regardless of the industry you’re in!!!

For another example, let’s say your primitive spreadsheet model will give you an award allocation that is 80% optimal. Let’s also say that your Platform Optimization Engine (POE), bundled with your cutting edge e-sourcing suite, will give you a solution that’s 96% optimal (80% better than what you would otherwise get). Let’s say this event is for a 10M spend, that the amortized cost of the POE for this event is 20K, and the cost of a one-time use of a Best-of-Breed ( BoB ) solution, guaranteed to achieve 100% of optimality, is 200K. The POE saves you 16% of 10M minus 20K, or 140K. However, BoB won’t save you anything since the 20% of 10M, or 200K, “saved” by BoB is just enough to cover it’s cost, leaving you with a net savings of 0.

In other-words, when you’re talking about Optimization, Often 80% is Enough.

For more insight into decision optimization (particularly as it relates to strategic sourcing), Part VII of my CombineNet Series went up over on Sourcing Innovation today.

Dan:

Precisely. I was referring to a single event, initiative, etc.

The reality is thus: Sometimes 80% is enough, and Sometimes 80% is not even close. It’s all relative.

If you’ve got 80% of spend under management, that’s a big win. (Most companies struggle to get 50%!) But if your total spend is only 80% of optimal, that’s a huge loss.

The truth is the following:
* you need to get as much spend under management as possible
* you need to actively manage it (contract management, compliance management, spend analysis and maverick spend identification, invoice verification, etc.)
* you need to improve your operations constantly
* but you should be satisfied with improvements that close 80% of the gap – you’ll never be perfect at anything, but getting 80% closer to optimal from where you are today is a huge cost savings. Moreover, if you haven’t already, you’ll quickly find out that the 80/20 rule holds in technology too – emerging best of breed solutions that get you 80% closer are readily affordable (e.g. Procuri & Iasta vs. Emptoris & Ariba), but solutions that go beyond 80% improvement are very costly.

So take your 80% win in each category with emerging best-of-breed on-demand offerings, and wait for improvements to come along to get you 80% closer again in the next iteration. (And if they don’t, it’s on-demand, junk it and move to the next on-demand solution if that’s what it takes.)

Consider this: Let’s say I’m spending 500M on acquisitions, and it will cost roughly 10M in on-demand software, systems, and labor to reduce that to 430M where as it will cost 40M for high-end best of breed products and consulting armies to get that spend down to 400M. Which should I choose? The 10M (80%) solution, as it saves me just as much as the 40M (100%) solution: 60M. (500M-430M+10M = 500M-400M+40M)

And that’s why I preach TVM – Total Value Management – based strategic sourcing efforts. Don’t just look at your inbound cost or even your total cost of ownership to produce the product, look at your outbound costs as well. (You have to in turn ship your products to your clients, poor quality will generate returns that will eat away your savings, etc.)

* All posts prior to 2012 were removed in the Spend Matters site refresh in June, 2023.

 

CombineNet VII: BoB’s Power Source

Yesterday I told you that not only could CombineNet support all of the basic cost and constraint categories required for true decision support, but that the model they generate accurately represents all of the costs and constraints they support and they can solve the model faster than all of their competitors the vast majority of the time. I also told you that today I’d highlight where this unique capability comes from.

Paul highlighted it in his “Now that’s an Electric Engine …” recent post on CombineNotes [WayBackMachine]. CombineNet’s ClearBox uses sophisticated tree-search algorithms to find the optimal allocation. Furthermore, the algorithms are ‘anytime algorithms’; they provide better and better solutions during the search process. And, CombineNet has also invented a host of proprietary techniques in tree search algorithms, including new branching strategies, custom cutting plane families, cutting plane generation and selection techniques, and machine learning methods for predicting what techniques will perform well on the instance at hand (for use in dynamically selecting a technique).

Even though every model can be built and solved on an off-the-shelf optimizer, the reality is that we are dealing with NP-Complete problems, which means that solve time generally increases exponentially with problem size. This means that for any given solver and any given model class, there is a limit on the average model size that can be solved in any given time window. Although an efficient model formulation combined with an appropriately tweaked off-the-shelf solver can solve a very decently sized problem, one must not ignore the fact that generic solvers use generic algorithms which are not always optimized for the problem at hand. This indicates that it is likely that one could create an appropriately defined custom designed algorithm that is likely to solve the model faster, if not significantly faster.

What is not as obvious is the degree of difficulty often associated with the development of these custom algorithms for strategic sourcing decision optimization. The nature of these problems is that it is very hard to determine for any given solution strategy and any given model instance, whether it is more or less likely to solve the model faster than another similar solution strategy. It’s the fundamental nature of NP-Complete. If we knew how to do it, we’d likely be in P-Space.

As highlighted in the post, CombineNet began to develop its algorithms in 1997, and it has 16 people working on the algorithms. We have tested hundreds of techniques to find those that shorten solve time for Expressive Commerce clearing problems. Some of the techniques are specific to market clearing, and others apply to combinatorial optimization more broadly. And that’s where the strength of CombineNet is – 10 years of research focussed on determining how to solve the combinatorial optimization problems that underlie strategic sourcing decision optimization problems quickly and optimally. Everything else is just icing on the cake.

CombineNet VI: Strategic Sourcing Decision Optimization

We ended our last post by noting that what is important in strategic sourcing decision optimization is:

  1. The ability to support all of the relevant costs and cost tiers.
  2. The ability to support all of the fundamental constraint types required for true strategic sourcing decision optimization.
  3. The ability to generate a model that accurately represents all of the relevant costs and constraints.
  4. The ability to optimally solve the model in a realistic time frame.

Other requirements, which should go without saying, are:

  • Solid Mathematical Foundations
  • What If? Capability

Today we are going to discuss each of these points in detail and explain where the true power of a BoB solution like CombineNet is and what’s just confusing marketing hype.

1. The ability to support all of the relevant costs and cost tiers.

Fundamentally, in order to be a true solution for strategic sourcing decision optimization, the application has to support fixed and variable costs, and, furthermore, the application should allow those costs to be bid in a tiered or layered fashion or as discounts, so that a buyer can use the bidding structures that are natural to the commodity or industry they are in. This includes the ability to define unit costs, transportation costs, usage costs, and impact costs as well as sophisticated supplier discounts along the lines of “If you buy 10,000 forks, I’ll give you a discount on 10,000 spoons”.

2. The ability to support all of the fundamental constraint types required for true strategic sourcing decision optimization.

Fundamentally, the optimization solution must support, at a minimum, four basic categories of constraints: (a) capacity constraints, (b) flexible allocation, (c) risk mitigation, and (d) qualitative constraints in order for it to be a real strategic sourcing decision optimization product.

You need to take into account all of your supplier capacities, you need to be able to account for your current contracts and business rules, you need to insure that sole sourcing risks are addressed when required, and you need to be able to take into account your non-cost requirements such as quality, delivery time, and durability (etc.).

3. The ability to generate a model that accurately represents all of the relevant costs and constraints.

Many solutions exist that let you define whatever you want, but under the hood the costs and constraints are simplified and only an approximate representation is used.

4. The ability to optimally solve the model in a realistic time frame.

This is actually a two part requirement. The first requirement is that the system optimally solves the defined model, and not an approximation of the model. The second requirement is is that the system solves the model in a realistic time frame. A small model should not take more than a few minutes. A medium sized model should not take more than a few hours. A large model should solve overnight. Any longer and the usefulness of the solution is limited, especially when sourcing cycles are now completed in weeks, and not months, and all that a buyer may have to make an award decision is a few days.

Furthermore, as I mentioned in my last post, where CombineNet really stands apart from the rest of the pack is:

3. Their ability to generate a model that accurately represents all of the relevant costs and constraints.

4. Their ability to optimally solve the model in a realistic time frame.

5. Their ability to solve larger models than the majority of their competitors.

Simply put, not only can they support all of the basic cost and constraint categories required for true decision support, but the model they generate accurately represents all of the costs and constraints they support and they can solve the model faster than all of their competitors the vast majority of the time. Furthermore, they have the capability to solve larger models than the vast majority of their competitors. And tomorrow we’ll discuss where this unique capability comes from and why that, and not Expressive-Bidding, Expressive-Commerce, Comprehensive Bidding, or whatever-you-want-to-call-it-today-bidding, is what makes CombineNet BoB.

CombineNet V: Expressive Bidding (in Combinatorial Optimizations)

I know I ended my last post indicating that my next post would put BoB in perspective by extolling the virtues of POE, but I’m getting really tired of CombineNet over-hyping Expressive Bidding and so I’m going to explain why Expressive Bidding and Expressive Commerce has nothing to do with the price of fish when we’re talking about BoB. (Don’t worry, this does not have to be a series of finite length, so I will discuss the virtues of POE eventually so that you can put the virtues of BoB in perspective.)

In Paul’s “2007 – The Year of the Supplier” post on CombineNotes [WayBackMachine], he says Expressive Bids can include conditional (if/then) offers, volume discounts, packages of items (bundles), and other creative offers that take advantage of their strengths and/or recent innovations and with Expressive Bidding, suppliers drive the inefficiencies out of their own business and share the savings with buyers, simultaneously strengthening strategic relationships for long-term supply chain efficiencies and competitive advantage.

First of all, there’s nothing here that you couldn’t do self-serve with MindFlow’s application back in 2000/2001, which was two years before CombineNet started using the terminology and filing for trademarks / copyrights / etc. Secondly, pieces of this functionality existed before that in Emptoris’ offering, FreeMarket’s failed effort, i2’s early technology, etc. Thirdly, operation researchers have known how to do if-then constraints for at least two decades using the Theory of Logical Variables and its precursor instantiations. Fourthly, the same holds true for tiered bids (and every discount can be transformed to a tiered bid with a minimum buy if-then constraint and vice versa), which operations researchers have been accomplishing for even longer using piece-wise linear constraints. Fifthly, these bid styles existed long before the introduction of optimization technology, so there is fundamentally nothing new about what is being supported. Sixthly they’re not the first company to come up with a wizard-like interface (although it looks like theirs may be better than most). I could go on, but you get the point.

I’m not saying that Expressive Bidding, Real-World Bidding, Comprehensive Bidding, or whatever-you-want-to-call-it-today-bidding is not important, it is, because, without it, any optimization application with any degree of sophistication will be quite difficult to use, just that it’s not the greatest thing since sliced-bread, which CombineNet’s marketing materials would leave to believe.

What’s important is:

  1. The ability to support all of the relevant costs and cost tiers.
  2. The ability to support all of the fundamental constraint types required for true strategic sourcing decision optimization.
  3. The ability to generate a model that accurately represents all of the relevant costs and constraints.
  4. The ability to optimally solve the model in a realistic time frame.

Where CombineNet really stands apart from the rest of the pack is with respect to:

3. Their ability to generate a model that accurately represents all of the relevant costs and constraints.
4. Their ability to optimally solve the model in a realistic time frame.
5. Solve larger models than the majority of their competitors.

So tomorrow we’ll discuss these required capabilities and what BoB truly is, and, more importantly, what CombineNet’s offering really is and what is just annoying marketing hype.