Category Archives: eSourcing Forum

Six Sigma I: An Introduction

Originally posted on on the e-Sourcing Forum [WayBackMachine] on Friday, 1 September 2006

The design of responsive supply chains is becoming a priority for companies who must continually reduce costs and streamline processes to remain competitive in a dynamic global market where companies that do not anticipate challenges and prepare for the unexpected can vanish almost overnight. This means that an organization today must pay more attention to the business environment, get tighter with its customers, analyze performance data better, and avoid disasters.

Six Sigma, which defines itself as a relentless quest for perfection through the disciplined use of fact-based, data-driven, decision-making methodology, is one methodology that companies can use to make their supply chains more responsive, foster innovation, and improve quality across the board. When properly applied, this ultimately leads to lower costs, greater profits, greater customer satisfaction rates and earnings per share for the shareholders, which is why Six Sigma has caught on at a number of large enterprises.

But it’s no light undertaking – it requires massive commitment from the CEO down, considerable amounts of training and application, master black belts, black belts, and green belts, and operational changes across the board. In addition, it often requires new mindsets, new methodologies, new technologies, new performance metrics, and, most importantly, new incentives. Which leads one to ask, if you’re not a large enterprise with a large spend, large budgets, and the time to transform, is it worth it?

That’s not an easy question to answer. If you just look at the average results, you might think it is a resounding yes. But no decision is ever that easy, especially when you do not really understand what something is and separate the facts from the hype.

After all, we are dealing with a methodology that is promoted as a revolution capable of fixing everything wrong with your corporation, with the possible exception of the clog in the lunch-room sink. (No wait, it has a process to fix that too!) Just one article I read recently said Six Sigma will help you with:

  • Globalization
  • Mergers and Acquisitions (M&A)
  • Supply Chain Design and Planning
  • Customer Relationship Management (CRM)
  • Brand Building
  • New Product Development (NPD)
  • Sustainable Growth
  • Innovation Management
  • Risk Management
  • and much, much more by
    • accelerating a company’s learning cycle
    • speeding up research phases
    • speeding up redesign
    • enabling rapid information exchange
    • … and so on.

To fully understand the breadth of this claim, imagine if a smart looking guy in a suit walked into your office and said “I’m the architect, foreman, electrician, plumber, steel worker, heavy equipment operator, drywaller, and project manager for that new office building you’re thinking of. With just a few humble assistants, I’m going to take on that massive project all by myself and finish it faster, better, and at less cost then any of the multi-billion contracting firms you are currently considering.” That’s comparable to the breadth of the claim that Six Sigma often appears to be making.

Theoretically, a universally applicable best practice generic methodology could be applicable to each of your business functions, but just how helpful is it going to be if it is that generic? That’s one of the questions we are going to try to answer this weekend. To do that, we are first going to focus on its potential contributions to quality and innovation, since these are two common qualities that define great operations across organizational functions. Sunday we will discuss its potential value to strategic sourcing.

However, first of all we are going to define what Six Sigma really is. Sigma measures variation, and more specifically, the standard deviation of a data set. Six Sigma refers to a distance of six standard deviations between the mean value of the data set and the nearest specified tolerance limit. A Six Sigma Process is one that produces at most 3 defects per million trials (in the long one).

Furthermore, Six Sigma is concerned with something called the Rolled Throughput Yield of a system, which ensures that the measurement applies to the finished product and not just a step in the process. The rolled throughput yield is calculated by multiplying the yield of each step. For example, a system with five steps and only 99% yield at each step would produce a rolled throughput yield of 0.99^5 = 0.951 or 95.1%. Thus, a six sigma process is one with a rolled throughput yield of at least 99.9997% or almost six nines reliability.

In short, a Six Sigma methodology is one that is designed to eliminate variation from a process to ensure consistent results every time.


For more information on six-sigma, see the “Six Sigma: Improve Supply Chains through Methodology” wiki-paper over on the e-Sourcing Wiki [WayBackMachine].

Optimization IV: POE or BoB?

Originally posted on on the e-Sourcing Forum [WayBackMachine] on Monday, 28 August 2006

When choosing an optimization solution, you generally have two choices: a Platform Optimization Engine that is integrated into a strategic sourcing suite (such as Iasta’s Bid Optimization 1.0 that runs on top of industry standard solvers such as Ilog’s CPlex) or a standalone Best of Breed product (such as CombineNet’s customized optimization platforms based on their proprietary solution algorithms customized for certain problem domains and models). The question is – which is best for you?

As you probably have guessed, it turns out there is no easy answer to this question. It really depends on the problem you need to solve, the complexity of the model or modeling capabilities you need to solve it, the size of the problem, and the effectiveness of generic optimization approaches such as linear programming, mixed integer (linear) programming, and constraint-based programming on your problem. It also depends on what you are looking for in a solution – do you want ease of use or flexibility, low cost or (potentially) high value, off-the-shelf or customized, etc?

Generally speaking, POE is an easy-going, relatively inexpensive date while BoB is a serious, expensive commitment. However, whereas POE is only comfortable in familiar situations, BoB can adapt rather well to new situations with very little effort. POE is often backed up by an extensive support group whereas BoB is a relative loner, with only a small group of close friends there to support him.

In other words, since POE is part of a package deal, the relative cost for POE’s services is usually much (much) less than the cost of BoB’s services. However, POE can generally only solve the problems that fit into the pre-defined set of models that POE comes with. In contrast, BoB can often adapt to handle new models and new variations thereof as time goes on. Since POE makes use of the services provided by third party optimization engines, POE gets better not only when the platform improves, but also when the third party services improve. On the other hand, BoB only gets better when the provider supporting BoB gets better. Since POE is generally designed to solve a small range of problems, POE’s user interface is usually customized to the problems in such a way as to make their definition easy and obvious to even the most inexperienced of analysts. On the other side of the fence, BoB, designed to handle a wide range of complex problems, is often quite difficult to use and often requires a lot of education and experience if you want optimal results.

So where do you start? I’d recommend listening to the advice of POP, Practiced Optimization Practicioner who knows that the majority of your optimization problems in a particular area (generally between 60% and 80%) can usually be approximated very well and solved near-optimally by POE and that POE is an easier, quicker, cheaper entry point into optimization than often anti-social BoB. Therefore, you should start with POE on the problems that POE is most suited for, get comfortable with optimization, and get some quick hits. Once you understand optimization, you can then dive in on your more complex problems and determine whether or not you can use POE, the kind of results you can expect, and whether or not you should bring in heavy hitting BoB to tackle those problems POE can not handle, or may not do well on.

In the future, I think you’ll see the market leaders using both solutions – POE for most of their day-to-day sourcing problems, but BoB for complicated make-or-buy, logistics heavy, or non-standard sourcing-allocation problems where large amounts of savings are there for the taking. (Large being the key decision factor on whether or not you engage BoB, since BoB can easily be 10 times as expensive as POE and you need to approach everything from a value-based ROI perspective.) This means that companies like Iasta (POE) and CombineNet (BoB) should both have a very bright future, since I predict that Iasta’s forthcoming Decision Optimization 2.0 offering will be best-of-breed in its category (on-demand integrated strategic sourcing optimization) and CombineNet is already best-of-breed in many ways in its category (stand-alone logistics, multi-variate make-buy decisions, and non-standard complex-sourcing).


For a more in-depth discussion of decision optimization, what it is, what it is not, how it enables decision support, the benefits it provides, and strategies for success, see the “Strategic Sourcing Decision Optimization: The Inefficiency Eliminator” wiki-paper over on the e-Sourcing Wiki [WayBackMachine].

Optimization III: Why it’s time is finally here

Originally posted on on the e-Sourcing Forum [WayBackMachine] on Sunday, 27 August 2006

Friday we noted the effectiveness of decision optimization and how it can enable early adopters to identify average incremental savings of 12% above those that basic, price-focused auctions alone have generated, according to Aberdeen’s recent “Success Strategies in Advanced Sourcing and Negotiations: Optimizing Total Costs and Total Value for the Next Wave of e-Sourcing Savings” in June of 2005. Yesterday we discussed the factors that combined to downplay decision optimization’s importance to a successful e-Sourcing process. Today we will discuss how the market is changing and why decision optimization will soon take its rightful place at the heart of strategic sourcing initiatives.

Four major factors are currently combining to elevate the importance of decision optimization to strategic sourcing. Briefly they are:

  • Extensive use of e-Auctions over the last 3-5 years by early adopters.
  • Optimization Technology has evolved.
  • Solution Providers are integrating optimization into their platforms.
  • Solution Providers are recognizing that there needs to be a significant amount of sophistication under the hood.

(1) Extensive use of e-Auctions over the last 3-5 years by early adopters and market leaders have sucked all of the fat out of supplier margins, rationalized the supply base, and streamlined the process to the point where there is essentially no more money to be saved on auctions alone.

(2) Optimization technology, like all technologies undergoing an evolution, has become more accessible and easier to use. No longer does it take a heavily trained PhD to use today’s optimization products. A BSc with minimal training can be up and running with today’s tools in a few hours, and comfortable within a few days. (However, it still takes a PhD, or PhD team, to build the product, but that’s why market leaders are hiring very well educated and experienced teams.)

(3) Leading Solution Providers are recognizing that decision optimization needs to be an integrated part of an e-sourcing platform that supports the full range of e-sourcing activities and are incorporating these capabilities into their product suites.

(4) Leading Solution Providers are recognizing that a minimal amount of sophistication is required in the model and building this into their products. For example, whereas Iasta’s bid optimization 1.0 product contains the basic concept of Ship Tos by way of allocation groups, their forthcoming product, among other capabilities, will eventually support Ship Froms and discounts, which in turn support freight lanes and volume-based pricing.

I see a very bright future for decision optimization enabled e-sourcing platforms, both for solution providers and their clients who will now have the opportunity to maintain double digit savings in their sourcing events for years to come. But I’d like to know what you think. Who’s using it now, Who’s planning to use it, and Who’s not. And why?


For a more in-depth discussion of decision optimization, what it is, what it is not, how it enables decision support, the benefits it provides, and strategies for success, see the “Strategic Sourcing Decision Optimization: The Inefficiency Eliminator” wiki-paper over on the e-Sourcing Wiki [WayBackMachine].

Optimization II: Why it was Relegated to the Shadows

Originally posted on on the e-Sourcing Forum [WayBackMachine] on

Yesterday I pointed out a recent report from Aberdeen in June of last year entitled “Success Strategies in Advanced Sourcing and Negotiations: Optimizing Total Costs and Total Value for the Next Wave of e-Sourcing Savings” where they determined that the application of optimization tools to analyze total costs, and of flexible bidding functionality to uncover creative supplier solutions has enabled early adopters to identify average incremental savings of 12% above those that basic, price-focused auctions alone have generated and discussed the fact that despite this result, optimization still is not used regularly across the board.

I also indicated that the lack of use of optimization across the board is likely the result of a number of factors that have historically combined to downplay the appeal of decision optimization, which has often been viewed as overly complicated unless absolutely necessary.

Specifically, I believe the lack of adoption of decision optimization across the board is the result of four key factors.

  1. Early e-Auctions generated amazing returns.
  2. Initial optimization offerings were hard to use and harder to understand.
    (In my view, MindFlow fell into this category.)
  3. Many solution providers attempted to side-step the complexities by toning down their options.
  4. A lack of integrated solutions on the marketplace.

I will now discuss each of these in more detail.

(1) Early e-Auctions generated amazing returns!

Many auctions generated double-digit returns, often in excess of 20%! This caused auction technology to be over-hyped as a technology for cost savings. As a former employee of an early provider of Strategic Sourcing solutions, I saw both the results and the buzz it generated. However, these results cannot be maintained indefinitely! Even if a supplier has a bloated margin of 100%, the most they will be able to give up and maintain viability is typically in the 60% – 80% range. In other words, after 3 events, there are no more margins to trim.

(2) Initial optimization offerings on the e-sourcing marketplace were hard to use and even harder to understand.

By its very nature, optimization, which is based in complex mathematics, is hard. Very hard. And many products had user interfaces to match. The underlying technology may be sophisticated, but this does not imply that the end product should be! Your car is a perfect example. Modern cars have very complex integrated mechanical, electrical, and electronic systems. But the user interface for an automatic is a gear shift (park, neutral, drive), a steering wheel, a gas pedal, and a brake. Decision Optimization should be the same – mind-boggling under the hood but easy as e-mail through the UI. Next generation systems will be. (I believe that this is one of the reasons that decision-optimization (only) companies like MindFlow never caught on beyond a few large CPG and Food Service companies. For example, even though there was a time when MindFlow could not be matched in terms of self-service optimization capability in the sourcing marketplace*, it was also true that it could cause the average user significant consternation. I believe that only sophisticated sourcing professionals with extensive training could take full advantage of the solution. *I’m sure a few individuals at CombineNet would disagree with this statement, but one thing I repeatedly heard from customers and prospects about their early solution offerings was that you needed one of their PhDs to run it for you. However, I should note that for certain areas, this is definately no longer the case with some of their recent releases.)

(3) Many solution providers attempted to side-step the complexities by initially toning down their offerings.

The proclaimed market leaders in the sourcing space provide us with examples. Whereas MindFlow built an extensive model (7+ logical dimensions) with ship-tos, ship-froms, built in lane support, complex cost structures, etc., some of the leaders went with simple point-based bid solutions. Bid 1 from Supplier 1 for Item X, Bid 2 from Supplier 2 for Item X, Bid 1 from Supplier 1 for Item Y, Bid 2 from Supplier 2 for Item Y, etc. with a couple of limit or allocation constraints. Although these products turned out to be much easier to use, they did not provide enough sophistication to model the real world supply chains and constraints of the companies that needed optimization the most! Interestingly enough, I believe that this is one of the reasons MindFlow lasted so long when many other optimization-based start-ups no longer exist (independently). MindFlow’s early market may have been small, but they were one of the pioneers of true decision optimization technologies and one of the few companies to offer the real power multinational CPG and food-service companies needed to accurately model their sourcing scenarios. (In comparison, CombineNet was one of the few companies that could handle their purely logistical models.)

(4) A lack of integrated solutions on the marketplace.

Many of the early providers of decision optimization only offered decision optimization. Furthermore, those companies that did offer other solutions typically weren’t best in class, especially from a usability perspective. However, leading sourcing professionals know that decision optimization is most effective when it is part of an integrated e-enabled strategic sourcing process and relatively ineffective on its own. (At least one of Iasta’s forthcoming solution briefs will elaborate more on this.) Decision optimization needs cost data (that results from auctions, possibly sealed bid), qualified award possibilities (that results from RFx and Supplier Scorecards), and an understanding of the supply chain strategy and appropriate commodity market (that results from spend analysis and proper processes). On its own, its capabilities are limited, integrated into an end-to-end e-sourcing platform, its capabilities are virtually endless.

Fortunately, market conditions are changing and I believe that the industry as a whole will not only be ready for this amazing technology very soon, but be hungry for it, especially when it is properly integrated into an e-Sourcing platform that provides best-of-breed technologies that support the end-to-end e-Sourcing process. The reasons I have for this forthcoming shift, and the reasons why some companies are working hard to build a best-of-breed decision optimization offering that is tightly integrated into an end-to-end e-Sourcing suite, will be illuminated in tomorrow’s post.


For a more in-depth discussion of decision optimization, what it is, what it is not, how it enables decision support, the benefits it provides, and strategies for success, see the “Strategic Sourcing Decision Optimization: The Inefficiency Eliminator” wiki-paper over on the e-Sourcing Wiki [WayBackMachine].

Optimization I: A Powerful Tool

Originally posted on on the e-Sourcing Forum [WayBackMachine] on Friday, 25 August 2006

Even before Aberdeen came out with its “Success Strategies in Advanced Sourcing and Negotiations: Optimizing Total Costs and Total Value for the Next Wave of e-Sourcing Savings” in June of last year, some of us already knew that decision optimization was the future of strategic sourcing. Moreover, the fact that they determined that the application of optimization tools to analyze total costs, and of flexible bidding functionality to uncover creative supplier solutions has enabled early adopters to identify an average incremental savings of 12% above those that basic, price-focused auctions alone have generated was no surprise to those of us who had been developing such technology, and monitoring its implementation success, for many years. That’s why innovative sourcing companies like Iasta (your e-Sourcing Forum blog sponsor) were already working on Bid Optimization capabilities (with version 1.0 released in December of last year) and focussed optimization companies like CombineNet have been pursuing improved optimization technologies and algorithms for over a decade.

And as you read this, I can tell you that Iasta is investing heavily in the research and development of Decision Optimization 2.0, which it expects to complete by the end of the year. Decision Optimization 2.0 will be based on the theory of Total Value Management (TVM) and continue to run on market leading solvers such as ILog’s CPlex. TVM models attempt to go beyond the capabilities of LCO (Landed Cost Optimization) and TCO (Total Cost Optimization) models by capturing the value, and not just the cost, of an award. They support qualitative constraints, to allow you to ensure the award will meet your physical constraints (durability, reliability, timeliness, low defect rate, etc.), and allow you to capture the impact costs associated with an award (such as marketing value, low return rates and high customer satisfaction, etc.) through constraints, fixed costs, and adjustments. TVM is the next logical progression in sourcing cost modeling (and an extension of the TCO modeling capabilities that were found in many previous generation modeling tools, which included MindFlow). But I digress.

Many innovative service and solution companies in the e-sourcing marketplace have been betting for the last five years (or so) that optimization is the wave of the future, but the vast majority have met with limited success (often surviving by M&A, like MindFlow, as pointed out by David in a post earlier this year) and many more are out of business.

Furthermore, the companies that have succeeded, have done so primarily due to acquisitions and other strengths. For example, Ariba acquired many of its customers from Free Markets and its customers praise them for their market knowledge and end to end platform capabilities that support integrated best practice processes from start to finish. Emptoris essentially doubled its customer base from the Dicarta merger, acquired many of its initial customers from its auction capabilities, and retained them through its own end to end platform, beefed up by many acquisitions over the years. i2 just isn’t a name I regularly hear in any sentence that contains “strategic sourcing” and “decision optimization”, and many companies that have survived, like SCA Technologies, are still relatively small in terms of customer base.

MindFlow, once acknowledged by the analyst and research groups, including AMR and Aberdeen Group, as the provider with the most comprehensive self-service platform for decision optimization, especially for CPG and Food Service, with its complex modeling capabilities and numerous constraint categories, is now virtually non-existent since the acquisition. In fact, when you get right down to it, the only company that has been around for the long haul and succeeded on optimization alone is CombineNet, and it has historically made most of its inroads in logistics and transportation, not strategic sourcing award allocation (although its customer base and focus is broadening). In fact, in a recent web search, the only recent news of significant note for 2006 that I could find in the strategic sourcing optimization arena, is the announcement in January that CombineNet is partnering in a joint venture with the University of Pittsburgh Medical Center to provide advanced sourcing solutions to the healthcare industry, as per this press release.

It is well known that many of the market leaders, and many of the big companies, are using decision optimization as (a critical) part of their strategic sourcing processes, but, considering that leaders typically make up less then 20% of the market (and that I know for a fact that not all market leaders are using decision optimization), it’s fair to ask “Who else is using decision optimization?”.

More importantly, when research has shown that repeated auctions on the same category quickly lead to diminishing returns, and often to net 0 returns after only 3 or 4 auctions, and that decision optimization often leads to incremental savings of 12% above and beyond other savings opportunities, why aren’t more companies, especially the mid-market enterprises, making regular, constructive use of this technology?

I believe the lack of use of optimization across the board is the result of numerous factors that have combined to downplay the appeal and importance of this technology over the last few years when concentrated efforts should have been made to introduce this technology as an overall component of any value-based strategic sourcing process. Tomorrow, I will discuss those factors and Sunday I will discuss why I think the time of decision optimization for sourcing analytics is finally here.


For a more in-depth discussion of decision optimization, what it is, what it is not, how it enables decision support, the benefits it provides, and strategies for success, see the “Strategic Sourcing Decision Optimization: The Inefficiency Eliminator” wiki-paper over on the e-Sourcing Wiki [WayBackMachine].