Category Archives: Decision Optimization

How Do We Accelerate the Adoption of Optimization and Analytics? Part I

Every organization that adopts Strategic Sourcing saves time, money, and reputation (that would result from poor sourcing that typically results from tactical buying), but any organization that adopts Advanced Sourcing processes and platforms saves more. A lot more. Only advanced sourcing, based on analytics and optimization, saves an organization an average of 10%+ year after year after year, even when traditional sourcing methods fail.

But despite this, the adoption of modern analytics platforms and optimization-backed sourcing platforms is still minimal, and considering second generation platforms have been in existence for about ten years, and third generation platforms have been hitting the scene for the last couple of years (which can do more than first generation systems ever imagined) that can now be used by even the most junior buyer. There’s no reason that these systems are not in every leading Supply Management organization and every organization that wants to be a leading Supply Management organization.

Why aren’t these systems, which can deliver an ROI not only many times their cost but many times that of every other system, not being adopted?

Well, there are still the rampant myths that they are hard to use, require a PhD, and are only applicable for complex strategic categories, but anyone who does even a bit of research will realize that these myths only had (a shred of) validity with respect to first generation systems. There is also the belief that they are unaffordable (as first generation systems required high six figures, if not seven figures), but again research will illustrate that powerful systems are available in the five figure range and best in class systems, which support the organization end to end, can be obtained on an enterprise basis in the low six figures (and often deliver eight figures of value year after year, a 100X return). But what’s the real reason, and how do we overcome it?

If we want to really accelerate adoption, we have to figure out the critical roadblock. Last year, the prophet, in his post on “brainstorming how to accelerate the adoption of sourcing optimization” suggested the answer resided in:

  • simplifying the non-power user experience,
  • providing dynamic global/geo analysis from a visibility and risk perspective,
  • including greater API-based connectivity to back-end systems,
  • providing decision guidance as to the best models to use and scenarios to run, and
  • allowing for the sharing of models, scenarios, and best practice guidance between users

suggesting that the real reasons were

  • perceived complexity,
  • lack of visualization beyond cost tables,
  • lack of integration,
  • lack of guidance, and
  • lack of collaboration.

But, just like the myths, these reasons don’t apply to modern optimization-backed platforms, which make it easy to import and export data (for file-based integration, which is all that is needed); visualize data on a map and against constraints and identified risks; share models and scenarios; use pre-packaged cost and constraint templates (which is guidance); and walk a user through the advanced sourcing process using a wizard.

So what’s the problem? Why isn’t the adoption being accelerated? We’ll address this in Part II.

True Savings Can Only Be Identified through Multi-Factor Optimization

A recent guest post from a vendor-employed guest contributor over on Spend Matters said to “Calculate Your True Savings Using Predictive Analytics”. While the doctor agrees predictive analytics can often give you a good data point as to projected savings, the reality is that it’s not always as accurate as you would like to believe and typically does not capture your best savings opportunities.

Why? Before we discuss the guest post, which did have some good points, we have to note that most predictive analytics algorithms work on trending and statistics on historical or market data, and while this can be highly accurate (95%+) the majority of the time (95%+), because market data is only historical and typically does not include data points on new (not yet introduced or announced innovations), detailed cost breakdowns on consumer / market prices, or operational insights into hidden inefficiencies whose correction can do more than shaving a few points off the top.

Going back to the post, the author states that if you use a Savings Regression Analysis (SRA) model based on multivariate regression of past-realized savings for a given subcategory to compute the savings potential under current market conditions, the target computed will be realistic, achievable, and likely mirror what you will do (despite the savings targets you set).

And this statistically based model will work if it is the same buyer (group) employing the same strategy on the same market base under similar conditions, but what could happen if a new buyer comes in that totally redefines the demand and the market strategy, or the market conditions have suddenly changed from supply shortage to supply surplus, or new production technologies could revolutionize production and trim overhead 20%? In this situation, this type of model will be significantly off.

Now, anything you can do to better predict savings is a positive, because, as the author points out, this allows for

  • better cash flow management (as you will better know your costs)
  • time to market optimization (as you will know the best time to source if you have leeway)
  • goal setting (as you won’t be trying to achieve the impossible)
  • performance management (as you can track against a realistic goal)

But while predictive analytics give a good data point, the best data point is when you use your market intelligence to build good should cost models, use optimization to minimize transportation and incidental storage and sales (and even taxation) costs (when sourcing globally), and use six sigma analysis to see if there is any opportunity to take cost out of a supplier’s overhead production cost. Going into this level of detail may indicate that while the product cost is likely to increase 1% this year (and explains why the predictive software says only 2% savings should be expected after heavy negotiations), an extensive analysis could show that a transportation network redesign could shave 3% and lean process improvements at your supplier could shave 2%, meaning that a cost reduction of up to 7% could be achieved with the right footwork (which is something the predictive model will never tell you). So use the predictive algorithms to establish a baseline, but never, ever stop there.

Only an Optimization-Backed Sourcing Platform will Answer a Buyer’s SOS


We all know the importance of a good Sourcing Platform to power our Procurement Value Engine. But even after multiple posts (on Sourcing Innovation) and (white) papers on the topic, one still might not be convinced that an optimization-backed sourcing platform is truly necessary. If the organization is still getting reasonably good results from its (last-generation) sourcing suite, has a large number of templates, workflows, and processes configured for its key/strategic categories, and has a consultancy/service provider that handles its tougher events (and they use an optimization-powered platform for those few really complex or high-dollar categories), it might think that everything is fine. And the reality is that everything is fine … until it isn’t!

from How Optimization-Backed Sourcing Platforms Save Our Souls . . . Or At Least Our Backsides


One has to understand that disruptions don’t only occur in the supply chain after the contract is signed, they occur during the sourcing process, and a significant disruption can result in an evergreen contract renewing at above market prices (which is bad) or a contract expiring and the organization left with insufficient inventory and no source of supply in a tight market (which is worse). And even if the disruption doesn’t result in an evergreen renewal or a (costly) inventory stock-out (that shuts down a production line), it can still result in increased costs, increased risks, and missed opportunities.

Sourcing events need to go smoothly, but in a typical sourcing platform, as may of you know, that’s not always the case. Sometimes suppliers change the rules, and sometimes the rules just change, and everything, as they say, quickly goes to hell in a handbasket.

For example, all of a sudden at the 11th hour, a fire happens or a border closes, and a supplier offers you a backup location, or pulls out, and you need to bring in a supplier at a new location. Your transportation bids are useless, your risk profile is unuseably skewed, and maybe even your whole event setup is useless, and you have to start over. And this is just one of a dozen scenarios that can flip an average buyer’s world upside down with an average sourcing platform.

But if you had a flexible optimization-backed sourcing platform, instead of going back to square one, you’d just keep on truckin’ with an optimization-backed sourcing platform as they are designed, from the gorund up, to support dynamic, complex, cost models, dozens of what-if scenarios, and ever changing real-world requirements and made for change.

A factory and associated lanes disappears, no problem, it is just removed from the model with a single click. A new one is added? No problem, define the associated end points, the lanes are automatically populated, and a partial bid survey can be resent to all incumbent suppliers for revised bids. These are then loaded into the model, amalgamated with current bids, and the model is solved. No starting from scratch, creating new RFPs, creating a new model structure, etc. Just a few simple changes, a few new bids, and everything keeps on going like nothing ever happened.

And this is only one way optimization-backed sourcing platforms save a buyer’s behind. For more, check out the doctor‘s latest paper on How Optimization-Backed Sourcing Platforms Save Our Souls . . . Or At Least Our Backsides, sponsored by Trade Extensions, and realize that if you don’t have one, you need a proper sourcing platform today.

Millions Saved. Pennies Spent. Why Won’t They Learn?

Trade Extensions recently released a new set of case studies chronicling just half a dozen sourcing projects it did over the last couple of years for its fortune 500 clients that chronicled, on average, savings of 10% or more which ranged from 500K on a 5.5M category to 28M on a 200M category. All of these companies saved tens of millions (or more) and only spent in the six figure range for the Trade Extensions solution, which means for every penny it saved a dollar.

It is not just the magnitude of the savings that is significant though – it is the breadth of the impact. The air freight example not only identified a savings potential of 42%, with a realized savings of 21% (when the company took risk, performance, and preferred vendors into account), but also identified a scenario which improved service levels and reduced risks while delivering 21% savings.

The compliance reporting example helped an organization that, due to the scale of it’s operations, took five days to analyze the output of its Transportation Management System (TMS), reduce its retrospective analysis time to a proactive operations step that automatically executed in 30 minutes or less, and allowed the organization to, for the first time, ensure its product movements were consistent with the awarded contract scenario.

In the full truck load and global packaging examples, the companies were able to rationalize the supply bases by 25% to 40% while reducing cost and at least maintaining service levels and risk (if not increasing service and decreasing risk).

But yet these examples are rare. Every year many organizations as large, or larger, than these continue to spend close to, if not, seven figures on their first generation sourcing or source to pay platforms while generating savings that, instead of being in the 10% or more range, are in the 2% to 3% range, which means that the organization is essentially spending dollars to save dollars — which does not make good ecnoomic sense. Especially when a modern optimization (backed sourcing) platform can always be run along side existing supply management system and used as appropriate to generate 3X to 5X the savings and value than the organization would otherwise obtain.

So while the leaders have learned, why won’t the laggards learn?

Keelvar: The Little Engine that Could

In case you haven’t guessed, this post is about The Little Engine That Could not only get up the big hill, but after scaling the hill, decided to follow the tracks up to Alaska, tackle, and climb, Mount McKinley (also known as Denali), which is the highest mountain in the United States at 6,190.5 meters (or 20,310 feet), and not stop until it reached the summit.

For those of you who missed our prior posts, namely Keelvar: Strange Name. Uncommon Results., Keelvar: Are They Right for You, and Re-introducing Keelvar, An Optimization-Backed Sourcing Platform, Keelvar, which is the newest, and still the smallest entrant, to the strategic sourcing decision optimization game, and one of the few (correction: two) vendors to provide a fully integrated optimization-backed sourcing platform (with integrated RFX and e-Auctions), has been making great strides since it spun-out of the 4C research laboratory (in the Department of Computer Science) at the University College of Cork a mere four years ago in 2012. Since then, it has been advancing faster than all of its peers except Trade Extensions, and has emerged to become a top contender for the provision of optimization-backed sourcing platforms. In fact, as hinted at in an upcoming Pro piece, the doctor expects that Keelvar will grow faster than 4 of its 5 five competitors over the next few years.

So what’s so great about this little upstart? The first thing to note is the ease-of-use of the platform. The platform, which embeds a simple-to-follow seven-step best practice sourcing platform, literally guides even the most junior of buyers through the most complex events the platform can handle, and the side-bar navigation makes it a breeze to quickly access any step in the process. (The tried-and-true best-practice methodology is strikingly similar to what MindFlow used back in the day, but it never had such an easy to use, clean, and modern interface.)

The second thing is the speed of improvement. Since SI last reviewed the platform last fall, a number of considerable of enhancements have been made that go well beyond usability. Extensive supplier self-service has been added (which allows the supplier to manage not only the response and bid process, but the team assigned to it – all the buyer has to do is invite one supplier rep, and that supplier rep can create the supplier organization’s records, add users, give them appropriate, fine-grained read/edit rights to the documents and bids, and manage all of their effort without any buyer involvement whatsoever). Single-sheet smart-load (which allows the platform to detect field-types, field-status, and other relevant information without a user having to define a lot of meta-data or use the cell-based encoding required by other platforms) has been developed. And parametric bidding is in quality assurance.

Parametric bidding is, in a world, cool. Often in the acquisition of fleets, computers, cell phones, etc., the buyer doesn’t precisely know the exact configuration details that are desired until the last minute. In this situation, the buyer has to either create a huge number of potential configurations for bidding, or pick a few and hope for the best. With parametric bidding, the supplier can bid on a base configuration and define all of the options they offer against that configuration as well as the price increments (or decrements) for that option. When the final configurations are selected, the system will automatically calculate the appropriate costs (and discounts) from the parametric sheet for the optimization model, with no effort at all required by the user. This is a feature that is jut not seen in first generation sourcing platforms. Watch for it.

Keelvar, which was first named as a SpendMatters company to watch last year (and which will soon be covered in depth on Pro by the doctor and the public defender), is a company that you should be keeping a really close eye on. Optimization-backed sourcing platforms are the future. and right now there are one of only two providers with a single, integrated, end-to-end, solution. We may see more in the future (with BravoSolution working on integrating its two product lines, SciQuest’s acquisition of CombineNet, and Determine’s acquisition of Selectica), but Keelvar (and Trade Extensions) have an early lead that gets larger every day their competitors work on integration (as opposed to innovation).

Now, you’re probably worried about adoption, because first generation platforms were, for the most part, so damn hard to use (to put it bluntly), but second generation optimization-backed sourcing platforms are actually quite easy to use and focussed around adoption. For more information on how to get Higher Adoption, check out the linked white-paper. And for more information on Keelvar, we recommend checking out their new, open, Keelvar support portal.