Daily Archives: February 9, 2016

Optimization Backed Sourcing Platform … Or Bust Part II


This is the second part of a five part series that revises and ties together key ideas outlined last year on Sourcing Innovation across multiple posts. Regular readers will be familiar with much of the content, but the integrated perspective should help to cement the ideas in regular readers and new readers alike.


This post is largely based on It’s Not a Suite, It’s Just Sourcing, Part II.

In our last post we made the rather bold claim, which is probably going to irk a lot of vendors, that it’s NOT a Suite, It’s JUST Sourcing. SI likes vendors that are trying to build solutions to solve their customers’ pain points, and has chronicled the efforts of many over the years, and thus isn’t doing this series to be irksome. SI is doing this series because it’s not 2005 anymore, it’s 2015 and the nature of, and need for, Sourcing has changed as global trade has become more complicated, supply chains have lengthened, risks have increased, and sourcing has become more complex. Today, sourcing absolutely has to be more strategic and Suite Sourcing is NOT Strategic Sourcing. In today’s post, we’re going to begin to clarify why.

Our last post outlined a hypothetical, but realistic, example in the high-tech space, discussing a typical, primary, sourcing event for a company that assembled custom-built high-end workstations for software developers and engineers. We started by discussing the primary factors that the Sourcing analyst was likely to identify as well as two strategies the analyst was likely to take. This led to a perceived event progression and a plan that looked like it was easily executable in you average modular sourcing suite. We did this to make it clear why many companies fall for the fallacy that you can attack sourcing in a step-wise fashion using a modular suite, and, as a result, why some vendors still believe that a modular suite is the way to go. The reality is that, at a quick glance, it does look like this is the right approach and that there is no reason to question it — even though there is a big reason. Namely, the approach is wrong.

The reason being is that, in reality, the event is not going to go as planned.

Specifically, it will not be an analysis followed by an RFP followed by a single auction / optimization analysis followed by a push into the contract management system. One or more, with emphasis on the more, of the following will happen:

  • the RFX will come back and some of the requested bid fields will be empty because the supplier is no longer producing the product
  • the RFX will come back and there will be new products that the buyer did not know about with new bids (and new interdependencies to be mapped)
  • the logistics carriers will come back with quotes much higher than expected and/or a logistics carrier or 3PL will withdraw (due to overcommitments) and lanes will vanish
  • stakeholders or key customers will change requirements post RFX issue and you will have to go back and ask for prices on next generation products, which might still be in final design stages
  • the baseline optimization will come back with completely unexpected results and once the analyst uses spend analysis to dive in, the analyst will find a number of outliers in the incumbent bid and realize that she has to go back and ask for verified or corrected data
  • the auction will end with three suppliers almost equal on baseline scoring and extensive analysis will be needed to determine which supplier gets 50%, which supplier gets 30%, and which supplier gets 20% in the 50/30/20 split dictated by the stakeholders to minimize risk

In these situations, respectively

  • the analyst will have to identify a larger supply base and send the RFX to more suppliers
  • the analyst will have to research the new products and decide whether to accept them or not and then, possibly, ask the supply base to bid on (comparable) products in a revised RFX
  • the analyst will have to invite more carriers to bid and consider alternate lanes, possibly from secondary (air)ports to secondary (air)ports
  • the analyst will have to create revised specs and go back to the supply base for additional prices and options
  • the analyst will have to backtrack to the spend analysis step on the submitted data, followed by a request for bid verification and a repeat of the optimization on revised data
  • the analyst will have to go back to the analysis step to identify which bid components were strongest for each supplier and then compare that to existing supplier scorecards (to determine likelihood of on-time delivery, quality guarantees, price consistency, etc.)

In other words, the event is not going to go as planned and it’s not going to be a sequential progression from analysis to RFX to auction/optimization to award. Moreover, most events are going to see multiple occurrences of the above hiccups and require an almost random workflow that uses all of the sourcing capabilities of a suite multiple times.

Moreover, the transitions back and forth will need to be seamless. If an analyst has to push data out of the optimization “module” into the “analysis” module for detailed data and outlier analysis, then push the data, with insights, back into the “RFX” module for revised RFX data collection, and than push the revised RFP data back into the “Optimization” module for revised analysis only to find out that the lane cost is coming out higher than expected in the preferred award, indicating that there is still an additional opportunity if logistics costs can be lowered, then this “modular” workflow quickly becomes a nightmare.

Plus, in this situation, the analyst will have to do an in-depth analysis of the logistics cost to determine if costs can be lowered simply by inviting more carriers to bid, analyzing primary and secondary lanes, or doing something progressive like using the organization’s sourcing expertise to help a provider lower their overhead with better insurance rates, communication plans, and office & computer supplies from the organization’s GPO contract. Then, after this analysis has been done, which will likely take the form of multiple what-if optimizations using various cost models, the analyst will have to go back to the RFP, issue the revised RFP with more options to current and new suppliers, push the data back into the optimization module and continue.

In a modern sourcing project, one cannot separate data collection from cost modelling from analysis from bidding from optimization — it is all one integrated sourcing process that lathers, rinses, and repeats until the solution is found and the event is done. And any provider that thinks you can separate pieces out and take a modular, piecemeal approach and build up to a suite, one module at a time, is still living in 2005 and should be approached with caution. It’s not a suite, it’s just sourcing. And, as indicated in our previous post, and as will be discussed in more detail in a future post, it’s not optimization, it’s strategic sourcing.