Category Archives: Decision Optimization

The UX One Should Expect from Best-In-Class Optimization … Part II

In our last post on The UX One Should Expect from Best-In-Class Optimization we began our foray into the world of optimization and how the requirements for a good user experience goes well beyond the requirements for e-RFX and even best-in-class e-Auction. (As for the basic requirements of any e-RFX or e-Auction platform, see our two-part series on Best-in-Class e-Sourcing Part I and Part II and our deeper dive into Best-in-Class e-Auctions Part I and Part II.)

In our post, where we noted that last Thursday over on Spend Matters Pro [membership required] the doctor and the prophet posted the first article in our four part series on What to Expect from Best-in-Class Sourcing Optimization Technology and User Design (Part 1), we indicated that optimization is important, very important, and about to become even more important as savings go up in smoke due to inflation, protectionist policies, and insufficient supply of raw materials. (And, as we noted, that’s why Coupa spent a lot of its IPO proceeds to buy Trade Extensions. They might not understand what they bought, or how to use it, or where it fits in, but they saw the future and wanted to get in the game early enough to have some time to — hopefully — figure it out before their competition.)

Plus, as we noted, it’s the only advanced sourcing solution that has been demonstrated, repeatedly, by analysts and providers alike, to provide year-over-year returns over 10% when properly applied. Plus, unlike spend analysis, which only identifies the high-level savings opportunities (which can only be captured if appropriate events are undertaken, possibly based on optimization), optimization produces the exact award scenario required to generate the savings. And, often, this will be a scenario that will never, ever be identified by any human, even if given enough time to generate dozens, or even hundreds, of spreadsheets.

In our last post, we noted that one of the core requirements of such a platform was powerful cost modelling as true calculation, and optimization, of the costs of goods sold requires complex breakdowns and formulas because, in practice, with even the most “vanilla” or simplest of products, there are fixed costs and variable costs and that these change at different production levels. And in addition to fixed and variable costs associated with the product creation, there’s also import / export tariffs, taxes, logistics costs, utilization costs, warranty, return, and disposal costs and a host of other category specific costs.

But that’s just one core component of the platform. Another, as explained in the doctor and the prophet‘s second piece on What To Expect from Best-in-Class Sourcing Optimization Technology and User Design (Part 2) [membership required] is guided sourcing by way of system-assisted “what-if” support.

Cost modelling is indeed a powerful tool, especially when compared to a system that doesn’t have it, but arguably the real power of a strategic sourcing decision optimization tool lies in the ability to generate, analyze, and compare what are called “what if?” scenarios. This statement holds true both when analyzing cost and when analyzing risk, as well as the broader resilience components of sourcing award/allocation decisions.

As the co-authors note, even expert sourcing optimization users commonly look to collect data and apply constraints centred on near-term thinking — and award decisions. But when viewed in true context (i.e. an awarded supplier’s performance over the term of the contract), it is rarely the lowest cost and resulting business allocation scenario that brings the greatest value to the organization. Rather, it is the scenario that is the most resilient in the face of unpredictability.

Moreover, even minor variations resulting from different initial or future award considerations can have drastic impacts on costs. And it is only through such scenario analysis that a slightly higher cost award decision today (or in the future) could end up delivering far greater organizational value. Users can now think through all such potential scenarios, so it should be common practice to use the capability to test hunches and/or quantify potential risks … which can only be done with the right sourcing optimization platform with a modern, appropriate, user experience.

But this is just another piece of the puzzle. Stay tuned for Part III.

The UX One Should Expect from Best-In-Class Optimization … Part I

In our last four posts, we dove deep, really deep, into the basic requirements for any modern e-Negotiation platform (which we defined as e-RFX and/or e-Auction) and then dove deeper into the additional requirements for any e-Auction platform that claims to be a modern, best-in-class, e-Auction platform in the year 2017 (not 2007, where some seem to be stuck — but we won’t talk about them). [See Best-in-Class e-Sourcing Part I, Best-in-Class e-Sourcing Part II, Best-in-Class e-Auctions Part I, and Best-in-Class e-Auctions Part II.] But we excluded optimization for a reason, because the requirements for optimization go beyond — way beyond — the requirements for even the most intense set of requirements for the most advanced e-Auction platform (which can support bills of materials and constraints).

Last Thursday, over on Spend Matters Pro [membership required], the doctor and the prophet posted the first article in our four part series on What to Expect from Best-in-Class Sourcing Optimization Technology and User Design (Part 1) (that’s right, 4-part series), where we begin our deep dive into what best-in-class sourcing optimization looks like and how form follows function from a design perspective.

First of all, optimization is important — and about to become even more important as savings go up in smoke due to inflation, protectionist policies, and insufficient supply of raw materials. Why do you think Coupa spent a healthy chunk of its IPO proceeds to purchase Trade Extensions. (They might not understand what they bought, or how to use it, but they saw the future and wanted to get in the game early enough to have some time to, hopefully, figure it out before their competition.)

Secondly, it’s the only advanced sourcing solution that has been demonstrated, when properly applied, to generate year-over-year returns over 10% (and an average of 12% according to two of the first back-to-back studies conducted by Aberdeen last decade). Plus, unlike spend analysis, which only identifies the high-level savings opportunities (which can only be captured if appropriate events are undertaken, possibly based on optimization), optimization produces the exact award scenario required to generate the savings.

Third, and most important, optimization identifies opportunities for both savings and value generation that no other platform can. Opportunities and supply chain designs no human, even with thousands of spreadsheets, will ever identify. It’s the foundation for a new sourcing age, but one that will only happen if optimization gets adopted, which will only happen if it provides a great user experience (as that’s the only thing that will overcome the math-based fear that surrounds it).

So what does such a platform need? Many, many things (as we dive into in What To Expect from Best-in-Class Sourcing Optimization Technology and User Design (Part 1) [membership required], but one thing it requires is powerful cost modelling.

You see, true calculation, and optimization, of the cost of goods sold requires complex breakdowns and formulas because, in practice, with even the most “vanilla” or simplest of products, there are fixed costs and variable costs and that these change at different production levels. For example, there are fixed costs to set up and start a production line, and then variable costs for each production level depending on raw resources, energy and manpower required to produce the product. And that’s just the beginning. You also have to worry about import / export tariffs, taxes, logistics costs, utilization costs, warranty, return, and disposal costs and a host of other category specific costs. If the costs aren’t modelled accurately, they can’t be optimized accurately. A great optimization platform thus supports flexible and powerful user defined cost models that break down costs as needed and to levels where individual elements can be optimized when possible.

And this is just the beginning.

Stay tuned for Part II in our series.

… And Trade Extensions is No More!

As of Thursday, one could look up a Form 8-K filing on the SEC site from May 3, 2017 that simply stated that Coupa had completed its acquisition of Trade Extensions, now called Coupa Advanced Sourcing for those of you on the ball (and watching TE profiles on LinkedIn for updates as well). SI expects you’ll see a formal press release early this week.

While SI completed its initial analysis shortly after announcement, it’s going to hold off publishing until after Coupa Inspire to see if Coupa inspiration changes the doctor‘s mind at Inspire. :-) # Look for a deep analysis the week of the 22nd.

(For speculators, you can check out SI’s historical writings on M&A in general and its posts on the importance of cultural conformity in partnerships and then balance these views with the simple fact that only one* acquisition of an optimization platform provider has succeeded in the Sourcing/Procurement space to date, and probably take a guess as to the doctor‘s current view. But it would be only a guess.)

*Tigris was swallowed by VerticalNet; CombineNet shrivelled in SciQuest, now Jaggaer; Mindflow was killed by Emptoris (which was in turn butchered by IBM, whose initial foray into optimization was so bad that they ended up giving it all away for free in COIN-OR) and the founders of Algorhythm subsumed their optimization capabilities into their rapid application development platform Applifire! Only the VerticalNet acquisition by BravoSolution was a success, and likely only because the BravoSolution model required keeping VerticalNet more-or-less in-tact as the US operation of the global BravoSolution organization (as there was essentially no US presence at the time).

#Or at least lets him focus in on one analysis in particular (as his analysis is actually a bifurcated analysis that depends on decisions and directions taken over the next year … will make for a very long blog series as is … )

Box Nation

… most of what America is now is just boxes going back and forth …
Stewie, Family Guy, Season 15, Episode 18

Seth MacFarlane is extremely insightful when he chooses to be. We not only have boxes on pallets in containers going back and forth between countries but we have boxes in trucks going back and forth between local warehouses, stores, postal outlets, and consumer residences … it’s a boxes in, boxes out society. And it doesn’t matter how much we optimize the boxes coming in if the boxes going out still cost too much.

The point is, you don’t just optimize the inbound supply chain if the outbound supply chain consists of lots of small deliveries that will considerably eat up the savings you worked so hard to generate. In order to keep costs down, you have to optimize these little boxes as well.

This means that you not only need to optimize:

  • packaging costs
  • (outbound) distribution costs
  • insurance costs

But you shouldn’t do separate sourcing events, because packaging is used inbound and outbound. Plus, distribution inbound and outbound uses trucks … and while inbound might typically use big trucks and outbound might typically use small trucks, not only is the situation sometimes reversed, but the same carriers often have big trucks and little trucks and the more volume you can source, the better the deal you can get.

And then there is insurance. While the insurance inbound will likely be of the supply chain variety, and insurance outbound will likely be small carrier insurance / goods insurance, it doesn’t mean that both policies can’t be sourced from the same provider, and that you can’t get a better deal simultaneous sourcing.

In other words, if you really want to save money and achieve sourcing success in Box Nation, you have to consider all the boxes, not just the inbound ones. And if you want to be successful, use optimization. Check the archives (linked) for more info.

Are We About to Enter the Age of Permissive Analytics?

Right now most of the leading analytics vendors are rolling out or considering the roll out of prescriptive analytics, which goes one step beyond predictive analytics and assigns meaning to those analytics in the form of actionable insights the organization could take in order to take advantage of the likely situation suggested by the predictive analytics.

But this won’t be the end. Once a few vendors have decent predictive analytics solutions, one vendor is going to try and get an edge and start rolling out the next generation analytics, and, in particular, permissive analytics. What are permissive analytics, you ask? Before we define them, let’s take a step back.

In the beginning, there were descriptive analytics. Solutions analyzed your spend and / or metrics and gave you clear insight into your performance.

Then there are predictive analytics. Solutions analyzed your spend and / or metrics and used time-period, statistical, or other algorithms to predict likely future spend and / or metrics based on current and historical spend / metrics and present the likely outcomes to you in order to help you make better decisions.

Predictive analytics was great as long as you knew how to interpret the data, what the available actions were, and which actions were most likely to achieve the best business outcomes given the likely future trend on the spend and / or metrics. But if you didn’t know how to interpret the data, what your options were, or how to choose the best one that was most in line with the business objectives.

The answer was, of course, prescriptive analytics, which combined the predictive analytics with expert knowledge that not only prescribed a course of action but indicated why the course of action was prescribed. For example, if the system detected rising demand within the organization and predicted rising cost due to increasing market demand, the recommendation would be to negotiate for, and lock-in supply as soon as possible using either an (optimization-backed) RFX, auction, or negotiation with incumbents, depending upon which option was best suited to the current situation.

But what if the system detected that organizational demand was falling, but market demand was falling faster, there would be a surplus of supply, and the best course of action was an immediate auction with pre-approved suppliers (which were more than sufficient to create competition and satisfy demand)? And what if the auction could be automatically configured, suppliers automatically invited, ceilings automatically set, and the auction automatically launched? What if nothing needed to be done except approve, sit back, watch, and auto-award to the lowest bidder? Why would the buyer need to do anything at all? Why shouldn’t the system just go?

If the system was set up with rules that defined behaviours that the buyer allowed the system to take automatically, then the system could auto-source on behalf of the buyer and the buying organization. The permissive analytics would not only allow the system to automate non strategic sourcing and procurement activities, but do so using leading prescriptive analytics combined with rules defined by the buying organization and the buyer. And if prescriptive analytics included a machine learning engine at the core, the system could learn buyer preferences for automated vs. manual vs. semi-automated and even suggest permissive rules (that could, for example, allow the category to be resourced annually as long as the right conditions held).

In other words, the next generation of analytics vendors are going to add machine learning, flexible and dynamic rule definition, and automation to their prescriptive analytics and the integrated sourcing platforms and take automated buying and supply chain management to the next level.

But will it be the right level? Hard to say. The odds are they’ll make significantly fewer bad choices than the average sourcing professional (as the odds will increase to 98% over time), but, unlike experienced and wise sourcing professionals, won’t detect when an event happens in left-field that totally changes the dynamics and makes a former best-practice sourcing strategy mute. They’ll detect and navigate individual black swan attacks but will have no hope of detecting a coordinated black swan volley. However, if the organization also employs risk management solutions with real time event monitoring and alerts, ties the risk management system to the automation, and forces user review of higher spend / higher risk categories put through automation, it might just work.

Time will tell.