Category Archives: Decision Optimization

Procurement Trend #08. Lifecycle TCO

Five anti-trends remain. We can count them on one-hand, but like LOLCat, we feel more compelled to provide stupid examples of how back-water the futurists really are when they provide us examples of trends that anyone who bothered to poke their head over their cubicle wall ten years ago would have noticed. However, we’ll leave their humiliation for LOLCat, who has obviously received very little enjoyment from this series, but still found time to point out how LOLCats have been sustainable at least since the first corrugated cardboard box was created and instead focus on blasting the myths the futurists continue to propagate.

So why do these Rip van Winkles keep pushing upon us trends from yesteryear? Besides the fact that some of them obviously spent the best part of the last few decades napping, probably because they look around, see the laggard organizations still caught in the muck, and assume they can still sell last decade’s snake oil in today’s marketplace. Why do they think Lifecycle TCO is today’s cure?

  • the supply management lifecycle in a typical company has been expanding
    for decades

    and cost models rarely keep up

  • once the margin has been taken out of the unit cost and the landed cost,
    the definition of cost has to expand to realize savings

    but most companies that claim to be looking at TCO are still looking at T-CAP

  • the most out-of-control costs are typically where you’re not looking

    and that’s the way, uh-huh, uh-huh, they* like it

So what does this mean to you?

Cost Models Have to Expand

Right now, most companies that claim to be focussed on Total Cost of Ownership (TCO) are really only focussed on Total Cost of Acquisition and Production (T-CAP). They are merely focussed on landed cost and costs associated with production (waste, etc.) and distribution and aren’t looking up the supply chain to energy, labour, and raw material costs and forward to maintenance, service, warranty and return costs or even further forward to reclamation, recycling, and disposal (related) costs. Every cost has an impact and any sudden increase or decrease can completely change the model.

Out of Control Costs Have to be Found

Wherever they are. Typically, a company heavily focussed on optimization will be focussed on T-CAP but not look at the expected warranty and return costs associated with switching to a lower-cost supplier or not break down the supplier’s quote to realize that the energy costs are much higher than expected and likely to rise rapidly in the region two potential suppliers are currently located in.

Cost Control Measures Have to Be Implemented

Once the cost models are expanded, the out of control costs are identified, cost control measures are defined, implemented, and performance against them is tracked. If the out of control costs are energy costs, then the organization might decide to implement its own renewable power plant (such as a solar farm or wind farm) for fixed plant energy requirements. A sourcing project is undertaken to source the plant and then, once its up and running, additional projects are undertaken to control maintenance costs, etc. Year-over-year costs are tracked to insure the realized savings on a production-cost-per-megawatt basis are realized so that the organization will see its ROI within a defined period of time.

Piece of Cake, eh?

So You Think You’ve Mastered Strategic Sourcing Decision Optimization?

Well, the doctor has news for you. You haven’t. In fact, you’re not even close.

You might be applying at least baseline optimization to the majority of your high-dollar and/or strategic categories. You might be in the Hackett Group Top 8%. You might be building Billion Dollar sourcing models. You might be years ahead of your peers. But the reality is that when it comes to true strategic sourcing decision optimization (SSDO) mastery, you’re not even close.

With the exception of the two e-CHAOS vendors, the doctor interacts and/or works with all of the remaining vendors who offer true strategic sourcing decision optimization (which isn’t a hard thing to do as there are only seven*1 [7] vendors in total with a solution that meets the minimum requirements as set forth in the wikipaper), knows the depth of the projects these vendors have supported, and can say with confidence that the best of the best have barely mastered the basics of optimization 2.0. Barely. And optimization 3.0 is on the way.

[  As a history lesson, optimization 1.0 was circa 2000 when the first solutions that minimally met the four basic requirements of solid mathematical foundations (MILP), true cost modelling, constraint analysis, and what if? capability hit the market. Most of these were basic, supporting only supplier – product – customer DC mappings; unit and transportation costs and then one level of discounts or rebates; capacity, allocation, and min/max supplier selection constraints for very basic risk mitigation; and manually created what-if scenarios. In addition, maximum model size was limited, large models took hours to days to solve, and setting up and importing all of the data from multiple bid sheets across multiple spreadsheets often took days.

Then, circa 2005 to 2007, as a result of a considerable increase in computing power, algorithmic improvements, and domain knowledge, a few solutions started to improve rapidly and we hit the beginnings of optimization 2.0. The platforms evolved to make full use of the theory of logical variables in the MILP solvers; they also supported multiple supplier locations, product substitutions, and differential costs by lane*2; a buyer could define costs by way of a cost model with as few or as many factors as desired, at multiple tiers and with volume or spend-based discounts; a full plethora of allocation, capacity, and risk mitigation constraints that could define required and desired splits, address risk mitigation or mandate awards to a set of products, suppliers, and or regions, etc.; and could automatically generate what-if scenarios based on automatically adding or dropping previously defined or newly defined constraints, historical versus current pricing models, and other factors. In addition, import and export was streamlined from RFX, Auction, spreadsheet templates, and ERP systems (where standard transportation and overhead pricing was kept). State of the art report generators and OLAP capability was integrated so that not only could you generate scenario reports and comparative reports across scenarios, but you could also dive in to see what was driving the savings against the current sourcing strategy and, more importantly, what was driving the costs compared to the unconstrained baseline scenario (and zero-in on what business rules might be too costly).  ]

The reality is that the average best-in-class organization is only doing T-CAP strategic sourcing decision optimization, and is still far from achieving TCO. Basically, when the average organizations build their cost models, they are focussed on the costs of acquisition and production (and distribution) of the goods they are buying. They’re not incorporating downstream maintenance, service and return costs and not considering end-of-life reclamation, recycling, and disposal. Nor are they breaking the acquisition cost models down to determine the upstream impact costs associated with the supplier or production method. For example, if the supplier runs their factory on dirty coal and the company has pledged carbon neutrality and has to buy carbon credits to achieve their goal or the working conditions in the factory are unhealthy (and the factory would be closed down if it was in America) and this adds more fuel to the fire of the CSR activists and is costing your organization brand value, these costs also need to be considered. As a result, the organization is capping its potential return from optimization. Not only is the organization not achieving TCO, but it’s no where close to achieving TVM (total value management), which is what it has to achieve if it wants to realize true optimization 2.0 mastery and move on to optimization 3.0.

And the average organization is not even thinking about the more advanced opportunities that the next generation 3.0 capabilities will enable. Right now, the leading strategic sourcing decision optimization vendors are integrating new capabilities in the new versions of their products that are currently in development, with some basic 3.0 capabilities already released! The convergence of big data, advanced analytics, and decision optimization into a single platform is enabling a host of new capabilities that the average organization has not yet envisioned, including the 6 next-generation advanced sourcing optimization capabilities outlined in Sourcing Innovation’s new white paper on Optimization, What Comes Next (registration required), sponsored by Trade Extensions (which is one of the vendors working hard to give you tomorrow’s optimization solution today).

Companies that master the 6 next-generation advanced sourcing optimization capabilities described in Optimization, What Comes Next (registration required), will not only be the first to master optimization 2.0, but will be the first to enter the world of optimization 3.0 and find savings and cost avoidance opportunities that they never even knew existed.

Are you ready to crank the amp and take it to 11? If so, download Optimization, What Comes Next (registration required) today!

*1 search the SI archives if you don’t know who the seven are

*2 the MindFlow Model, which was recognized as the first SSDO model to support multi-line item optimization back in 2000, actually supported this level of modelling back in 2000, an average of five-plus years before the majority of SSDO solution providers did

Optimization, Do You Know What Comes Next?

Admit it. You don’t. You have no clue. You’re still struggling with the basics of model building, data collection, constraint definition, scenario comparison, and, most importantly, when you should use optimization (always, but not necessarily to make the award decision). It’s math, it’s hard, you weren’t trained, and you’re already struggling to keep up with the increasing demands placed upon your Supply Management organization by the C-Suite, stakeholders, and suppliers.

But you should. Costs, including labour and energy, are rising across the board. Inflation is about to make its way back with a vengeance. Markets are uncertain and in some places unstable. It’s getting to the point where all optimization on a category optimized in the last three years is going to do is limit cost increases to inflation. Unless the model is expanded, the category is redefined, or an innovative approach is taken — there will be no more savings to be found.

As a result, you have to take your optimization to the next level. You have to go from T-CAP (Cost of Acquisition and Production/Utilization/Distribution) to true TCO (Total Cost of Ownership) as you work your way to TVM (Total Value Management). You have to learn how to take your modelling and analysis skills to the next level. And you have to be innovative in your application of optimization.

And what does that innovation look like? To find out, download Sourcing Innovation’s new Illumination white-paper on Optimization, What Comes Next (registration required). Sponsored by Trade Extensions, this white paper presents six ways you can merge big data, analysis, and optimization to take strategic sourcing decision optimization (SSDO) to the next level and find savings and cost avoidance opportunities you would never have imagined even a year ago.

The reality is that even if you’re Hackett Group top 8%, regularly applying SSDO to high-dollar and/or strategic categories, and building nine, and even ten, figure sourcing events on optimization, you’ve barely mastered the basics of optimization 2.0. Since the doctor regularly interacts (and sometimes works) with five of the seven providers of SSDO technology (and the two e-CHAOS hold-outs have not upgraded their platforms substantially in years), including the true market leaders, he knows the extent of the projects they have supported and knows that even the most advanced organizations are just scratching the surface of optimization 2.0. But these providers are now taking their products to the next level and have started releasing the foundations of 3.0 capability, with lots more to come over the next few years. The organizations that adopt, and master, these 3.0 capabilities first will be decades ahead of their peers in supply mastery. Decades.

So, if you want to be one of this decade’s Supply Management leaders, download Sourcing Innovation’s new white-paper on Optimization, What Comes Next (registration required) and start preparing for the future of optimization today. It will be worth your while and when you start applying these techniques, you won’t be disappointed.

When it Comes to an Event, How Big is Too Big?

1 Category?
5 Categories?
25 Categories?

10 Commodities?
50 Commodities?
100 Commodities?

50 Lanes?
500 Lanes?
5000 Lanes?

It depends. How much can you handle at one time?

If you’re sourcing with optimization, the bigger the better. Tackle as many categories at a time that overlap with at least one other category, especially if you are dealing with physical goods that are coming from common locations. The way you save on logistics costs is to minimize the number of trucks, which occurs when you can combine as many shipments as possible as to minimize the number of LTL shipments.

Or if you are dealing with multiple service categories that can sometimes be provided by the same contract or temporary labour agencies. For example, engineers and software developers are often offered by the same specialist agencies; warehouse, janitorial, security and other unskilled labour are often obtainable from the same agency; and certain others specialize in legal, accounting, and similar trade professions.

Tackling them all as one mega-project doesn’t mean that you have to negotiate with them all simultaneously or that you have to create massive RFXs, Auctions, or bid-sheets. There’s nothing stopping you from organizing your sourcing events so that each category is being sourced simultaneously by a different team member, co-ordinated so that all of the bids come in simultaneously for a round of global optimizations to determine if there is any overlap in transportation or supply base that would suggest a (temporary) combination of categories or a splitting of transportation into a separate project.

Optimization isn’t just doing the best job you can on the event, it’s defining the right event in the first place. Sometimes the best way to do this is to look at a number of categories simultaneously when they are each in the middle of a sourcing project and see if the definition and split really is the right one. If the mega-optimization suggests something different, re-define the categories and events and continue the right way.

Just make sure that, when the event notice goes out, that you inform suppliers / carriers that the bids will be multi-round and that the scope of the transportation requirements might be increased or decreased after initial bid analysis and further category definition; or, in the services case, that this is a preliminary request for information and rate cards and that the suppliers should inform you of other services they can offer and standard rates for those as you may, if the option exists, expand your requests during the final RFQ / negotiation phase (as you want to be above board during the entire process).

In other words, a project is only too big when it exceeds the ability of your current team to manage it simultaneously. If the numbers involved makes someone fidgety, then it’s time they shape up or you find someone with a stronger backbone. If the tool you are using says it’s too big, then it’s time to get a new tool. It’s not too big if it doesn’t exceed your current potential, which for many leading sourcing organizations is well beyond what they think it is (as a result of sourcing providers with limited skill sets assuming that just because they can’t handle it that their client can’t handle it). There are teams out there that can handle Billion Dollar sourcing projects and tools that supper them. That’s about as big as it gets.

So, as Big Data Promoters like to say, Think Big!

Optimize, don’t Compromise!

Continuing on our theme of analysis and optimization, every e-Sourcing suite on the market will support your organization in its sourcing activities, but not every product will allow your organization to optimize it’s sourcing activities.

Optimization requires advanced sourcing capability, and advanced sourcing requires the ability to analyze data, not just collect and report on data.

This means, that at the very least, you will require:

  • true spend analysis,
  • true category analysis,
  • true cost-based bidding, and/or
  • true bid optimization.

Without at least one of these capabilities, you’ll never optimize your spend. So don’t even both to try without them.