Category Archives: Decision Optimization

The 6 Days of X-asperation: Day 4 – Questions to ask your Decision Optimization Vendor

Just like we did in the X-emplification series, we’re going to continue with Decision Optimization as we tackle the generic questions that you should be asking every vendor, and the types of answers you should be expecting.

1. What do I have to do to get a good handle on how to make effective use of this technology, and for an organization of my size, how long is it going to take?

The first thing you have to do is get a good understanding of what strategic sourcing decision optimization is, what it can do for you, and, most importantly, what data you’re going to need. I strongly suggest you read the wiki-paper authored by yours truly if you haven’t already. The wiki-paper will tell you:

  • The requirements of a true decision optimization system
    which will insure you don’t get taken in by a cheap imitation decision support system
  • The basic capabilities a decision optimization system should have to be truly useful for strategic sourcing decision optimization
    which will insure that the decision optimization system you select is most appropriate for the problems your team faces as sourcing professionals
  • The basic requirements for success when using a decision optimization system
    which include good forecasts, appropriate cost breakdowns, and knowledge of the required, vs. desired, business rules
  • Ten strategies for success
    which will help you get the most out of every strategic sourcing decision optimization project

Then you need to figure out what your potential return is from strategic sourcing decision optimization. Although every project will benefit, the reality is that decision optimization is still relatively expensive technology to buy and the amount of work involved in these projects can be considerably more than an auction. This sometimes requires the time of senior professionals, which can add up. If you’re a small or mid-size company whose largest sourcing project is 10M — and you only expect to save 3%, primarily on reduced freight and inventory by way of better allocations, because it’s a buyers market and auctions work very well — then, considering that the cost of buying, maintaining, and using a good solution starts in the mid six-figures a year, it’s probably not for you. However, if you’re a global 3000 company with a dozen or more sourcing projects in the 50M to 500M+ range, and a possibility of savings of 5% to 15% per project, and your total potential savings is in the 50M to 150M range in the first two to three years, then you should identify the right optimization system for you and start using it as soon as possible.

Once you’ve decided optimization is a useful technology, and one you should be using, you need to review the categories that you will be sourcing in the next 12 months, and then rank them by dollar amount and complexity. The projects that appear in the top half of both lists will be good candidates for strategic sourcing decision optimization. (Note that if your annual spend is in excess of 1B, the doctor can tell you right now that properly applied decision optimization technology will generate ROI for you.)

Armed with the potential projects, you need to devise appropriate cost breakdowns for each of the goods and services under consideration, identify other relevant non-cost and qualitative factors, and prepare the appropriate surveys and RFPs/RFQs so that you can get projects underway relatively quickly. Optimization only achieves significant returns if done right – and this requires that you get accurate bids and cost breakdowns where the cost of the good or service is separated from the freight cost, and any relevant costs such as duties, differential costs of waste and returns, and discounts are taken into account. The extra preparation is definitely worth it when you consider that studies done by Aberdeen in 2005 and 2007 (as referenced in the wiki-paper) found that organizations that employ advanced sourcing methods based on decision optimization save an organization, on average, 12% above and beyond what can be saved in an e-Auction or basic sourcing project.

The amount of time it takes really depends upon the skill-level of the people you have. They have to wrap their minds around decision optimization for strategic sourcing, understand what it really is, how they best use it, and how they have to approach decision optimization sourcing projects and data collection to get the most bang for their buck. If they are junior buyers in skill-level, it could easily take them a few months to truly grasp the basic concept, and chances are they will never be able to take full advantage of the tool until you upgrade their sourcing skill level (through an appropriate certification program such as the ISM CPSM or Next Level Purchasing (now the Certitrek NLPA) SPSM, for example). If they are senior buyers in skill-level, they should be able to grasp the basics and re-design the RFXs for the first project within a couple of weeks.

2a. How much functionality is my organization realistically going to be using in 12 months?

Your senior buyers should be using all of the functionality in the strategic sourcing decision optimization tool within 6 – 12 months. The situation now isn’t as it was when these tools were first hitting the market place 7 years ago (at which time the UI alone was so complicated you needed a graduate degree just to understand it). A good tool has a clear UI and good data import capability that allows you to specify the categories and items under consideration, the suppliers who can supply those items, the locations where you need those items (which may be done by way of groupings), and the cost breakdowns (at least by adjusted unit cost and freight cost). The tool should be able to import data from an appropriately formatted excel worksheet, or from an e-RFX or e-Auction module if it is integrated into a sourcing suite. Furthermore, modern software allows each type of constraint that can be defined to be clearly delineated, and step-by-step wizards exist to help you define the constraint appropriately.

Your intermediate buyers should be able to master the basic constraints in this time-frame, and be well on their way to improving their sourcing decisions and relative skill levels.

Even though the tools have improved significantly, strategic sourcing decision optimization, by its very nature, requires a more advanced skill level than other tools in the e-Sourcing suite and your junior buyers may not be up to the challenge. You will need to provide them the training they need to upgrade their sourcing competence to an intermediate level before you can expect them to master the tool, even though your technologically savvy junior buyers will be able to get a reasonable grip on the basics of setting up a scenario and defining simple constraints in a rather short time frame. You have to remember that the use of strategic sourcing decision optimization is advanced sourcing, and this requires more than just a friendly tool – it requires buyer skill.

2b. How much functionality do I really need?

When it comes to model development and solver capability, as much as you can get. This is still a developing technology, and even though you can achieve considerable savings above and beyond an e-Auction just with what’s out there today, there’s still a long way to go.

When it comes to add-ons, it depends on what the company is offering you as an add-on. If it’s services, then, considering you should have guidance on your first few projects, you should strongly consider them if they’re reaonably priced. If it’s custom integration services to your RFX or e-Auction platform, then, assuming these are the right RFX and e-Auction platforms to be using, and the integration is priced competitively, then this is also worth considering. However, if the add-on an enhanced solver module, I’d ask why this isn’t part of the base offering (as it should be).

However, one thing that is important to note, if it’s not easy to load the data into the tool, it likely won’t be used at all. Thus, it’s important to make sure that not only is the import or ETL tool included as part of the basic functionality, but that the import functionality is also easy to use.

2c. And how does this functionality solve my #1 pain today, which is X?

If you’re looking at strategic sourcing decision optimization, chances are you are seeing diminishing savings from your sourcing projects and need a way to improve returns. What you’re looking for here is an answer not based on technical competence, but on sourcing experience. You want the vendor to tell you that their product has been applied successfully by companies in a number of verticals on a number of categories and that, based upon their experience in and around your industry, they expect that you will be able to save in the 5% to 15% range on a well-defined set of categories. You want to know that they have the experience to help you select the right categories to start with that will help you get some quick wins and support for the new technology.

3. How much training is my team going to require to effectively use the software? How long is it going to take them to absorb this training?

It should not take more than a week to get your intermediate and senior buyers up to speed on how to use the tool. However, the training is not really going to be absorbed at a deep level until your professionals apply the tool on a few projects, which should be done under the guidance of an experienced professional who can insure that your team is tackling the project in an optimal manner. Thus, it will probably take a few months, at the minimum, for your senior buyers to truly master the tool.

4. How much is this software REALLY going to cost me in the first year and each subsequent year?

Although real strategic sourcing decision optimization has been around for almost seven years, it only became usable in the last few, and due to its relatively low adoption rate to date, and continued development, it’s still a reasonably new offering. You should expect to be paying in the mid six figures per year, depending on the power of the solution and your hardware and solver license requirements. (Most platforms are built on top of industry leading solvers, such as Ilog’s CPlex, which can run 25K to 50K per license. Plus, you need high end servers if you want to build large models and have them solve relatively quickly. Thus, even an on-demand offering is going to be pricey if you want dedicated solvers and hardware, which you could need if you have large models or intend to use the platform significantly.)

Furthermore, since this technology is still emerging (like real spend analysis), updates should be regular and maintenance will be higher than for e-RFX and e-Auction, so you are probably looking at maintenance (for behind the firewall or ASP solutions) in the 20% range.

Installation should not be time consuming, and should not require more than a few days of consulting. (On-demand should be free if you’re using a basic service that uses shared optimization resources, but if you are asking for dedicated resources, you should expect to pay for some consulting time as a dedicated resource will need to deployed to get this done.)

5. You say you care about your customers and that you are going to provide great service. Prove it!

Ask for references. Talk to them. If the vendor has an upcoming user meeting or conference, ask to go to it. Ask for examples of results their customers have achieved on the platforms recently, and how they can help you achieve the same. But most importantly, ask them if they’ll help you with your initial pilot project at a reasonable consulting rate and see what kind of results they deliver – with their tool.

6. Can I take it for a test drive or a short term lease?

Considering that this software is usually either web-based or a fat client that runs on your desktop, there shouldn’t be any problem for your provider to set you up with a single instance, or copy, for you to use on a pilot project – which they should be comfortable with you undertaking at a low consulting rate – equal to the cost of the consultant that guides you through the pilot project.

7. Can I buy it or implement it in pieces?Just like you should ultimately buy the entire e-RFx or e-Auction tool functionality up-front, you should buy the entire functionality of the strategic sourcing decision optimization tool up-front as well, but I’d hold off on buying dedicated hardware and solver resources until you’re ramped up and ready to maximize usage of such resources, as a single dedicated high-end machine with a dedicated CPlex license will cost you (well) over 50K a year in additional cost. If you’re maxing out your solver, dedicated resources can be worth it when you consider the ROI that accompanies strategic sourcing decision optimization. But if the hardware is just sitting there, that money is better spent on consulting services to help you get up to speed on how to maximize use of the tool.

The 12 Days of X-emplification: Day 11 – Supply Chain Optimization

On Day 2 we talked about strategic sourcing decision optimization, the technology you need to make the right buy given the myriad of constraints you have to adhere to and the large number of costs and bids you need to take into account. Today we’re going to talk about supply chain optimization – the process of optimizing your supply chain, or distribution network, to minimize costs and maximize value.

Even though the only way to truly get the optimal buy every time is to use the optimal supply chain, the reality is that you can’t transform your supply chain overnight for every bid. The realities are that it takes time to acquire, lease, or dispose of distribution centers and warehouses, that you have contracts in place with suppliers and carriers for anywhere from three months to three years in a typical organization, and that changing global distribution patterns requires time to research the regulatory, documentation, and taxation requirements of different countries and trade zones. Thus, when it comes to strategic sourcing, the best you can do is optimize the buy within the supply chain you have available to you today. However, if you can improve the supply chain, then you can reduce your costs and save even more across all of your buys.

Supply chain optimization is something you should do on a regular basis. Whereas in the past, experts would say that it is something you should do only once every five, seven, or ten years – today it is something you should do every year! Today’s optimization solutions are a lot more powerful than they were ten years ago and allow you to build much more sophisticated models, which are now usually solved in hours compared to the weeks that was once required for models of this magnitude.

Even though it probably doesn’t make sense to buy and sell manufacturing and distribution center assets every year, there’s nothing stopping you from modeling the cost associated with such a sale, or modeling the cost of breaking or failing to renew a lease, of each asset you have if new options present themselves, such as alternative low-cost distribution centers or the possibility to sell a manufacturing center to an outsourced contract manufacturer who might be able to manage it more cost effectively. Today’s solutions can model all of the costs associated with acquiring, running, and disposing of an asset in your global distribution network, and can help you truly identify what the optimal network is for you at any given time for any given period of operation. (Thus, every year you can redo the analysis and assume that the network is only going to remain stable for the next year.) You can also tell a good supply chain network optimization solution that certain aspects of the network aren’t allowed to change and that certain aspects of the network must change and have it tell you whether or not your current network is optimal or if you should consider making some changes.

So how do you identify the right supply chain modeling and optimization solution? As with any other technology, you ask the right questions. The following questions should be enough to get you started and help you identify the real solutions with the power you need from the imposters.

1. Can the solution model your supply chain as is?

This is a question you need a resounding yes to. How do you know how much a potential network redesign is going to save you if you don’t even know how much your current network design is costing you? This brings us to …

2. Can it derive a cost baseline?

Once you’ve modeled your current network, the solution should be able to run the model and tell you how much your network should be costing you. (If your current network is actually costing you significantly more, than either you have some inefficiencies in your processes to work out or you have not accurately modeled your network and need to revise or expand your model.)

3. Can the solution support the construction of a model depicting a desired state?

If you have a solution in mind, you should be able to construct that solution and derive a cost baseline for that solution. Similarly, you should be able to define your own modification of a suggested network design and derive a cost baseline for that modification. After all, it’s not the lowest cost solution, it’s the highest value solution – and that’s not necessarily the solution with the lowest cost today, but the network design with the expected lowest cost, and highest value, over the expected lifetime of the network.

4. Can it derive an estimated cost of any model you specify under a projected range of activity?

The reality is that any given solution is only optimal for the specific (set of) demand value(s) and the specific (set of) cost(s) that the model is defined on. However, you’re optimizing your network for a future period of time, where demands are only forecasts that could change. Thus, you want a solution that also has simulation capabilities and the ability to run multiple models under multiple demand scenarios and cost differentials to allow you to come up with a network plan that is robust and most likely to save you money over the range of scenarios that are most likely to occur.

5. Can it allow you to drill down into the expected cost differential between two models and determine why?

It’s not enough to know that one network design is expected to cost 2M more than another, you also need to know why, especially if the more expensive network design is the one you’d prefer. If you know that most of the costs are associated with lease payments, then you know that if you could negotiate a lower lease price, you could end up with a network design that you like and that is only slightly more than the lowest cost solution. If such a design also has lower risks, then it has a higher value and you can choose it.

6. Can it help you optimize your supply chain improvement investments?

Converting from one network design to another will occur a lot of upfront costs associated with asset acquisition, lease, and disposal as well as penalties if you have contracts in place that you need to back out of early. These up front costs need to be covered somehow, and if you only have a fixed amount of capital available for supply chain improvements, you want the model to be able to take that into account and the solution to provide you with different, near-optimal, improvement possibilities that are within your budget today.

7. Can it model the impact of fixed asset disposal or cost reduction on projected service levels? inventories? greening?

When optimizing your network, it’s not just about cost and risk, it’s also about service optimization, inventory optimization, supply chain greening, and a slew of other initiatives. It’s important that such a solution not only allow you to specify all of your constraints, but allow you to calculate whether or not you’re trading service level or inventory risk or carbon credits for that cost reduction.

8. Can the solution support sensitivity analysis?

Building on the last question, if the system tells you a certain network design is likely to reduce your projected service levels by 1%, you want to know how much money is required to bring that down to any threshold between 0 and 1%. Maybe you only have to sacrifice 25% of your maximum savings opportunity to achieve a service level decrease of only 0.1%. That could be a good trade-off – a 0.1% decrease in projected service levels is much better than a 1% service level decrease, especially when it costs you only 25% of your maximum savings potential to achieve a projected service decrease that’s ten times better than the projected service decrease that you would be stuck with if you went with the greedy solution.

The 12 Days of X-emplification: Day 3 – Optimization

As I highlighted in Questions to Ask your Optimization Vendor, not all optimization vendors are equal … and, more importantly, not all vendors that claim to have decision optimization even have it! Thus, not only is it important to know what to look for when searching for a true strategic sourcing decision optimization, it’s also vital to know what to ask and what answer you want to hear.

In this post I’m going to cover five key questions that you should ask of every vendor you are considering. Some of these overlap the questions I x-emplified in my previous post (which you should re-read), and a few of them are new. Of all of the topics I am covering, this is probably one of the least understood – and since this is one of the few technologies with the capability to allow you to reduce your spend year-over-year when properly applied – this situation has to change.

1. Does the application satisfy the four pillars of strategic sourcing decision optimization?

As outlined in the Strategic Sourcing Decision Optimization wiki-paper on the e-Sourcing Wiki [WayBackMachine], the four pillars of strategic sourcing decision optimization are:

  • Sound & Complete Solid Mathematical Foundations
    such as simplex algorithms and branch-and-bound;
    many simulation and heuristic algorithms do not guarantee analysis of every possible solution (sub)space given enough time, and, thus, are not complete in mathematical terms
  • True Cost Modeling
    many bidders bid tiered bids, discounts, and fixed cost components – the model must be capable of supporting each of these bid types
  • Sophisticated Constraint Analysis
    At a minimum, the model must be able to support reasonably generic and flexible constraints in each of the following four categories

    • Capacity / Limit
      allowing an award of 200K units to a supplier who can only supplier 100K units does not make for a valid model
    • Basic Allocation
      you should be able to specify that a supplier receives a certain amount of the business, and that business is split between two or more suppliers in feasible percentage ranges
    • Risk Mitigation
      let’s face it – supply chains today are all about risk management, and you should be able to force multiple suppliers, geographies, lanes, etc. to mitigate those risks without specifying specific suppliers, geographies, lanes, etc. to take advantage of the full power of decision optimization
    • Qualitative
      A good model considers quality, defect rates, waste, on-time delivery, etc.
  • What-if Capability
    The strength of decision optimization lies in what-if analysis. Keep reading.

2. Does the application support the creation and comparison of multiple what-if scenarios?

The true power of decision optimization does not lie in the ability to find a solution to one model, but the ability to create different models that represent different eventualities (as this will allow you to hone in on a robust and realistic solution), to create different models off a base model plus or minus one or more constraints (as this will help you figure out how much a business rule or network design constraint is really costing you), and to create models under different pricing scenarios (to find out what would happen if preferred suppliers decreased prices or increased supply availability).

3. Does the application automatically identify the most constraining and costly constraints?

Let’s face it, not every constraint has a significant impact on the optimal solution, if it even has an impact at all. Restricting the highest cost supplier to 30% of the total award is unnecessary if the supplier is not going to get any award. However, restricting the lowest cost supplier to 20% of the award could be the most restrictive constraint in the scenario, as the supplier would get 80% of the award otherwise.

The solution should identify, in order of decreasing impact, which constraints are having the greatest effect on the optimal solution and, at the very least, provide a range estimate of how much the constraint is costing you in the model. Determining the constraints that significantly impact a scenario can be done deterministically – they are at their bounds. Determining the constraints that impact a scenario moderately can be done through a deterministic comparison with the optimal solution to the “unconstrained model” (where only supplier capacities, demands, and cost constraints are included). The rest of the constraints then impact the model slightly or not at all. Calculations that take into account the differences between the optimal solution to the model and the optimal solution to the unconstrained model can be used to provide a reasonable estimate of the cost of any particular constraint. Furthermore, an exact cost associated with the removal with any constraint subset can be determined by optimizing the modified model. This brings us to …

4. Does the application support the automatic creation and solution of relaxed and perturbed scenarios?

After the constraints with the most significant impact, particularly from a cost (or risk) perspective have been identified, it’s only logical that you want to know not only how much they are costing you, but how much a relaxation (as opposed to a removal) of the constraint would save you. For example, if you allocated 30% of an award to a new vendor vs. 20%, what would you save? The reality is that you really want to understand not just the cost, but the “cost per unit” of the constraint. If you have allocation splits, you want to know the effect of minor and moderate changes to the splits. If you have limit constraints, you want to know how much you could save with increased capacity (and, thus, whether the company should be making an investment into new technology or more production lines or entering into a strategic partnership with a key supplier to lock up more capacity). If you have qualitative constraints, could you save more if vendors increased their quality by 10% across the board (which is equivalent to allowing a 10% decrease in quality level in the model)?

For each constraint type, it’s pretty easy to come up with a standard set of “perturbations” that you would want to analyze using what-if analysis. The application should support standard perturbation templates that you can use to set up an over lunch (or overnight) run against a well-formed what-if scenario that would generate a variance report that would tell you not only what constraint relaxations would save you the most, but how much a per unit perturbation against the constraint would save you and let you hone in on an award allocation that will have the lowest total cost of ownership over the life of the contract – and not just on the day you run the what-if scenario.

5. Does the application support make-vs-buy and arbitrary product substitution?

If you’re only sourcing indirect, you might not care about make-vs-buy, but you should care about product substitution. Let’s say you’re a major player in the food service industry who caters to the average joe’s love of pizza. Since few pizzas are made without tomato sauce, you’re going to need a lot of it. But guess what – if you ask a supplier for “sauce” they’re going to say “how much”, “what type of packaging”, and “refrigerated or frozen”? Chances are there are 10 different “products” for you to choose from. And it’s not as easy as just saying “whatever is cheapest” or “I’m standardizing on form factor 27” because “whatever is cheapest” will vary by production plant (some types of packaging will be cheaper in some countries than others, some plants have newer technology for certain kinds of packaging, and packaging weights and volumes determine shipping costs). Furthermore, the availability of products is probably going to vary across locations. The way to get the lowest cost is to allow a supplier to bid ALL products that can meet your needs (and, of course, account for any variances in processing costs through adjustments). Thus, you want your strategic sourcing decision optimization solution to support arbitrary product substitution.

However, if you’re sourcing direct, you’re really going to want make-vs-buy analysis. Let’s go back to Mr. Martyn’s automative seat example. Do you source the seat? Do you source seat components and assemble them? If so, how do you define a “component”? Or do you source all the raw parts and assemble them? The reality is that even in this simple problem, you have over 1000 different options. Thus, creating a model where you source the finished components, creating a model where you source specific sub-components, and creating a model where you source all the parts, and doing a comparison report on their optimal solutions just doesn’t cut it. You might find that you save 10% by sourcing the components and having a third party assemble them, but who’s to say that there isn’t a different configuration of components that would let you save 20%? If you gave your suppliers the flexibility to choose their own components and the third party assemblers the opportunity to bid on the components they think they could assemble into the final product most cost effectively, who knows what innovation they might identify? In a make-vs-buy scenario, you can’t assume you know what the proper subset of models to analyze is. You really need to analyze ALL the options available to you.

the doctor Goes Mental On Optimization Myths

In my last post, I went mental on three of the most dangerous myths out there with respect to e-Auctions. In this post, I’ll attack some of the more brain-dead myths that are out there with respect to optimization. For more great information on decision optimization, I recommend checking out the e-Sourcing Wiki [WayBackMachine] paper (Strategic Sourcing Decision Optimization: The Inefficiency Eliminator) that was originally authored by the doctor, the doctor‘s joint podcast with Next Level Purchasing (Parts I and II and Free Transcript with Editorial Notes brought to you by Sourcing Innovation and Next Level Purchasing [now the Certitrek NLPA]), and the posts in the Decision Optimization category here on this blog.

Myth 1: I need a PhD to use optimization!
It used to be the case that you needed an advanced degree to use the overly complicated command-line tools that represented the first generation of commercially available optimization products, but that hasn’t been true for quite some time. Today, companies like Emptoris (acquired by IBM, sunset in 2017) and Iasta (acquired by Selectica, merged with b-Pack, rebranded Determine, acquired by Corcentric) offer very simple wizard-driven user-interfaces that can be driven by any business analyst or sourcing professional which are often easier to use then your ERP or BI tools!

It’s literally as simple as selecting the relevant auction or RFx data, defining your demands for each item at each distribution center, identifying invalid freight lanes, specifying supplier capacity restrictions, identifying any business rules (such as dual-supply, 70-30 split) and defining any discounts or “preferred award” valuations (if I buy from Quality Delivered, the joint marketing campaign will be worth $10K). Then you click “optimize” and the optimal award for your scenario is spit out. If you don’t like it, you can copy the scenario, add or remove some constraints, and see what your idea of an optimal award is costing you and make the smart decision.

Myth 2: I can’t afford optimization! It’s too expensive!
Having been involved in this industry for a while, I know that early solutions were very expensive – usually starting in the seven figure range for an average company. But that’s true for every new generation of technology, software or hardware, it’s costly at first, as companies need to recoup their massive R&D investments, but gets cheaper over time. Today, a mid-size company can get a true enterprise quality strategic sourcing decision optimization solution for its sourcing department starting in the quarter-million to half-million six-figure range – and this will include a (shared) C-Plex license and (shared) dedicated hardware resources if they use an on-demand model such as that offered by Iasta.

Myth 3: My problem’s too large / complex for optimization.
Again, the technology has come a long way in the last decade. Not only can massive problems (which couldn’t have been solved in months a decade ago) now be solved in a matter of hours, but the types and quantity of constraints available have greatly increased. If your model is truly humongous, just remember that both CombineNet (acquired by Jaggaer) and Algorhythm regularly solve problems that take millions of variables and hundreds of thousands of equations to specify. As for complexity, even the solution by the relative newcomer, Iasta (which has adopted a best-to-market strategy) supports the four basic categories of constraints required for true decision optimization (capacity, allocation, risk mitigation, and qualitative), flexible discounts that will allow you to implement just about any cost structure you can devise, and freight costs for a true total landed cost model (as you can also define adjustments to capture utilization costs).

Myth 4: We’re very sophisticated when it comes to e-Auctions. We’re not going to save enough money with optimization to make it worthwhile.
Wayne Campbell said it best when he said and perhaps monkeys will fly out of my butt!“. Although it’s theoretically possible that you could be making the perfect buy – every time – without any decision optimization, in reality, the chance that this is true is about as close to 0 as you can get. I’ve NEVER encountered a situation where optimization didn’t provide a company with a cost savings opportunity in any moderately complex bid (and, these days, what bid is not moderately complex?). Furthermore, Aberdeen has found, both in their 2005 study AND their 2007 study, that advanced sourcing and negotiation methods, which includes decision optimization, saves a company an average of 12% beyond what they would save just using e-Auctions. That’s a lot of cash you’re leaving on the table.

There are more myths, but this is a good start – and hopefully enough to convince you to check these solutions out. Decision optimization for strategic sourcing is worth the investment.

Algorhythm and the Optimization Rhythm in India

Recently, I had the pleasure to have a couple of conversations with Ajit Singh, the Founder and Director of Algorhythm, a company in Pune, India that has significant expertise in Optimization and Supply Chain Modeling. The have their own optimization engine, a set of front-ends for different types of supply chain models that can be used by anyone with modeling skills, and significant experience in helping large global multi-nationals with significant supply chain network design and optimization problems. Basically, they’re India’s CombineNet, but with a slight distinction – every model they build, including custom models, can be executed and modified completely by the client through an extension of their easy-to-use windows-based front end – you are not tied to their services. In comparison, although CombineNet has done a great job over the past few years of actually building stand-alone products and interfaces, it’s still often the case that custom models are only available through their services model.

Algorhythm has the capabilities to attack both strategic and tactical supply chain problems from an optimization and simulation perspective. They have sophisticated models for strategic planning that include inventory optimization, distribution network design, manufacturing network design and for tactical execution that include production planning, logistics planning, and supply network execution.

They also have specialized solutions for oil, steel, and packaging as well as having a considerable amount of experience in creating models for manufacturers and distributors. Major clients include Unilever (Hindustan Unilever, Unilever Plc. UK, and Unilever China), Thyssen Krupp, Hindustan Petroleum, and Parle Products among dozens of others. Their manufacturing and distribution network design models often save their clients 3-5%. Remember that we’re talking production models here – not sourcing models, so this is actually quite good. In terms of efficiency, their production planning and scheduling models often halve throughput time and inventory carrying requirements – which is also very good. Furthermore, we’re not talking small models here – Parle, for example, ships 50K trucks per year per SKU from hundreds of factories to thousands of wholesalers.

It’s quite easy to build a model in their products, which they call Prorhythm (for production-planning based models), Netrhythm (for network-planning based models), and Logrhythm (for logistics planning models), and which run on top of their Xtra Sensory optimization engine. They’ve thought through what the model is, what the core elements are that make it up are, what the costs are, and what measures you might want to optimize. Building a model is simply defining all the relevant entities (which are factories, lines, outputs, inputs, etc. in production planning), the associated costs (material, labor, overhead, etc.), the measure(s) you want to optimize (cost, throughput, etc.) and their priority / weighting if multiple, and the constraints. It assumes all relationships between related entities are valid unless you specify them as invalid (and permits groupings for easy constraint definition). It also groups constraints in a “constraint file” so you can easily run the same model against different constraint sets. Basically, it’s built to build models the way the doctor would build it.

Since there is no “one” optimal solution when you’re optimizing against multiple objectives, as it’s almost always impossible to precisely normalize each measure to a uniformly distributed 0-1 interval that can then be weighted according to the weights you want, they also support simulation. You can tell the optimizer to construct a set number of models equally distributed around the desired optimization point and it will automatically create and run all of the variants which you can then compare to see how slight changes impact solutions and goals.

It’s a great offering, and the people are quite knowledgeable. If you have a tough optimization problem, be sure to check them out. They might surprise you.