Category Archives: Decision Optimization

Keelvar: Are They Right For You?

As a preamble, in today’s post we’re not going to discuss whether or not optimization is right for you because the answer is an unqualified “it is” because there does not exist a vertical that is unable to benefit from an appropriate optimization solution that supports the right model. If you are a 100M+ company, you should be using optimization. Maybe not on all categories, because you don’t source categories where the return doesn’t exceed the cost of the effort, but on the large and strategic ones.

Instead, as a follow-up to yesterday’s post, we are going to discuss whether or not Keelvar is right for you.

Right now, most of the companies that use optimization are in the high-end of the market, with a few leading companies at the high-end of the mid-market dabbling in it. Furthermore, most of the optimization solution vendors out there are focussing on this market. As a result, as per a previous post, most of the mid-market is not using optimization because they see it as too costly and too difficult. Seeing this, Keelvar decided that what was needed was a solution that was focussed entirely on the mid-market and, more specifically, at the lower end of the mid-market. That’s the solution they built.

This has advantages, in that they have a large market they can go after, and disadvantages in that the simplifications required to make the solution useable by that market limit the solution’s flexibility and power for large problems that require complex models and powerful solutions capabilities. But since the high end market already has good solutions, that’s okay. Given the different needs of the lower mid-market, the higher mid-market, and the global multi-nationals that need almost customized solutions, and the different needs of well-staffed and well-educated Supply Management organizations and poorly-staffed organizations that need to augment their solutions with a lot of services, there is still plenty of room in the market for a new entrant as even the six market segments just defined (3 tiers, without services and services required) aren’t adequately covered by the current players.

So if you’re in the lower-end of the mid-market and you’re ready to start optimizing your sourcing, you should head on over to Keelvar’s site and check them-out. The solution might just work for you, and with event pricing starting in the low five figures and unlimited annual licenses starting in the extremely low six figures, it won’t take long to see an ROI — especially since Keelvar makes optimization affordable on an event basis on categories as low as 500K to 1M and on an unlimited basis on categories as low as 100K to 250K (because if you have an unlimited license, why not use it on every event — it doesn’t take long for 10K savings to add up!).

Keelvar: Strange Name. Uncommon Results.

In our last post we talked about a new entrant to the Strategic Sourcing Decision Optimization arena that was about to take up the education gauntlet. That new entrant is Keelvar. A spin-out from the 4C research laboratory in the Department of Computer Science at University College Cork that raised 750K Euros in 2012, it was formed as a Software-as-a-Service (SaaS) company to help purchasers establish a balanced and cost-effective outcome between large and small suppliers, which can be critical to indigenous industry by way of a price-gathering mechanism that supports communication of creative ways in which waste can be removed, helping government departments and multinational companies reduce costs. (Source: Silicon Republic)

It does this by augmenting its auction and RFX-based technology platform with true strategic sourcing decision optimization technology, but doing so in such a way as to hide the inherent complexity from the average buyer. Unlike many competing solutions on the market, the Keelvar UI is designed using a wizard-based workflow that guides the user in the setup of a combinatorial auction that is then solved using an optimization engine that uses a mix of solver technology developed at the 4C research laboratory and commercial solvers.

The solution walks the user through a simple process that even an average buyer can handle. The user needs to only:

  • Define the event typeThe solution comes with a number of pre-configured event types, each of which has appropriate corresponding bid templates.
  • Select the products and services and define the lots (contracts)The bidding sheets can be auto-generated off of these templates.
  • Select the suppliersWho will be bidding? The system then sends out the appropriate auto-generated bid sheets, tagged to each supplier.
  • Accept the bidsWhen the bids are returned, the user has to specify what bids are accepted and are to be used in the scenario.
  • Define the constraints.For each pre-configured event type, the system supports a number of pre-defined constraints. These include:
    • Supplier Limits (Risk Mitigation)
      where the buyer specifies a minimum or maximum number of suppliers
    • Award Splits (Allocation and Capacity)
      where the buyer can dictate that a supplier, or the winning suppliers, get an award split, minimum, or maximum award
    • Quality/Delivery Requirements (Qualitative)
      if the model is a freight model, the suppliers can specify lead times and the buyer can insist upon a maximum lead time, etc.
  • Run the ScenariosThe solution then runs the unconstrained scenario, the constrained scenario, and outputs a report that summarizes the constrained scenario cost, the number of bidders, and how much more it costs than the unconstrained scenario.
  • Define What-If Scenarios (Optionally)The user can specify constraints to add or drop, run the scenario again, and compare it previous (and the unconstrained) scenario.
  • Output a full award reportOnce the user is happy with a scenario.

It’s as easy to use as an auction tool, which is something that cannot be said for many of the optimization solutions out there, and no math or understanding of optimization is required. Plus, it’s a true SSDO solution as it is based on solid mathematical foundations (as the scenario can be built and solved as a MILP model), supports true cost model (as some of the templates allow different cost factors to be defined), supports reasonably sophisticated constraints (and enough to meet the minimum requirements of a SSDO solution), and has what-if capability. It’s definitely not the most sophisticated or powerful tool out there, but it doesn’t need to be.

For your average mid-size company at the lower end of the range, the solution gets the job done and does it in a way that the buyer can understand. There are thousands upon thousands of companies out now that don’t need more than this.

So if you’re in the lower-end and you’re ready to start optimizing your sourcing, you should head on over to Keelvar‘s site and check them-out. The solution might just work for you.

When It Comes to Optimization, You Need Every Insight You Can Get!

Even though it’s been almost a decade since Strategic Sourcing Decision Optimization (SSDO) has been not only readily available, but affordable (especially when one considers that back to back Aberdeen Studies in the noughts demonstrated that advanced sourcing, which is based on optimization, saved an average of 12% per event, which means that companies that employed optimization on large categories often saw an ROI after their first event), most mid-size and larger companies aren’t using it. In fact, most mid-size and larger companies haven’t even tried it!

Why is this? There is a laundry list of reasons, but the most important are probably:

  • misinterpretation and misinformationThere is still a lack of understanding about what optimization is and how important it is to your strategy sourcing efforts. A lot of people believe that optimization is only for the largest categories, the most complex categories, companies with complicated manufacturing supply chains, etc. This is not true. Optimization is relevant to every strategic sourcing project, small and large. The only question is how important is it — does the event revolve around the optimization or does the optimization revolve around the event?
  • fearBecause it’s misunderstood and, more importantly, because it is math, it is feared. (It’s important to remember that less than 1 in 7 American adults are “proficient” at math. This means that while your senior analysts with a strong Operations Research (OR) background will be hesitant of optimization, your average buyer will be, to borrow a colloquialism, scared sh!tl3ss. And, unwilling to admit this fear, he will do everything he can to come up with dozens of excuses as to why optimization is not applicable to your problem or why other methods will perform better.) And moreover, because of the misinformation out there which doesn’t tell you that the good solutions handle all the math for you, and all you need to do is specify the demands and the constraints (and their priorities if not all constraints can be simultaneously solved), people avoid (strategic sourcing decision) optimization when they should be embracing it.
  • costOptimization solutions used to be expensive. Very expensive. Back when there were only a couple of known solution providers (in the e-CHAOS pack), and sourcing suites started in the six figures, optimization solutions, even for a single event, were six figures, and sometimes seven for unlimited use. If you weren’t guaranteed of a high six figure return off of your first event, and a high seven figure return over the course of the year, this was a big risk to take. But that was then, and this is now. Today, optimization solutions start in the lower end of the five figure range, and unlimited annual licenses start in the lower end of the six figure range. And their power and performance is at least ten times what it was a decade ago. Models that used to run for hours now solve in minutes and an analyst can run dozens of what-if scenarios in a day, quickly getting to the best price-value trade-off for the organization.

So how do we get optimization into the hands of the masses, and more importantly into your hands (if your colleagues are holding your organization back)?

We deal with the roadblocks we discussed.

How do we deal with the roadblocks?

We start with education. We educate people that they don’t have to be a math whiz (because the math whiz is only needed to build the solution, not to use it), that a strategic sourcing decision optimization solution isn’t hard to use, that it doesn’t cost a lot, and that it does generate a return. And we hit them on all fronts. Third Party, Provider, and Practitioner.

To date, it’s been mainly third party, and, unfortunately, mainly SI spreading the message of optimization. But now we have a few providers working hard to spread the message as well. BravoSolution, who has been kind enough in the past to sponsor SI to help with this effort (and who offered you an Illumination on The Future Of Optimization) has been working hard to spread the messages of Optimization, Analysis, and the integration thereof in what they call High Definition Sourcing for a few years now. A new provider in the SSDO arena, and the first new provider to provide a true SSDO solution since Iasta back in the 2007-2008 timeframe, that we’ll announce shortly, is also taking up the challenge.

And Trade Extensions, who has also been kind enough to sponsor SI to help with this effort, and who has also been providing industry leading optimization solutions and education for a few years now, has just doubled down on the education effort, starting with a new INSIGHTS series focussed entirely on optimization. Consisting of a series of nine interviews with Founder, Chairman, and Optimization Guru Arne Andersson and CEO, Freight Trader, and Master Buyer Garry Mansell, this series will attempt to burn away the fog on optimization, make it a standard part of your sourcing suite, and lay the foundation for a series of follow-up educational offerings which will include white-papers and webinars on the subject.

Because optimization is for everyone, not just the 1%!

Are You Doing It Wrong?

If you’ve been following the media, you know that we have reached a point were most major business publications are now putting focus on Supply Chain as your top risk and your top opportunity.

You also know that these same publications, and the solution providers that follow, and reference them, have been preaching the following solutions to not only tame the risk but increase the opportunity.

Comprehensive Category Management

Spot buying individual categories at market lows or evening running reverse auctions at opportune times is not category management. And for that matter, neither is an event that covers the entire category. At this point you probably think that the doctor is losing it a little, because how could it not be category management if you are addressing the whole category?

It’s Simple. Category Management isn’t just about grouping all seemingly related items and running an event, it’s grouping items that have related characteristics that allow the items to be sourced effectively under the same strategy. For example, while it might make theoretical sense to group printers, ink, and paper together — because you use them together, from a sourcing point of view, ink and paper often go better with office supplies and printers with hardware. You can probably get them thrown in for free with a server purchase. But that’s just the start. If you source a lot of metal parts, you should probably group them by primary metal, since the price of steel, aluminum, etc. will largely dictate their prices and it might even make sense to not only source all of the parts from the same supplier but even buy the metal on behalf of the supplier with your better negotiating power and/or credit rating.

Supply Chain Risk Monitoring

Natural and Man-Made disasters devastate supply chains when they result in raw material or product unavailability for weeks or months. When a company doesn’t understand their dependence on a single source or the risks that single source is subject too, they can figuratively get caught with their pants down to say the least.

As a result, most leading companies in the Risk Management arena are now tracking and monitoring their tier 1 supply base for not only missed deliveries, but late shipment dates and inquiring immediately when something is late shipping. However, by the time a shipment is late, it’s often too late to go to another source if the reason for the lateness is the lack of an important raw material. So the smarter companies also ask their suppliers to let them know when their suppliers miss a delivery. This is better, but sometimes this is still too late. You need to track the primary sources of the raw material and their ability to produce. Not only the companies, but their locations. All natural and man-made disasters in the region and then evaluated for impact and if the producer of the primary raw material or part is potentially at risk, they make sure, or ask their tier 1 supplier to make sure, that the raw material or product can still be delivered on time and if it can’t, these leading companies immediately seek a secondary source (or lock up available supply pre-emptively) — not two weeks after the tier 1 supplier required the raw material to meet the commit date.

Big Data

The only buzzword on par with big data is cloud. According to the converted, or should I say the diverted, better decision are made with better data — the more data the merrier. This sounds good in theory, but most algorithms predict demand, acquisition cost, projected sales prices, etc. based on trends. But these days the average market life of a CPG product, especially in electronics or fashion, is six months or less, and the reality is that there just isn’t enough data to predict meaningful trends on. Similarly, every disruption impacts the cost, and these disruptions are as unpredictable as future sales predicted using trend models with insufficient data.

You use all of the data available to validate your operations, procurement, and financial situation. Not to blindly predict future sales or prices. An over-reliance on big data is often more dangerous than not having data at all.

Anticipatory Demand Planning is Good, but Anticipatory Shipping?

SI can believe that Amazon patented a Method and System for Anticipatory Package Shipping (US Patent 8615473) but can’t believe it would use this for more than a small number of items. Nor does it believe the system would be implemented as outlined in the patent as filed, at least in the short term.

It took Amazon 7 years to turn its first profit, and while Prime is currently very profitable to Amazon (which makes $78 more in profit per year per Prime customer, on average, than non-prime customer according to CIRP’s market research – Source: Wired), those margins would drop substantially if Amazon started shipping tens, or hundreds, of thousands of packages a year that no one wanted. Amazon does have an efficient distribution network and probably has the absolute best deals with postal and courier services that can be papered, but every shipment costs money and every unnecessary shipment eats into profit. Returns cut into profit margins enough, how much are returned shipments to nowhere going to cost?

Thanks to big data, predictive analytics is getting better by the day, but it’s still hit and miss at a granular level. While it’s pretty easy to use correlation data across a large customer base to predict that you are likely to desire an item, it’s harder to predict whether or not you’d actually buy it, and if you would, at what price point, assuming you don’t already own the product in question. (It’s always telling the doctor he wants books and media he already owns.)

As a result, any predictive analytics at the individual consumer level are going to be hit-and-miss at best. Predictive analytics work best across a large consumer base with a lot of data where one can predict that, on average, 5 in 100 people who match a profile will buy the product from Amazon.

And, from Amazon’s viewpoint, the best use of the predictive analytics is on new releases, as the bulk of sales in many of its categories, and books and media in particular, are in the weeks immediately following a new product release. With the right data and the right algorithms, it can not only predict how many units it is likely to sell against its current customer base, but if the demand is enough, how many in each region that is associated with each distribution center and how the orders will likely track over time on a daily basis.

In this, and only this situation, would anticipatory shipping, and in particular, anticipatory packaging, make sense in the short term. For example, if Scott Adams were to release a new Dilbert book and Amazon predicted 200,000 copies would be sold in the first 3 weeks, and expected that it would get 50,000 of those sales, pre-packaging 40,000 for shipment and then distributing those across it’s DCs such that each DC received a number of books proportionate to the expected sales in the serviced area would be a good idea. All Amazon would have to do to speed up shipment would be to slap the delivery address on the boxes as the orders came in and have them ready to go in the next pickup for local delivery.

In the future, once the system is fine-tuned and its delivery partners have the technology to replace a unique delivery address identifier with a specific delivery address on-the-fly, Amazon can pre-ship a set number of these pre-packaged items to the local post office or delivery company every day, which can, in turn, load those packages onto the appropriate courier truck each morning as the addresses in the system are updated with consumer delivery addresses sent over by Amazon upon each purchase.

But not everyone would get faster shipping service. In order to prevent too many unnecessary shipments and loss, Amazon would have to err on the side of caution and pre-package (and pre-ship) less unit of an item than it expected to sell, and restrict the anticipatory shipping and packaging to only those items expected to have a large sales volume. In most cases, the best Amazon will do is optimize the distribution of inventory across its warehouses. However, this can still take a day (or two) off of average delivery time, so this is still a good start.

Any differing opinions?