Category Archives: Spend Analysis

Optimize, don’t Compromise!

Continuing on our theme of analysis and optimization, every e-Sourcing suite on the market will support your organization in its sourcing activities, but not every product will allow your organization to optimize it’s sourcing activities.

Optimization requires advanced sourcing capability, and advanced sourcing requires the ability to analyze data, not just collect and report on data.

This means, that at the very least, you will require:

  • true spend analysis,
  • true category analysis,
  • true cost-based bidding, and/or
  • true bid optimization.

Without at least one of these capabilities, you’ll never optimize your spend. So don’t even both to try without them.

If I Succeed in Destroying Dashboards and Razing Report Writers, What Next?

In yesterday’s post, where I responded to the smart alecks, I noted that, once dashboards are destroyed and report writers are razed, there was about a half-dozen next logical steps that could be taken to improve today’s spend analysis solutions, even if that solution was BIQ.

Should cost modelling, award optimization based on historical data and business rules, and federation across related data sets for deeper dives are pretty obvious. Are there somewhat less obvious advancements we should also be thinking of?

Of course. One rung up the ladder, three of them are:

Predictive Modelling

Once you have should-cost modelling, the next logical step is predictive modelling. Use historical data to extract pricing trends and predict likely future prices for the commodity. Use this to determine not only the best time to (re) source the category as well as using deep-dive analysis to determine the best strategy.

Optimize Supplier Relationships

Once you have optimized all of the awards based on historical data and business rules, you also have the optimal allocation by supplier. Once you have the optimized set of awards for each supplier, you can optimize the re-order schedule, shipping arrangements, and even production and sourcing schedules on behalf of the suppliers and take costs out one level down in the supplier chain. Helping your suppliers help you goes a long way to building good supplier relationships and increasing supplier performance.

Simultaneous Drill Across Multiple Data Sets

Once you have true federation, you want to split the screen and update the views to only contain the relevant data in each data set as you drill down through the data. Going back to our previous example, you start in the Payment cube drilling into the goods receipts associated with the wonky widgets, then switch to the Order History cube to find the initial requisitions, but when you drill on the user in the second cube, the first cube is updated to contain only those goods receipts associated with the user. The user can drill through either cube to find the data she wants, whichever is easiest, and both cubes update. She doesn’t have to go back and forth.

These are just a few more things that can be done, and all would simplify the life of an analyst. More to come at a later time but first, this time I’m going to insist that you tell me what you would do. :-;

If I Succeeded in Destroying Dashboards, How Else Would I Improve Spend Analysis.

The smart alecks are correct — technically destroying dashboards is not adding anything to spend analysis so I didn’t actually provide a way to improve spend analysis technology, just the results you get from using it.

So if I succeeded and dashboards bit the dust, what would I do? (Besides banning integration points for report writers for all OLAP-based spend analysis products?*) Good question. Especially since there’s about a half dozen logical next steps.

Three things that would be useful if you had a true spend analysis product like Opera’s BIQ would be to:

  • Integrate Easy Should-Cost Modelling CapabilityThis way you can define a cost breakdown for a product or service you are looking to source and have the tool automatically generate an expected cost based upon current data, as well as a price-range, with confidence, based upon low, average, and high prices paid for the raw materials, energy, labour, etc. (provided that the should-cost model permitted base-cost definitions for any cost components you weren’t buying that were bought entirely by your supplier)
  • Optimized Awards Based on Historical Data and Business RulesYou don’t have to send out an RFX to get base market pricing if you are already buying a product, it’s in your transaction store. Nor do you have to run a complex event to determine the lowest cost providers for a market basket. Moreover, if you are buying commodity products and services with list prices, and all your suppliers do is give you a discount of X% for a guaranteed award, you don’t really need optimization to determine the lowest cost as it’s just a simple formula against current pricing. And if your only business rule is 2 or 3 way split, it’s just the 2 or 3 lowest cost suppliers with the appropriate risk mitigation. In this situation, it would be easy for spend analysis tools to build in some simple optimization capability to tell you your lowest cost buy, and if it’s close to your should-cost model, you can just cut a contract without going through a time-consuming sourcing event.
  • True Federation across Related Data SetsMost spend analysis tools are only capable of working on one cube built on one data classification at a time. This means that even though a user can pick the drill dimension order, only one set of data can be viewed at one time. But sometimes you want to drill into greater detail (such as who requisitioned all those widgets from the wonky supplier), and that’s not in the transaction file — so you need another cube with more detail on the invoice (history). Then you drill in on the augmented AP (cube) data until you get to the invoices associated with the supplier, switch over to the new cube and drill down to the line items of interest and retrieve the requisitioners. Another situation is where you are getting a lot of warranty returns, and you want to figure out what batches the returned items are in so you can determine whether or not the batches were bad and it will be cheaper to do a mass replacement (by just putting out a recall) than dealing with one breakdown at a time. In this case, you need to drill into the warranty cube and then branch over into the invoice cube to get the batch numbers associated with the appropriate goods receipts that are associated with the invoice.

These are just a few things that can be done, and all would simplify the life of an analyst. More to come at a later time but first, what would you do?

* If you don’t know why, you don’t know your spend analysis product limitations!

In What Way Would I Improve Spend Analysis?

When it comes to spend analysis there is at least one particularly powerful tool out there that will meet the majority of the needs of any organization and probably at least one tool that will do, with elbow grease, just about any analysis an analyst can think of. Since businesses have wanted reports and analytics since the days of the first spreadsheets, analysis tools are always advancing and most are beyond the ability of the average user to fully utilize their functionality.

So, given this fact, how would I improve spend analysis? And given that this question may imply that I may only make one improvement, just what would that improvement be? Especially since most tools don’t do (true) federation, don’t support full reg-ex (regular expressions), don’t understand semantics, and don’t run fast enough on large data sets — indicating that, as a PhD in CS with deep expertise in analysis, modelling, optimization, and semantics, there are theoretically a number of advancements I could bring to the table if I put my mind to it?

Despite the plethora of options available, today there is only ONE thing I would do to improve spend analysis. I’d make it impossible to do anything but spend analysis. Specifically, I’d make it illegal to include dash-boarding capability in any (spend) analysis product.

Why would I do such a thing? Besides the fact that I’ve been ranting since 2007 that dashboards are dangerous and dysfunctional, I would do such a thing because, among other things, they give you a false sense of security that, if mismanaged, could be so grave that, like the myth of Nero, you would fiddle while the factory burned.

Why would I ditch the dashboards and make it a crime punishable by any fate one could devise that was worse than death to include any capability whatsoever designed to support a dashboard? Because I just read this post on Purchasing Insight on “the inordinate cost of poor spend analytics” that said that it’s reckoned that more than 50% of businesses employ between 2 and 5 people to prepare and create procurement dashboards and spend reports. This is ludicrous. (No, not Ludacris.) If these people are senior analysts, then a large organization is spending more than 500,000 a year on salary and overhead to create dangerous and dysfunctional dashboards that spit out shiny spend reports that, after being analyzed the first time for inefficiencies, provide zero value to the organization. Once the report is analyzed, the inefficiency identified, and the problem corrected, and once this is verified in the next report, no subsequent report is going to tell the analyst, or management, anything new.

As SI has said again and again, the value of spend analysis is actually doing spend analysis, again and again, testing new hypothesis every time they pop into the analyst’s head. Yes, most hypotheses will yield nothing, but that’s not important because it only takes one insight to yield 100,000 worth of savings. If the tool is flexible, powerful, and configured appropriately, the user will be able to explore dozens of different analyses in a week, and if even one yields 10,000 of savings, that’s an (amortized) ROI of (at least) 5X. Spend analysis is analysis. Not dashboards and reports.

So if you really want to improve spend analysis — ditch the dashboards and focus your talent on real analysis. Otherwise, just download a free reporting engine off the internet. You’ll get the same worthless result, without forking out six figures for a tool you’re not really using.

Last Day for Free Sourcing and Procurement Papers from Spend Matters!

As per a recent post over on Spend Matters UK, today is the last day that a set of recent Spend Matters white-papers, made available by sponsors, that is currently free to practitioners goes back behind the pay-wall and this set in particular contains three pieces authored or co-authored by Pierre Mitchell and two authored or co-authored by Thomas Kase. Pierre, most recently of Hackett Group fame, is well know for his thought leadership around Supply Management best practices and Thomas, who has worked for a number of providers, is known for his deep expertise in SPM/SRM solutions. When you can get your hands on their work for free, it’s not something you should pass up.

The papers in particular that are about to become “pro” access only are:

  • How to Justify Spend Analysis to Finance/IT When There’s No Clear ROI
    by Pierre Mitchell
  • Write Better RFPs — How to Get What You Want (and Need) From Suppliers
    by Thomas Kase
  • Metadata Explained: What it Means for Spend Analytics, Supply Risk, Supplier Performance, and More
    by Thomas Kase, Pierre Mitchell, and Jason Busch
  • Procurement Analytics: How to Plan (and Optimize) Your Process
    by Pierre Mitchell

Regular readers of SI know the utter importance of Spend Analysis, a subject which has garnered hundreds of posts on SI over the years. As one of the only two technologies that have been repeatedly to provide an organization year-over-year double digit ROI returns when properly used, your organization cannot afford to be without it! As such, the last thing you want is to be roadblocked by finance when there is no readily apparent savings opportunity (as you need the solution to clean your data to find the savings opportunity). In his piece, Pierre gives us you ten hints to getting your project approved, which can happen quickly if the project is presented appropriately. Always remember, the faster you get the system, the faster you can centralize and cleanse your data and find opportunities, and the faster you can start saving.

Just this week SI was continuing it’s RFX rants, which started back in 2007, about how the vast majority of provider RFPs suck and how you won’t get good results unless you write your own (using the provider RFP as a checklist of some key elements that need to be included, but in a way that makes sense to your organization). In Thomas’ paper he discusses some key elements of the RFP creation process that can make the difference between success and failure in your efforts.

Everyone talks about Meta Data, but not a lot of people really understand what they are talking about. In the collaborative piece between Thomas, Pierre, and Jason, the authors provide a discussion of meta data and how meta data aggregation can paint a picture not readily available from the elements. They then go on to demonstrate how the proper analysis of meta data can yield risk analysis and opportunity assessments that cannot other wise be performed and that can be very beneficial to the business day one. It’s another example of why your organization needs good data and tools to process and mine that data if it wants a true twenty-first century supply chain.

In the last piece on Procurement Analytics by Pierre, he notes that for an analytics project to be successful, you need the right scope. The scope is all of supply management, not just the tactical procurement function. All data collected from the first RFX during the sourcing process through the last on-contract procurement to the final warranty return needs to be collected and stored in one central or federated database so that an analysis can look at all relevant data, not just purchase data. It’s not just how much you paid, but how much you were supposed to pay, what you paid for, and if a different categorization would be more beneficial to your organization. And until you make an effort to centralize, or at least centralize on a common schema even if the data is scattered, you won’t even know what transformations and cleansings need to be done.

If you haven’t downloaded these yet, don’t miss your very last opportunity to do so. These are some great pieces with content that you should know, so read up!