Monthly Archives: January 2007

Forecasting

No doubt about it – despite being critical for effective business planning, accurate forecasting is complex and challenging and still remains elusive for many organizations. However, as the recent issue of APICS Magazine points out in their article “Outlook Warm and Sunny”, one can create good forecasts through the proper combination of judgmental and statistical methodologies and use them to identify new market opportunities, anticipate future demands, effectively schedule production, and reduce inventories.

What’s interesting about this article is that it is well known that neither technique on it’s own can be very effective. Most of us lack the ability to accurately judge future demand due to limitations in human cognitive abilities, the restricted amounts of information we have at our disposal, and unknown causal relationships. Similarly, statistical forecasts are limited with respect to the models they are based on. Although a statistical model is much more accurate than any intuitive model we could come up with, it is built on assumptions and causal relationships which may change over time. The best example of a statistical model gone bad is Nike’s $400M failure in 2000 due to demand forecasting software. Nike relied exclusively on automated forecasts without any judgmental checks, but the newly implemented models were not yet fine-tuned and accurate enough to be deployed in a fully automated mode.

The best forecasts are those that leverage the strengths of both judgmental methods and statistical methods. However, as the author points out, well-established rules must be followed in order to effectively combine these techniques.

The following table summarizes the strengths and weaknesses of each approach.

Judgmental Forecasts
Strengths Weaknesses
Responsive to latest environmental changes

Can include “inside” information

Can compensate for “one-time” or unusual events

Human cognitive limitations.

Biases

Statistical Forecasts
Strengths Weaknesses
Objective

Consistent

Can process large amounts of data

Can compute many variables and complex relationships

Slow to react to changing environments

Only as good as model formulation and available data

Can be costly to model “soft” information

Require technical understanding

According to the article, judgmental and statistical forecasts can be combined in different ways to take advantage of their individual strengths but the most popular method in practice appears to be the managerial adjustment of statistical forecasts where managers adjust the statistical forecast in a “managerial override”. Managerially adjusted forecasts can often improve forecast accuracy by including information not available to the statistical model. However, if performed incorrectly, adjustments can cause inaccuracy due to inherent human bias. Thus, established rules should be followed for effective adjustments.

The rules outlined by the article are the following:

Only practitioners with domain knowledge should adjust statistical forecasts.
Judgmental adjustment is more likely to improve accuracy when the adjustment is based on domain knowledge. Generally, only domain practitioners will be aware of the relevant contextual information that should be used to adjust a forecast.
Adjust statistical forecasts when there are known changes in the environment.
The adjustment should compensate for specific events not captured by the statistical model or time series. It should not be based just on intuition or bias.
Structure the judgmental adjustment process.
Use a documented or computationally consistent methodology. This will allow you to repeat successes and insure that failures are caught, corrected, and not repeated.
Document all judgmental adjustments made and measure forecast accuracy.
Records must be kept of all adjustments made, and the reasons therefore, and the results of the forecast must be measured so the process can be improved over time and the underlying statistical models updated when relevant observations are made.

When good, quantifiable, and historical data is available, reliance should be placed primarily on statistical forecasts. Only when the domain practitioners know of relevant contextual events or information not contained in the model should judgment be used to adjust the forecast.

Emptoris Update

Unfortunately, Emptoris (acquired by IBM, sunset in 2017) was not among the companies I was able to meet with during my whirlwind tour of Boston, but that didn’t stop me from trying to find out why they were all so busy. Apparently, those days were filled with meetings and company events to kick off the new year.

The official report I was able to solicit went something along the lines of the following.

The week culminated with a company wide meeting and post-holiday party where Emptoris essentially gave their employees a company update and a cause to celebrate. The major bullet points were:

  • Last year was Emptoris’ best year ever, with each quarter booking more sales and revenue than the quarter before.
  • Emptoris had more deals than Ariba.
  • Emptoris has succeeding in swiping at least one marquee Ariba customer.
  • Emptoris webinars are in hot demand … with their upcoming webinar already clocking in at over three thousand registrants.
  • Their consulting group and India operations are increasing rapidly to support demand.
  • Key verticals for Emptoris are financial services, CPG, and pharmaceutical.
  • A number of significant product enhancements are planned for the week ahead.

However, I’m sure the unofficial transcript of what transpired really was more in line with the following.

Early in the week:

Pinky: Gee, Brain, what are we going to do this year?
Brain: The same thing we do every year, try to take over the (sourcing) world!
Pinky: Narf! Zort! How are we going to do that Brain?
Brain: As you know, people in today’s dollar-driven economy are obsessed with anything that they think will fatten their own paychecks. My plan is to promise everyone who buys our system shares in our new sourcing savings investment plan which is funded by 10% of all of our sales. The sourcing savings investment fund will use our new real-time optimization technology to quickly buy and sell volatile shares at the lead end of the spikes, before everyone else decides to cash out, and aggregate value through the sheer volume of trades. This plan should net us billions before everyone else figures out that accelerated buying and selling short can make you billions and everyone else starts doing it and our algorithm fails miserably and we lose it all – but it won’t matter, because by then everyone will be using our software and we’ll rule the sourcing world!!!
Pinky: And how are we going to convince the employees this is a good plan?
Brain: You mean my loyal subjects that already address me as “Your Highness”? That’s easy, we just tell them all that they get a cut of the investment plan at the end of every quarter based on sales volume.
Pinky: That sounds fantastic Brain! Poit!
Brain: Yes, and failing that, I just filled my brand new playbook with backup plans that are sure to work if this one doesn’t work as spectacularly as I planned!

Access Problems?

Some of my southern colleagues have indicated that they have been having problems accessing my blog lately.  I myself have had problems at times, since I host on a remote server, but given the nature of the internet, and the bad weather we’ve had around North America lately, brief outages are to be expected for any site without vast financial resources to pay for replication services.

If you ever have trouble with this blog, or another blog, here is a workaround that might help you (especially if its just a routing issue).  Try accessing the blog (be it Sourcing Innovation, Spend Matters, eSourcing Forum, or Supply Excellence, for example) through Technorati or ProcureIQ – since they both pick up our blog posts regularly.

[Remember, this post is from January 20, 2007, and, as such, it should be no surprise that neither Technorati nor ProcureIQ (no longer in existence) pick up these blogs anymore, especially since only this blog and Spend Matters remain from the original Procurement(-related) blogs.]

There’s No Spend Analysis without the Slice ‘N’ Dice

When I was in Boston, I was lucky enough to spend the better part of the day with Eric Strovink of BIQ, and have a few extended conversations with individuals at some of the local consulting firms that specialize in sourcing, and am now more than convinced that any tool that mandates a single cube, or makes it difficult to change the cube, is not a spend analysis tool, merely a spend data warehouse with built in canned reporting (and, if you’re really lucky, limited ad-hoc capabilities).

Not that there’s anything wrong with a centralized spend warehouse with a consistent view of your total spend, especially one that integrates multiple internal and external data sources and allows you to drill down and understand your spend at a detailed level. Of all the e-Sourcing software tools, it is the one most likely to make your CFO do backflips, especially if it has good reporting (and this is a big if – not all spend analysis tools on the market do), since it makes it really easy for the CFO to tell the CEO where the money is going and comply with all those pesky reporting requirements.

However, the value of such a tool is quite limited to you as a purchasing agent. Now, it’s true that the first time you’ll use it you’ll save big-time, especially if it’s the first time you have visibility into the majority of your spend, but the reality is that this is the only time you’ll see such significant savings. After you’ve identified all of the low hanging fruit identified by the single view provided to you by the system, analyzed each instance of over-spending, and taken corrective actions, you’ll find that you’ll be unable to identify additional savings and the system will simply function as a glorified data warehouse that you only use once a quarter to create those reports for your CFO and check that your teammates our buying off the negotiated contracts – something that you could do almost as well with your existing ERP system and a significantly cheaper Business Intelligence / OLAP tool like Business Objects or COGNOS and some grunt work.

Remember, I’m not saying that traditional spend analysis systems like those provided by e-Sourcing providers like Procuri (acquired by Ariba, acquired by SAP) and Emptoris (acquired by IBM, sunset in 2017) are not without value – if you do not have a good, integrated, data warehouse that integrates your various accounting, purchasing, and inventory systems to provide you a single view of your spend or a good reporting system to produce all of the reports your CFO needs, then you’ll find these systems very valuable. However, it’s important that you understand that the primary value of these systems is in the total spend visibility they provide from a financial viewpoint, not the spend analysis capability you really require to identify potential overspending and cut-costs, because you’ll only be able to do this once – thanks to the single organizational view they are built on. (In other words, you’ll save big when you fist implement the system but future savings will be limited to your capability to quickly catch and stop maverick spend.) So, if you need a system to consolidate your spend data, produce the tedious reports required by all of the new financial reporting requirements, and give you some basic across-the-board spend visibility, or, more importantly, you need a spend data warehouse that integrates with the rest of your e-Sourcing suite, be sure to check these systems out – but understand what they are really worth to you before you sign the check.

In order to help you understand where these systems fail in true spend-analysis, why you need to be able to dynamically create multiple cubes on the fly which support dynamic dimensions, meta-aggregation, cross-dimensional roll-ups, and even federated data sets, I’m happy to inform you that Eric Strovink has agreed to co-author a series of posts outlining what real spend analysis is, how it differs from basic spend visibility, what it does for you, and why you need to get there. Stay tuned. (This series is bound to be as informative as my CombineNet series which, when combined with Paul’s informative posts and rebuttals, is probably one of the best non-marketing filtered sources of information out there on decision optimization.)