Daily Archives: January 25, 2010

New and Upcoming Events from the #1 Supply Chain Resource Site

The Sourcing Innovation Resource Site, always immediately accessible from the link under the “Free Resources” section of the sidebar, continues to add new content on a weekly, and often daily, basis — and it will continue to do so.

The following is a short selection of upcoming webinars and events that you might want to check out in the coming weeks:

Date & Time Webcast

10:00 GMT-08:00/AKDT/PST

Leveraging Oracle BI

Sponsor: Oracle


11:00 GMT-08:00/AKDT/PST

Effective Procurement Engagement Within Marketing Spread

Sponsor: Sourcing Interests Group


11:00 GMT-05:00/CDT/EST

Managing Supplier Performance and Compliance Risk

Sponsor: Aravo


1:00 GMT-05:00/CDT/EST

Recovery 2010: The Rise of Contract Talent

Sponsor: Fieldglass


14:00 GMT-05:00/CDT/EST

Visual Spend Analytics : Take Your Spend Analysis to the Next Level

Sponsor: tableau Software


11:00 GMT-05:00/CDT/EST

Using demand management to drive procurement excellence

Sponsor: Ariba


15:00 GMT-08:00/AKDT/PST

Oracle Supply Chain Webcast

Sponsor: Oracle


14:00 GMT-05:00/CDT/EST

Lean Quality for Pharma and Consumer Product Manufacturers: Saving Millions With Rapid Testing

Sponsor: Celsis


12:00 GMT-05:00/CDT/EST

Commodity Management for Supply Chain Professionals

Sponsor: PMAC

Dates Conference Sponsor
2010-Feb-22 to


Institute of Business Forecasting: Supply Chain Forecasting & Planning Conference

Phoenix, Arizona, USA (North-America)

2010-Feb-23 to


Logistics and Supply Chain Management 2010

Orlando, Florida, USA (North-America)

2010-Feb-23 to


Cold Chain Storage and Distribution conference

London, England, UK (Europe)

Arena International Events
2010-Feb-24 to


eWorld Purchasing and Supply

Birmingham, England, UK (Europe)

2010-Feb-25 to


Clean Energy Power 2010

Stuttgart, Germany (Europe)

Simba Media GmbH
2010-Feb-25 to


Spend Management Day

Los Angeles, California, USA (North-America)


They are all readily searchable from the comprehensive Site-Search page. So don’t forget to review the resource site on a weekly basis. You just might find what you didn’t even know you were looking for!

And continue to keep a sharp eye out for new additions!

Spend Analysis II: Why Data Analysis Is Avoided

Today’s post is from Eric Strovink of BIQ.

If I have learned one thing during my career as a software developer and software company executive, it’s this: contrary to what I believed when I was a know-it-all 20-something, there are a lot of clever people in the world. And clever people make smart decisions (for example, reading this blog, which thousands do every day).

One of those decisions is the decision NOT to perform low-probability ad hoc data analysis. It’s a sensible decision. Sometimes it’s based on empirical study and hard-won experience, and sometimes it’s a gut feel; but either way, the decision has a strong rational basis. It’s just not worthwhile.

A picture is helpful:

Click image to enlarge

The above shows the expected value of an ad hoc analysis of a $100K savings opportunity. On the X axis is the number of days required to prepare and analyze the data; on the Y axis, the probability that the analysis will be fruitful. I chose a $700 opportunity cost per analyst-day; choose your own number, it doesn’t really matter.

Note that the graph is mostly “underwater”; that is, the expected value of the analysis is largely negative. Unless the probability of success is quite high, or the time taken to perform the analysis is quite low, it’s simply not a good plan to undertake it.

We are all faced with the decision, from time to time, whether to explore a hunch or not. However, an analyst can only work for 220 days per year. Sending a key analyst off on a wild goose chase could be a serious setback, so it’s a risky decision, and we don’t do it, and so our hunches remain hunches forever.

But what if it wasn’t risky at all?

Nothing can be done about the “probability that the analysis will be fruitful”; that’s fixed. But plenty can be done about the “number of days required to prepare and analyze data.” Suppose a dataset could be built in 5 minutes, and analyzed in under an hour? This turns the expected value of speculative analysis sharply positive. Suddenly it is a very good idea indeed to perform ad hoc analysis of all kinds.

And that’s good news. Because there is a ton of useful data floating around the average company that nobody ever looks at. Top of the list? Invoice-level detail, from which all kinds of interesting conclusions can be drawn. Try this experiment: acquire invoice-level (PxQ) data from a supplier with whom you have a contract. Dump it into an analysis dataset, and chart price point by SKU over time. Chances are, like most companies, you’ll find something very wrong, such as prices all over the map for the same SKU (ironically, sometimes this happens even if you have an e-procurement system that’s supposed to prevent it). If you have a contract, only one of those prices is correct; the rest are not, and represent money on the table that you can recover trivially.

Of course, please don’t spend weeks or months on the exercise, because then it won’t pay off. Instead, find yourself a data analysis tool with which you can move quickly and efficiently — or a services provider who can use the tool efficiently for you (and thus make a contingency-based analysis worthwhile for both of you). Bottom line: if you can’t build a dataset by yourself, in minutes, you’ll end up underwater, just like the graph.

Next installment: Crosstabs Aren’t “Analysis”

Previous Installment: It’s the Analysis Stupid

Share This on Linked In