And even if I didn’t know the start and end dates, such dates are easily deducible by recent blog activity. On the day ISM started, daily hits dropped by about a third. Now that ISM has been over for a day, hits are up. Way up – by over 50% relative to the norm as the hordes of practitioners lost in the glaring lights of Vegas find their way back onto the internet and back to their favorite blogs. But I am curious – did any other bloggers notice such a significant effect on blog hits from this event? And has any other event had such a significant effect on sourcing and procurement blog readership?
Daily Archives: May 10, 2007
Spend Analysis: What Purchasing.com Got Wrong
Purchasing’s recent article on How to Select a Sourcing Strategy wasn’t the only article that just didn’t make the cut in my book. Their ABCs of Spend Analysis article missed the point as well. However, knowing that Eric Strovink of BIQ would also be taken aback by this article, I invited him to shed some light on what the article missed. So, without further ado, here’s Eric’s guest post on What Purchasing.com Got Wrong.
Occasionally an article crosses my desk that seems well-written and insightful, such as
Purchasing.com’s “The ABCs of Spend Analysis” — but only if I’m willing to accept
assumptions with which I can’t agree.
“A: Acquire the data skills”
Wait, stop right there. In my view, it shouldn’t be necessary to
“acquire data skills” in order to manipulate and report on spend data.
This limits usage to a fraction of the business users who otherwise could
deeply improve their understanding of what’s going on. Any requirement to
dump data out of a spend analysis system and import it into Microsoft
Access or any other database management system is a glaring indictment
of the spend analysis system itself. It’s supposed to be a “spend analysis”
for goodness sake, so where’s the “analysis?” Similarly, a requirement
for SQL skills or other IT expertise in order to construct a report
is equally an indictment of the spend analysis system.
We should not allow tool limitations to dictate that business users
must become IT experts in order to analyze their data. That’s like saying
drivers must become mechanics before they can use the interstate highway system.
“B: Bring the data together”
One could hope that “ETL (Extract, Transform, Load)” would not be
the theme of this page, but of course it is. In fact, the only letter
that’s relevant in this tired acronym is “T” (for “Transform”). If
you can’t load data into your spend analysis system, then find one
that makes it easy. If you can’t dump data out of your ERP or accounting
system(s) in some reasonable flat file format (like .csv), you didn’t
try very hard. Every accounting system I’ve dealt with in the last
four years, old or new, has a perfectly reasonable and simple method
for dumping its data, almost always a method that requires no IT
assistance at all.
Transformation is necessary in order to coerce data from incompatible
systems into a common format. A good transformation tool should be able
to move any field from one column to any other; create new fields;
eliminate fields; and create any output field as a function (including
string, math, and logical functions like “IF”) of any number of
input fields. It should be able to save the transform and apply it
And the translation tool should be — you guessed it — operable by
ordinary business users, not just by IT types.
“C: Change the way you source”
Wait, should we just plow ahead and start sourcing? It turns
out that accounts payable-level spend analysis doesn’t really
show you very much, and this section of the article reinforces
the point. “We realized we were spending more with Supplier K
than we had previously thought, and this gave us more leverage
in negotiations.” OK, but do you know whether contract terms
were met? Was the supplier over-charging? What were the exact
buy points and quantities with supplier Q for commodity X? Why
didn’t a contract with supplier Y result in the savings we projected?
Problem is, an A/P level cube only peels back the first layer
of the onion. You’ve achieved a reasonable idea of what was bought,
who bought it, and who supplied it, and that’s important. However,
spend analysis is an iterative process of first identifying macro
behaviors, and then zooming in using micro analysis on a
commodity-specific basis. The high level cube gives you an
indication of what might be wrong — too many suppliers, or
too few; too high a spend rate given [number of employees] or
[size of business]; too much off-contract spend. But that’s
all it gives you. You don’t really know, for example, whether
off-contract spend (Fred down in Order Entry buying a Dimension
800 from Dell) is an inferior price point to the company’s VAR
contract for ThinkPads — or whether Fred actually got a better deal.
If the high-level cube hints at a possibility for cost reduction,
should you run right out and start running sourcing events? Maybe
not. For one thing, it’s sometimes difficult to determine
whether high spend is a demand issue or a supply issue, and
sourcing won’t touch the demand side. Sourcing events are
politically disruptive, can take many months to implement,
and can upset long-term supplier relationships unnecessarily.
It’s entirely possible that quietly confronting an incumbent
supplier with a detailed analysis of buy patterns from
invoice-level data can not only change behavior immediately,
but also return money to the bottom line from overcharges. And,
it certainly will tell you if you have a demand problem. If,
at the end of the day, sourcing is still required, fine; but in
many cases it’s not. Few incumbent vendors want to go through
a sourcing exercise and potentially lose the business; they’d
rather meet Fred’s price point.
Thanks Eric! I could not have said it better myself.