Category Archives: Spend Analysis

Efficient Sourcing In Marketing, Part I

Three years ago CIPS and the IPA came out with their report on “Magic and Logic”: Re-defining sustainable business practices for agencies, marketing, and procurement in their attempt to change the game and get the sacred cow marketing budget under control. It was an insightful report, as I noted in my two-part series on Magic & Logic (Part I and Part II), and a great first attempt at carving up the sacred cow.

Then, two years ago, Efficio entered the game with their paper on “The Creative Challenge: Driving Efficiencies in Marketing Procurement”. This report, which covered some of the key challenges involved initiating collaboration between marketing and collaboration, as well as some of the typical savings levers that can be used to negotiate savings anywhere from 3% to 50%, provided an 8-step approach to driving efficiencies in Marketing Procurement. As per my posts on The Creative Challenge (Part I and Part II), it was a good starting process and a great second attempt at serving that sacred cow on a platter.

Since them, I’ve been waiting for another paper that will complete the trilogy and, hopefully, provide us with the ultimate approach to Marketing Procurement. And while it certainly isn’t the ultimate approach, Booz & Co.’s recent attempt, “Efficient Sourcing In Marketing” is a good end to the trilogy. As noted in the introduction, the following, all-too-common, scenario speaks volumes about the sourcing side of marketing at large companies.

The large retail bank’s approach to buying marketing-related services and materials was typical. On direct marketing efforts, decentralized business units worked with advertising agencies of their choice — agencies usually chosen on the basis of demonstrated capabilities, their understanding of the nuances of the individual businesses, and the personal relationships they had built over time. The relative cost was hard to compare, as each of the bank’s business units negotiated its own agreements with its marketing partners. Pricing was usually project-based, with no standardization from one business unit to another, even when it involved universally used items, such as envelopes, mailing inserts, and postcards, or when units shared the same vendors. By ignoring costs, which can represent a quarter of many companies’ total purchasing outlay, the company is leaving huge sums of money on the table — as much as 40% to 50% in some cases. This can easily mean tens of millions of dollars of savings at many large companies in the CPG, Pharmaceutical, or Automotive sectors that rely heavy on marketing. These savings can be reinvested in more successful campaigns, truly allowing marketing to do more with less when efficient strategic sourcing comes to the table, provided both departments collaborate to an unprecedented degree.

This will require adherence to a six-step process, that I’ll address in Part II, but the good news is that the payoff can materialize quickly. Often, merely creating more visibility into a supplier’s relationshipsacross a firm and discussing the level of business with the supplier can elicit more favourable pricing. Furthermore, the appropriate identification of savings target for different types of services can lead to rapid savings. The report gives an example of a CPG company that targeted 8% savings on creative services, 16% savings for less complicated services (that could be done in-house or by lower-cost resources), and a 18% savings through the adopt of a preferred set of enterprise-wide vendors. Overall, the company reduced cost by 42%, saving 10 Million on what was a 25 Million spend!

Share This on Linked In

Purchasing 0.3

Is Purchasing Magazine trying to give me a heart attack? Isn’t it enough that they refuse to acknowledge the presence of Sourcing Innovation (which, as you know, is one of the few blogs that brings you real supply management content you can use day-in, day-out six days a week, every week) which they dropped from their “News from the Web” feed years ago (when I first ripped apart one of their sloppy articles)? After reading a few of their recent articles, my blood is boiling!

That’s right! That bullcr@p that Spend Analysis is expensive (see last Thursday’s post) is just the tip of the iceberg. And even though many of the quoted individuals had good advice to share, in the end, Purchasing’s recent article on “Purchasing 3.0” is just as bad and filled with absurdities … which start on the first line! (If Purchasing had their way, we’d regress to Purchasing 0.3!)

Have you used Social Networking to build supplier relationships?
I Hope Not! Since all Facebook is good for is Facebook parties that result in “Million dollar homes being trashed” (Metro.co.uk) …
If you want to build drinking buddies relationships, yes, Facebook will work great … but what you want is productive and professional relationships where you can work together to make each of your businesses better.

Are you sure you’re using Excel effectively?
You can’t use Excel to manage your supply chain! How many fracking times do I have to say it? Spreadsheets are bad strategy, prevent innovation, and cost you billions! You’re better off using an etch-a-sketch like the dork in It’s All About the Pentiums (2:54 mark). (And just because it’s still all about the pentiums, baby, that doesn’t mean it should be!)

Do you, uh, Tweet?
Are you kidding me? Hasn’t Twitter Turned Too Many Into Twits already? It appears that Twitter has already made twits out of at least 3 in 10 students! The only things that should go “tweet” are Tweety Bird bird and Rockin’ Robin (Muppet Version).

With the prevalence of ERP systems in large companies, more purchasing professionals … should be focusing on developing advanced database skills.
Uhhm, no. Purchasing professionals should be focussed on learning advanced data analysis skills. This is not quite the same as learning advanced database skills. Purchasing managers don’t need to know how to configure, manage, scale, back-up, restore, and replicate databases … that’s what DBAs are for. Purchasing managers know how to use today’s spend analysis tools, which require them to learn how to build and manipulate cubes through dimension-driven UIs, not how to optimize 4 level nested SQL statements … that’s what the tools do! (And frankly, even your average CS graduate would have a hard time optimizing 4-level nested SQL statements across multiple tables, if they could even write them in the first place!)

The article also promotes the new Microsoft Online Services
which will only work if everyone on the team is using a supported version of Windows. And even then, it might not work. (Furthermore, even though they claim that LiveMeeting works on Safari and Firefox on Mac, even if your system meets all the requirements listed, it often doesn’t.) Mac is around 15% of the market and growing, Linux is on the rise (especially in Netbooks), and a number of organizations still use AIX and UNIX based platforms (which could become popular again if thin-client desktops [e.g. SunRays] take off). Not everyone is in the Microsoft eco-system anymore. (And the majority of supply management systems are NOT built on .NET.)

And then the last paragraph indicates that mobile devices are the answer to requisition approval (to keep projects moving), commodity price updates, and procurement communication!
This is probably the most dangerous message of all, because now we’re in Yes, … but territory. Not all requisitions can be approved on a 3×5 screen. What if there are 20 (or more) line items? What if your system flagged 5 as off-contract? What if it’s an unusual request for a significant amount? You’re going to need more data to make the right decision than you’re ever going to fit on that itsy bitsy teeny weeny tiny Blackberry Storm or Curve. It’s one thing to approve a new laptop or mobile phone for an employee that needs it right away to continue working, but another to approve an order of 10,000 units of SKU XYZ123, when the contract is for ZYX321! Why is the order off contract? Oversight? Have requirements changed? Or is your supplier out of ZYX321 and you need an acceptable substitution right away? And what good is a commodity price update if you can’t see the history and the trend graphs. Unless you’ve already done the analysis and figured out that you should buy when it hits 75 or sell if it hits 100, because you’re hedging risk on the commodities markets, that update is useless. And communicating in 140 byte tweets? That would just make you a Twit!

Let me finish by saying that I’m so glad that you, dear reader, are an educated, informed, and intelligent individual who would drop this blog from your feeds faster than a hot potato if I ever published anything as ridiculous as what Purchasing and other publications are getting away with these days!

Share This on Linked In

Purchasing Gets it Wrong Again: Spend Analysis IS Cheap.

A recent article in Purchasing on “what $100K buys in spend analysis software” has me jumping up and down again (their 2007 article on the “ABCs of Spend Analysis”, which was beautifully dissected by Eric Strovink in What Purchasing.com Got Wrong, had me fuming for weeks). According to this new article, being able to analyze spend is critical (which it is), but it isn’t cheap and price tags start at $100K — and buyers may have to pay more for insight into new opportunities for sourcing and consolidation. WTF?!?!?!

Allow me to say that again. What the frack? It is cheap! Pricing starts at $36K/year for the most powerful spend analysis tool on the market. That’s significantly less than the $100K price tag they list. $64,000 less. (I guess that’s the real $64,000 question!) A one year single user license for BIQ is only $36,000. It includes unlimited utilization by your senior analyst and all of the new features described in their last press release, including nodal and transactional computed measures, dynamic referencer filters, a super-fast 64-bit loader, and the ability to drill-down on 50M transactions in real-time on your laptop. (You might need a quadcore with 16 GB of memory for that size dataset, but those are pretty cheap these days.) And, you can get a 100 user license for much less than $100K/year, even if you pay by the month with the option to quit at any time.

As usual, it’s obvious that Purchasing.com’s research consisted of a simple web search, product description screen-scrapes, and a quick call for pricing, as opposed to the in-depth web demos that I insist on before Sourcing Innovation will even acknowledge that a product exists. And the results are dismal. While Ariba, Bravo, CVM, Etesisus, Global e-Procure, Ketera, SAP, and Zycus all have spend analysis solutions, they are not equal. Iasta’s is actually built on third parties (BIQ and Spend Radar), FieldGlass is limited to services, Insight is a services organization which, to the best of my knowledge, still uses third party tools, and I’m sure big players like Emptoris (which just announced faster reloads and data warehouse restructuring time) and new SaaS players like Rosslyn Analtics (which are trying to take a cloud-based approach) are sure to be annoyed at being wholly (and unaccountably) ignored.

You’re better off starting with a Google search and visiting individual vendor sites than reading this article.

Once again.

Share This on Linked In

Bravo: Analysis and Supplier Performance Management for Contract Compliance

Last month, I told you how BravoSolution Collaboratively Optimized Its Way onto the doctor‘s Short List. Today, I’m going to discuss their (Spend) Analysis, Supplier Performance Management, and Contract Compliance Solutions to give you a broader view of their solution suite.

To get straight to the point, their spend analysis (console) solution, which takes a standard reporting-based approach, and which includes over 60 standard report templates, is nothing special, but their analysis administration tool, the Transformation Designer, is one of the most powerful administration interfaces I’ve seen in a web-based analysis solution. Most providers tout their “leading auto-cleansing, auto-classification, and auto-enrichment” solutions as if they’re the be-all-and-end-all, but those who truly understand spend analysis realize that you can’t auto-cleanse, auto-classify, and auto-enrich everything, no matter how many rules are in your repository or how many billions of transactions your provider has classified. (And those who fall for that line are lucky if mapping accuracy even approaches 80%.)

Every company is different, every department is different, every employee is different, and every transaction is different. That’s why you have 19 different representations of IBM in your supplier master. Furthermore, you don’t buy the same SKUs from the same suppliers with every order. So even if you had a “perfect” set of automatic mapping rules today, they’d be broken tomorrow. You have to continually manage and maintain your data or your reports will be useless. And Bravo’s Transformation Designer allows your data administrator(s) to do all that.

Bravo’s Transformation Designer allows you to select your data sources, define the raw data tables, capture the raw data fields, profile the data, and define custom mapping and transformation rules on the data before it populates your repository. You can also define a bevy of checks (null, range, date, acceptable values, duplicate, etc) and define your rules based on transformations (that can use substrings, calculations, and lookups). The rules can be layered, with higher priority rules taking precedence and lower priority rules kicking in when there are no higher priority rules. (So you can start with the classic “secret sauce” of map the vendors, map the GL codes, map the vendors + GL codes, and map the exceptions and have the rules applied in reverse order.)

In addition to supporting your standard “knowledge base” of auto-classification rules (which includes mappings, and families, for tens of millions of suppliers and even more standard items), which you can use to start your mapping journey, it also supports automated text classification methods based on advanced statistical algorithms. A proper combination of all three rule types — knowledge base for standard vendor and GL code mappings, statistical rules for automated mapping of unrecognized transactions (that can be mapped with high statistical accuracy), and custom hand-coded direct mapping rules for the exceptions — will get you very high classification accuracy with very little work. Especially since you can use their rules engine to quickly identify exceptions and define direct mapping rules that take care of them. And any time you identify a mis-mapping, you can define a new rule to re-map it. (New rules are immediately added to an asynchronous mapping queue and the queue is processed continuously, which allows for near real-time updates. No waiting for the monthly refresh.)

The Analysis Console also works on their supplier performance data. Bravo Solution is a mature provider of SPM, having been offering a solution since 2001. While it might not be broader or deeper than any of the newer pure SPM solution plays (SupplierSoft, Aravo, Hiperos, etc.), they have a history of successful implementations. (And more importantly, how deep is a SPM solution anyway? As long as it captures data, calculates metrics, allows you to create month-over-month, quarter-over-quarter, year-over-year, and trend reports on the metrics, allows you to share that data with your supplier(s), and allows you to collaborate on action plans in a virtual collaboration environment, what else is critical to your average organization just starting on an SPM journey?) With regards to SPM, Bravo has your bases covered. It’s nothing fancy, but it will more than get the job done.

This brings us to Contract Compliance. Their solution can automatically load cleansed GPO contract data, normalize the data based on supplier families and parent-child company organization relationships, and extract line-items and SKUs. If you integrate with your purchasing system, the solution will automatically match purchases to contracts and flag exceptions. It also supports deep embedding with your e-Procurement system and can be used to identify contracts, price levels, and exceptions before a PO is issued. But the best part is the deep integration that is currently being developed between the Analytics Console, SPM Module and Contract Compliance Modules. You’ll be able to analyze contract compliance against supplier performance at any time, over any time period, and see if you’re getting the value you expected from the contract — and then use this information at contract renewal / resourcing time.

Share This on Linked In

Spend Analysis V: User-Defined Measures, Part 2

Today’s post is from Eric Strovink of BIQ.

Sometimes you want and need control over how (and when) measures are calculated. Such measures are termed “user-defined” measures.

As we saw previously, there are two kinds of user-defined measures:
(1) post-facto computations that are performed after transactional roll-up (the usual definition), which we’ll consider here, and
(2) those that are performed during transactional roll-up, which were covered in Part 1.

In the above example the gray “Ref” columns are filtered on commodity count/spend in Q1 2003, and the normal column is filtered on commodity count/spend in Q1 2004. If we then additionally filter on four business units:

we can now see the quarter-on-quarter comparison for just those business units:

You can see that two filter operations are occuring every time the dataset is filtered; one for the regular filter above, and one for the reference filter, modified by non-conflicting filter operations. This “dynamic” reference filtering can be quite powerful, since the relationship between the two periods is now available at any filter position in the dataset.

Now, let’s add a post-rollup (“nodal”) computed measure that calculates the %difference between these columns. The code reads like this:

 

$%Diff$ = ($Amount$-$RefFilter4.Amount$)/$RefFilter4.Amount$ * 100;

 

Now, if we sort top down by %difference, we can see quite clearly the quarter-on-quarter difference sorted by worst to best, considering just the four cost centers above:

This percentage difference is available to all analysis modules, because it is calculated at every node, not just at the nodes that are currently being displayed.

Next installment: Meta Dimensions; but I’ll take a few weeks off before diving back in!

Previous Installment: User-Defined Measures, Part I

Share This on Linked In