Daily Archives: February 5, 2008

The 6 Days of X-asperation: Day 3 – Questions to ask your Spend Analysis Vendor

Just like we did in the X-emplification series, we’re going to continue with Spend Analysis as we tackle the generic questions that you should be asking every vendor, and the types of answers you should be expecting.

1. What do I have to do to get a good handle on how to make effective use of this technology, and for an organization of my size, how long is it going to take?

You need to be aware of the data that you need and where it is located. You should also have a good handle on how long it’s going to take to get permission to access the data and how long it’s going to take to classify the data. But most importantly, you need executive support to get the various subsidiaries and business units to elevate the priority of your request.

To answer this question, you first need to know how many systems the organization is using, as there are likely multiple accounting systems involved in any reasonably large organizations. The data feed from each system will need to be coerced (or transformed) into a common, all inclusive, record format. At a minimum, you will need supplier, cost center, GL classification, currency, amount, and any relevant dates. You should also include item description, PO number, legal entity, business unit, country, payment method, and any other descriptive fields that are available to you. Finally, also be sure to get any ancillary data that link to the records, such as supplier or GL master, as this data can provide additional information, such as MWBE status, as well as the “names” that go with the non-descriptive “codes” that are commonly used by accounting systems. Note that if you have a good spend analysis tool, it will provide a scriptable translation facility (the “T” in ETL) that will make the transformations required to transform all of your data into a common record format easy to define and repeat.

With cooperation from the business units, and allowing for some corrective feedback, it should only take a few days to acquire the data feeds and derive the transformations necessary as the creation of an initial dump script for each system, once you locate the person who understands the data organization in the system, shouldn’t take more than a day or so. In practice, the actual wall time may be somewhat longer, as there will always be a business unit with “something better to do” than dump data for you, and that’s why you start by getting executive support to insure that the wall time doesn’t drag on unnecessarily.

Once the data is extracted and transformed, it shouldn’t take more than a few hours to load the data into the spend analysis tool, derive the key dimensions such as supplier, cost center, GL code, date, etc, and get your first view of spend. The view will not be a perfect one, because you still need to build the commodity dimension that defines what was actually purchased, but it will still have some value – as you will have a picture of total spend by supplier, cost center, etc.

Fortunately, data mapping, and dimension familying, are well understood exercises. The secret sauce of mapping is “map the GL codes … map the suppliers … map the GL codes and suppliers”, and this can be done by most organizations in just a few days, and quicker still if you start with an 80-20 approach and start by classifying the top 1000 suppliers and top 1000 GL codes. (And you don’t need an “automatic classifier” — which still needs to be checked anyway as “IBM” could be International Business Machines or Iggy’s Beachside Market — to do it!)

But remember, this is just the first A/P spend cube. If you really want to derive maximum value from your spend analysis tool, you’ll want to build a lot of different spend cubes using the data that’s lying around your organization, starting with PxQ invoice data that’s just begging to be analyzed. That’s why you’ll want to develop in-house capability to build cubes, so your analysts can use the spend analysis system to perform dozens of specialty analyses. That’s why it’s important to make sure your spend analysis system is accessible to your analysts, because relying on third parties, or on the spend analysis system vendor, to build cubes for you quickly becomes expensive in a many-cubes scenario.

2a. How much functionality is my organization realistically going to be using in 12 months?

It comes down to the tool and the user. If it’s a real spend analysis tool, versus just a spend reporting tool tacked on to a data warehouse, your power users will be using most of the functionality almost immediately, while regular users just use the cubes (yes, that’s cubes in plural) and reports prepared by the power users. If it’s simply canned reporting on a data warehouse, you won’t be using any of it in a year as you’ll have identified and cleaned up all of the low-hanging fruit within 3 to 6 months.

2b. How much functionality do I really need?

The ability to build your own cubes from arbitrary data sources, to classify and re-classify the data on the fly with a rules engine, to create ranged dimensions, and to slice the data anyway you see fit. Canned reports, data enrichment, and other peripheral features, while nice, are mostly just sales tools and “icing” that you’ll outgrow quickly.

2c. And how does this functionality solve my #1 pain today, which is X?

If you’re looking at spend analysis, and never built an A/P spend cube, your number one pain point is that your spend is probably out of control. Thus, you want a solution that’s going to do more than just build a few canned reports, because otherwise you’ll be in the exact same position a year after implementing the system, where you are still spending with vendors you thought you had terminated, where there is still off-contract spending you don’t know about, and where you are still spending across business units with the same vendor in a non-amalgamated, or non-leveraged, way that was never identified to begin with.

If you already have an A/P spend cube, the number one pain point is likely to be that your vendors are not performing to contract or that your contracts are not returning the savings that were predicted. In order to get to the bottom of these issues, you have to understand both the demand side and the invoice side, which requires the building of cubes with more detailed, commodity-specific data. You may find, for example, that the office supplies contract that was so carefully negotiated has been neatly side-stepped by the vendor, through unreasonable pricing of off-contract items. You may find that your “best price” contract for PC’s shows a 12-month absolutely flat price curve for the same exact SKU, even though you know that PCs always depreciate 25-30% over such a time period. And so on.

Either way, you need a solution that’s going to do more than just identify the low hanging fruit, because otherwise you’ll be in the exact same position a year after implementing the system. You need to know that it has the flexibility to cube, slice, and dice spend any way you can imagine so that you can find savings opportunities above and beyond those that can be identified with canned reports and “just one” cube.

3. How much training is my team going to require to effectively use the software? How long is it going to take them to absorb this training?

Your team needs to be shown how to build a basic spend cube, import data, map data using rules and overlays, create ranged and/or rolled-up dimensions, create reports and graphs and maps, and drill down by multiple dimensions. Not all users will need all of this training, but your up-and-coming sourcing professionals and power users will. This training should take at least a week, and preferably two – where the second week includes guided mapping, cubing, and analysis of your data.

4. How much is this software REALLY going to cost me in the first year and each subsequent year?

Real spend analysis, versus just spend reporting tied to a spend data warehouse, is a relatively new offering. Expect to pay high five to low six figures a year for this functionality alone. If you’re also buying a data warehouse, or buying “automated classification” (if such functionality really exists), then expect to pay more. For on-demand solutions, or installed solutions priced with an on-demand model, maintenance should be (close to) zero; for other solutions, figure a higher maintenance cost than for e-RFx and e-Auction – closer to 20% than to 10%.

If we’re just talking pure analysis functionality, then installation should be free if it is on-demand, or be no more than a day of consulting if we’re talking behind the firewall or hosted ASP. However, if you’re also buying a data warehouse, then, depending on how many systems you have and how many transactions are in each system, and whether you want automated integration, it could be days, or weeks, or even months, to load and classify the data and get the automated integration paths up and running between all of the various systems. If we’re talking (real) enrichment using (real) third party data, add more time still.

If you turn to a third party or to the vendor for services, you should ensure that you can onboard those services in the future. Ideally, you should partner with a services vendor who will do “walk along” training, so that internal resources can become familiar with the system while real work is progressing in parallel. Services vendors should offer “a la carte” pricing, not just fancy promises and a fixed (large) price.

Finally, you shouldn’t be forced to sign a long-term contract with any software vendor, and you shouldn’t do so voluntarily until you’ve had a chance to really use the product. There should be a way (ask them!) to structure the contract such that you can stop using the software at any time. And, you should make sure that there’s a way to dump out and preserve all of the data you’ve organized with the spend analysis system, so that you can easily move that data to another platform, be it another spend analysis system or BI system or just a general-purpose database management system.

5. You say you care about your customers and that you are going to provide great service. Prove it!

Ask for references. Talk to them. If the vendor has an upcoming user meeting or conference, ask to go to it. Ask for examples of results their customers have achieved on the platforms recently, and how they can help you achieve the same. But most importantly, ask them if they’ll help you with your initial pilot project at a reasonable consulting rate and see what kind of results they deliver – with their tool.

6. Can I take it for a test drive or a short term lease?

Considering that this software is usually either web-based or a fat client that runs on your desktop, there shouldn’t be any problem for your provider to set you up with a single instance, or copy, for you to use on a pilot project – which they should be comfortable with you undertaking at a low consulting rate – equal to the cost of the consultant that guides you through the pilot project.

7. Can I buy it or implement it in pieces?

Just like you should buy the entire e-RFx or e-Auction tool functionality up-front, you should buy the entire analysis tool functionality up-front, but if the vendor also offers warehouse and automated classification and integration platforms, you should be able to buy, and add, them in pieces. (You might think that you need a data warehouse upon which to run a spend analysis tool, but just remember that most BI tools come with their own internal, basic, database functionality and that a good tool will allow you to import the relevant data dumps from each of your current databases, integrate them into a single cube, and run the reports you need.)

However, you should write the contract such that you can choose to drop add-on modules, services, or other functionality later, without penalty. Too many software and services contracts contain “poison pills” such as guaranteed services payments or guaranteed maintenance payments. Don’t sign them.