Monthly Archives: May 2018

What’s Your Supply Management Priority?

Supply Management Mastery is an elusive goal. As SI has been documenting for years, in order to master supply management, you have to manage a slew of Source to Pay processes as well as related Operational, Finance, and Risk processes.

But this is not easy when you consider the many steps involved in even source to pay. Spend Analysis. Opportunity Analysis. Requirements Definition. e-Negotiation. Strategic Sourcing Decision Optimization. Contract Negotiation Management. Catalog Creation. Guided Buying. Purchase Order Management. Invoice Management. Supplier Management. Risk Management. Product Management. And so on.

You have to master all of them, but you can’t work on them all at once. You have to make priorities, and eliminate all but the top three (3). And even then, you might not be able to tackle all three if each would require a separate system.

So what’s your priority?

Spend Analysis gives you insights, but you have to be able to act on them. That requires e-Negotiation, SSDO, contract management, etc.

Opportunity Analysis goes beyond just spend to determine if your opportunities are spend related, supply base related, process related, or otherwise.

Requirements Definition helps crystalize organizational needs and helps the buyer zero in on what really matters. But then it has to create good contracts and statements of work.

e-Negotiation helps capture all of the back-and-forth between both parties so that the organization can build supplier profiles and take advantage of that. Provided the organization has deep supplier master data management.

SSDO can find the optimal cost allocation across suppliers, products, and carriers and delivers an average savings year over year that exceeds 10%. But it requires deep models and lots of data. And where does that data come from? Typically from e-Negotiation.

Contract negotiation management is great for creating great contracts. But you need product details, SOWs, risk management and liability clauses, and other data.

Catalog management software is great, as long as you have a supplier management portal to manage the supplier the catalog comes from.

Guided buying is even better, but only if you have the solutions to guide the buyer to that captures the majority of organizational spend. Guided buying that only works in an incomplete catalog is more of a frustration than a solution.

Purchase Order Management can eliminate a lot of paper, provided there are catalog, sourcing, etc. systems to integrate with to auto-generate those POs on buyer actions.

Invoice Management systems are great, as long as you have POs, contracts, goods receipts, and other documents to m-way match against! Otherwise, they just collect e-paper that still has to be manually reviewed. (And in the average organization, that still typically results in them being printed.)

Supplier Management is great for managing information, relationships, and performance, provided their are networks and portals to collect the data from, and internal systems to create and manage scorecards to define performance improvements on.

Product Management is key to understanding the product and category dynamics, but then you need category management strategies to map to.

And, these days, instantiations and realizations of risk can wipe out the savings from 10 sourcing projects, so risk management is paramount, but detecting and monitoring for risks requires a slew of systems internal and external and lots of data.

In other words, every system is great, but generally only if you have one or more systems to collect the data it runs on or supplement key functionality.

Which again begs the question, what are your priorities? Otherwise, you’ll never know where to start.

GDPR – still avoiding the problem? (Part V)

Today’s guest post is from Tony Bridger, an experienced provider of Procurement Consulting and Spend Analysis services across the Commonwealth (as well as a Lean Six Sigma Black Belt) who has been delivering value across continents for two decades. He is currently President of UK-based TrainingWorx Ltd, a provider of a wide range of Procurement and Analytic business training programs (inc. GDPR, spend analysis, project management, process improvement, etc.) and focussed short-term consulting solutions. Tony can be contacted at

In our last post we noted that those with extensive risk management experience know that avoidance is a key strategy for risk minimisation.

We also noted that this may well be a very feasible option f-or those analytics suppliers outside of the European Union.

The GDPR actively supports the anonymisation approach:

The principles of data protection should …. not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable. This Regulation does not therefore concern the processing of such anonymous information, including for statistical or research purposes.”

By removing or replacing data elements this satisfies another element of the Regulation – pseudonymisation:

the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person.” (Article 4).

Credit card or card account numbers can be used to identify a person – many card systems encrypt or hash the card number if expense managers are used. Once again, it pays to do the data homework.

The salvation for many spend analytics providers is to encourage the client to set data extract routines that eliminate these types personal data.

However, that still leaves us with the less easily manageable data component of personal data buried within invoice line descriptions or other ERP free text fields.

Once GDPR becomes recognised as the “new paradigm”, analytics providers are likely to claim that they have all sorts of (chargeable) capability to remove this data or anonymise it. This is more likely to revert to a line by line manual check as opposed to anything technically complex or ground breaking.

There is nothing intrinsically wrong with this approach. It may be time consuming but will follow the usual pattern of spend analytics data management. The first stage of the dataset build is historical data construction. If all historical spend data is checked and anonymised, then monthly refresh data is much lower volume – and patterns where personal data may exist may have already made their presence known – a pattern.

Vendors and clients are therefore taking all reasonable precautions with the data. If the data can have all personal elements removed, then GDPR does not apply. The “shotgun approach” for web providers is to use full access encryption…but this could be prohibitive in cost terms.

So, what is the risk? Spend data with personal data content has to align with the Regulation both within the EU — and transferring data outside of the EU. The use of surgical data techniques can reduce the risk and perhaps even reduce the data to non-personal in nature.

The alternative option is to leave the personal data and adhere to the range of controls that are required to manage that information. We have yet to cover these controls in any detail.

As we will discuss later in a later post, staff, employee data and personal data may also be subject to consents. A considerably more complex issue under GDPR. With new elements like right to be forgotten it may be simpler just to remove the data components.

No one said this was going to be easy.

Thanks, Tony.

GDPR – avoiding the problem? (GDPR Part IV)

Today’s guest post is from Tony Bridger, an experienced provider of Procurement Consulting and Spend Analysis services across the Commonwealth (as well as a Lean Six Sigma Black Belt) who has been delivering value across continents for two decades. He is currently President of UK-based TrainingWorx Ltd, a provider of a wide range of Procurement and Analytic business training programs (inc. GDPR, spend analysis, project management, process improvement, etc.) and focussed short-term consulting solutions. Tony can be contacted at

For those with extensive risk management experience, avoidance is a key strategy for risk minimisation.

For those analytics suppliers outside of the European Union, this may well be a very feasible option. If we assume that spend data could contain P Card holder names, personal data in staff reimbursements and personal details in invoices – what are the avoidance options?

A myriad of options exist that analytics providers can deploy to avoid the personal data problem in risk terms. The first, and most obvious option (and least acceptable) is to refuse to take data from clients that that may contain personal data. However, the old adage applies that “some will always take the business, and someone will always do it cheaper”. Its also not a tenable under the GDPR, the fact that the client says the data is “personal data free” may not stand up if a breach occurs.

There is an old English adage that simply states that “you can’t eat a horse at one sitting”. If we start to break the problem up in to manageable components the potential issues become less intimidating.

One of the major areas of concern is P Card data. In the UK, many local councils and authorities publish their P card data for public access (in Excel files) on their websites – but with no personal cardholder data. It really focuses on the core question – does the client really need the name of the cardholder/Card number – or is the supplier spend the key focus? If the card data is extracted post reconciliation (if an Expense Manager is used for card management), the data will contain a cost centre. If the cost centre structure is loaded as a hierarchy it can be relatively easy to see where spend is occurring within the organisation – but not who incurred the cost.

The second key area is staff reimbursements. Many companies still set staff up as vendors to pay reimbursements. This spend too is quite insightful and may deliver several sourcing opportunities. However, it still leaves the personal data in the file that may be extracted from the ERP. For this element of the data, it may be far simpler to create a data mechanism that identifies those vendor master entries on the client ERP with a data flag of some kind. For statutory tax reporting purposes, many corporate clients are required to account for reimbursements for staff (for taxation purposes e.g. Fringe Benefits). So, if the client can remove staff names or attributable identifiers– then that will eliminate or avoid the data issue. In effect, there is the possibility that the problem can be eliminated on the client extract, but you must ask the client more about how they are extracting their data and guide them as to how they can better manage their data for GDPR compliance to prevent getting data you don’t want. .

In many respects, spend analysis providers have had it really easy up until now. They simply give the client a data extract request, the client provides what they can, and the provider builds the dataset. GDPR for EU clients makes this process less simple from 25th May. Why?

To be continued!

Thanks, Tony.

One Hundred and Fourteen Years Ago

The United States took over, and began, construction of the Panama Canal. Then, a little over ten years later, it was completed and for the first time ships could travel between the mid-Atlantic and mid-Pacific from at least 10 days, and typically two to three weeks (depending on how fast the ship was and the weather) to less than a day, as it saves ships a 7,872 mile voyage.

It revolutionized ocean freight and although we now take it for granted, it was a historic achievement.

IS TCO a No Go Without Optimization?

At this point in time, very few people are still in the stone ages of Supply Management and buy on price per unit (PPU) alone, the first level of sourcing value. However, there are still a number of buyers in a number of organizations that still buy on landed cost or total cost of acquisition (TCA) and buy solely on the sum of price per unit, transportation, duty, tariff, temporary storage, and other costs that are incurred from the time an order is placed until the time the product is received. These organizations are still in the dark ages of Supply Management and need to find the light very, very quickly (especially with Trump Nation and Brexit on the way). And while most modern Supply Management organizations attempt to buy on total cost of ownership (TCO), the third level of sourcing value, not all succeed.

TCO is the most commonly used metric today by analysts, consultants, vendors, and (I’m sorry to say) bloggers alike. It is designed to be a comparative cost metric that quantifies the overall cost of each acquired unit from a direct, indirect, and quantifiable market perspective that takes a broader look at the cost of a product from an acquisition, utilization, and delivery perspective. In addition to the landed costs, it typically also considers indirect utilization, supplier switching, and transaction costs as well as cost adjustments for quality, waste, and brand power (if your supplier has a brand that increases the selling price of the product you create with the component).

TCO is designed to capture the ‘true cost’ of a product (or service) from a supplier and does a much better job of helping you to compare apples-to-apples when determining the best buy for your organization. And even though it’s not the ultimate metric, as that’s total value management (TVM), the next level (and pinnacle) of sourcing value measurement, you cannot apply TVM until you have mastered TCO (which is a big component of TVM just like total cost of acquisition is a big component of TCO), and you can’t master TCO until you can model it.

But most sourcing solutions don’t let you model TCO. And the few that do don’t let you optimize it. That’s why it’s important when selecting a strategic sourcing solution you get an optimization-backed solution with support for deep cost models and, preferably, bills of material. They might still be few and far between, but a few more hit the market in the past year, and we expect more will be coming due to the power, and utility, of such solutions.

So is TCO a no-go without optimization? Not necessarily, but it sure is a lot harder to do without optimization.