Virtual Procurement Centers of Excellence: Will We Ever Realize Them?

Three years ago we told you that Virtual Procurement Centers of Excellence where The Next Level of Complex Direct Procurement and that your sourcing platform should enable this.

But it’s three years later, and we still only have a handful of S2P platforms that can properly support bill of materials for direct sourcing. (In fact, you don’t even need all your fingers!)

Add this to the fact that either ERP integration is minimal, that support for modification and should-cost modelling is limited, or there is no support for integrating price indices or market intelligence, and it’s still a pretty sorry state of affairs.

Especially since true value is only going to be realized not only with proper insights into bill of material costs, but what the bill of materials should look like. (Maybe the steel being used is inferior, the rare earth metal component could be reduced with a better design, etc.)

In other words you need a platform that not only supports full ERP integration, BoM modelling and management and deep should-cost modelling but also up-to-date market intelligence. This should not be limited to commodity feeds (as these are not global or available for all commodities), but should also use community intelligence, especially around labour rates and energy costs in a region.

But we only have one S2P platform with real budding community intelligence, and it’s support for direct is relatively non-existent compared to some peers.

So the question is, are we ever going to realize them? For all of the reasons we gave three years ago, and then some, we need Virtual Procurement Centers of Excellence for Direct, backed by Market and Community Intelligence, but they still seem to be in the distant future.

So what do you think? Are the current S2P players going to up their game to where we need them to be? Or do we need a new breed of players to come out of the shadows and show the market what is needed.

Still No Single Starting Point for Your Supply Management Journey!

Just as there is no one platform and no one workflow for Supply Management, yet alone even S2P (although the vendors claiming to have such seem to be increasing by the day) — and the only way to make progress is to define the core workflow, identify a set of overlapping/integrating systems to achieve the core workflow, identify vendors that can provide these systems — with enough configurability to allow you to support the necessary variances, and then select those vendors that best meet overall organizational needs and move forward — there is no one starting point.

The starting point, with the ever increasing complexity of systems, processes, and global supply chains, is more organization dependent than ever. One has to take all of the following questions, and corresponding answers, into account … and more!

  • What does the organization have now for systems and processes?
  • Where is the organization in its Supply Management journey?
  • What is the talent profile — what is its average and collective IQ, EQ, and TQ?
  • What are the organization’s biggest pain points?
  • What are the organization’s top pressures?
  • What is the organization’s budget? Can it be extended? Leveraged?
  • What resources does the organization have available to support implementation and change management?
  • What resources and programs do its current, and prospective, vendors have to help?
  • What professional organizations and associations can it lean on for support?
  • What leading research and advice can it access?
  • And so on.

It’s tough. Typically, an organization makes the jump when it’s desperate to get savings, and typically, when doing a systems buy, the organization will focus on the system that is advertised to identify the biggest return. In Supply Management, that’s a true strategic sourcing system that supports complex sourcing as only decision optimization and spend analysis technologies have been repeatedly found to identify year-over-year savings in excess of 10%, with everything else being single digits. (This system will support complex sourcing workflows, decision optimization, and at least industry average analytics as well as a solid supplier master / SIM capability. Contracts can be external.)

But identification is not realization. In an average organization without the proper processes and systems to support contract implementation, as per a classic AMR series on reaching sourcing excellence, an average organization will only capture 60 cents to 70 cents of every dollar of negotiated savings at the end of the day.

If the organization is not set up to capture savings, it has to start simple. Processes. e-Procurement/I2P. SRM to get suppliers on board with processes and programs that will allow it to capture data and insure the suppliers deliver the value they promise without constant monitoring by the buyer. If the organization is set up to capture savings, but can’t identify any, it has to look at more complex platforms or options. However, regardless of the answers to the above questions, it should start simple and work it’s way up the technology and process complexity ladder. The key to success will be adoption, and that will mean not overwhelming those that will be required to adopt the new systems and processes if success is to be achieved.

A Single Version of Truth!

Today’s guest post is from the spend master himself, Eric Strovink of Spendata.

A oft-repeated benefit of data warehouses in general, and spend analysis systems specifically, is the promise of “a single version of truth.” The argument goes like this: in order to take action on any savings initiative, company stakeholders must first agree on the structure and organization of the data. Then and only then can real progress be made.

The problem, of course, is that truth is slippery when it comes to spend data. What, for example, is “tail spend”? Even pundits can’t agree. Should IT labor be mapped to Professional Services, HR, or Technology? For that matter, what should a Commodity structure look like in the first place? Can anyone agree on a Cost Center hierarchy, when there are different versions of the org chart due to acquisitions, dotted-line responsibilities, and other (necessary) inconsistencies?

What tends to happen is that the “single version of truth” ends up being driven by a set of committee decisions, resulting in generic spending data that is much less useful than it could be. Spend analysts uncover opportunities by creating new data relationships to drive insights, not by running displays or reports against static data. So, when the time comes to propose savings initiatives, the very system that’s supposed to support decision-making is less useful than it should be; or worst-case, not useful at all.

Questions and Answers: Metadata

Do we have preferred vendors? Do buyers and stakeholders agree on which vendors are preferred? What vendors are “untouchable” because of long-term contracts or other entanglements? For that matter, with which vendors do we actually have contracts, and what do we mean by “contract”? Are there policies that mandate against a particular savings initiative, such as lack of centralized control over laser printer procurement, or the absence of a policy on buying service contracts? Can we identify and annotate opportunities and non-opportunities, by vendor or by Commodity?

The answers to these (and many other) questions produce “metadata” that needs to be combined with spend data in order to inform the next steps in a savings program. The nature of this metadata is that it’s almost certainly inaccurate when first entered. We’ll need to modify it, pretty much continually, as we learn more; for example, finding out that although John may have dealt with Vendor X and has correctly indicated that he’s dealt with them, it’s actually Carol who owns the relationship. We may also determine that the Commodity mapping isn’t helpful; network wiring, for example, might need to belong with IT, not Facilities.

Alternative Truths

As we add more and more metadata to the system — information that is critical to driving a savings program — we encounter the need to refine and reorganize data to reflect new insights and new information. Data organization is often quite purpose-specific, so multiple different versions of the data must be able to be spawned quickly and coexist without issues. This requires an agile system with completely different characteristics than a centralized system with an inflexible structure and a large audience. In essence, one must learn to become comfortable with alternative truths, because they are essential to the analysis process.

So what happens to the centralized spend analysis system, proudly trotted out to multiple users, with its “single version of truth?” Well, it chugs along in the background, making its committee members happy. Meanwhile, the real work of spend analysis must be (and is) done elsewhere.

Thanks, Eric!.

Relevant Content is Still a Major Cornerstone of Any Compliance Effort

Not long ago we asked if you, or Ecovadis, could solve the compliance challenge before it cost your organizations tens or hundreds of millions of dollars. The biggest reasons for lack of compliance are still lack of knowledge, policy, visibility, analysis, and procurement technology and the fixes are still knowledge, policy, and appropriate technology.

One of those technologies is a Procurement Marketplace that can steer (or force) buyers to buy the right products from the right (and approved) suppliers (which can be an integrated catalog management solution that takes advantage of your supplier master, community intelligence provided by the vendor, and integrated risk information from third parties).

Another of these technologies is still supply chain visibility technology that lets a company monitor what is going on in the supply chain and evaluate a potential supply base before making a decision.

A third technology, and one we should not forget about, is import/export/trade management software that helps the organization identify the regulations it must comply with, collect the necessary information, produce the required documents, make sure the documents get to the proper authorities complete and on-time, and track all of the associated certifications and insurance certificates that go with the products and the supply base.

A good trade solution will address, at a minimum, import/export requirements, ECCN (Export Control Classification Number), custom security programs, FTA/FTZ/SEZ (Free Trade Agreements/Free Trade Zones/Special Economic Zones), country of origin, DPS (denied party screening), entry visibility, and HS (Harmonized System) codes / HTS (Harmonized Tariff Schedule) codes — and keep up with the never ending onslaught of tariff changes and temporary product bans that are result of the trade war. Essentially it will help a company determine all of the export requirements, all of the import requirements, produce the necessary documentation, and track its product from country of origin to the destination country.

In order for this solution to work, it needs a lot of content. Namely:

  • import/export regulations for all of the countries being sourced from, sourced through, and shipped to
  • US ECCN database
  • requirements for programs such as C-TPAT, PIP, and AEO
  • Free Trade Agreements between all of the relevant countries
  • database of all FTZs / SEZs in the relevant countries
  • HS schedules for all of the relevant countries and mappings
    and/or mappings to from country specific schedules
  • Denied parties lists for the relevant countries
  • Direct feeds to updated denied product lists for real-time updates
  • Direct feeds to updated tariffs for real-time updates
  • Early warning of products under consideration for bans/tariffs and real time flagging

Don’t overlook these last three. They are new and many of the traditional solutions on the market won’t have the capability. When a single ban or tariff spike can lay ruin to your best laid plans, be prepared.

The Platform is Becoming Ever More Important …

In Monday’s post, we quoted an except from Magnus’ interview on Spend Matters where he noted how important it is to start with the most important capabilities / modules and build out towards a full S2P suite (because he knows as well as the doctor does that a big bang approach typically results in a big explosive bang that usually takes your money and credibility with it). If you examine this closely, you see that you need to select not only the right starting solution, but a starting solution that can grow.

This requires a platform approach from the get-go. It doesn’t need to underlie the starting modules, it doesn’t need to underlie the ending modules, it just needs to underlie the suite you want to put together. It can be part of an application you already have or a third party application you buy later. But it has to exist.

The simple fact of the matter is that you can’t put together an integrated solution that supports an integrated source-to-pay workflow if you don’t have a platform to build it on. And you can’t patch it together just using endpoint integrations using whatever APIs — that’s just enabling you to push data from one point into another … or pull it from one point to another. That’s not an integrated solution, which requires an integrated workflow, just data integration. And while that is a start, it’s not enough. Especially when there is no one-size fits all category strategy and source to contract or procure to pay workflow for even the smallest of organizations with the simplest of needs.

So before you select any solution, the first thing you have to make sure is that it is built on, or works with, a true platform … otherwise, you may find as you undertake your S2P journey that a component you selected early does not fit the bill and you have to repeat steps … which is something you really can’t afford to do.