Monthly Archives: March 2018

Will the Trade Wars Be Good for Advanced Sourcing?

Trump is imposing tariffs. China is retaliating. And this is just the beginning. As a result, supply risk and the need for spend forecasting is finally becoming real. But is it becoming real enough for organizations to take action? It’s hard to say. But one thing we do know is that the only way organizations can progress forward is to better understand not only the risks, but the costs.

What are the risks? Many. What are the costs? Significant. And how can you know of either? In the first case, you need to monitor the news, the sentiment of the responses in regards to the news, crowd-source some predictions, and run some advanced analytics on all this data to determine the probability of something happening — and sticking.

And in the second case, you build should cost models with current data, and projected data, to determine the impact of a tariff on the total cost of ownership of the product. This means that a simple RFX or Auction platform is just not enough – an organization needs a platform with deep should cost modelling and the ability to create what-if should-cost models based on projected and anticipated changes.

But even that’s not enough. If the projected increases are significant, then the organization will, at the very least, need to reallocate global supply chains to insure that products, which are currently sourced from multiple suppliers and/or locations, are being exported from and imported into the most cost effective locales the organization has access to. And if this is not enough to keep costs under control, then the organization may need to even source from additional suppliers (in different locations) or re-source the entire category (to the extent possible).

But it’s hard to figure all of this out without an optimization backed sourcing platform. Hopefully this is the kicker that is needed to get these powerful analytical platforms into the hands of more Sourcing and Procurement organizations, as these platforms are desperately needed and reduce spend on analyzed categories by an average of 10%+ year-over-year, making their ROI immense.

But, alas, only time will tell. But if bankruptcy could be on the line (when a tariff wipes out the entire profit margin), maybe this time these platforms will finally take hold.

Where are You on Your Master Data Journey?

You want to get cost under control. Maybe even save. You need to ensure compliance. You need to satisfy the auditors. You want to know the risks you face. And the risks you could face. All laudable goals, but all goals that are unobtainable without … you guessed it … data.

More specifically, clean, rich, up-to-date, relatively complete data … which, likely, resides in multiple systems, duplicated across each. This makes data centralization, which is necessary for any of these initiatives, complicated, and often difficult. It’s not just the last update record date, especially since some systems do the last update at the record level, and not the data element level.

Plus, how do you know which parts of which records can be combined? Especially when they conflict or don’t line up. Without an appropriate master data management strategy, and a system that can handle master data management across multiple, loosely related, supply management and enterprise, it can be downright impossible for any initiative that spans more than a few dozen providers or categories. And even that is an effort.

But MDM is not easy to define, and even less easy to implement. First of all, which systems do you use for master data when there is an argument for multiple systems that store a record, such as a supplier, to be a master data system. Secondly, when you do identify the master system, how do you manage, and approve, updates … and how do you insure they get synched to the right systems at the right time? Third, how do you integrate all the data into a single, even if only virtual, record so that you can run a spend report. A compliance report. A risk report. An audit report?

The point is that it’s not just as easy as selecting a system, proclaiming it your MDM, and believing the implementor that your MDM problems will be solved in a few months. Some companies, that aren’t heavily focussed on, and involved with, the initiative take years to integrate systems and arrive at a nearly clean set of master data.

So before you march forward on your next, data intensive initiative, maybe you should step back, ask yourself where you are on your data management journey, and give an honest answer.

RPA: Are We There Yet?

Nope. Not even close. And a recent Hackett study proves it.

Earlier this month, The Hackett Group released a point of view on Robotic Process Automation: A Reality Check and a Route Forward where they noted that while early initiatives have produced some tangible successes, many organizations have yet to scale their use of RPA to a level that is making a major impact on performance, likely because RPA has come with a greater-than-expected learning curve.

Right now, mainstream adoption of RPA is 3% in Finance, 3% in HR, 7% in Procurement, and 10% in GBS – Global Business Services. Experimentation (referred to as limited adoption) is higher, 6% in HR, 18% in Finance, 18% in Procurement, and 29% in GBS, but not that high, especially considering the high learning curve for the average organization will end up with a number of these not continuing the experiment.

Due to the large amount of interest, Hackett is predicting that, within 3 years, RPA will be mainstream in 11% of HR Organizations, a 4X increase, 30% of Procurement, a 4X increase, 38% of Finance, a 12X increase, and 52% in GBS, a 5X increase, as well as increases in experimentation. Experimentation will definitely increase due to the hotness of the topic, but mainstream adoption will require success, and as Hackett deftly notes, successful deployment requirements have certain key prerequisites too:

  • digital inputs
  • structured data
  • clear logical rules can be applied

And when the conditions are right, organizations:

  • realize operational cost benefits
  • have less errors and more consistent rule application
  • benefit from increased productivity
  • are able to refocus talent on higher-value work
  • strengthen auditability for key tasks
  • have enhanced task execution data to analyze and improve processes

But this is not enough for success. Hackett prescribes three criteria for success, which they define as:

  • selecting the right RPA opportunities
  • planning the journey
  • building an RPA team or COE

and you’ll have to check out Robotic Process Automation: A Reality Check and a Route Forward for more details, but is this enough?

Maybe, maybe not. It depends on how good of an RPA team is built, and how good they are at identifying appropriate use cases for RPA, and how good they are at the successful implementation. Success breeds success, but failure eliminates the option of continued use of RPA, at least until a management changeover.

The Hidden Value of SI Association

What’s the value of SI Sponsorship? Quite a lot — you can contact the doctor <at> sourcinginnovation <dot> com to find out more, but here are two interesting statistics that SI has not advertised until now:

1/3 of all companies that have had a commercial relationship with SI have merged or been acquired and are now bigger, more successful entities

2/3 of all companies that have sponsored SI have merged or been acquired and are now bigger, more successful entities

In other words, companies associated with SI have become so successful, they become very attractive to not only customers, but peers and PE firms! They have options!

To the best of SI’s knowledge, NO OTHER INDEPENDENT PUBLICATION or Independent Analyst Entity has had this success rate.

Can SI take all the credit?  Probably not, but the doctor is quite confident that the implied correlation is this — if you are visionary enough to focus on building your market (and work with SI to achieve that), instead of just trying to steal your competitor’s customers, you’re probably going to succeed.  And SI is one of the partners that is going to help you … greatly.

(And if you’re too big to merge or be acquired, don’t worry!  SI can still help you create new markets for your products.  And then you too can be happy.)

Trade is Getting Complicated. Trade Agreements More So. Are Your Contracts Up to Snuff?

It’s difficult enough to create contracts that specify what both parties want, but with the shifting global landscape, crumbling trade agreements, new ones rising to take their place, and new regulations cropping up all the time that companies need to adhere to just to do business in their home country, it’s almost impossible.

How do you define contracts that keep up? And, more importantly, how do you figure out which of your contracts are not up to par, and where they are falling short?

In the first case, you constantly monitor government sites, associations, and news sites for mention of new regulations and requirements to adhere to them. Then you process the news, make sense of the new requirements, and find some experts to help you understand the best way to contractually deal with the new rules.

In the second case, you need to be able to quickly analyze a contract and determine if there are clauses to address the regulations. But if it’s a 50 page contract, that’s not a quick effort. And if you have 1,000 of them? 10,000 of them? How can you even attempt to do that?

Manually, you can’t. You need tech that can identify which contracts are likely lacking one or more clauses to address one or more regulations and bring them to your attention, in order of priority. Advanced, semantic, technology that can understand documents, deficiencies, and suggest potential fixes.

And a few companies understand that, and that’s why you see the likes of companies like LawGeex and LegalSifter rising up to challenge Seal with a new take on contract analytics and the need for. Because, one way or another, once you reach a certain point on your sourcing journey, you’re going to need this technology.