Monthly Archives: January 2020

S2C Decision Tree …

Over on Purchasing Insight, Pete Loughlin ran a great post on the “build or buy decision tree for Purchase-to-Pay” that should not be overlooked because it gives every organization a very simple answer that even the most luddite of C-Suites can understand … NO!

You do NOT build a P2P system in-house. In fact, you should NOT have been building or maintaining a P2P system in house since the early part of the last decade — but with so many suite providers to choose from now, the fact that some organizations are still even considering building a P2P solution is almost inconceivable in-and-of itself.

As Pete Loughlin clearly states, when facing the build-or-buy question you first need to to ask yourself if the problem you are trying to address is new, uniquely different or so rare that a suitable solution doesn’t exist already. And the only reason you’d build in-house is if you could honestly answer no. In the days where there were only a couple of solutions, and they only worked well with ERPs or indirect purchases, there might have been good reasons to say no, but now that there are dozens of options, that can be focused on indirect, services, direct, or the whole kit-and-kaboodle, the only reason you’d say no is if you were completely unaware of what has happened in the space in the last 20 years — and if that is the case, you really shouldn’t be making the decision.

However, the reason SI is drawing this to your attention is not just because you shouldn’t be building P2P in-house, but because you shouldn’t be building S2C and, most definitely, shouldn’t be building S2P (or any component there-of) in-house either! But the real reason SI is bringing this to your attention is the flow-start doesn’t stop there … it continues. Not only should you NOT build in-house, but you should not formalize the short-list in-house without the help of an expert advisory partner. There are 100s of companies out there, and just shortlisting SAP Ariba, Coupa, and Oracle is not the right answer — and it’s even worse if you shortlist Basware, Coupa, Oracle, and ScanMarket for S2P. While these are all great providers in their own right, they are not all S2P and it’s not an apples-to-apples comparison. And when it comes to best-of-breed solutions, the doctor has seen even worse shortlists!

This one of the reasons the doctor worked on the development of SolutionMap — by creating a custom profile, it can be used to identify the companies that best-match an organization’s need on the tech-axis, which allows the organization to shortlist the right vendors to invite to the RFI. Vendors that can meet basic tech needs and be compared in an apples-to-apples comparison … allowing the organization to focus on finding the provider that can best serve the organization overall and match their culture, versus focusing on basic check-the-box technology features just to find out 2 of the 3 shortlist providers don’t even meet the basics. (And this usually ends up with the organization having to go with the vendor that’s left versus selecting the vendor that’s the best.)

Platforms in 2020

Last week we talked about analyst predictions for analytics in 2020, most of which were just statements of the obvious, wishful thinking, or some combination thereof, but there was one prediction in particular that stood out … the one that was 100% correct. In particular, the prediction that companies will continue failing analytics and AI transformations.

Considering that most companies don’t have a good grip on analytics and an even worse grip on AI, what it really is, and how to judge if a company truly has some level of Artificial Intelligence — be it Assisted, Augmented, Cognitive, or Autonomous Intelligence — or if the company is just using Applied Indirection in their marketing.

But Analytics is just one aspect of technology that an average company is going to be interested, and if the company is not looking for a best-of-breed analytics vendor, it is looking for a platform. So what’s in-store for platforms in 2020?

Well, as usual, more of the same-old same-old, but their might be a few pinpoints of light in the near future. However, first, let’s discuss what’s going to happen for sure.

1) The M&A Mania is going to continue … and accelerate.
Workday’s (almost) ridiculous multiple for Scout (based upon current revenue) is going to make everyone hungry for acquisitions to keep up.

2) CLM and Analytics will be focal points.
Contract Management is the buzz, and while most organizations still don’t quite understand how to really extract value from it, no one wants to be left behind.
Similarly, AI is weaving it’s way into analytics, and while most vendors don’t have what the market thinks they have, it’s bringing analytics back into the limelight.

3) Mega-Acquirers (large companies and PE firms) will be all-in with suite mania.
If they don’t have a sourcing, supplier management, contract management, analytics, e-Procurement w/ Catalog Management, Invoice Management, and Payment management capability, they will be out to acquire any of those pieces as fast as possible to check all the major boxes and claim equivalency with Coupa, Ivalua, etc.
If they have the main pieces, they will be looking for ancillary pieces to increase the value and differentiate from the competition along the lines of T&E Management, BoM management for direct sourcing, Quality Management for Direct, Optimization and What-if Analysis, Freight “Broker” platform integration for (near) real-time weights and accurate Total Cost bids, etc.

But this is no surprise … it’s just an acceleration of what we’re seeing now.

So will anything be new?

1) “Chat-bots” will be put to work.
They will slowly transform from interactive help systems to actual assistants that will take commands and implement standard actions across the application. “Create an RFP for all off-contract products and products that will be off-contract in 90 days in the office supplies category” will find the template, find the products, identify the minimum information needed (release date, initial supplier pool, etc.) and ask it, and create a RFP ready to be finalized and sent out (using naming conventions, standard definition of incumbents, etc.).

2) “Predictive” Analytics will start to be integrated cross platform.
But don’t get too excited … for the most part it will be traditional trend algorithms or open-source models that have been found to typically work on that type of data and little to no machine learning, but it will be a step in the right direction.

3) “MDM” will be bandied around like it’s the new acronym candy.
And while platforms will make progress in terms of managing all of the data that flow through them, their ability to push data back to source systems and manage master data across systems will still be a while off. MDM will stay in the hands of ERP and highly specialist vendors for a few years to come.

While not an in-depth discussion of the trends that will continue or the trends that will start, it’s a good start.

20 Analytics Predictions from the “Experts” for 2020 Part II

In our last post we started reviewing 20 analytics predictions being peddled by the major analytics futurists and analytics sites. Why? Because while overly optimistic futurist authors rarely get it right, their predictions do point out two things. What should be done — and isn’t getting done — and where the space needs to go.

And even though 19 of these anti-predictions won’t (fully) come to pass this year, we started reviewing them one by one to give you a reality and indicate what is likely coming sooner than later, and what is still a pipe dream. Most of the predictions we reviewed yesterday were those that fell into the “aren’t happening” or “aren’t really happening at all” (because they are more of the same old, same old) buckets, but today we get to some that will start to materialize and the one, yes one, that is 100% true — and that you need to be fully aware of.

So settle in and let’s finish this.

AI becomes more mainstream

Well, acceptance of AI will continue to become more mainstream, but considering that most “AI” providers are actually providers of “Artificial Indirection” and have no AI at all, not even at the level of “Assisted Intelligence”. Most providers of “AI” are just providers of RPA (robotic process automation) at-best, and a configurable rules-engine at worst.

Multi-hybrid

A few vendors are offering multi-hybrid analytics solutions, and a few more will, but there will be nothing new. It will be one solution for integrated in-platform analytics, another for do-it-yourself analytics, and possibly an in-house developed third for database management and cube construction. But there’s going to be no significant changes here — most practitioners are going to use what their vendors give them.

Analytics will become usable by business analysts

Well, this one is half true. With recent advances in user interfaces and usability, it will become more usable … but … only to the better half of the business analysts … and … only with training. And this is where this particular prediction fails. Training has been high on the priority list for a decade, and it’s also been high on the “cut when budgets need trimming” list for a decade as well. There will be little to no training as per the norm, so only the most dedicated will self-learn and use it.

Data governance takes centre stage

This prediction is likely to come sooner than you might think, but not in 2020. Until there is a big cost associated with the lack of data governance, like training, it’s going to remain high on the priority list but not going to get centre stage. This will only change when lack of governance risks a huge fine or a large organization loses a major court case with a large judgement that was the result of lack of governance (which resulted in data exposure) which could have happened to any governance.

AI ethics standards will emerge

We all wish this will happen, but as with data governance, until a large organization loses a discriminatory court case as a result of an AI decision, and the court holds the organization responsible for that AI decision, no one is going to put any real effort, beyond lip service, into AI ethics. At least from a vendor perspective. A few lawyers hungry to make a name for themselves might, but that’s about it.

Analytics will hit the C-Suite

Re-set the woody woodpecker laugh track. If the average business analyst is not going to get much more involved with analytics, then you can bet the average C-Suite executive is not going to get much more involved either. They might get better reports and dashboards, but that’s it.

Intelligent assistants that connect the dots will become more pervasive

This is another half-truth. “Intelligent assistants” that allow a user to interact with the application in natural language, and especially English, will continue to infiltrate S2P platforms, but as to connecting-the-dots … not likely. That will require true embedded machine learning technology, and that’s still far away for the average provider.

Open source is going down the drain thanks to cloud platforms

This is yet another half-truth. While it is true that as more and more providers lock into a cloud platform (such as Azure, Microsoft, and Google) they will lock into whatever analytics are provided in the platform, this is not going to stop open source efforts — although uptake may trickle off for a while.

Effective implementation will continue to be a challenge

This is mostly true. Effective implementation will continue to be a challenge for the majority of organizations, and only a few best-of-breed providers will see the challenge of effective implementations decrease. As data continues to proliferate, especially considering the average quality of data, analytics will continue to get more challenging on the whole.

And now, finally, the one prediction the doctor found that is 100% accurate.

Companies will continue failing analytics & AI transformations

This is absolutely true. Considering that analytics requires good data and AI requires lots of good data, good algorithms, and experts to guide the algorithms, and most companies have poor data, poorer algorithms, and a dearth of experts … and often rely on vendors who peddle applied indirection, the doctor expects a big uptick in failures until the space educates themselves on what AI truly is, what the levels are, what is actually out there, and who is actually offering it.

For details on what the levels are, and what is coming, keep your eyes on SI and SM, and if your organization has been smart enough to subscribe, check out the doctor‘s pieces over on Spend Matters Pro on AI in Supplier Discovery, Sourcing, Optimization, Procurement, and Supplier Management.

AI in Procurement: [Spend Matters Pro subscription required]
Today Part I,
Today Part II
Tomorrow Part I,
Tomorrow Part II,
Tomorrow Part III
The Day After

AI in Sourcing: [Spend Matters Pro subscription required]
Today
Tomorrow Part I,
Tomorrow Part II
The Day After

AI in Sourcing Optimization: [Spend Matters Pro subscription required]
Today
Tomorrow
The Day After Part I,
The Day After Part II

AI in Supplier Discovery: [Spend Matters Pro subscription required]
Today
Tomorrow
The Day After

AI in Supplier Management: [Spend Matters Pro subscription required]
Today Part I,
Today Part II
Tomorrow Part I,
Tomorrow Part II
The Day After

20 Analytics Predictions from the “Experts” for 2020 Part I

Guess how many will be 100% accurate?

(We’ll give you a hint. You only need one hand. You won’t need your thumb. And you’ll probably have fingers to spare.)

the doctor has been scouring the internet for the usual prediction articles to see what 2020 won’t have in store. Because if there is just one thing overly optimistic futurist authors are good at, it’s at pointing out what won’t be happening anytime soon, even though it should be.

This is not to say they’re all bust — some will materialize eventually and others indicate where a turning point may be needed — but they’re definitely not this year’s reality (and maybe not even this decade’s).

So, to pump some reality into the picture, the doctor is going to discuss the 19 anti-predictions that are taking over mainstream Net media … and then discuss the 1 prediction he found that is entirely 100% accurate.

In no particular order, we’ll take the predictions one by one.

Performance benchmarks will be replaced by efficiency benchmarks

This absolutely needs to happen. Performance benchmarks only tell you how good you’ve done, not how good you are going to do in the future. The only indication of that is how good you are doing now, and this is best measured by efficiency. But since pretty much all analytics vendors are just getting good at performance benchmarks and dashboards, you can bet efficiency is still a long way coming.

IoT becomes queryable and analyzable

… but not in real-time. Right now, the best that will happen is that the signals will get pushed into a database on a near-real time schedule (which will be at least daily), indexed on a near-real time basis (at least daily), and support meaningful queries that can provide real, usable, actionable information that will help users make decisions faster than ever before (but not yet real-time).

Rise of data micro-services

Data micro-services will continue to proliferate, but does this mean that they will truly rise, especially in a business — or Procurement — context. The best that will happen is that more analytics vendors will integrate more useful data streams for their clients to make use of — market data, risk data, supplier data, product data, etc. — but real-time micro-service subscriptions are likely still a few years off.

More in-memory processing

In-memory processing will continue to increase at the same rate its been increasing at for the last decade. No more, no less. We’re not at the point where more vendors will spend big on memory and move to all in-memory processing or abandon it.

More natural-language processing

Natural language processing will continue to increase at the same rate its been increasing for the last decade. No more, no less. We’re not at the point where more vendors will dive in any faster or abandon it. It’s the same-old, same-old.

Graph analytics

Graph analytics will continue to worm its way into analytics platforms, but this won’t be the year it breaks out and takes over. Most vendors are still using traditional relational databases … object databases are still a stretch.

Augmented analytics

The definition of augmented is a system that can learn from human feedback and provide better insights and/or recommendations over time. While we do have good machine learning technology that can learn from human interaction and optimize (work)flows, when it comes to analytics, good insights comes from identifying the right data to present to the user and, in particular, data that extends beyond organizational data such as current market rates, supplier risk data, product performance data, etc.

Until we have analytics platforms that are tightly integrated with the right market and external data, and machine learning that learns not just from user workflows on internal data, but external data and human decisions based on that external data, we’re not going to have much in the way of useful augmented analytics in spend analysis platforms. The few exceptions in the next few years will be those analytics vendors that live inside consultancies that do category management, GPO sourcing, and similar services that collect meaningful market data on categories and savings percentages to help customers do relevant opportunity analysis.

Predictive analytics

As with natural language processing, predictive analytics will continue to be the same-old same-old predictions based on traditional trend analysis. There won’t be much ground-breaking here as only the vendors that are working on neural networks, deep learning, and other AI technologies will make any advancements — but the majority of these vendors are not (spend) analytics vendors

Data automation

RPA is picking up, but like in-memory processing and semantic technology, it’s not going to all-of-a-sudden become mainstream, especially in analytics. Especially since it’s not just automating input and out-of-the-box reports that is useful, but automating processes that provide insight. And, as per our discussion of augmented analytics, insight requires external data integrated with internal data in meaningful trends.

No-code analytics

Cue the woody woodpecker laugh track please! Because true analytics is anything but low-code. It’s lots and lots and lots of code. Hundreds and Thousands and Hundreds of Thousands of lines of codes. Maybe the UI makes it easy to build reports and extract insights with point-and-click and drag-and-drop and allows an average user to do it without scripting, but the analytics provider will be writing even more code than you know to make that happen.

Come back tomorrow as we tackle the next ten.

2020 Is Here. Will we ever Get 20/20 Vision into our Technology Providers?

AI. Virtual Reality. Augmented Intelligence. Big Data. Autonomous Software. The Futurists are in a prediction frenzy and throwing around these words not only like everyone understands them but every provider has them.

Very few providers actually have these technologies, but the sad reality is that very few providers aren’t claiming to have them. obviously, this is a problem. A big problem. Because the number of providers claiming to have these technologies and actually have them is only a small percentage — making it hard for anyone to see the big picture.

But we need to — and we need to see it clearly. Very clearly — because, as we have indicated many times, there is a lot more applied indirection out there than artificial intelligence. Similarly, it’s not really virtual reality unless its immersive, and while a lot of gamers might immerse all of their focus into their games, most are not truly immersive. It’s not augmented intelligence unless the application intelligently provides a recommendation, and associated process, that is at least as good as you would come up with and, preferably, as good as a human expert. It’s not even close to being Big Data unless the application is capable of processing and working with more data than can fit in memory on an average server. (Big Data is a moving target — what was big in 2000 is small today.) And it’s not autonomous unless the application is capable of doing processes that would normally take a human to do on its own with the exception of truly exceptional situations (as it should be able to handle most exceptions, especially if the exception was handled before).

The reality is that while software is going to get more automated, and usability is going to continue to improve, we’re not going to see real AI for a while. The “Big Data” that most applications will be capable of handling will continue to be limited to user machine / browser memory. Virtual Reality is a ways off. Augmented Reality will continue to advance, but primarily in gaming.

But depending on what you are looking for, you likely don’t need AI, don’t need “big data”, don’t need autonomous, and definitely don’t need virtual reality. You just need a system that allows you, with some simple RPA, to digitize paper processes, automate common processes, and improve productivity.

And it would be nice if we could get some real 20/20 vision into what vendors actually have and what you really need.

But that might still be a pipe dream.