Category Archives: Technology

Tailoring in 2020 …

Even though bespoke tailoring didn’t come into vogue in the UK until the early 1700s, the modern art of tailoring is an age-old practice that dates back to at least the 1200s when skilled garment makers would make custom attire for the royalty in their realms. These tailors would provide a “made to measure” service would insure that each garment, original and unique to each customer, would fit perfectly.

Tailors understood the value of service, so the question is, why don’t your platform providers. They all promise the perfect fit, but most don’t deliver. Why?

Well, there are a slew of reasons. Many providers claiming to be Procurement 3.0 are actually still delivering hacked together 2.0 solutions with limited capability and even more limited customization. But will this change?

With some providers, especially those with platforms with true 3.0 foundations who are embarking on completing their journey with the hopes of someday embarking on the Procurement 4.0 journey (which right now is unobtainable* despite the proclamations of the futurists), it will.

Platforms will not only be more configurable, but they will be configurable by you and, more appropriately, as they get more complete, and smarter, they will begin to adapt to you. Smart assistants will learn your grammar and usage patterns and immediately guide you to what you ask. Augmented intelligence will provide you insights you need where you need them … not 3 reports and 6 drill-downs away from where you need the insight.

Basically, what we are saying is, now that we are into the third decade of stand-alone best-of-breed Procurement technology, it’s time that the technology works for you. No longer should you be burdened with technology that makes you work for it. So when you are looking for a platform, look for one offered by a tailor, not by a one-size-fits-all milling machine.

* For reasons that we’ll discuss in the near future …

S2C Decision Tree …

Over on Purchasing Insight, Pete Loughlin ran a great post on the “build or buy decision tree for Purchase-to-Pay” that should not be overlooked because it gives every organization a very simple answer that even the most luddite of C-Suites can understand … NO!

You do NOT build a P2P system in-house. In fact, you should NOT have been building or maintaining a P2P system in house since the early part of the last decade — but with so many suite providers to choose from now, the fact that some organizations are still even considering building a P2P solution is almost inconceivable in-and-of itself.

As Pete Loughlin clearly states, when facing the build-or-buy question you first need to to ask yourself if the problem you are trying to address is new, uniquely different or so rare that a suitable solution doesn’t exist already. And the only reason you’d build in-house is if you could honestly answer no. In the days where there were only a couple of solutions, and they only worked well with ERPs or indirect purchases, there might have been good reasons to say no, but now that there are dozens of options, that can be focused on indirect, services, direct, or the whole kit-and-kaboodle, the only reason you’d say no is if you were completely unaware of what has happened in the space in the last 20 years — and if that is the case, you really shouldn’t be making the decision.

However, the reason SI is drawing this to your attention is not just because you shouldn’t be building P2P in-house, but because you shouldn’t be building S2C and, most definitely, shouldn’t be building S2P (or any component there-of) in-house either! But the real reason SI is bringing this to your attention is the flow-start doesn’t stop there … it continues. Not only should you NOT build in-house, but you should not formalize the short-list in-house without the help of an expert advisory partner. There are 100s of companies out there, and just shortlisting SAP Ariba, Coupa, and Oracle is not the right answer — and it’s even worse if you shortlist Basware, Coupa, Oracle, and ScanMarket for S2P. While these are all great providers in their own right, they are not all S2P and it’s not an apples-to-apples comparison. And when it comes to best-of-breed solutions, the doctor has seen even worse shortlists!

This one of the reasons the doctor worked on the development of SolutionMap — by creating a custom profile, it can be used to identify the companies that best-match an organization’s need on the tech-axis, which allows the organization to shortlist the right vendors to invite to the RFI. Vendors that can meet basic tech needs and be compared in an apples-to-apples comparison … allowing the organization to focus on finding the provider that can best serve the organization overall and match their culture, versus focusing on basic check-the-box technology features just to find out 2 of the 3 shortlist providers don’t even meet the basics. (And this usually ends up with the organization having to go with the vendor that’s left versus selecting the vendor that’s the best.)

Platforms in 2020

Last week we talked about analyst predictions for analytics in 2020, most of which were just statements of the obvious, wishful thinking, or some combination thereof, but there was one prediction in particular that stood out … the one that was 100% correct. In particular, the prediction that companies will continue failing analytics and AI transformations.

Considering that most companies don’t have a good grip on analytics and an even worse grip on AI, what it really is, and how to judge if a company truly has some level of Artificial Intelligence — be it Assisted, Augmented, Cognitive, or Autonomous Intelligence — or if the company is just using Applied Indirection in their marketing.

But Analytics is just one aspect of technology that an average company is going to be interested, and if the company is not looking for a best-of-breed analytics vendor, it is looking for a platform. So what’s in-store for platforms in 2020?

Well, as usual, more of the same-old same-old, but their might be a few pinpoints of light in the near future. However, first, let’s discuss what’s going to happen for sure.

1) The M&A Mania is going to continue … and accelerate.
Workday’s (almost) ridiculous multiple for Scout (based upon current revenue) is going to make everyone hungry for acquisitions to keep up.

2) CLM and Analytics will be focal points.
Contract Management is the buzz, and while most organizations still don’t quite understand how to really extract value from it, no one wants to be left behind.
Similarly, AI is weaving it’s way into analytics, and while most vendors don’t have what the market thinks they have, it’s bringing analytics back into the limelight.

3) Mega-Acquirers (large companies and PE firms) will be all-in with suite mania.
If they don’t have a sourcing, supplier management, contract management, analytics, e-Procurement w/ Catalog Management, Invoice Management, and Payment management capability, they will be out to acquire any of those pieces as fast as possible to check all the major boxes and claim equivalency with Coupa, Ivalua, etc.
If they have the main pieces, they will be looking for ancillary pieces to increase the value and differentiate from the competition along the lines of T&E Management, BoM management for direct sourcing, Quality Management for Direct, Optimization and What-if Analysis, Freight “Broker” platform integration for (near) real-time weights and accurate Total Cost bids, etc.

But this is no surprise … it’s just an acceleration of what we’re seeing now.

So will anything be new?

1) “Chat-bots” will be put to work.
They will slowly transform from interactive help systems to actual assistants that will take commands and implement standard actions across the application. “Create an RFP for all off-contract products and products that will be off-contract in 90 days in the office supplies category” will find the template, find the products, identify the minimum information needed (release date, initial supplier pool, etc.) and ask it, and create a RFP ready to be finalized and sent out (using naming conventions, standard definition of incumbents, etc.).

2) “Predictive” Analytics will start to be integrated cross platform.
But don’t get too excited … for the most part it will be traditional trend algorithms or open-source models that have been found to typically work on that type of data and little to no machine learning, but it will be a step in the right direction.

3) “MDM” will be bandied around like it’s the new acronym candy.
And while platforms will make progress in terms of managing all of the data that flow through them, their ability to push data back to source systems and manage master data across systems will still be a while off. MDM will stay in the hands of ERP and highly specialist vendors for a few years to come.

While not an in-depth discussion of the trends that will continue or the trends that will start, it’s a good start.

20 Analytics Predictions from the “Experts” for 2020 Part I

Guess how many will be 100% accurate?

(We’ll give you a hint. You only need one hand. You won’t need your thumb. And you’ll probably have fingers to spare.)

the doctor has been scouring the internet for the usual prediction articles to see what 2020 won’t have in store. Because if there is just one thing overly optimistic futurist authors are good at, it’s at pointing out what won’t be happening anytime soon, even though it should be.

This is not to say they’re all bust — some will materialize eventually and others indicate where a turning point may be needed — but they’re definitely not this year’s reality (and maybe not even this decade’s).

So, to pump some reality into the picture, the doctor is going to discuss the 19 anti-predictions that are taking over mainstream Net media … and then discuss the 1 prediction he found that is entirely 100% accurate.

In no particular order, we’ll take the predictions one by one.

Performance benchmarks will be replaced by efficiency benchmarks

This absolutely needs to happen. Performance benchmarks only tell you how good you’ve done, not how good you are going to do in the future. The only indication of that is how good you are doing now, and this is best measured by efficiency. But since pretty much all analytics vendors are just getting good at performance benchmarks and dashboards, you can bet efficiency is still a long way coming.

IoT becomes queryable and analyzable

… but not in real-time. Right now, the best that will happen is that the signals will get pushed into a database on a near-real time schedule (which will be at least daily), indexed on a near-real time basis (at least daily), and support meaningful queries that can provide real, usable, actionable information that will help users make decisions faster than ever before (but not yet real-time).

Rise of data micro-services

Data micro-services will continue to proliferate, but does this mean that they will truly rise, especially in a business — or Procurement — context. The best that will happen is that more analytics vendors will integrate more useful data streams for their clients to make use of — market data, risk data, supplier data, product data, etc. — but real-time micro-service subscriptions are likely still a few years off.

More in-memory processing

In-memory processing will continue to increase at the same rate its been increasing at for the last decade. No more, no less. We’re not at the point where more vendors will spend big on memory and move to all in-memory processing or abandon it.

More natural-language processing

Natural language processing will continue to increase at the same rate its been increasing for the last decade. No more, no less. We’re not at the point where more vendors will dive in any faster or abandon it. It’s the same-old, same-old.

Graph analytics

Graph analytics will continue to worm its way into analytics platforms, but this won’t be the year it breaks out and takes over. Most vendors are still using traditional relational databases … object databases are still a stretch.

Augmented analytics

The definition of augmented is a system that can learn from human feedback and provide better insights and/or recommendations over time. While we do have good machine learning technology that can learn from human interaction and optimize (work)flows, when it comes to analytics, good insights comes from identifying the right data to present to the user and, in particular, data that extends beyond organizational data such as current market rates, supplier risk data, product performance data, etc.

Until we have analytics platforms that are tightly integrated with the right market and external data, and machine learning that learns not just from user workflows on internal data, but external data and human decisions based on that external data, we’re not going to have much in the way of useful augmented analytics in spend analysis platforms. The few exceptions in the next few years will be those analytics vendors that live inside consultancies that do category management, GPO sourcing, and similar services that collect meaningful market data on categories and savings percentages to help customers do relevant opportunity analysis.

Predictive analytics

As with natural language processing, predictive analytics will continue to be the same-old same-old predictions based on traditional trend analysis. There won’t be much ground-breaking here as only the vendors that are working on neural networks, deep learning, and other AI technologies will make any advancements — but the majority of these vendors are not (spend) analytics vendors

Data automation

RPA is picking up, but like in-memory processing and semantic technology, it’s not going to all-of-a-sudden become mainstream, especially in analytics. Especially since it’s not just automating input and out-of-the-box reports that is useful, but automating processes that provide insight. And, as per our discussion of augmented analytics, insight requires external data integrated with internal data in meaningful trends.

No-code analytics

Cue the woody woodpecker laugh track please! Because true analytics is anything but low-code. It’s lots and lots and lots of code. Hundreds and Thousands and Hundreds of Thousands of lines of codes. Maybe the UI makes it easy to build reports and extract insights with point-and-click and drag-and-drop and allows an average user to do it without scripting, but the analytics provider will be writing even more code than you know to make that happen.

Come back tomorrow as we tackle the next ten.

2020 Is Here. Will we ever Get 20/20 Vision into our Technology Providers?

AI. Virtual Reality. Augmented Intelligence. Big Data. Autonomous Software. The Futurists are in a prediction frenzy and throwing around these words not only like everyone understands them but every provider has them.

Very few providers actually have these technologies, but the sad reality is that very few providers aren’t claiming to have them. obviously, this is a problem. A big problem. Because the number of providers claiming to have these technologies and actually have them is only a small percentage — making it hard for anyone to see the big picture.

But we need to — and we need to see it clearly. Very clearly — because, as we have indicated many times, there is a lot more applied indirection out there than artificial intelligence. Similarly, it’s not really virtual reality unless its immersive, and while a lot of gamers might immerse all of their focus into their games, most are not truly immersive. It’s not augmented intelligence unless the application intelligently provides a recommendation, and associated process, that is at least as good as you would come up with and, preferably, as good as a human expert. It’s not even close to being Big Data unless the application is capable of processing and working with more data than can fit in memory on an average server. (Big Data is a moving target — what was big in 2000 is small today.) And it’s not autonomous unless the application is capable of doing processes that would normally take a human to do on its own with the exception of truly exceptional situations (as it should be able to handle most exceptions, especially if the exception was handled before).

The reality is that while software is going to get more automated, and usability is going to continue to improve, we’re not going to see real AI for a while. The “Big Data” that most applications will be capable of handling will continue to be limited to user machine / browser memory. Virtual Reality is a ways off. Augmented Reality will continue to advance, but primarily in gaming.

But depending on what you are looking for, you likely don’t need AI, don’t need “big data”, don’t need autonomous, and definitely don’t need virtual reality. You just need a system that allows you, with some simple RPA, to digitize paper processes, automate common processes, and improve productivity.

And it would be nice if we could get some real 20/20 vision into what vendors actually have and what you really need.

But that might still be a pipe dream.