Category Archives: Technology

20 Analytics Predictions from the “Experts” for 2020 Part I

Guess how many will be 100% accurate?

(We’ll give you a hint. You only need one hand. You won’t need your thumb. And you’ll probably have fingers to spare.)

the doctor has been scouring the internet for the usual prediction articles to see what 2020 won’t have in store. Because if there is just one thing overly optimistic futurist authors are good at, it’s at pointing out what won’t be happening anytime soon, even though it should be.

This is not to say they’re all bust — some will materialize eventually and others indicate where a turning point may be needed — but they’re definitely not this year’s reality (and maybe not even this decade’s).

So, to pump some reality into the picture, the doctor is going to discuss the 19 anti-predictions that are taking over mainstream Net media … and then discuss the 1 prediction he found that is entirely 100% accurate.

In no particular order, we’ll take the predictions one by one.

Performance benchmarks will be replaced by efficiency benchmarks

This absolutely needs to happen. Performance benchmarks only tell you how good you’ve done, not how good you are going to do in the future. The only indication of that is how good you are doing now, and this is best measured by efficiency. But since pretty much all analytics vendors are just getting good at performance benchmarks and dashboards, you can bet efficiency is still a long way coming.

IoT becomes queryable and analyzable

… but not in real-time. Right now, the best that will happen is that the signals will get pushed into a database on a near-real time schedule (which will be at least daily), indexed on a near-real time basis (at least daily), and support meaningful queries that can provide real, usable, actionable information that will help users make decisions faster than ever before (but not yet real-time).

Rise of data micro-services

Data micro-services will continue to proliferate, but does this mean that they will truly rise, especially in a business — or Procurement — context. The best that will happen is that more analytics vendors will integrate more useful data streams for their clients to make use of — market data, risk data, supplier data, product data, etc. — but real-time micro-service subscriptions are likely still a few years off.

More in-memory processing

In-memory processing will continue to increase at the same rate its been increasing at for the last decade. No more, no less. We’re not at the point where more vendors will spend big on memory and move to all in-memory processing or abandon it.

More natural-language processing

Natural language processing will continue to increase at the same rate its been increasing for the last decade. No more, no less. We’re not at the point where more vendors will dive in any faster or abandon it. It’s the same-old, same-old.

Graph analytics

Graph analytics will continue to worm its way into analytics platforms, but this won’t be the year it breaks out and takes over. Most vendors are still using traditional relational databases … object databases are still a stretch.

Augmented analytics

The definition of augmented is a system that can learn from human feedback and provide better insights and/or recommendations over time. While we do have good machine learning technology that can learn from human interaction and optimize (work)flows, when it comes to analytics, good insights comes from identifying the right data to present to the user and, in particular, data that extends beyond organizational data such as current market rates, supplier risk data, product performance data, etc.

Until we have analytics platforms that are tightly integrated with the right market and external data, and machine learning that learns not just from user workflows on internal data, but external data and human decisions based on that external data, we’re not going to have much in the way of useful augmented analytics in spend analysis platforms. The few exceptions in the next few years will be those analytics vendors that live inside consultancies that do category management, GPO sourcing, and similar services that collect meaningful market data on categories and savings percentages to help customers do relevant opportunity analysis.

Predictive analytics

As with natural language processing, predictive analytics will continue to be the same-old same-old predictions based on traditional trend analysis. There won’t be much ground-breaking here as only the vendors that are working on neural networks, deep learning, and other AI technologies will make any advancements — but the majority of these vendors are not (spend) analytics vendors

Data automation

RPA is picking up, but like in-memory processing and semantic technology, it’s not going to all-of-a-sudden become mainstream, especially in analytics. Especially since it’s not just automating input and out-of-the-box reports that is useful, but automating processes that provide insight. And, as per our discussion of augmented analytics, insight requires external data integrated with internal data in meaningful trends.

No-code analytics

Cue the woody woodpecker laugh track please! Because true analytics is anything but low-code. It’s lots and lots and lots of code. Hundreds and Thousands and Hundreds of Thousands of lines of codes. Maybe the UI makes it easy to build reports and extract insights with point-and-click and drag-and-drop and allows an average user to do it without scripting, but the analytics provider will be writing even more code than you know to make that happen.

Come back tomorrow as we tackle the next ten.

2020 Is Here. Will we ever Get 20/20 Vision into our Technology Providers?

AI. Virtual Reality. Augmented Intelligence. Big Data. Autonomous Software. The Futurists are in a prediction frenzy and throwing around these words not only like everyone understands them but every provider has them.

Very few providers actually have these technologies, but the sad reality is that very few providers aren’t claiming to have them. obviously, this is a problem. A big problem. Because the number of providers claiming to have these technologies and actually have them is only a small percentage — making it hard for anyone to see the big picture.

But we need to — and we need to see it clearly. Very clearly — because, as we have indicated many times, there is a lot more applied indirection out there than artificial intelligence. Similarly, it’s not really virtual reality unless its immersive, and while a lot of gamers might immerse all of their focus into their games, most are not truly immersive. It’s not augmented intelligence unless the application intelligently provides a recommendation, and associated process, that is at least as good as you would come up with and, preferably, as good as a human expert. It’s not even close to being Big Data unless the application is capable of processing and working with more data than can fit in memory on an average server. (Big Data is a moving target — what was big in 2000 is small today.) And it’s not autonomous unless the application is capable of doing processes that would normally take a human to do on its own with the exception of truly exceptional situations (as it should be able to handle most exceptions, especially if the exception was handled before).

The reality is that while software is going to get more automated, and usability is going to continue to improve, we’re not going to see real AI for a while. The “Big Data” that most applications will be capable of handling will continue to be limited to user machine / browser memory. Virtual Reality is a ways off. Augmented Reality will continue to advance, but primarily in gaming.

But depending on what you are looking for, you likely don’t need AI, don’t need “big data”, don’t need autonomous, and definitely don’t need virtual reality. You just need a system that allows you, with some simple RPA, to digitize paper processes, automate common processes, and improve productivity.

And it would be nice if we could get some real 20/20 vision into what vendors actually have and what you really need.

But that might still be a pipe dream.

Virtual Procurement Centers of Excellence: Will We Ever Realize Them?

Three years ago we told you that Virtual Procurement Centers of Excellence where The Next Level of Complex Direct Procurement and that your sourcing platform should enable this.

But it’s three years later, and we still only have a handful of S2P platforms that can properly support bill of materials for direct sourcing. (In fact, you don’t even need all your fingers!)

Add this to the fact that either ERP integration is minimal, that support for modification and should-cost modelling is limited, or there is no support for integrating price indices or market intelligence, and it’s still a pretty sorry state of affairs.

Especially since true value is only going to be realized not only with proper insights into bill of material costs, but what the bill of materials should look like. (Maybe the steel being used is inferior, the rare earth metal component could be reduced with a better design, etc.)

In other words you need a platform that not only supports full ERP integration, BoM modelling and management and deep should-cost modelling but also up-to-date market intelligence. This should not be limited to commodity feeds (as these are not global or available for all commodities), but should also use community intelligence, especially around labour rates and energy costs in a region.

But we only have one S2P platform with real budding community intelligence, and it’s support for direct is relatively non-existent compared to some peers.

So the question is, are we ever going to realize them? For all of the reasons we gave three years ago, and then some, we need Virtual Procurement Centers of Excellence for Direct, backed by Market and Community Intelligence, but they still seem to be in the distant future.

So what do you think? Are the current S2P players going to up their game to where we need them to be? Or do we need a new breed of players to come out of the shadows and show the market what is needed.

The Platform is Becoming Ever More Important …

In Monday’s post, we quoted an except from Magnus’ interview on Spend Matters where he noted how important it is to start with the most important capabilities / modules and build out towards a full S2P suite (because he knows as well as the doctor does that a big bang approach typically results in a big explosive bang that usually takes your money and credibility with it). If you examine this closely, you see that you need to select not only the right starting solution, but a starting solution that can grow.

This requires a platform approach from the get-go. It doesn’t need to underlie the starting modules, it doesn’t need to underlie the ending modules, it just needs to underlie the suite you want to put together. It can be part of an application you already have or a third party application you buy later. But it has to exist.

The simple fact of the matter is that you can’t put together an integrated solution that supports an integrated source-to-pay workflow if you don’t have a platform to build it on. And you can’t patch it together just using endpoint integrations using whatever APIs — that’s just enabling you to push data from one point into another … or pull it from one point to another. That’s not an integrated solution, which requires an integrated workflow, just data integration. And while that is a start, it’s not enough. Especially when there is no one-size fits all category strategy and source to contract or procure to pay workflow for even the smallest of organizations with the simplest of needs.

So before you select any solution, the first thing you have to make sure is that it is built on, or works with, a true platform … otherwise, you may find as you undertake your S2P journey that a component you selected early does not fit the bill and you have to repeat steps … which is something you really can’t afford to do.

You Wouldn’t Let Your Banker Pick Out Your Job …

So why do you let a systems implementor / integrator choose your Sourcing / Procurement system???

And while you might initially believe that this simile is far-fetched, the reality is that it’s very close to home. While a banker is the right partner to help you manage your money, he or she is probably the worst person to figure out the right job for you given that he or she doesn’t really know you. Similarly, while you’re preferred implementation / integration partner is probably the best company out there to implement the platform that will control the majority of your organizational spending, chances are that partner has no knowledge of the true breadth of your Procurement processes work and no clue what the right kind of system for the organization would be. And as a result, just like a banker might steer you towards a job you’d fail miserably at (and lose, leaving you without a pay cheque), an implementor / integrator might steer you towards a system that will not work at all for your organization, and cost your organization millions in the process.

Furthermore, this is also true for any consultancy that has partnerships with a select group of source-to-pay vendors. In fact, taking advice from any of the consultancies that have partnerships with a select group of source-to-pay vendors is MORE risky than an implementation partner without any relationships. Why? Because these consultancies, by way of their partnerships, tend to ONLY recommend their partners because:

  1. that’s all they tend to implement, and know, and
  2. their partnerships provide them with referral fees, guaranteed services, and / or higher margins (and the senior partners at these consultancies mandate that these options are always recommended)

So, if your preferred consulting partner only has relationships with platforms that are primarily for indirect S2P, but your organization is primarily direct S2P, your organization’s chances of getting a good recommendation are zero. That’s right. Zero! (Even worse than a generic systems implementor with no knowledge of the space doing a Google search, coming up with five vendors, and making a random recommendation — at least then you have a 20% chance of getting a good recommendation!)

In other words, if you want a good recommendation, you have to ask a neutral third party, like an analyst firm, a niche consultancy which does not do implementations (and has no partnerships), or a consultancy that uses third party evaluations to provide you with the best recommendations it can, leaving aside any partnerships the consultancy might have. (For example, such a consultancy could license Spend Matters Customer Maps, which are Solution Maps with custom personas defined specific to the client needs, to help your organization identify the best fits and then help your organization with the RFIs to identify the best-of-the-best).

Otherwise, the doctor can pretty much guarantee you’re always going to be recommended vendors A and B (and maybe C) in North America and vendors X and Y (and maybe Z) in Europe … even though there are 8 S2P platforms and dozens of best-of-breed solution providers that might be right for you (as Solution Map ranks over 50 and plans to add many more over time). [Not that A, B, C, X, Y, and Z aren’t good in the right situation — but in S2P, one-size does not fit all — especially when you consider direct vs indirect, product vs service, head vs tail spend, strategic process requirements, optimization and analytics needs, automation, etc. — and the fact that some providers never get recommended even though for certain industries they are usually the best choice.]

So again, unless you want a quick way to triple your losses, don’t let an implementor choose your S2P platform. You choose it, and as per a recent piece of the doctor‘s over on Spend Matters, you take what you want!