Monthly Archives: November 2016

Whenever you find yourself on the side of majority …

… it is time to pause and reflect.
Mark Twain

As far as the doctor is concerned, nothing could be closer to the truth. SI never strived to be #1 in breadth, readership, coverage, etc. for this very reason. It’s not happy with the majority that is willing to accept the status quo, the inefficiencies, the lack of progress, and, most importantly, the lack of innovation.

He’s also not interested to join the ranks of the analysts that think you can do an in-depth vendor review from a powerpoint presentation and a few customer references. SI has never covered a vendor without a live demo and never will. (At least this explains why so many tragic quadrants and product grave reports are so out of whack.)

Nor is he willing to allow “sponsored posts”. Such a slippery slope. Maybe you insist that the posts be written by the experts, and the sponsor agrees, but then you get a bait and switch where the expert was too busy, so the social media coordinator writes the post instead, it’s regurgitated drivel, but a spot was paid for and if you don’t post it, they threaten breach of contract.

Or attend half-a-dozen me-too conferences every spring and every fall and deliver the same old speech o’er and o’er again. Some analysts and bloggers like the junket circuit, but most of the people who are there time and time again fall into two categories: those that like to hear themselves talk and those that are trying to talk themselves into a new job.

Nor does he feel like towing the same-old same-old line put forth by all of the heavyweights that have been selling the same sauce for over a decade. (He has nothing against big companies that keep innovating, but, unfortunately, a few that sold out to big enterprise software companies really haven’t innovated anything since they did. Likely because integrations at these giants take so long that by the time you can get back to innovation, the boat has been commandeered by quicker competition who snuck aboard during the night and sailed it out of the harbour.)

And he definitely won’t sign an NDA to get a demo. (After all, how could you cover anything once you signed an NDA?) The rules are simple on SI — an open demo is a requirement for coverage. The vendor doesn’t have to show anything they don’t want to or answer any questions they don’t want too (but if too much is kept secretive and not enough is shown to convey the value, then the chances of coverage aren’t great as SI always wants to cast a vendor in a positive light if there is value in the solution for a subset of the market), but remain secretive at their own risk.

Is this alone enough to keep the doctor and SI out of the majority and on the innovation path? Hard to say, but that is where the doctor wishes to stay. And he hopes that you’ll stay here too and that you will …

I try not to get involved in the business of prediction …

… It’s a quick way to look like an idiot.
Warren Ellis

A new year is but six weeks away … and you know what that means (besides holiday frenzy, too much turkey, broken resolutions, and the Times Square ball drop) … prediction time is right around the corner.

Snow isn’t even on the ground yet (and remember that the doctor lives in the Great White North) and already we’re seeing “prediction” articles for 2017. For example, Labels & Labelling, trying to beat the rush, posted their predictions for 2017 where they quoted industry pros who predicted (surprise) more automation, more web-based customer interaction, the continual decline of the printing industry (traditional, not home-printer manufacturing), rapid change, and so on. But this followed a prediction from CIO that predicted cloud computing trends the 2nd of November, which should affect us all with almost all supply management software offerings being multi-tenant cloud instances. But this was over a month after Ardent Partners’ launched their tech and innovation outlook report for 2017 at the end of September.

Of course, they couldn’t get the jump on the price forecasters, like Metal Miner and Spend Matters, who have been posting price outlooks (such as the plastic resin price outlook) for a while now. Or the freight rate forecasters (including CIPS) that have been predicting rates since the quarter started as well.

And it’s driving the doctor nuts. the doctor hates predictions. Most of the good ones are based on logical assumptions that people are logical and will do the logical thing. They won’t. Not because they don’t want to, but because they work for organizations that don’t always make logical, or even informed decisions (as per yesterday’s post). If the CFO doesn’t believe the ROI claim, the COO doesn’t believe the efficiency improvement claim, or the CEO just doesn’t like the sound of it … it won’t happen. So even though it might seem logical that this will be the year this proven, successful, time saving, value generating solution will take off … it might, or, more likely, it might not.

Then there are the fake futurists who make crazy predictions (like this will be the year the printing press will die or radio advertising will end or the entire factory will be automated) just to get attention. For example: air freight will kill ocean freight, railroads will rise again, on-premise ERP will finally die, etc. We all know this ain’t gonna happen. Then there are the slightly non-obvious non-fantastic but boring predictions whose chances are near 50/50 that just re-iterates the same-old same-old until they come true. (But who wants to read the same-old, same-old again? As we’ve clearly indicated in our Future of Procurement series, it’s the same old sh!t over and over again. Please Kill It! [NSFW])

In other words, those who make predictions have two choices — be crazy, and look like an idiot, if they want to be read, or state the obvious, and bore their audience to sleep. (Let’s just hope their audience doesn’t have KLS. While it’s likely triggered by infection, medical science is not 100% sure.)

Like LOLCat, there’s only two predictions the doctor can get behind. The first is that put forward by the public defender who last year noted that, no matter what, all predictions will be wrong. The second, as astutely noted by the maverick, is that, regardless of what the false futurists predict, we will forever be in the year of the Chief Buzzword Officer. Ugh.

the doctor doesn’t mind looking like an idiot, especially if that’s what it takes to spread the truth, but please, please, please don’t ask for predictions. Please let them rest in peace!

Informed Decision-Making …

… comes from a long tradition of guessing and then blaming others for inadequate results.
Scott Adams

And this summarizes how most decisions are made in most companies, especially in the C-Suite. Why? Because, historically, organizations didn’t have much in the way of good data, most data that was available (by the time it was assembled, analyzed, and reported on) was outdated, and most decisions made on the data were iffy if the organization was in a fast-moving business.

So, as a result, the best executives learned that they had to learn to “read the tea leaves”, listen to third parties, ignore them, then go with whatever their gut was telling them … especially if their gut was right more than wrong (and that’s how they got to their position). (And then if something didn’t work out, they blamed the underlings that gave them the analysis that supported their gut feeling.*)

But in today’s fast moving hyperconnected world where, for every unsatisfied customer, there are three more companies waiting to jump in and satisfy that customer, there’s no time for slip-ups from bad, gut-feel, decisions. There’s no room for guessing.

And, with so many software applications today that can process more data than an organization needs to (as it’s not about how big a data set you can get, but how big a data set makes sense) in real time, every organization should be making decisions based upon good, extensive, data and likely possibilities. No data set is ever complete, no trend or data-based prediction is ever perfect, but when there are so many systems that can bat 950 when most of the best seat-of-the-pant executive decision makers struggle to bat 500, why isn’t the average organization using one of these systems for all decisions?

For starters, every organization should have a good spend analytics system that is capable of working on all spend, and numeric spend-related data, that can also compute trend lines. The organization should know what products are taking off, which are nearing the end of their life-cycle, and which are flat. And it should also be able to analyze market data to see how raw material cost and availability is trending.

But it needs to do more than just analyze organizational spend data, buying trends, and commodity markets. It also needs to analyze market trends in various product lines, including those it does not (yet) produce, and predict how good a new or altered product might sell based upon sales of similar product lines. So not only does it need a spend analytic solution, but it needs a predictive analytic solution as well. Every regular reader knows that the doctor does not believe in AI and that decisions should not be handed over blindly to a system, but that the suggestions of a good system with a good track record, properly configured, should be carefully evaluated, much more so than the gut-feel of a random executive. This is where analytics efforts are focussed, not down blind alleys. Especially when the batting average of these systems is almost double. They’ll miss occasionally, but a good analysis will reveal that (and why, which allow the system to be improved), and, most importantly, analytic effort will be focussed where it makes sense to focus, not on random areas to support random hypothesis with no foundation in reality (which is where a lot of effort is focussed in guess-work run organizations).

* The really successful executives always asked for multiple analysis until they had data that supported their decision, just in case.

It’s Not What You Pay a Man …

… but what he costs you that counts.
Will Rogers

Will Rogers was born in a time when many businesses were vertically integrated, controlling everything from the extraction of the raw material from the mine to the final delivery of the end product to the consumer, and they succeeded or failed on the caliber of man they hired.

But if he were alive today, I bet he’d be saying:

It’s not what you pay a vendor, but what the vendor costs you that counts.

All vendors of software and services cost you — and they typically cost much more than the license fee or consulting hour they bill you by. We’ll start with a services provider. Besides the myriad of expenses they will bill you for (that will be just within tolerance), there will also be the costs of supervising the resources, evaluating the deliverables, participating in regular review meetings, monitoring the relationship, and so on — and all of these will take up time which will eat up a huge opportunity cost.

But this is nothing compared to what a software/platform vendor will cost you.

A vendor touting the virtues of on-premise software will not only charge you an installation fee, a license fee, and an (on-site) maintenance fee, but will also charge you regular (emergency) upgrade fees, change fees, and so on. But there will also be the costs of the supporting software they need (databases, middleware, etc.), the hardware they need to run on, the training to use and support the software, and so on. If the vendor is ASP, these costs will all be rolled up and hidden in a monthly service fee that will also include the personnel costs to manage the instance and a portion of the overhead cost of the facility. And if the vendor is SaaS, there will still be a single fee, but since the facility is multi-tenant, it will be less.

But regardless of the platform, there will still be other costs — for instance, most of today’s sourcing and procurement platforms don’t deliver value unless the pre-requisites are met. For some platforms, this means connectivity to other systems. For others, this means good data … and lots of it. Data that typically resides in a myriad of other systems, in various forms of incompleteness and correctness, that needs to be centralized, corrected, completed, and enriched — an effort that can cost thousands of man-hours and hundreds of thousands of dollars for some organizations. And if the system is relatively worthless until most of that data is loaded (for example, spend analysis), then the organization will have to spend many times the system cost to get any source of value.

The same goes for systems that require templates and libraries to be useful — like contract management systems. If the authoring feature doesn’t simplify matters until a few hundred templates and a few thousand clauses are created, indexed, and cross-indexed, countless hours from paralegals and legals will be needed to make it useful.

In other words, when it comes to vendors, it’s not what you pay, it’s what they cost. And if the return doesn’t outweigh the cost by at least a factor of 3, think twice.

The First Rule of Any Technology …

… used in a business is that automation applied to an efficient operation will magnify the efficiency. The second is that automation applied to an inefficient operation will magnify the inefficiency.

Bill Gates

So many companies forget this in their rush to implement new S2C / P2P / S2P systems after finally getting budget approval. If you just automate what you have, you’ll simply amplify your mess and your problems.

Take sourcing. If, all of a sudden, a buyer can go from 50 mini-RFX events to 250 mini-RFX events, this is not always a good thing. What if the buyer is always using the suppliers she favours, who recommend custom SKUs for every project? In this situation, all that will happen is the buyer will proliferate SKUs throughout the system, often adding SKUs for products that were already supplied by another supplier (that the stockroom clerk ordered from), that was not invited to the RFQ as it was not one of the buyer’s favoured suppliers.

And while this theoretically increases Spend Under Management (SUM) as it gets the spend in the system, this just increases Spend Under Record (SUR) as, instead of properly managing the spend — which in this case would have resulted in SKU standardization instead of proliferation and category-based supplier rationalization based on a collective stakeholder scorecard and not just buyer preference — all the buyer did was add more chaos to the spend.

As another example, take invoice processing. As the purchasing wizard regularly laments over on Purchasing Insight, many organizations still think invoice automation is OCR and automatic field extraction based on keywords or relative location in the document. This in a time when most suppliers have EDI or the ability to send some form of standard XML, and when just about every decent e-Procurement or Source to Pay platform allows smaller suppliers without these abilities to “PO-flip” to an invoice. Some platforms even allow virtual printer drives to be distributed (for Windows and Mac) that will allow a supplier to “print” an invoice from their AR software straight to the e-Procurement platform — there are so many options that don’t require erroneous OCR, why would anyone in their right mind* even consider it.

Before automating anything, be sure to do a formal process review, identify any areas that are inefficient, and any areas that could be improved by technology. Then identify what the processes should look like. Only then do you automate. And be sure to measure whether or not the automation is delivering the planned results. This means that you should have, and be reading, throughput/efficiency metrics before the conversion, and throughput/efficiency metrics after the conversion. And the metrics should improve in the right direction. If they don’t, stop and figure out why. Automation should help, not hinder.

* We know, we know. Many MBAs aren’t always in their “right mind”. 😉