Category Archives: Technology

AI: Applied Indirection Part II

In yesterday’s post we told you that many companies that were touting AI were not actually selling Artificial Intelligence or even anything remotely similar (including, but not limited to, Autonomous Intelligence, Augmented Intelligence, Assisted Intelligence, and/or Amplified Intuition) and were, in fact, using the buzz-acronym to accomplish applied indirection and sell you 90s tech in a shiny new wrapper, proffering yesterday’s miracle cure for all of your current woes.

The only difference between the 90s solutions and todays is that today’s look nicer, run faster (but that’s mainly due to the exponential increase in computing power), and have more automation built in. But RPA — Robotic Process Automation — is NOT AI. It’s just using a rules engine and workflow to automate common tasks under typical conditions.

So how do you tell the difference between Applied Indirection and real (WEAK) AI? Well, first you think about what AI means, apply a little common sense, and ask some good questions.

Let’s start with thinking about what AI really means. AI typically stands for Artificial Intelligence, and the definition of AI in its strongest form is machine intelligence, where the machine can acquire knowledge, learn, apply it, and adapt to new, previously un-encountered real world situations in a general manner just as a human would do. If you think about it, no machine can do that, and no machine is even close. So there’s really no such thing as AI (and won’t be for decades).

At the same level of complexity is Autonomous Intelligence, which is Artificial Intelligence that is capable of acting on its own without any human interaction. Since true Artificial Intelligence doesn’t exist, it should be obvious that Autonomous Intelligence (outside of living beings on our planet), which is an AI agent that can work in complete isolation from human interaction, doesn’t exist either.

At the next level down we have Augmented Intelligence, where we don’t define a platform as being intelligent but as capable of providing knowledge and insight that we can use to complement and enhance our intelligence and make as faster and better at the tasks we are performing. At this level, there are tools that exist for well defined tasks, but they are few and far between. While there are a lot of systems that can allow us to do our jobs faster and better, they don’t augment our intelligence. For a system to truly be an augmented intelligence system, it must augment our intelligence, propose actions that we were not aware of (and would not think of in a little bit of time), and make us smarter over time. Very few systems do that, even when limited to very specific tasks.

Going down a level, we have Assisted Intelligence, where we don’t define a platform as intelligent, but capable of using knowledge and insight that it has to complement and enhance our daily performance of tasks by helping us to do them faster, better, or both. Like augmented intelligence platforms, they should be able to prescriptively suggest actions or workflows, but we don’t require that they be capable of identifying anything we wouldn’t in our jobs.

The big difference between augmented and assisted is that a platform that analyzes market data and dynamics and comes up with one of a pre-set of sourcing strategies as a recommendation is generally just assisted intelligence. In comparison, a platform that not only pulls in market feeds but scours the web for public pricing, articles on supply / demand (im)balance, third party audits, and reports on recent events and other data not pushed through integrated feeds; creates multiple pricing and availability projections; runs those projections through multiple models; and then recommends you extend the current agreement and buffer stock three months of supply (because an earthquake in China just closed down the mines that supply a significant amount of the rare earth metals used in your product and supply is likely to become scarce and pricing rise in six weeks) would be considered to be an augmented intelligence platform because even though you could do web searches to find updated public pricing, supply projections, third party audits, and natural disaster reports, there’s no guarantee you’re going to find the report on the local Chinese news feed (that won’t get picked up by an English news feed for two weeks because China downplayed the effect of the earthquake) when you only read English.

In other words, there are some assisted intelligent tools out there (that help you do your job better and faster, but aren’t going to do anything you can’t or come up with anything you wouldn’t if you just spent five minutes thinking about it), a few augmented intelligence platforms for specific tasks, but no autonomously intelligent, artificially intelligent, and definitely no cognitive platforms on the market — and if someone is trying to sell you that, they are using the marketing technique of applied indirection to sell you modern silicon snake oil.

You have been warned!

AI: Applied Indirection

You read that right. AI at most companies is not Artificial Intelligence. It’s not Autonomous Intelligence, Augmented Intelligence, Assisted Intelligence, or even Amplified Intuition. In reality, it is marketers taking Green Day’s AI a little to literally (and treating everyone like an American Idiot*) and repackaging old tech with a new label.

You see, most of what the Marketing Mad Men are trying to sell as AI are just old-school statistical algorithms in a brand-new wrapper. And the only reason these technologies are finally hitting the market and getting good results is the sheer amount of processing power and data we have at our disposal — because dumb algorithms (which is what they are) only work well when you have a lot of processing power, a lot more data, and the power plant to run that processing power 24/7 at 99% capacity across dozens, if not hundreds, of trial parameterizations until you find something that, well, just works.

But it’s not intelligence. It’s advanced curve fitting, regression, k-means clustering, support vector machines, and other statistical inference techniques that existed in SAS in the 1990s. Except now, the curve fitting is nth degree polynomial, advanced trigonometric, geometric, n-dimensional, step-wise, and adaptive. The regression is nonlinear, non-parametric, stepwise, and much more robust … and accurate because you can process millions of data points if you have them. The k-means is not clustering around one or two dimensions, but one or two dozen if necessary in a large multi-dimensional space — and the clusters can be of arbitrary n-dimensional geometric shapes using kernal machines. The support vector machines are not just based on primal, dual, and kernal classification with a bit of gradient descent but enhanced with multi-class support vectors, advanced regression, and transduction (to work with partial valued data). And so on.

And don’t think there’s anything new about “deep neural networks” either. They are just multi-level neural networks which were common-place in the 1990s with more levels and more nodes per level with more advanced statistical classification functions in each node trying to figure out how to extract patterns from unclassified data to classify and structure it, which happen to get better results because they can work on millions of data points, instead of thousands, and do tens of millions of calculations and re-calculations instead of tens of thousands. And that’s the only reason they get better results “out of the box”. There is absolutely nothing better or more advanced about the core technology. Nothing. It’s still as dumb as a door-knob, no matter how whizz-bang the markets make it out to be.

And at the end of the day, the “active” part of the neural network is a fraction of the overall network (which means as much as 90% of the computation is wasted), and if that can be identified and abstracted, you typically end up with a small neural network no bigger than the ones being used twenty years ago, which, even if more than three or four layers, can probably be redesigned as a three-or-four layer network. (See the recent article on the recent MIT Research, for example.) [But if you’ve studied advanced mathematical systems, this is not an unexpected results. Over-dumbification has always led to unnecessary processing and inferior results. Of course, over-smartification also leads to ineffective algorithms because data, typically produced by humans, is not perfect either and we need to account for this as well and detect small perturbations and deal with them. But it’s always better to be thoughtful in our design than to just brute force it.

In other words, many modern marketing madmen in enterprise software have become the new snake-oil salesmen, often selling simple statistical packages for a million dollars or raising tens of millions for yesterday’s tech in a shiny new wrapper. But it’s not intelligent, or even intuitive, by any stretch of the imagination.

That’s not to say that there isn’t technology that can qualify as assisted technology (and maybe even augmented in special cases), just that the majority of what’s being pushed your way isn’t.

So how do you know if you are among the majority being subjected to Applied Indirection or one of the few minority being offered a solution with true Assisted Intelligence capabilities? Stay tuned as we discuss this topic more in depth in the weeks to come …

* It’s much preferable to be a Canadian Idiot. We’re nicer and the “AI” marketers don’t bother us as much.

92 Years Ago Today …

The last Ford Model T rolls off the production line, ending a production run that lasted almost 19 years and produced over 16.5 Million units.

The Ford Model T, coloquially known as the Tin Lizzie, is iconic as it is generally regarded as the first affordable automobile that brought the automobile to the common middle class American, and this is, in part, why it was named the most influential car of the 20th century (as it is synonomous not only with the rise of the middle class but the modernization of America). Moreover, even ninety two years later, it is still the ninth best selling car of all time.

It was with the Model T that Ford pretty much perfected the modern American production line that revolutionized entire industries. The car should not be forgotten.

Can You Even Identify Savings to Realize?

A few week ago we sort of put the cart before the horse when we noted that Realizing Those Savings is No Easy Feat because many organizations will undertake sourcing events, cut contracts, but then fail to realize 30% to 40% or more of the expected savings (and this has been the case since AMR’s classic studies on savings realization over a decade ago, well before they were bought and absorbed by Gartner).

So even though it’s sort of putting the cart before the horse to put an infrastructure in place to capture savings, without such an infrastructure, identified savings won’t realize. So it’s really not a bad idea to start with Procurement platforms that capture savings, because you need them.

However, today we’re going to assume you have such an infrastructure in place, and ask the question, even if you do, can you identify real savings? It’s a lot harder than you think. It’s not the lowest cost. Or the lowest landed cost. It’s the lowest total cost of ownership … over the product lifetime, which could be for years if you offer a warranty. Because not only is their warranty costs, there are return logistics costs as well!

But it’s not easy to capture all of the relevant costs in an RFI, nor is it easy to build the models that can accurately model total lifetime cost of ownership in Excel. That’s why the doctor has been promoting optimization-backed sourcing platforms for years — only those platforms can accurately compute lifetime costs and allow for the right fact-based negotiations and award decisions.

But it’s not just cost that needs to be considered, it’s value and service levels. You need customers to want your products, and you need delivery times you can depend on. But value and service guarantees cost money, and in inflationary markets, that means costs just go up and up.

If market prices are increasing, and you need to improve service levels and add more value-based features to appease customers, can you even identify savings?

The answer is, without the right platforms that allow you to look at your costs holistically and find ways to minimize them beyond just a price-based bid, is that you can’t … at least not after the first time you’ve “strategically sourced” a product. Additional savings will come from better category definition and alignment, smarter network design, better inventory management and aligned inventory levels, and up-sell opportunities from more appropriate, sustainable, sourcing selections.

And that will require the right upstream technology that will include the following:

  • supplier discovery to identify the right suppliers
  • optimization backed sourcing to make the right value-based decisions
  • supplier management to make sure the relationship and performance can be managed
  • risk management to identify, monitor, and mitigate potential disruption risks
  • analytics to analyze past, current, and ongoing price and KPI performance
  • CLM to manage the contract, obligations, and identify the time for renewal, renegotiation, or termination

And that’s why you see a proliferation towards Strategic Procurement Technology Suites and why the doctor has teamed up with Spend Matters to analyze them. Platforms are becoming key to identifying real, sustainable, savings — but only if they are the right ones for the customer base they are installed in.

Don’t Underestimate The Importance of Workflow …

One of the big reasons that, even four years after the doctor told you about the Procurement Damnation of Project Management most vendors haven’t done anything as per my post last week (read here), is that they have little or no workflow management, a capability that we told you is vital to the modern platform in our post three weeks ago where we were Digging into the S2P Tech Foundations.

Workflow is more than just a platform’s ability to guide a buyer through the application to complete a specific task, workflow is the ability of the platform to be adapted, and adapt, to the processes an organization needs to support, and the ability of the platform to support management of those processes and the projects that create them.

This is something a large majority of software application developers don’t get, and, as a result, something a large majority of applications don’t have. And it’s something that needs to be baked in at the foundations of an application, or the application will never have good workflow capability.

Why is there so little? Because classic application design philosophy, inspired by the waterfall model of software development, has been:

  • identify a problem
  • define the problem
  • translate into requirements
  • detail into functional specifications
  • build a software solution that implements the functional specifications
  • iteratively test and debug until stable enough for release

And the agile philosophy didn’t change much. The only difference is that instead of attacking the full extent of the problem and translating the full problem into requirements, you focused on a core piece of the problem, translated it into requirements, fleshed out, built, and then went back and extended the core, extracted the new requirements, fleshed those out, built new pieces and integrated into the existing solution, and so on.

No thought was given as to how to create a set of self contained units that could be strung together in a workflow to solve bigger problems, which is key to providing a platform that would allow the workflow to be extended and altered and allow the organization to change over time.

And if you’ve invested five to ten years in a platform that has been profitable, do you really want to go back to square one and build it up from foundations the proper way? Especially if you think you can still make money on what you have? Probably not.

And looking to the bigger picture, that’s the state of Procurement 2.0 and why we need new, evolutionary, platforms if we are every going to realize the extent of Procurement 3.0. But that’s another post.