Apparently one of the best presentations at Coupa Inspire earlier this month was a presentation by IBM on Procurement Transformation and Big Data. (Needless to say that this does not inspire the doctor.) Ouch! Big Data is good, but only if it’s a big bucket of relevant data, but that’s not usually the case. Usually it’s a big bucket of random data where only some of the data is relevant and the statistical relevance is low. This is bad because if an organization takes big data as gospel, it can be led down the wrong track. (And we’ll get back to this.)
And given the examples that the prophet is presenting in his post on “When Watson Meets Procurement” on Spend Matters, the doctor is a bit worried. Why? Let’s take them one by one.
Parsing unstructured data to extract “soft facts” and information from news feeds and social media to line up against traditional risk management data feeds to drive a new level of supply risk management intelligence.
Okay, this is smart, because it can identify potential problems, but not necessarily all that useful. For example, let’s say it detects a few dozen instances of consumer unrest due to product defects. If the product is one with a warranty, chances are your customer service department already has a few dozen instances of warranty claims. No new information. Let’s say it detects a few hundred instances of consumer duress because one of your suppliers was using slave labour – but that resulted from a news story that was already picked up by your supply chain visibility and risk monitoring system. Nothing unexpected here. All you can really pick up on is general consumer sentiment, but only the consumer sentiment of the consumer base that is online, and, more likely, the consumer base that is unhappy with the product, since people who are unhappy are more likely to complain that people who are happy are likely to go online and give good reviews.
Ask Watson about Procurement that can leverage natural language processing to extract data buried in contracts, documents, and other organizational systems such as AP.
Okay, this is kind of smart too, but this is not so much big data processing but natural language processing and query formation as this is no different than implementing a meta interface that parses a query and translates it into a format that is appropriate for each system that may contain related Procurement data. Yes, the number of systems that could contain related information magnify the data magnitude problem, but since you can search separately and then only integrate relevant data, this is really not that much of a big data problem.
Build My Briefing, Watson that aggregates information about a Procurement entity (category, supplier, etc.) into an auto-generated deliverable for anyone who needs it (for sourcing, supplier review, etc.)
Okay, this is also smart, but not big data. This is just aggregating data from multiple systems and shoving it in a pre-built template. It’s just a reporting engine on steroids.
the doctor would like to see a good use of Big Data for procurement to solve a problem that could not have been solved otherwise, but he hasn’t seen it yet. The reality is that, as he has been saying for years, Big Brains Will Win in the End.