Daily Archives: June 2, 2023

Reporting is Not Analysis — And Neither Are Spreadsheets, Databases, OLAP Solutions, or “Business Intelligence” Solutions

… and one of the best explanations the doctor has ever read on this topic (which he has been writing about for over two decades) was just published over on the Spendata blog on Closing the Analysis Gap. Written by the original old grey beard himself (who arguably built what was the first stand alone spend analysis application back in 2000 and then redefined what spend analysis was not once, but twice, in two subsequent start-ups that built two, entirely new, analytics applications that took a completely different, more in-depth approach), it’s one of the first articles to explain why every current general purpose solution that you’re currently using to try and do analysis actually doesn’t do true analysis and why you need a purpose built analysis solution if you really want to find results, and in our world, do some Spend Rappin’.

We’re not going to repeat the linked article in its entirety, so we’ll pause for you to go read it …

 

… we said, go to the linked article and read it … we’ll wait …

 

READ IT! Then come back. Here’s the the linked article again …

 

Thank you for reading it. Now we’ll continue.

As summarized by the article, we have the following issues:

Tool Issue Resolution Loss of Function
Spreadsheet Data limit; lack of controls/auditability Database No dependency maintenance; no hope of building responsive models
Database performance on transactional data (even with expert optimization) OLAP Database Data changes are offline only & tedious, what-if analysis is non-viable
OLAP Database Interfaces, like SQL, are inadequate BI Application Schema freezes to support existing dashboards; database read only
BI Application Read only data and limited interface functionality Spreadsheets Loss of friendly user interfaces and data controls/auditability

In other words, the cycle of development from stone-age spreadsheets to modern BI tools, which was supposed to take us from simple calculation capability to true mathematical analysis in the space age using the full breadth of mathematical techniques at our disposal (both built-in and through linkages to external libraries), has instead taken us back to the beginning to begin the cycle anew, while trying to devour itself like an Ouroboros.


Source: Wikipedia

 

Why did this happen? The usual reasons. Partly because some of the developers couldn’t see a resolution to the issues when they were first developing these solutions, or at least a resolution that could be implemented in a reasonable timeframe, partly (and sometimes mostly) because vendors were trying to rush a solution to market (to take your money), and partly (and sometimes largely) because the marketers keep hammering the message that what they have is the only solution you need until all the analysts, authors, and columnists repeat the same message to the point they believe it. (Even though the users keep pounding their heads against the keyboard when given a complex analysis assignment they just can’t do … without handing it off to the development team to write custom code, or cutting corners, or making assumptions, or whatever.) [This could be an entire rant on its own how the rush to MVP and marketing mania sometimes causes more ruin than salvation, but considering volumes still have to be written on the dangers of dunce AI, we’ll have to let this one go.]

The good news is that we now have a solution you can use to do real analysis, and this is much more important than you think. The reality is that if you can’t get to the root cause of why a number is as it is, it’s not analysis. It’s just a report. And I don’t care if you can drill down to the raw transactions that the analysis was derived from, that’s not the root cause, that’s just supporting data.

For example, profit went down because warranty costs increased 5% is not helpful. Why did warranty costs go up? Just being able to trace down to the transactions where you see 60% of that increase is associated with products produced by Substitional Supplier is not enough (and in most modern analysis/BI tools, that’s all you can do). Why? Because that’s not analysis.

Warranty costs increasing 5% is the inevitable result of something that happened. But what happened? If all you have is payables data, you need to dive into the warranty claim records to see what happened. That means you need to pull in the claim records, and then pull out the products and original customer order numbers and look for any commonalities or trends in that data. Maybe after pulling all this data in you see, of the 20 products you are offering (where each would account for 5% of the claims if all things were equal) there are 2 products that account for 50% of the claims. Now you have a root cause of the warranty spend increase, but not yet a root cause of what happened, or how to do anything about it.

To figure that out, you need to pull in the customer order records and the original purchase order records and link the product sent to the customer with a particular purchase order. When you do this, and find out that 80% of those claims relate to products purchased on the last six monthly purchase orders, you know the products that are the problem. You also know that something happened six months or so ago that caused those products to be more defective.

Let’s say both of these products are web-enabled remote switch control boxes that your manufacturing clients use to remotely turn on-and-off various parts of their power and control systems (for lighting, security monitoring, etc.) and you also have access, in the PLM system, to the design, bill of materials (BOM), and tier 2 suppliers and know a change takes 30 to 60 days to take effect. So you query the tier 1 BOM from 6, 7, 8, and 9 months ago and discover that 8 months ago the tier 2 supplier for the logic board changed (and nothing else) for both of these units. Now you are close to the root cause and know it is associated with the switch in component and/or supplier.

At this point you’re not sure if the logic board is defective, the tier 1 supplier is not integrating it properly, or the specs aren’t up to snuff, but as you have figured out this was the only change, you know you are close to the root cause. Now you can dive in deep to figure out the exact issue, and work with the engineering team to see if it can be addressed.

You continue with your analysis of all available data across the systems, and after diving in, you see that, despite the contract requiring that any changes be signed off by the local engineering team only after they do their own independent analysis to verify the product meets the specs and all quality requirements, you see that the engineering, who signed off on the specs, did not sign off on the quality tests which were not submitted. You can then place a hold on all future orders for the product, get on the phone with the tier 1 supplier and insist they expedite 10 units of the logic board air freight for quality testing, and get on the phone with engineering to make sure they independently test the logic boards as soon as they arrive.

Then, when the product, which is designed for 12V power inputs, arrives and the engineers do their stress tests and discover that the logic board, which was spec’ed to be able to handle voltage spikes to 15V (because some clients power backup systems off of battery backups that run off of chained automotive batteries) actually burns out at 14V, you have your root cause. You can then force the tier 1 supplier to go back to the original board from the original supplier, or find a new board from the current supplier that meets the spec … and solve the problem. [And while it’s true you can’t assume that all of the failure increases were the logic board without examining each and every unit of each and every claim, in this situation, statistically, most of the increase in failures will be due to this [as it was the only change].]

In other words, true analysis means being able to drill into raw data, bring in any and all associated data, do analysis and summaries of that data, drill in, bring in related data, and repeat until you find something you can tie to a real world event that led to something that had a material impact on the metrics that are relevant to your business. Anything less is NOT analysis.