Today’s post is by Eric Strovink of BIQ.
I’ve suggested previously in this series that analysis doesn’t have to be done by an applied mathematician; the key is to get insights about data. Sometimes those insights do require rigorous statistical analysis or modeling, to be sure. Much more often, though, one simply needs to examine the laundry, and the dirty socks stand out without any mathematical legerdemain.
Examining the laundry requires data manipulation. This usually takes the form of data warehousing, i.e. classic database management technology, extended in the case of transactional data to OLAP (“Online Analytical Processing”), SQL and or MDX, and reporting languages and tools. Problem is, business data analysts typically have insufficient IT skills to wield these tools effectively; and when they do have the skill, they seldom have the time. Thus, ad hoc analysis of data remains largely aspirational.
Custom data warehouses have value for organizations. ERP systems are a good example. But the data warehouse is a dangerous partner. It is not the source of all wisdom. It cannot possibly contain all the useful data in the enterprise. Warehouse vendors have trouble admitting this. For example, for years ERP sales types claimed that all spending was already tracked and controlled by the ERP system, so there was no need for a specialized third-party “spend analysis” system. These days all the major ERP vendors offer bolt-on spend analysis.
Spend analysis has the same issue. It introduces another static data warehouse, an OLAP data warehouse, along with data mapping tools that are typically not provided to the end user. As above, the data warehouse is a dangerous partner. It is not the source of all wisdom. It cannot possibly contain all the useful spend data in the enterprise. Spend analysis is not just A/P analysis; it can’t be done with just one dataset; and it’s not a set of static reports.
Once an opportunity is identified, more analysis is required to decide how to award business optimally. The Holy Grail of sourcing optimization has been a tool that is approachable for business users; but this goal has proved to be elusive. The good news is that “guided optimization” is now available from multiple vendors at reasonable price points. Although optimists (mostly experts at optimization) have argued for several years now that optimization is easy enough for end users without guidance, I take the practical view that it doesn’t really matter whether that’s true or not. As long as optimization is available at a reasonable price, whether it has a services component or not, the savings it delivers are worthwhile.
By no means is this series an exhaustive review of data analysis. For example, interesting technical advances such as Predictive Model Markup Language (PMML) are enabling predictive analytics to be bundled into everyday business processes. Scenario analysis is also a powerful tool for painting a picture of potential futures based on changes in behavior. But the vendors of these technologies either must make them accessible to end users, or offer affordable services around them. Otherwise they will remain exotic and inaccessible.
The bottom line is that analysis tools must be accessible to end users. It must be easy and fast to build datasets and gain insight from them. Optimization software should automatically perform sensitivity analysis for you, as the doctor has advocated. Ad hoc analysis should be the rule, not the exception. Analysis should not require vendor or IT support; if it does, it likely won’t happen.
The more you look, the more savings you will find; and when you walk into the CFO’s office waving a check, you will get attention as well as the resources to find even more.
Previous: Analytics V: Spend “Analysis”
Share This on Linked In