Daily Archives: November 15, 2010

Webinars This Week from the #1 Supply Chain Resource Site

The Sourcing Innovation Resource Site, always immediately accessible from the link under the “Free Resources” section of the sidebar, continues to add new content on a weekly, and often daily, basis — and it will continue to do so.

The following is a short selection of webinars THIS WEEK that might interest you:

Date & Time Webcast
2010-Nov-16

00:00 GMT/WET

Supply Chain Continuity: A Risk Management Imperative in a Global Economy
  

Sponsor: Avalution Consulting

2010-Nov-17

12:00 GMT/WET

A New Decade for Smarter Supply Chain Management
  

Sponsor: Supply Chain Digest

2010-Nov-17

8:00 GMT-08:00/AKDT/PST

Pfizer Finds the Formula to Cut both MRO Inventory and Downtime Risk
  

Sponsor: IHS

2010-Nov-17

14:00 GMT-05:00/CDT/EST

7 Ways to Break the Cost Barrier of Trade Promotion Management
  

Sponsor: MEI

2010-Nov-18

11:30 GMT-08:00/AKDT/PST

Yardi Procure to Pay: Featuring Yardi PAYscan and Site Stuff
  

Sponsor: Yardi

2010-Nov-18

13:00 GMT-05:00/CDT/EST

Sourcing Marketing: Key Success Factors
  

Sponsor: Global eProcure

2010-Nov-18

14:00 GMT-05:00/CDT/EST

Achieving Effective Inventory Management
  

Sponsor: Second Foundation Consulting

2010-Nov-18

13:30 GMT-05:00/CDT/EST

Provider Score Card: 5 Common Sense Tests to Foster Competition
  

Sponsor: Health Decisions Inc.

They are all readily searchable from the comprehensive Site-Search page.

Analytics VI: Conclusion

Today’s post is by Eric Strovink of BIQ.

I’ve suggested previously in this series that analysis doesn’t have to be done by an applied mathematician; the key is to get insights about data. Sometimes those insights do require rigorous statistical analysis or modeling, to be sure. Much more often, though, one simply needs to examine the laundry, and the dirty socks stand out without any mathematical legerdemain.

Examining the laundry requires data manipulation. This usually takes the form of data warehousing, i.e. classic database management technology, extended in the case of transactional data to OLAP (“Online Analytical Processing”), SQL and or MDX, and reporting languages and tools. Problem is, business data analysts typically have insufficient IT skills to wield these tools effectively; and when they do have the skill, they seldom have the time. Thus, ad hoc analysis of data remains largely aspirational.

Custom data warehouses have value for organizations. ERP systems are a good example. But the data warehouse is a dangerous partner. It is not the source of all wisdom. It cannot possibly contain all the useful data in the enterprise. Warehouse vendors have trouble admitting this. For example, for years ERP sales types claimed that all spending was already tracked and controlled by the ERP system, so there was no need for a specialized third-party “spend analysis” system. These days all the major ERP vendors offer bolt-on spend analysis.

Spend analysis has the same issue. It introduces another static data warehouse, an OLAP data warehouse, along with data mapping tools that are typically not provided to the end user. As above, the data warehouse is a dangerous partner. It is not the source of all wisdom. It cannot possibly contain all the useful spend data in the enterprise. Spend analysis is not just A/P analysis; it can’t be done with just one dataset; and it’s not a set of static reports.

Once an opportunity is identified, more analysis is required to decide how to award business optimally. The Holy Grail of sourcing optimization has been a tool that is approachable for business users; but this goal has proved to be elusive. The good news is that “guided optimization” is now available from multiple vendors at reasonable price points. Although optimists (mostly experts at optimization) have argued for several years now that optimization is easy enough for end users without guidance, I take the practical view that it doesn’t really matter whether that’s true or not. As long as optimization is available at a reasonable price, whether it has a services component or not, the savings it delivers are worthwhile.

By no means is this series an exhaustive review of data analysis. For example, interesting technical advances such as Predictive Model Markup Language (PMML) are enabling predictive analytics to be bundled into everyday business processes. Scenario analysis is also a powerful tool for painting a picture of potential futures based on changes in behavior. But the vendors of these technologies either must make them accessible to end users, or offer affordable services around them. Otherwise they will remain exotic and inaccessible.

The bottom line is that analysis tools must be accessible to end users. It must be easy and fast to build datasets and gain insight from them. Optimization software should automatically perform sensitivity analysis for you, as the doctor has advocated. Ad hoc analysis should be the rule, not the exception. Analysis should not require vendor or IT support; if it does, it likely won’t happen.

The more you look, the more savings you will find; and when you walk into the CFO’s office waving a check, you will get attention as well as the resources to find even more.

Previous: Analytics V: Spend “Analysis”

Share This on Linked In