Daily Archives: March 4, 2011

When Will Analysis Be Ubiquitous?

It seems that analysts across the board are finally recognizing the need for good data analysis. For example, this recent article in Industry Week on mastering complexity, driving out complication, notes that what most manufacturers are missing today is an adequate way to analyze and interpret collected data in terms of what are the potential impacts and risks on the business. Add this to all the articles preaching the need for spend analysis to get direct, indirect, and logistics costs under control, and one sees that the need for analysis across the board is now ubiquitous.

But yet only a small number of companies have solid analytical tools. Most companies still don’t have basic spend analysis applications that allow them to see where their organization spend is going. Far fewer still have Enterprise Manufacturing Intelligence (EMI) applications. But until there are data analysis applications across the supply chain, significant cost-saving opportunities are going to go unidentified. So when will analysis be ubiquitous? How many more years are we going to have to wait?

Probabilistic Chips ARE NOT Going to Improve Your Supply Chain Software

It simultaneously humours and scares me every time non-technical folk decide to write about a new piece of technology and how it is going to revolutionize whatever domain they regularly write about. The latest example is this recent piece over on Supply Chain Brain that says one should look for breakthrough technologies this year.

The article, which correctly notes that adoption of ERP systems led to:

  • Stumbling
    as a lack of depth in planning functionality in ERP systems did not lead to integrated planning
  • Failed Promises
    as the ERP is not the de-facto data model for the enterprise or even the end-to-end supply chain
  • A Lack of Agility
    as it has failed to deliver any sensing capabilities that would drive supply chain agility

went on to say that 2011 would be the year when breakthrough technologies using probabilistic chip logic, parallel processing for near-real time response, and artificial intelligence would hit the market. WHAT THE FRACK? Are you kidding me? Did Supply Chain Brain really publish this? Was it written by the Scarecrow from the Wizard of Oz? Let’s examine these technologies in more detail.

  • Parallel Processing
    Your average solution is already taking advantage of this. It’s called a multi-core chip which has been standard in every server for years now. Sure most applications are not written to be multi-threaded or take advantage of multi-cores, but, in order to allow the developers to handle increasing complexity and code-sprawl, most applications are written as multiple modules that are assigned their own processes, and the OS will balance the processes among the cores to speed up overall performance.
  • Artificial Intelligence
    We have been promised (true) AI for 55 years and it has never materialized. What makes you think the next 55 are going to be any different? And how are supply management applications going to deliver a technology that does not even exist yet?
  • Probabilistic Chip Logic
         Obviously the author has no fracking clue what PCL is. Because if the author did, the author would know that PCL, by its very definition, CAN NOT improve computational results. In fact, what PCL actually does is WORSEN computational results. (It will make a decision optimization model worthless since optimization requires millions of calculations and the propagated error would soon be so bad that there will be no accuracy left.) In some applications, like video and audio compression and decompression, small losses in accuracy are not only acceptable but often unnoticeable. It turns out that if you are willing to accept small losses in accuracy, you can do the computations with significantly less energy.
         Most of the voltage required by a computer chip is used to overcome the electrical “noise” created by constantly moving electrons in the chip materials. If this noise is not drowned out by a high enough voltage, then a chip may not be able to accurately determine if an electron flowed through one of its transistors. (Chips produce their bits, 0s and 1s, by measuring the absence or presence of an electron in a transistor.) If the voltage is decreased, the signal-to-noise ratio decreases and the probability of registering an incorrect bit increases. It turns out that the nature of electricity means that voltage (and energy) requirements can be significantly decreased if one is willing to accept an increase in the probability of a bit being mis-read x% of the time, which for some applications (like video and audio signal processing) only results in a small loss of precision.
         Thus, the use of a probabilistic chip can decrease your energy requirements (and corresponding operational costs of computing machinery), but cannot improve the processing accuracy of any applications so using it (although it can speed the chips up slightly since lower voltage utilization means they can run faster without overheating). And, at least for now, one will get (considerably) more speed from parallel processing.