Monthly Archives: July 2008

Kick-Starting Your Environmental Compliance Program

As a recent Industry Week article points out, with new environmental regulations popping up around the globe, ‘non-compliance’ is a ‘non-option’. Furthermore, if you’re a manufacturer, it’s not just your finished products that you have to be concerned with. You have to be sure that any new energy products are environmentally friendly, that chemicals used in the production process are non-hazardous and non-toxic, and that any waste products are properly disposed of.

In order to ensure you are in compliance, you need an environmental compliance program. But where do you start? Industry Week recently ran a great article that outlined what you need to do to Create an Effective Environmental Compliance Program.

  1. Document
    Make sure you have the data you need to demonstrate REACH and RoHS compliance.
  2. Analyze
    Be sure to analyze the effectiveness of any processes you already have in place. Things to consider:
    • resource requirements and associated overheads
    • current response times to customer requests
    • time losts and costs incurred responding to audits
    • costs of errors and unintentional non-compliant shipments that slip through
  3. Self-Assessment
    Consider the implications if you are found to be non-compliant.
  4. Appoint a Champion
    Someone has to spearhead the effort.
  5. Evaluate (, Evaluate, Evaluate)
    Compare the different solutions, their advantages and disadvantages, and select the best one.
  6. Implement
    Once you have identified the solution that has the best balance between functionality, flexibility, reporting, ease-of-use, implementation, commitment, and cost, you need to implement it.

Supply Chain Finance Slowly Takes Hold

It was nice to see the recent article by Henry Ijams of PayStream Advisors, Inc in Supply & Demand Chain Executive on emerging payment and discount paradigms in the supply chain and the benefits of working capital optimization, which include:

  • paper reduction
  • liquidity injection into the supply chain
  • supply chain risk reduction
  • purchase-to-pay automation financing

The point of the article is that supply chain finance is finally starting to take hold, which is good, even if most people are still confusing “discounts” with “finance”. Not that discounts are bad because, appropriately used, they’re quite good. Appropriately defined early payment discounts save the buyer money, as they pay a lower price, and save the supplier money, as they don’t have to borrow financing at a higher rate. It’s a win-win.

They key to supply chain finance, as correctly noted in the article, is end-to-end e-Procurement with full automation when there is no discrepancy in the m-way match (between purchase order, goods receipt, invoice, and, if available, contract) and quick discovery and alerts when something doesn’t match (to allow for the error to be corrected and payment approved before the discount window expires).

However, supply chain finance is more than just discounts. It’s better forecasting. Better inventory optimization. Better financing options. A full suite of capital and cash management tools. Collaborative problem solving. And more importantly, it’s not shifting inventory to suppliers or increasing days payable outstanding (DPO). For a detailed discussion of supply chain finance, I would suggest that you check out the wiki-paper. I think it would be worth your time.

Design Cost Out with Akoya

Last year, I gave you a formal introduction to Akoya in my post Ahoya, Akoya and their unique solution for reducing direct material spend, which was described by their co-founder and president Brett Holland in his posts on Getting Ahead of the Product Cost Management Curve and Taking Control of Cost Management for Engineered Direct Materials over on Spend Matters. (With additional information to be found in their short paper on why analytically derived should-cost information is critical for improving product margins.)

When I was back in the mostly windy city recently, I had a chance to catch up with Akoya and discuss their new product offering, currently in beta with a few select customers, which is designed to complement their existing product offering and help organizations save even more on their manufactured part purchases.

Their current product offering, Category Workbench, allows a company to identify which parts it is likely paying too much for using a technique Akoya calls “competitive banding”. By extracting identifying features and product composition information from part designs, Akoya’s Category Workbench can automatically group parts into categories by common design elements and raw material composition and extrapolate average prices. Using this information and market costs, it can also statistically extrapolate expected prices for each part in a category and identify those parts that are currently being sourced at below market price, at market price, and above market price. This allows the company to identify those parts that present savings opportunities through re-sourcing, re-negotiation, or re-design. With this information in hand, a company not only knows where its sourcing teams should direct their sourcing efforts, but where it’s engineering teams should direct their redesign efforts. Considering that re-design and re-costing of even a simple part through a system like MTI Systems’ Costimator or Apriori’s Virtual Product Environment* can take a design engineer the better part of a day at the low-end, and a few days at the high end, and that their time is very expensive, this is very important as a company that sources thousands of direct parts can only attack a few hundred parts in a given year, and the wrong choice (based simply on a spend analysis by volume, supplier, or cost) can cost a company more money than the redesign effort will save.

Akoya’s next product, Designer Workbench, is going to allow a company to dive into those parts where the Category Workbench indicates a potential savings opportunity in a way that’s going to allow the company to determine the potential extent of the savings opportunity in minutes instead of hours, or even days. In addition to a number of new search, costing, and CAD data support features, which I plan to dive into at a later date after the product is generally available, one of the significant new features that this product is going to include is a new “What If” Analysis Tool. Using market pricing and price data from other products in the competitive band, this tool is going to allow an engineer to quickly create virtual variants with different features, treatments, processes, and raw materials and calculate estimated costs in real-time. In a matter of (less than 15) minutes (on average), a design engineer is able to iterate through a number of options and identify not only a lower cost alternative, but the high level design features that the lower cost alternative needs to have. In beta tests with an existing client, the estimated costs produced by the solution have been found to be accurate within 95% or more (when compared with detailed Costimator analyses which take an average of 6 hours for the customer in question). Needless to say, these are some amazing results, and I suspect that a forward thinking company that properly utilized this solution in conjunction with complementary solutions would see incredible returns. But that’s also a subject for a later post.

* Although re-design and re-costing through Apriori’s Virtual Product Environment can be done in a matter of minutes once the environment is configured and appropriate cost information entered, setting up one of these environments usually takes days, and often requires the assistance of Apriori personnel. 

Lies, Damn Lies, and Statistics

Hopefully you caught The Brain’s much needed lesson in statistics back in January, as it was very informative. (If you didn’t, you can still go back and read it. Heck, even if you did, it probably wouldn’t be a bad idea to go back and read it again.)

The reason I’m pointing it out again is that I just noticed that Knowledge and Wharton put out a great summary of some of the key points in their article on The Use — and Misuse — of Statistics: How and Why Numbers Are So Easily Manipulated. Even though they had to go and use, what is in my view, the waste-of-time, waste-of-print, and waste-of-breath story on Roger Clemens and his alleged (ab)use of steroids as a background, they still made some great points regarding statistics, which need to be reiterated every now and again (because it seems that the vast majority of people who like to do statistical studies and quote statistical results still don’t understand what statistics is really all about).

  • Correlation is not Causation!
    As the article notes, a chain of retail stores may analyze its operations for a set period and find that those times when it reduced its sales prices coincided with times that overall sales fell. The chain might conclude that low prices reduce sales volume when, in fact, it could be the case that stores run semi-annual sales during known down periods. In other words, low sales are causing price declines and not the other way around.
  • It’s much easier to isolate and exclude extraneous data when you have experimental or hard-sciences data.
    In post-activity analysis in a business setting, it’s much more difficult to isolate the effects of a variety of other influences — and any attempt to simplify will most likely lead to incorrect results.
  • Comparing your situation only to those that produced positive effects is selection bias — and it’s wrong! Samples must be random.
    The example the authors use is that the Clemens report tried to prove he didn’t do steroids by noting that there are other examples of professional baseball players, like Nolan Ryan, Randy Johnson, and Curt Schilling, that also enjoyed great success in their 40s. However, that’s atypical behavior. The vast majority of players, and pitchers in particular, steadily improve in their early careers, peak at about 30, and then slowly decline. Clemens started declining in his late 20’s and then rebounded and improved in his 40s.
  • A single, short-term study on a small population is not conclusive! Especially if the population is not representative of the population at large!
    The example given here is a lawsuit filed against the Coca-Cola Company’s marketing for Enviga, it’s caffeinated green-tea drink, that states it actually burns more calories than it provides, resulting in ‘negative calories’. The claim is based on a clinical study of a small group of individuals with an average BMI (Body Mass Index) of 22. However, the majority of the American population has a BMI of 25 or more. Thus, its not statistically reasonable to say that the study would be representative of the population at large.
  • An accounting of the entire testing process is required for proper perspective in interpretation.
    So you found a statistically significant effect, correlation, or difference between some set of variables. If you don’t report the twenty-one insignificant tests you did before you found that one significant result, how do you know it wasn’t a fluke and that the test should probably be repeated?
  • Data-driven studies can’t always tell you the right answer.
    All they can tell you is which answers to eliminate because the data does not support them. The true value of a statistical analysis is that it helps users to properly characterize uncertainty as opposed to a “best guess”, to realize what outcomes are statistically significant, and to answer specific hypotheses.
  • You have to understand what the drivers behind the variables are if you are to have any hope of making a correct interpretation!
    Consider the example of major league baseball outfielders. A hypothesis going into such a study might be that outfielders have a harder time catching balls hit behind them, which forces them to run backwards. You’ll likely find that the opposite was true – that outfielders tend to catch more balls running backwards, even though this seems counter-intuitive at first. However, when you consider the hang-time of the ball, and the fact that balls hit farther are in the air longer, which gives the outfielder more time to catch them, it starts to make sense.
  • The validity of a statistical analysis is only as good as it’s individual components.
    And if even one component is invalid, the whole work is invalid.

Junket Junking with James Jin

While I was visiting MFG headquarters in Atlanta yesterday, I had the pleasure of sitting down for a few minutes with James Jin, who runs MFG’s Shanghai office (which covers all of Asia at the moment). James, who hosted last year’s webinar on Surviving China’s Rapidly Changing Sourcing Tides is a rare character who not only deeply understands both the North American business world and the Chinese business world, but who also sees what’s needed to bridge them into a more seamless global marketplace, which is something he does on a daily basis through MFG. It was a pleasure to discuss both what he thought were the most critical areas of focus for MFG, and other companies doing business in China, as well as what his biggest challenges were, because it not only highlighted the differences in doing business with China, but also the similarities – which I don’t think enough people spend enough time on.

From James’ perspective, the three most critical aspects that MFG (and other businesses entering or doing business in the Chinese marketplace) needs to focus on are:

  • Any international business can’t afford to miss the China market
    Low cost country sourcing might be going away, or changing in nature, but China is here to stay as not only a huge supply base, but an economic development opportunity.
  • There is a huge buy-side demand, growing larger by the day.
    Not only does China have one of the largest emerging middle-classes in the developing world, but they already have one of the largest middle-classes in the world. (Think about it, they have over 1.3 Billion people!) China is not just a large supply-side opportunity, but they are a large sell-side opportunity for any company that can offer the right products at the right prices.
  • The key to success is to think globally, but act locally.
    Doing business in China requires a balance and an understanding of local culture and geography.

However, James’ answer to my inquiry on the three biggest challenges to doing business in China was even more revealing on what it takes to do business successfully in China than the three most important areas on which you need to focus. In short, the three biggest challenges are:

  • People
  • People
  • People

Despite the cultural and language differences, the reality is that doing business in China is just like doing business with most developed countries. They’re very mature about their approach to business (they understand that business ebbs and flows in a repeated cycle, and that just because they’re getting a lot of business today, that doesn’t mean they’ll be getting a lot tomorrow and that they’re going to have to work to get and keep new business), they have new plants with industry leading technology (in fact, some suppliers have the most modern plants in the world for what they produce due to recent investments to keep up with Western demand), the leaders understand that it’s all about the people (as almost everything but the human equation can be automated these days), but, even with 1.3+ Billion people, it’s a constant struggle to find the right people with the right education and the right skills for the job. Especially when they have to play in a global marketplace.

As for what’s going on with the rest of MFG, you’ll be hearing a lot more from them next quarter, and I’ll have more to discuss at a future time as well, but for now, I need to hit the road again and wonder just what Willie was thinking. (Obviously, he didn’t drive his own tour bus!!!)