A recent article over on the HBR blogs on how your brain connects the future to the past discussed some recent studies that suggest that the areas of memory that remember the past and the areas of creativity that imagine the future in your brain are almost one in the same. More specifically, the brain’s memory circuits are not merely for reflecting on the past but are also vital mechanisms for imagining, anticipating, and preparing for the future, a skill that each of use needs daily in this fast-paced knowledge-driven economy.
In the business world, it’s a distinct advantage to have a brain that anticipates future demands and negotiates them well because accurate predictions typically translate to success. A proactive brain that flexibly recombines details from past experiences that, by analogy with your current surroundings, help you make sense of where you are, anticipate what will come next, and successfully navigate the transition increases your performance. But how do you get a proactive brain?
The article provided some tips, which included:
- thinking about your (organization’s) goals for the future,
- giving your brain a rich bank of experiences, and
- interacting with others.
In short, increasing your CQ will increase your performance. So what’s CQ? That’s the subject of a new 10-part series, edited by Dick Locke — SI’s resident expert on international trade, that starts tomorrow!
Share This on Linked In
… but the ability to clean it on the fly is better!
Chain Link Research, which has been publishing some of the best thought leadership on Supply Chain Management in recent months, recently ran a piece on contract and supplier management lessons that summarized eight key lessons from their recent research. Seven of these are dead on and emphasize lessons I’ve been trying to impart for years (including a couple that still haven’t been learned by most of the space).
The eighth lesson, which states that data cleanliness cannot be overemphasized is correct, but overlooks the fundamental problem associated with data — it will never be 100% clean. Even if you have one hundred bodies manually reviewing and cleansing the data (which is exactly what you get if you buy a certain vendor’s solution, since that’s their unwritten strategy for dealing with all the transactions that their automated mapping algorithm is unable to classify), you’re not going to get it all right. First of all, data is always being added to the system — you’ll never be 100% up to date. Secondly, classifications need to change over time. And, most importantly, humans make mistakes and while they’ll fix some errors correctly, they’ll screw up other errors (which they may miss entirely).
The real to success is having a data analysis tool that allows you to fix an error in real time as soon as its spotted — not a traditional data warehouse where you have to wait weeks (or months) for the refresh. Then you can get away with 80% to 90% accuracy* (which is all you need to figure out where the problems really lie) because, if a supplier or customer spots an error in the data, you can say “sorry, let me fix that”, click on the transaction, click on the link that shows the rule that ultimately produced the mapping, and either (a) change the rule if it is wrong or (b) create a new exception (overlay) mapping rule if the mapping rule is normally right, but this is a special case. The report is updated, very little changes in the big picture, and you move on. That’s the way you do it.
* You can achieve this level of mapping accuracy in a matter of days, creating rules by hand, no matter how much data you have. All you have to do is apply the secret sauce of:
- Map the GL codes
- Map the top Vendors
- Map the Vendor + GL codes (for top Vendors who sell more than one Commodity)
- Map the Exceptions (for example, GL codes that always map to a particular Commodity)
- Map the Exceptions to the Exceptions**
** If your data is really bad or you have a really sophisticated categorization scheme.
Share This on Linked In