Collaborate, Collaborate, Collaborate, Collaborate V

Recently, Computer Sciences Corporation (CSC]), Supply Chain Management Review (SCMR), and Michigan State University (MSU) released the “Fifth Annual Global Survey of Supply Chain Progress”.

The report measured the performance of firms along eight dimensions of supply chain competence:

  • Alignment with Business Strategy
  • Strategic Customer Integration
  • Strategic Supplier Integration
  • Cross-Functional Internal Integration
  • Supply Chain Responsiveness
  • Planning & Execution Process & Technology
  • Supply Chain Rationalization / Segmentation
  • Risk Management

The report found that the less mature companies needed to focus on greater collaboration with business partners and pay more attention to areas of weakness. Another mark of leaders was greater strategic alignment and significant, positive, involvement of top managers.

But I think it’s pretty obvious that collaboration is the ultimate key. What better way to mutually identify and improve the areas of weakness? What better way to improve strategic alignment? What better way to maximize the positive involvement of management? Furthermore, without collaboration, you’ll never truly achieve strategic integration between customers, suppliers, or internal departments.

So you want to achieve collaboration, but aren’t sure how to sell it? A recent CAPS Research study by Stanley Fawcett, Gregory Magnan, and Jeffrey Ogden, as summarized in “How to Manage Supply Chain Collaboration”, puts forward a three step process to identify and compare the benefits, barriers and bridges to assess and communicate the viability of pursuing a path toward collaborative advantage. The three stages are as follows:

  • Introspection
    A company’s orientation and philosophy consists of two building blocks: customer orientation and systems thinking orientation.
  • Supply Chain Design
    A five step process: scan, map, cost, manage competency, and rationalize
  • Supply Chain Collaboration Relationship alignment, information sharing, performance measurement, people empowerment, and collaborative learning.

By figuring out where your company is, and then working your way through a proper supply chain design planning exercise, you’ll be in a position to align your relationships, share information, measure your performance, and progress collaboratively.

the doctor Explains the Meaning of the Word Share

Thanks to the tireless efforts of Sesame Workshop (formerly known as the Children’s Television Workshop (CTW) to us old-schoolers), I thought everybody knew the meaning of the word share. However, a recent visit to the good old ISM – or should that be the curmudgeonly old ISM, proved me wrong.

I found out that they had an article titled “Spread the Word – Sharing the Value of Supply Management” and was intrigued. However, and I really shouldn’t be surprised by this, the article is for paying members only. That’s not “sharing”. That’s “hoarding”.

To explain the concept, I’ve asked Elmo and Zoe, from the Sesame Workshop, to help me out. Take it away, gang.

Zoe : Hi Elmo! What’s that?
Elmo: That’s Elmo’s shiny new red ball.
Zoe : Can I see it?
Elmo: I don’t know. Elmo really likes it.
Zoe : I’ll give it back.
Elmo: Okay. Here you go.
Elmo tosses the ball to Zoe.
Zoe : Thank you. It’s a very nice ball! Catch!
Zoe tosses the ball back to Elmo.
Elmo : Yes it is! Catch!
Elmo tosses the ball back to Zoe.
Zoe : We’re playing ball!
Zoe tosses the ball back to Elmo.
Elmo: And we’re sharing!

There you have it. Pretty easy concept, isn’t it? And it makes everyone happy.

As a bonus lesson, the doctor is also going to tell the ISM about the word irony. The Unabridged Dictionary.com defines irony as “the use of words to convey a meaning that is the opposite of its literal meaning”. Alanis Morissette had a really good explanation of the concept, which can now be found on Youtube.

Roles of Performance Metrics in the (Out)Sourcing Process

I recently stumbled upon “Strategic Sourcing: Measuring and Managing Performance” again, a report prepared by RAND for Project Air Force back in 2000. Barely over a page into the summary, it notes that in managing their relationship, a customer and a provider jointly choose metrics that they believe will support the corporate goals of the customer organization. The goal is that such metrics align the provider’s priorities with those of its customer. Since procurement outsourcing is going to become the rage, as per my “Why You Should Consider Procurement Outsourcing“, it’s important to understand what the role of metrics is in the sourcing, and outsourcing, process since you never know when your CEO is going to get the outsourcing bug.

The paper enforces that customers (should) tend to focus on metrics that easily convey to providers the dimensions of performance most important to them. However, although the summary notes that customers and providers tend to refine the set of metrics used to measure performance throughout the relationship, it fails to mention that providers tend to want those metrics that focus on the performance measures that make them look good. Although it does imply relatively early on that outsourcers may have their own choice of metrics early on, it doesn’t make it particularly clear. In particular, there are providers that will try to steer the customer toward transaction oriented and fixed cost reduction metrics that are easy to measure, manage, and achieve.

Even though the key to a great outsourcing relationship is for a provider to insure that metrics are aligned with those aspects of performance that matter most, not all outsourcing providers may recognize the fact, or even if they do, be able to recognize which aspects of performance matter most to a customer. The fact of the matter is that there is no magic set of metrics that fits every sourcing need – and different managers at different levels within the customer firm may want different metrics, especially if the customer does not have its house in order before jumping on the outsourcing bandwagon.

It’s important that the metrics chosen focus on the strategic goals of outsourcing the process, and not the transactional nature of the process being outsourced. Specifically, are costs being reduced, are they being reduced with approved, quality suppliers, and are they tracking well against market averages? Has overall spend throughput in the outsourced categories increased by an acceptable percentage? What percentage of spend is being captured in the system? Have project timelines decreased on average?

After all, if costs are being reduced, but are still more than market averages, then outsourcing is not working very well. If overall spend throughput through strategic sourcing projects has not increased, then outsourcing has not succeeded. If the provider is not capable of capturing 100% of sourced spend in their systems, then their technology is not up to snuff. If project timelines have not decreased, then the customer is better off doing everything in house.

Furthermore, the set of metrics chosen should be helpful in making the in-house vs. outsource decision, should be useful in selecting the right provider, should be capable of measuring the progress of the relationship, and should promote continuous improvement. In addition, when evaluating a potential provider, the customer should have a set of metrics that address total cost, service quality, HR policies, technological capability, financial stability, and other special interests of the customer, such as carbon neutrality goals.

Procurement outsourcing, like any type of outsourcing, is not an easy decision and should not be made quickly or lightly. Significant research should be done up front by both parties because the full value of a successful relationship will generally not be realized for three to five years, as there are a lot of up-front costs in making the transition and outsourcing adds head-count (as you need people on your team overseeing and managing the relationship). Even though outsourcing the right categories to the right provider can be a great success, outsourcing the wrong categories to the wrong provider can significantly increase costs. So do your research, and take some time to find the set of metrics that will work for you.

the doctor Thinks 5 Billion for Business Intelligence is Too Much!

IBM is planning to buy Cognos for 5 Billion. That’s right — 5 Billion! Now I know that Cognos stock is trading at around $57 as I compose this post, that Cognos has a Market Cap that is somewhere around 4.76 Billion, and that Cognos’ annual revenue exceeds 1 Billion a year (1.02 Billion, actually) – but 5 Billion for a business intelligence (BI) solution?

According to Wikipedia, BI is a set of concepts and methods to improve business decision making by using fact-based support systems that is data-driven. Furthermore, BI systems provide historical, current, and predictive views of business operations, most often using data that has been gathered into a data warehouse or a data mart and typically will support reporting, interactive “slice-and-dice” pivot-table analyses, visualization, and statistical data mining.

Cognos, in particular, offers a business intelligence solution that includes reporting, analysis, score-carding, dashboards, business event management, and data integration. Functionality that, I’d like to point out, is offered by just about every major spend analysis company in the space – most with market caps well under $1B, and many with market caps closer to the 100M range. Ok, so Cognos also offers a “planning” solution to create plans and budgets – but there are dozens of shareware products that do that; a “controller” product that does financial reporting – like most accounting systems; and “performance applications” or ready-built reports, analysis, and metrics for key business functions – which you can get from any provider with a reporting application. They also support deep integration with major database and ERP platforms like Oracle, which I’ll admit takes quite a bit of work when you consider how extensive these types of solutions are and how they tend to change from version to version, and this has some value, but pretty much everything these days supports an XML interface – so as long as you can map the relevant parts of the XML schema they support to your schema, any BI tool will do from an analysis perspective.

Now, Cognos was one of the first, and in the early days, one of the best, of the original BI players, but when it comes to analysis, there are a number of other players offer that offer more-or-less equal capabilities from an analysis perspective. Furthermore, many modern players have built their tools for less than 1% of what IBM is proposing to pay for Cognos.

So why can’t IBM develop their own solution on a fraction of that budget? Are they afraid they’ll repeat the SAP fiasco, where SAP spent millions upon tens of millions upon probably hundreds of millions of dollars trying to build an enterprise level data warehouse and BI system, failed, and ended up buying Business Objects for 6.8B? Is it because they’re afraid it’ll take to long – and that SAP will get a big lead on them in the BI marketplace? If so, then buying a solution is the right way to go – but 5 Billion? Especially when there are other solutions that have the same amount of power that they could get for less than a tenth of that!

Maybe they are looking at it as a future revenue stream – at 1B a year, they’d get their money back in five years – providing that people keep paying for the solution at current prices. But what happens when their customer base realizes that there are alternative solutions which could meet their needs that sometimes go for a fraction of the cost?

Maybe they’re also looking at it as additional manpower in sales and marketing – Cognos is known for it in the BI space. But I think IBM’s doing fine in that regard.

Now I know you’re saying that they’re just playing the game and doing what it takes to be competitive, but I always thought IBM’s goal was to be the leader, not a follower. When they realized in the 90’s that if they didn’t fix their supply chain that they could soon be on the verge of collapse, they did it. When they realized that if they wanted to play in the chip market they had to go big or go home, they did it. And when they realized that the only way to survive in the SAN arena was to innovate, they did it. So why can’t they innovate here? It’s not hard – just build a small team of smart people, give them a decent budget and access to the best resources in the company, and, most importantly, give them the authority to make their own decisions (and not get crushed under the weight of the opinion of every director and his dog).

Now, I am a big believer in Business Intelligence, but I’m a bigger believer first and foremost in intelligence – and I just don’t see the intelligence in overpaying for a solution when that money could be better spent on true innovation. With the right team, they could spend a tenth of that, buy the up-and-coming player with the best technology, use the knowledge of the IBM Global Services AMS migration factory (who probably know more about data migration than all of the big 5 system integrators put together) to integrate with just about every major ERP and database platform under the sun, and spend the rest on innovating new capabilities that would knock-your-socks off.

That’s my opinion. Any shareholders of IBM want to offer a counter-opinion? (It’s essentially your money that IBM is spending.)

Algorhythm and the Optimization Rhythm in India

Recently, I had the pleasure to have a couple of conversations with Ajit Singh, the Founder and Director of Algorhythm, a company in Pune, India that has significant expertise in Optimization and Supply Chain Modeling. The have their own optimization engine, a set of front-ends for different types of supply chain models that can be used by anyone with modeling skills, and significant experience in helping large global multi-nationals with significant supply chain network design and optimization problems. Basically, they’re India’s CombineNet, but with a slight distinction – every model they build, including custom models, can be executed and modified completely by the client through an extension of their easy-to-use windows-based front end – you are not tied to their services. In comparison, although CombineNet has done a great job over the past few years of actually building stand-alone products and interfaces, it’s still often the case that custom models are only available through their services model.

Algorhythm has the capabilities to attack both strategic and tactical supply chain problems from an optimization and simulation perspective. They have sophisticated models for strategic planning that include inventory optimization, distribution network design, manufacturing network design and for tactical execution that include production planning, logistics planning, and supply network execution.

They also have specialized solutions for oil, steel, and packaging as well as having a considerable amount of experience in creating models for manufacturers and distributors. Major clients include Unilever (Hindustan Unilever, Unilever Plc. UK, and Unilever China), Thyssen Krupp, Hindustan Petroleum, and Parle Products among dozens of others. Their manufacturing and distribution network design models often save their clients 3-5%. Remember that we’re talking production models here – not sourcing models, so this is actually quite good. In terms of efficiency, their production planning and scheduling models often halve throughput time and inventory carrying requirements – which is also very good. Furthermore, we’re not talking small models here – Parle, for example, ships 50K trucks per year per SKU from hundreds of factories to thousands of wholesalers.

It’s quite easy to build a model in their products, which they call Prorhythm (for production-planning based models), Netrhythm (for network-planning based models), and Logrhythm (for logistics planning models), and which run on top of their Xtra Sensory optimization engine. They’ve thought through what the model is, what the core elements are that make it up are, what the costs are, and what measures you might want to optimize. Building a model is simply defining all the relevant entities (which are factories, lines, outputs, inputs, etc. in production planning), the associated costs (material, labor, overhead, etc.), the measure(s) you want to optimize (cost, throughput, etc.) and their priority / weighting if multiple, and the constraints. It assumes all relationships between related entities are valid unless you specify them as invalid (and permits groupings for easy constraint definition). It also groups constraints in a “constraint file” so you can easily run the same model against different constraint sets. Basically, it’s built to build models the way the doctor would build it.

Since there is no “one” optimal solution when you’re optimizing against multiple objectives, as it’s almost always impossible to precisely normalize each measure to a uniformly distributed 0-1 interval that can then be weighted according to the weights you want, they also support simulation. You can tell the optimizer to construct a set number of models equally distributed around the desired optimization point and it will automatically create and run all of the variants which you can then compare to see how slight changes impact solutions and goals.

It’s a great offering, and the people are quite knowledgeable. If you have a tough optimization problem, be sure to check them out. They might surprise you.