Daily Archives: November 14, 2007

the doctor Thinks 5 Billion for Business Intelligence is Too Much!

IBM is planning to buy Cognos for 5 Billion. That’s right – 5 Billion! Now I know that Cognos stock is trading at around $57 as I compose this post, that Cognos has a Market Cap that is somewhere around 4.76 Billion, and that Cognos’ annual revenue exceeds 1 Billion a year (1.02 Billion, actually) – but 5 Billion for a business intelligence (BI) solution?

According to Wikipedia, BI is a set of concepts and methods to improve business decision making by using fact-based support systems that is data-driven. Furthermore, BI systems provide historical, current, and predictive views of business operations, most often using data that has been gathered into a data warehouse or a data mart and typically will support reporting, interactive “slice-and-dice” pivot-table analyses, visualization, and statistical data mining.

Cognos, in particular, offers a business intelligence solution that includes reporting, analysis, score-carding, dashboards, business event management, and data integration. Functionality that, I’d like to point out, is offered by just about every major spend analysis company in the space – most with market caps well under $1B, and many with market caps closer to the 100M range. Ok, so Cognos also offers a “planning” solution to create plans and budgets – but there are dozens of shareware products that do that; a “controller” product that does financial reporting – like most accounting systems; and “performance applications” or ready-built reports, analysis, and metrics for key business functions – which you can get from any provider with a reporting application. They also support deep integration with major database and ERP platforms like Oracle, which I’ll admit takes quite a bit of work when you consider how extensive these types of solutions are and how they tend to change from version to version, and this has some value, but pretty much everything these days supports an XML interface – so as long as you can map the relevant parts of the XML schema they support to your schema, any BI tool will do from an analysis perspective.

Now, Cognos was one of the first, and in the early days, one of the best, of the original BI players, but when it comes to analysis, there are a number of other players offer that offer more-or-less equal capabilities from an analysis perspective. Furthermore, many modern players have built their tools for less than 1% of what IBM is proposing to pay for Cognos.

So why can’t IBM develop their own solution on a fraction of that budget? Are they afraid they’ll repeat the SAP fiasco, where SAP spent millions upon tens of millions upon probably hundreds of millions of dollars trying to build an enterprise level data warehouse and BI system, failed, and ended up buying Business Objects for 6.8B? Is it because they’re afraid it’ll take to long – and that SAP will get a big lead on them in the BI marketplace? If so, then buying a solution is the right way to go – but 5 Billion? Especially when there are other solutions that have the same amount of power that they could get for less than a tenth of that!

Maybe they are looking at it as a future revenue stream – at 1B a year, they’d get their money back in five years – providing that people keep paying for the solution at current prices. But what happens when their customer base realizes that there are alternative solutions which could meet their needs that sometimes go for a fraction of the cost?

Maybe they’re also looking at it as additional manpower in sales and marketing – Cognos is known for it in the BI space. But I think IBM’s doing fine in that regard.

Now I know you’re saying that they’re just playing the game and doing what it takes to be competitive, but I always thought IBM’s goal was to be the leader, not a follower. When they realized in the 90’s that if they didn’t fix their supply chain that they could soon be on the verge of collapse, they did it. When they realized that if they wanted to play in the chip market they had to go big or go home, they did it. And when they realized that the only way to survive in the SAN arena was to innovate, they did it. So why can’t they innovate here? It’s not hard – just build a small team of smart people, give them a decent budget and access to the best resources in the company, and, most importantly, give them the authority to make their own decisions (and not get crushed under the weight of the opinion of every director and his dog).

Now, I am a big believer in Business Intelligence, but I’m a bigger believer first and foremost in intelligence – and I just don’t see the intelligence in overpaying for a solution when that money could be better spent on true innovation. With the right team, they could spend a tenth of that, buy the up-and-coming player with the best technology, use the knowledge of the IBM Global Services AMS migration factory (who probably know more about data migration than all of the big 5 system integrators put together) to integrate with just about every major ERP and database platform under the sun, and spend the rest on innovating new capabilities that would knock-your-socks off.

That’s my opinion. Any shareholders of IBM want to offer a counter-opinion? (It’s essentially your money that IBM is spending.)

Algorhythm and the Optimization Rhythm in India

Recently, I had the pleasure to have a couple of conversations with Ajit Singh, the Founder and Director of Algorhythm, a company in Pune, India that has significant expertise in Optimization and Supply Chain Modeling. The have their own optimization engine, a set of front-ends for different types of supply chain models that can be used by anyone with modeling skills, and significant experience in helping large global multi-nationals with significant supply chain network design and optimization problems. Basically, they’re India’s CombineNet, but with a slight distinction – every model they build, including custom models, can be executed and modified completely by the client through an extension of their easy-to-use windows-based front end – you are not tied to their services. In comparison, although CombineNet has done a great job over the past few years of actually building stand-alone products and interfaces, it’s still often the case that custom models are only available through their services model.

Algorhythm has the capabilities to attack both strategic and tactical supply chain problems from an optimization and simulation perspective. They have sophisticated models for strategic planning that include inventory optimization, distribution network design, manufacturing network design and for tactical execution that include production planning, logistics planning, and supply network execution.

They also have specialized solutions for oil, steel, and packaging as well as having a considerable amount of experience in creating models for manufacturers and distributors. Major clients include Unilever (Hindustan Unilever, Unilever Plc. UK, and Unilever China), Thyssen Krupp, Hindustan Petroleum, and Parle Products among dozens of others. Their manufacturing and distribution network design models often save their clients 3-5%. Remember that we’re talking production models here – not sourcing models, so this is actually quite good. In terms of efficiency, their production planning and scheduling models often halve throughput time and inventory carrying requirements – which is also very good. Furthermore, we’re not talking small models here – Parle, for example, ships 50K trucks per year per SKU from hundreds of factories to thousands of wholesalers.

It’s quite easy to build a model in their products, which they call Prorhythm (for production-planning based models), Netrhythm (for network-planning based models), and Logrhythm (for logistics planning models), and which run on top of their Xtra Sensory optimization engine. They’ve thought through what the model is, what the core elements are that make it up are, what the costs are, and what measures you might want to optimize. Building a model is simply defining all the relevant entities (which are factories, lines, outputs, inputs, etc. in production planning), the associated costs (material, labor, overhead, etc.), the measure(s) you want to optimize (cost, throughput, etc.) and their priority / weighting if multiple, and the constraints. It assumes all relationships between related entities are valid unless you specify them as invalid (and permits groupings for easy constraint definition). It also groups constraints in a “constraint file” so you can easily run the same model against different constraint sets. Basically, it’s built to build models the way the doctor would build it.

Since there is no “one” optimal solution when you’re optimizing against multiple objectives, as it’s almost always impossible to precisely normalize each measure to a uniformly distributed 0-1 interval that can then be weighted according to the weights you want, they also support simulation. You can tell the optimizer to construct a set number of models equally distributed around the desired optimization point and it will automatically create and run all of the variants which you can then compare to see how slight changes impact solutions and goals.

It’s a great offering, and the people are quite knowledgeable. If you have a tough optimization problem, be sure to check them out. They might surprise you.