Daily Archives: January 20, 2010

Is The Future of ERP Harmonization?

In a recent article over on CIO.com, Thomas Wailgum suggests that the future of ERP might be harmonization, which he defines as the happy middle between new advances in middleware offerings, tools from the big vendors that allow easy integration between core databases and infrastructure, and SaaS apps where appropriate.

Given that, as Thomas astutely notes,

  • for most companies, the pursuit of the single instance dream hasn’t led to success,
  • companies can no longer afford to wait the typical five to seven years for returns on major IT investments (especially when, on average, three new hardware platforms and three new major software versions will materialize in that timeframe), and
  • the worst recession in recent history is causing most companies to ask what the true cost of their ERP is and what the realized value is, especially when most ERP vendors are trying to raise maintenance costs to 22% or more while delivering little or no incremental value.

Given these harsh realities, and the fact that many companies are finding that their time is running out on their antiquated ERP systems such as PeopleSoft, R/3, e-Business Suite, and JDE, this might finally be the turning point where companies stop pumping millions of dollars into legacy ERP systems that, for many organizations, provide little return — especially when enterprise versions of open-source cloud-ready ERP systems like Compiere are available for a fraction of the cost of typical ERP systems. These systems, which can come bundled with support, often cost less than 1/5th of a SAP or Oracle solution and play nice with on-demand middleware and best-of-breed SAP solutions that implement common XML standards. This allows you to quickly assemble considerably more functionality, and value, for an up-front cost that is a drop in the bucket compared to what many traditional ERP systems cost.

In enterprise software, it’s often hard to say for certain what will happen. But I think this is the recession that will finally force the inevitable move away from “sunk cost” IT to “pay for performance” and that the enterprise of tomorrow will be different from today.

I’d also recommend checking out part II of the article on Making Sense of All That Data. Regardless of your viewpoint, Thomas makes some interesting points, especially with regards to the “Super Vendors” and the forthcoming consolidation in the traditional ERP space (which always occurs at the end of a recession when the rich buy the poor).

Share This on Linked In

What Impact Will the BI Megatrends from 2009 Have on Next Generation Spend Analysis?

An article in Intelligent Enterprise last year outlined the Nine BI Megatrends for 2009 that the author expected to reshape business intelligence and information management in the year(s) ahead. Since spend analysis is a major component of business intelligence in supply chain, one has to wonder what impact these megatrends will have. But first, let’s address the mega-trends presented in the article.

  • Open Source
    Low TCO, mature development stacks such as LAMP (Linux, Apache, MySQL and PHP, Perl, or Python) [or MAMP if you prefer the Mac which, being built on Unix, is fully compatible thanks to Xcode], and new open source offerings from players such as Pentaho are making open-source platforms and foundations attractive, and providing pressure on commercial vendors to bring down the TCO.
  • BI is becoming less isolated
    Many users are now employing reporting, access, and analysis tools that come with functional applications, forcing suites to break down silos to offer value.
  • Users are demanding a richer experience
    The days of simple, canned reporting are finally slipping into the past. BI portals are starting to become richer, more flexible, and more powerful. They’re using Rich Internet Application (RIA) technology to improve the user experience and incorporating mash-ups to allow users to better visualize the data.
  • BI is starting to focus on relationships
    BI used to focus on reports that did not provide any flexibility when it came to investigating data relationships, but new tools are giving users the ability to define their own relationships, cubes, and reports and dive into the data in new and innovative ways and find relationships that, classically, would take weeks of specialist data mining or statistical analysis to uncover.
  • Business Modeling meets MDM
    Master Data Management and emerging semantic models, which could serve business modeling in the same way that data models, schema, and metadata served extract, transform, and load (ETL) tools, are enabling some vendors to create tools that improve business modeling and its data modeling relationship using graphical interfaces that allow analysts to create their own data models without having to learn specialized languages or methodologies.
  • MapReduce meets Large Scale Data Analysis
    Although the most famous implementation belongs to Google, it’s also available in the open source Apache Hadoop framework, and allows organizations to build parallel, virtualized architectures based on server farms using commodity hardware which can analyze more data simultaneously than ever before, allowing for the discovery of new relationships that can prove very insightful to BPM.
  • Column-oriented databases are attacking performance woes
    Some of the leading column-oriented database technologies are employing advanced compression technology and large memory algorithms that is changing the game for BI and data warehouse architectures, allowing complex queries to be answered in realistic amounts of time.
  • Event Processing is opening up new analytical possibilities
    Emergent applications in healthcare, telecommunications, intelligence, IT management, gaming, and web analytics are capturing events and correlating them with analytics from BI tools to give organizations actionable insight.
  • Too Big to Fail
    As more and more queries are run against multi-billion row tables in data warehouses managing hundreds of terabytes of data (and growing daily), we’ll see more and more BI implemented to improve BI.

So what does this mean for spend analysis? With the exceptions of MapReduce and Column-oriented databases, not much. The reality is that It’s the Analysis, Stupid and anything that doesn’t simplify analysis while increasing the analytical power available to the user won’t stay on the radar very long. That’s why I’m pleased to inform you that Eric Strovink’s new series on Spend Analysis starts within a week. As I’m sure it will be as informative and forward looking as his last two (linked in Spend Rappin’), I’m certainly looking forward to it!

Share This on Linked In