Category Archives: Spend Analysis

Why Bother Classifying Spend? 3 Ways Spend Analysis Will Improve Your Life … Part I

Today’s guest post is from Brian Seipel, Spend Analysis lead at Source One Management Services focused on helping corporations gain a clear view of their spend data to derive actionable budget optimization strategies.

Let’s face it, you and your team have your collective hands full keeping the Procurement trains running each day. Adding a spend analysis initiative on top of everything else being juggled? Well, that may be one ball too many to keep in the air. It seems like an unnecessary added step you simply don’t have time for and, really, what’s the point?

Through years of working with clients to develop and execute strategic sourcing initiatives, I have found there are two camps I can sort organizations into. Which side a client lands on is indicative of how much work lies ahead in terms of helping them truly control spend. Organizations will either be pro spend analysis… or barely spend time on the subject, if any at all.

To be fair, many organizations run a tight ship in terms of managing spend – but there’s still room for improvement for a good number of others. There are some great reasons to make a proper spend analysis a priority. As such, I wanted to take a minute to extol the virtues of this process to show some of the benefits you may be missing out on. See below for my top three reasons a proper spend analysis should be the next initiative you spend some time on.

A Tale of Two VARs

(Value Added Resellers)

First, I’d like to set the stage a bit. Consider the relationship between an organization and its IT hardware/software value-added resellers. In this scenario, we have two such VARs; one servicing the organization’s New York branch, the other servicing Philadelphia.

These two VARs have a lot in common. Both serve the same North East region, both offer stellar customer service, and so far the relationship has been good on all sides. Each office comes away satisfied after reviewing their VAR’s track record. But is that all there is to the story?

Generate More Savings

One of the most apparent (if not THE most apparent) reasons to analyze your spend is the impact such an analysis has on strategic sourcing initiatives. At the most basic level, an organization needs to know several key facts before developing a strategy around cost savings:

  • “How much money are we spending, and who is spending it?”
  • “Who is that money going to?”
  • “When are these transactions happening?”

These seem like simple enough questions, but getting the answers can be tricky. To kick off our VAR example, one great way to save money with such VARs is to leverage your spend volume to negotiate rebate structures and develop reduced unit pricing for all purchases. The more you spend, the bigger the rebate and the greater the incentive for VARs to offer unit price discounts – and these things can add up quickly. Consolidating spend to as few VARs as possible helps to maximize this strategy, and both our VARs service the same region. However, because New York and Philadelphia each use two separate VARs, neither will be able to negotiate as strong a rebate, and we likely won’t make much progress in commanding discounted rates. Each location may have a great relationship with its respective VAR – and Procurement wouldn’t know they were missing out on a savings opportunity until a spend analysis revealed this missing piece.

But this is just one way spend analytics will change your life.

Thanks, Brian.

Introducing LevaData. Possibly the first Cognitive Sourcing Solution for Direct Procurement.

Who is LevaData? LevaData is a new player in the new optimization-backed direct material prescriptive analytics space, and, to be honest, probably the only player in the optimization-backed direct material prescriptive analytics space. While Jaggaer has ASO and Pool4Tool, it’s direct material sourcing is optimization backed and while it has VMI, it does not have advanced prescriptive analytics for selecting vendors who will ultimately manage that inventory.

LevaData was formed back in 2014 to close the gaps that the founders saw in each of the other sourcing and supply management platforms that they have been a part of over the last two decades. They saw the need for a platform that provided visibility, analytics, insight, direction, optimization, and assistant — and that is what they sent out to do.

So what is the LevaData platform? It is sourcing platform for direct materials that integrates RFX, analytics, optimization, (should) cost modelling, and prescriptive advice into a cohesive whole that helps a buyer buy better when they use and which, to date, has reduced costs (considerably) for every single client.

For example, the first year realized savings for a 5B server and network company who deployed the LevaData platform was 24M; for a 2.4B consumer electronics company, it was 18M; and for a 0.6B network customer, it was 8M. To date, they’ve delivered over 100M of savings across 50B of spend to their customer base, and they are just getting started. This is due to the combination of efficiency, responsiveness, and savings their platform generates. Specifically, about 60% of the value is direct material cost reduction and incremental savings, 30% is responsiveness and being able to take advantage of market conditions in real time, and 10% is improved operational efficiency.

The platform was built by supply chain pros for supply chain buyers. It comes with a suite of f analytics reports, but unlike the majority of analytics platforms, the reports are fine tuned to bill of materials, component, and commodity intelligence. The reports can provide deep insight to not only costs by product, but costs by component and/or raw material and roll up and down bill of materials and raw materials to create insights that go beyond simple product or supplier reports. Moreover, on top of these reports, the platform can create costs forecasts and amortization schedules, track rebates owed, and calculate KPIs.

In order to provide the buyer with market intelligence, the application imports data from multiple market fees, creates benchmarks, compares those benchmarks to internal market data, automatically creates competitive reports, and calculates the foundation costs for should cost models.

And it makes all the relevant data available within the RFX. When a user selects an RFX, it can identify suppliers, identify current market costs, use forecasts and anonymized community intelligence to calculate target costs, and then use optimization to determine what the award split would be, subject to business constraints, and identify the suppliers to negotiate with, the volumes to offer, and the target costs to strive for.

It’s a first of its kind application, and while some components are still basic (as there is no lane or logistics support in the optimization model), missing (as there is no ad-hoc report builder, or incomplete (such as collaboration support between stakeholders or a strong supplier portal for collaboration), it appears to meet the minimal requirements we laid out yesterday and could just be the first real cognitive sourcing application on the market in the direct material space.

BIQ: Alive and Well in the Opera House! Part II

Yesterday we noted that BIQ, from the sleepy little town of Southborough, that was acquired by Opera Solutions in 2012, is not only alive and well in the Opera House, but has been continually improved since its acquisition and the new version, 5(.05), even has a capability no other spend analytics product on the market has.

So what is this new capabilities? We’ll get to that. First of all, we want to note that since we last covered BIQ, a number of improvements have been made, and we’ll cover those.

Secondly, we want to note that the core engine is as powerful as ever. Since it runs entirely in memory, on data entirely in memory, it can process 1M transactions per second. Need to add a dimension? Change a measure? Recalculate a report? It’s instantaneous on data sets of 1M transactions or less. And essentially real-time on data sets of 10M transactions. Try getting that performance from your database or OLAP engine. Just try it.

One of the first big changes they made was complete separation of the engine from the viewer. This allowed them to do two things. One, create a minimal engine footprint (for in-memory execution) with a fully exposed API that allowed them to create a full web-based SaaS version as well as an improved desktop application and expose the full power of the BIQ engine to either instance.

They used QlikView for the web interface and through this interface have created a collection of CIQ (category intelligence) and PIQ (performance intelligence) dashboards for just about every indirect category and standard performance category (supplier, operations, finance, etc.) in addition to a standard spend dashboard with reports and insights that rivals any competitor dashboard. In addition, they have exposed all of the dimensions in the underlying data and measures that have been programmed and a user can not only create ad-hoc reports, but ad-hoc cross-tabs and pivot tables on the fly.

And they re-did the desktop interface to look like a modern analytics front-end that was built this decade. As those who saw it know, the old BIQ looked like a Windows 98 application, even though Microsoft never built anything with that amount of power. The new interface is streamlined, slick, and quick. It has all of the functionality of the old interface, plus modern widget that are easy to rearrange, expand, minimize, and deploy.

One of the best improvements is the new data loader. It’s still file based, but supports a plethora of file formats, can be used to transform data from one format to another, merge files into a single file or cube, picking some or all of the data. It’s quick, easy, user friendly, and can process massive amounts of data quickly, letting users know if there are errors or issues that need to be identified almost immediately.

Another great feature is the new anomaly detection engine that can be run in parallel with BIQ, built on the best of BIQ and Signal Hub technology. Right now, they only have an instance fine tuned to T&E spend in the procurement space, but you can bet more instances will be coming soon. But this is a great start. T&E spend is plentiful, a lot of small transactions, and hard to find those needles that represent off policy spend, off contract spend, and, more importantly, fraudulent spend. Using the new anomaly detection feature you can quickly identify when an employee is flying business instead of coach, using an off-contract airline, or, and this is key, charging pet kennels as lodging or strip club bills as executive dinners.

But this isn’t the best new feature. The best new feature is the new Open Extract capability that provides true open access to Python-based analytics in BIQ. The new version of BIQ engine, which runs 100% in memory, includes the python runtime and a fully integrated IDE. Any analyst or data scientist that can script python can access and manipulate the data in the BIQ engine in real time, using constructs built specifically for this purpose. And these custom built scripts run just as fast as the built in scripts as they run native in the engine. For example, you can run a Benford’s Law analysis on 1M transactions in less than a second. And building it in python, and the Anaconda distribution in particular, means that any of the open source analytics packages for Continuum Analytics can be used. There’s nothing else like it on the market. It takes spend analysis to a whole new level.

BIQ: Alive and Well in the Opera House! Part I

Fourteen years ago, in the sleepy little town of Southborough, Massachusetts, a tiny start up called BIQ was created. It’s mission was to give business analysts the powerful transactional data analysis tool that they needed to do their own analysis and get their own insight. Less than two years later, it released that tool, called BIQ, and it totally changed the spend analysis market. For the first time, power analysts could do everything themselves in a market where spend analysis was primarily offered as a service, and they could do it at a price point that was at least an order of magnitude less than what the big providers were charging them. With licenses starting at 36K a year, an analyst could do the same analysis that he was paying a suite provider 360K for and a best of breed provider 1M for. Now, it required a lot of knowledge, aesthetic blindness, elbow grease, and overtime, but it could be done.

And when we say everything, we mean everything. You could load any flat files you want in a standard format (such as csv) in the data loader. You could combine them into any cubes you wanted by defining the overlapping dimensions. You could define ranged and derived dimensions using simple formula or built in definitions. You could drill down in real time, filter on what you wanted, and export subsets of records. You could define any categorization you wanted against any schema, any mapping rules you wanted, they were organized into priority groups, given a priority order, and run most specific to least specific so you never got a collision or random mapping like you might in a tool where you just defined non-prioritized rules that went in a database and often got applied in random order. You could define supplier families that could be reused. You could build your own cross-tab reports. It was the swiss army knife of analytics, at a price every organization could afford.

This quickly made BIQ a favourite not just among mid-market companies that couldn’t afford, and big companies that didn’t want to afford, high priced services, but also niche consultancies that could now do power-house analytics projects on their own, including firms like Lexington Analytics and Power Advocate. This, along with some really smart marketing, pushed BIQ into the mainstream of spend analytics providers, making it a de-factor shortlist candidate for any company wanting do-it-yourself spend analysis. This, of course, got the attention of many providers, who were afraid of the threat, in awe of the technology, or both.

One of these providers was Opera Solutions, who acquired BIQ in 2012, and shortly after, Lexington Analytics. Once the two providers were merged, Opera Solutions instantly had a complete spend analysis software and services solution for the indirect space. And they have steadily improved this offering since its acquisition. The new version comes packed with some big enhancements, including one capability that is not only market leading, but unique among all the spend analysis providers we have covered to date.

What is that? Come back tomorrow!

The UX One Should Expect from Best-in-Class Spend Analysis … Part V

In this post we wrap up our deep dive into spend analysis and what is required for a great user experience. We take our vertical torpedo as far as it can go and wrap the series up with insights beyond what you’re likely to find anywhere else. We’ve described necessary capabilities that go well beyond the capabilities of many of the vendors on the market, and more will fall by the wayside today. But that’s okay. The best will get up, brush off the dirt, and keep moving forward. (And the rest will be eaten by the vultures.)

And forward momentum is absolutely necessary. One of the keys to Procurement’s survival (unless it really wants to meet it’s end in the Procurement Wasteland we described in bitter detail last week) is an ability to continually identify value in excess of 10% year-over-year. Regardless of what eventually comes to pass, the individuals who are capable of always identifying value will survive in the organizations of the future.

But if this level of value is to be identified, buyers are going to need powerful, usable, analytics — much more powerful and usable then what the average buyer has today. Much more.

As per our series to date, this requires over a dozen key useablity features, many of which are not found in your average first, and even second generation, “reporting” and “business intelligence” analytics tool. In our brief overview series to date here on SI (on The UX One Should Expect from Best-in-Class Spend Analysis … Part I, Part II, Part III, and Part IV) we’ve covered four key features:

  • real, true dynamic dashboards,
  • simultaneous support for multiple cubes,
  • real-time idiot-proof data categorization, and
  • descriptive, predictive, and prescriptive analytics

And deep details on each were provided in the linked posts. But even prescriptive analytics, which, for many vendors, is really pushing the envelope, is not enough. Great solutions really push the envelope. For example, the most advanced solutions will also offer permissive analytics. As the doctor has recently explained in his two-part series (Are We About to Enter the Age of Permissive Analytics and When Selecting Your Prescriptive, and Future, Permissive, Analytics System), a great spend analysis system goes beyond prescriptive and uses AR and a rules-engine to enable a permissive system that will not only prescribe opportunities to find value but initiate action on those opportunities.

For example, if the opportunity is a tail-spend opportunity that could best be captured by a spot-auction, approved products that meet the bill, and approved suppliers that can automatically be invited to an auction to provide them, the system will automatically set up the auction and invite the suppliers, and if the total spend is within an acceptable amount, automatically offer an award (subject to pre-defined standard terms and conditions).

And that’s just the tip of the iceberg. For more insight onto just how much a permissive analytics platform can offer, check out the doctor and the prophet‘s fifth and final instalment on What To Expect from Best-in-Class Spend Analysis Technology and User Design (Part V) over on Spend Matters Pro (membership required). It’s worth it. And maybe, just maybe, when you identify, and adopt, the right solution, you won’t end up wandering the Procurement Wasteland.