Category Archives: Spend Analysis

Ignite Wants to Spark Your Sourcing Success with Actionable Analytics!

the doctor has written about many Spend Analysis vendors over the last decade*, including Ignite Procurement, which is one of the few newer vendors that he expected would soon breakout of their (Nordic) niche and start expanding, something which they are now starting to do having tripled their customer count in the last 6 months to over 200 customers.

The reason for this expectation? They earned their top right status in the Spend Matters Spend Analysis Solution Map when the (quadrant) maps still existed and the doctor was responsible for grading them as a Top 5 Best-of-Breed Mid-Market focussed Spend Analysis player located in Western Europe / the UK.

Founded by ex-BCG (Boston Consulting Group) consultants in 2017, with the first version of the platform launching in 2018, the key players not only have a firm understanding of spend analysis, but what analysis capabilities a customer needs to find their spend waste and their opportunities. Plus, they understood since day one that spend analysis in a vacuum is not that meaningful and is most meaningful in the context of negotiations, supplier management and development, contract obligation management, labour compliance and ESG reporting, for example. Negotiations work best when they are fact based, supplier development efforts are best focussed on those suppliers where improvements would result in considerable cost reductions or value generation, contracts are meaningless if not adhered to (and the resulting overspend completely unnecessary), labour violations in the supply chain can result in huge fines to your organization, and exceeding your carbon caps can be even more expensive (not to mention the fines if you don’t properly report). (And yes, this is a bit of foreshadowing.)

The core of the Ignite “Spend Management” solution is the analytics offering which, like most spend analysis solutions, has two core components:

1. Data Management

It’s very easy to get data into the Ignite Platform. In fact, it can be as easy as dragging-and-dropping a file onto the browser pane as Ignite allows you to define all of your taxonomies as well as your standard file format mappings to those taxonomies. It can then detect if the data is new, incremental, or updated and allow you to add, add only new records, update existing records, or even load the file into an entirely new cube.

When it comes to taxonomies, you can start with your own or built-in and then, during analysis, you can define and redefine taxonomies on the fly, with reclassification as simple as dragging-and-dropping. You can also define and update mapping rules quickly and easily as well, fixing errors or updating classifications as the need arises.

Data enrichment is easy-peasy compared to generic analytic platforms or suites as they support a number of financial metric, industry classification, currency exchanges, commodity intelligence, CO2 emission sources, and risk metric sources out of the box and provide a full integration platform with APIs, pre-built connectors, and timed data pulls (via SFTP, for example) for those who need custom integrations.

The platform also supports multiple tables and spend cubes and allows you to work on global tables and cubes or local tables and cubes for what-if analysis.

But most importantly, it’s one of the few best-of-breed platforms with a fully integrated visual data flow manager where you can define the entire loading, mapping, enrichment, classification, and automated analytics, reporting, and notification process, including automated supplier normalization.

2. Analytics

The Ignite Platform has just about everything you would expect from a modern analytics platform including arbitrary dimension selection, formula-based dimension derivation, easy (powerful metric based) filters, multiple chart and widget types, easy drill down, easy view/report modification, and ad-hoc analytics.

It also comes with a full suite of out-of-the-box analytics to help you identify potential savings opportunities through contracting (off-contract spend), renegotiation, supplier (re) negotiation, supplier benchmark improvements, spend consolidation, invoice management, payment term rationalization, price improvement, etc. Contract coverage, PO Coverage, key supplier coverage, and a suite of KPI reports are also available out-of-the-box (including a spend development dashboard that can go beyond just spend to impacting metrics such as OTD, quality incidents, etc. if you track the data in the supplier management module).

It’s ability to identify supplier-based (re)negotiation and development opportunities is extremely good and based on its proprietary Ignite Matrix that maps “share of wallet” vs EBIT Margin (which it calculates using mandatory government disclosures and integration with appropriate feeds, such as Enin in the Nordics, and appropriate adjustments) and scatter plots the results to help you quickly identify where your business is contributing to a supplier’s high profit margiin (and where the supplier has room to negotiate without jeopardizing its stability).

On top of this they have also built:

3. Supplier Management (Information / Performance / Risk / Compliance)

With its strong data management underpinnings, the Ignite Spend Management platform can store any all supplier related you wish to track and analyze, which not only allows deep spend-related insights by supplier, but performance and risk (metric) insight by supplier, with the ability to track and compare over time.

In addition to providing a full Supplier 360 view across all data captured in the platform, the Supplier Management capability includes standard campaign management where a buyer or supplier management can create questionnaires for supplier data augmentation and collection of relevant data and documents for supplier performance / risk / compliance management.

4. Contract Management (Governance)

Due to its strong data management underpinnings, the platform can also store all relevant contract meta-data in addition to the contract documents and allow users to manage, report on, and automatically annotate spend that is covered by a contract (as well as determine if it was billed, and paid, at the contracted rate using the appropriate payment terms). Also, as with most contract management plays, it can support tasks and alerts and the linking of contracts to tasks and alerts.

5. Scope 3 Management / Carbon Accounting

The best foundation for a carbon calculator / carbon reporting application is a true analytics platform that can support the definition of all of the appropriate Scope 1, 2, and 3 Categories of relevance to the organization and/or required by the appropriate authority to which reports must be made; the integration of data feeds to allow for the appropriate carbon emission calculations; the collection of actual data from suppliers that can supply it; the generation of the appropriate reports with the appropriate calculations for mandated reporting; and the tracking of changes over time. This is precisely what the Ignite Procurement platform supports.

The entire platform is easy to use and the UX is quite modern, but you don’t have to take our word for it — you can see a three minute demo on their webpage … just scroll down to the Meet Ignite Procurement section. So if you’re looking for an analytics platform that can provide you actionable spend insights on your contracts, suppliers, and ESG that you act on to reduce waste and increase value, you should make sure that Ignite Procurement is on your shortlist, especially if you are in its current target marketplace in the EU/UK.

* Not all on SI, many write-ups are on Spend Matters, behind the revised paywall.

AnyData: Your Mid-Market BoB Analytics Solution for Opportunity Identification, Compliance Tracking, and Associated Project Management

AnyData, which we first covered on Sourcing Innovation back in 2017 (in AnyData: Another Analytics Arriviste from Across the Atlantic), has matured significantly since 2017, especially with its addition of auto-rule generation using AI in 2020 (as chronicled in AnyData, making spend analytics accessible by anyone! [Or, ‘The mid-market analytics quandary’]) over on the Spend Matters Content Hub, Pro/Insider subscription required).

Starting out as a Visual Development Framework / Data Hub, AnyData progressed into a good analytics offering, augmented by a good contract management (governance) offering, augmented by a good supplier management (primarily supplier data hub) offering, and that’s about where it was until 2020 when it significantly improved its analytics offering from a usability perspective and began the journey to take the rest of the platform from good to great.

Over the last two years, it has continued to refine its analytics offering from a quick-start/ease-of-use perspective where it has you up and running and doing real analytics in a guided self-service model in as little 15 minutes, added key field extraction and [better] analytics to its contract management module, included ESG/Sustainability support in its Supplier Management module, and built a brand new Project (Activity) (Savings) Tracking module with default templates for analytics-based savings projects, contract ([re-]negotiation projects), and supplier data gathering/compliance projects. In addition, using their rapid development capability, they can build custom, smart, project forms for any project type an organization wants to track very quickly (typically a few hours, a day at most) for a low fee [in addition to the standard templates you get included in the base subscription].

Since introducing their enhanced analytics offering three years ago with AI-based rule-derivation and easy re-classification, they have enhanced each part of the process to guide an average Procurement Practitioner and fledgeling analyst through each step of the process in a best-practice way that teaches them the basics while allowing them to go wide and deep as soon as they are ready.

For most clients, the AnyData process is this:

  1. Upload the file
  2. Determine if you want to
    • load the data Fresh,
    • Merge the data in, or
    • merge new data in while Replacing existing data for a time-period
  3. Select a Taxonomy (Standard, Industry Best Practice, or Tailored)
  4. Configure the Automation Process as needed
    (fields to analyze, priority, confidence intervals, etc. … or use the defaults)
  5. Review the Business KPI Dashboard and, optionally, drill into the Categorization Data Quality Dashboard
    • Optionally: Define rules for any significant or relevant unclassified spend (note that 100% mapping is not necessary, and not achievable by ANY solution on the market; 90%+ of the spend in a category is enough)
    • Optionally: Override the auto-generated rules, especially if you have a few products or services you treat atypically (for the chosen taxonomy; e.g. printer cartridges in electronics vs. office suppliers)
    • Optionally: Review, drag and drop the taxonometric (sub)categories as desired
    • Optionally: Enhance data/classifications as needed
  6. Review the Significant Opportunity Dashboards
  7. Dive into the Dashboards of Interest,
    which you can modify as needed to update / add analysis in a visual manner as needed
  8. Kick-off one or more analytics/savings project on an opportunity of interest
  9. Repeat from any step in the process as needed

A few points of note:

  • Through the SaaS front-end, file size is limited by browser limits but since they support compressed file uploads (and real-time decompression), that can easily be a 20GB file (uncompressed); if the initial file is too big, they support SFTP
  • AnyData is fast, so under a million rows of data only takes minutes to fully process
  • AnyData has best-practice starting taxonomies for multiple industries and can provide you with one upon setup
  • They tie their industry best-practice taxonomies to the standard taxonomies (like UNSPSC) and can make use of multiple data points for classification and for the industries they know well, their default rule generation will correctly classify 80%+ out of the gate
  • Their visual focus and the ability to drag and drop categories makes taxonomy classification super easy (and fields makes rules classification easy as well)
  • Rules can be on any combination of fields and use multiple types of matching (exact, partial, regex, etc.)
  • The Data Quality dashboard gives you a quick overview of the quality of data you uploaded (and the confidence you can have in the auto-generated rules without reviewing any manually, until you identify a need to do so in a category deep-dive)
  • The opportunity dashboards identify the best opportunities uncovered in the automated out-of-the-box analysis
  • It’s a click to start a savings/analytics project
  • You can jump back to any step at any time and continue on (down a different path) …

It’s tailored for quick start, quick execution, and quick time-to-value for a mid-market staffed by Procurement Professionals who are short on time and short on analytics training as it trains those procurement professionals how to do proper analytics through (semi-)guided workflows as they go.

If you’re a Mid-Market looking for a Best-of-Breed (BoB) analytics solution, AnyData should definitely be on your short list, especially since it’s one of the few offerings that can be obtained at an incredibly affordable price point due to the high DiY nature of the tool (and the focus on self-selection and self-serve SaaS sales). There aren’t many tools where you can get enterprise subscriptions starting at less than 2K/month, and few equal AnyData.

It Doesn’t Matter Where You Start, You End with BoB in Analytics!

In a recent article, we asked in the battle of Suite vs. BoB (Best-of-Breed), which do you choose, and ended up with the answer of neither, but potentially both, because, as indicated in our article we asked in our post on Where’s the Procurement Management Platform, you need a true platform (that enables the creation of a true source-to-pay plus ecosystem for the various workflows and processes that need to be managed).

As a result, we indicated you could start where you wanted, provided:

  • you could conceivably manage it,
  • the vendor offers, and publicly publishes, a complete Open API, and
  • the vendor offers the necessary quick-start services.

(And for even more details on each of these requirements, stay tuned for our upcoming article on how it doesn’t matter where you start, you end with BoB in SXM).

But where do you end up? For some Procurement Practitioners, it depends on:

  • the module,
  • the organization’s biggest need for workflow/process management, and
  • the organization’s biggest savings/cost avoidance/value creation opportunities.

(And again, we’ll have even more details in our upcoming article on how you end with BoB in SXM for more details.)

But for Analytics, like SXM, you will end at BoB for analytics as no suite equals the best in class (BiC) (spend) analytics solutions (even if they are built in BiC technologies for generic analytics like Qlik or Tableau) as the true BoB spend analysis solutions (which are fewer and further between than you would expect) are leagues beyond them.

Moreover, for Analytics, you should start with BiC, even if the suite has a pre-packaged solution that’s pretty good, enough to get going, more than your fledgeling analysts are likely to be able to handle in the first year, and appears to be offering the module cheap as an add on to everything else they are selling you. Why?

Lots and lots of reasons. Here are five to get you started:

  • Top X Opportunities: Suites will only show you your top 10 categories, top 10 suppliers, top unmanaged tail categories, etc. No guarantee that these primitive, canned, analysis will be YOUR biggest opportunities. BoB will come with hundreds of built-in analytics, considerably more customization capability, and the power to find opportunities that pre-built suites and dashboards will never give you.
  • Better Classification: Suites will do a decent classification, usually through their black box AI (trained on billions and trillions), but even if they get to 95%, it won’t be great, it won’t be manageable, and it won’t be customizable to your organization’s need. BoB, when it uses AI, will use it to create rules, that can be corrected and overridden, that you can customize to your specific taxonometric needs for optimized Procurement (and no standard industry classification is worth its weight in protactinium), usually starting with an out-of-the-box taxonomy customized to your industry using the vendor’s experience and community knowledge.
  • Better Analytics: many of these tools have a lot more capability in terms of report construction, dimension derivation, metric support, integrated data science, etc. etc. etc.
  • Better UX: while UX is completely subjective, and as per a (previous/upcoming) rant, is not something an analyst should be scoring and advising you on (as the best UX is the one that works best for you), in general, the probability is very high that you will find these BoB tools more customizeable in workflow and configuration, more logical in workflow, and much easier to use (if this wasn’t the case, no one would buy these tools and the vendors would have closed their [virtual] doors a long time ago)
  • Beyond Analytics: most BoB solutions will have integrated opportunity selection and project/savings tracking, performance/throughput/project metric support, and/or risk-based analytics. The value of analytics is continually overlooked because the “Savings” is identified in the sourcing event, captured in the contract, and realized in Procurement, and no one wants to acknowledge the opportunity would not even have been identified without analytics.

And, finally, why not get used to using a best-in-class tool from the get-go so you don’t have to relearn a new tool when you max out the capabilities of the suite solution and are ready for the next level? Especially when, as you get better and better at analytics and dive deeper and deeper into categories, you can improve the taxonometric mappings, track all the opportunities you identify (and your progress), do what-if analysis when the mood strikes, and get productive in a tool that will do [much, much] more for you in the long run?

So, while you might select a suite SIM module as a foundation for your supplier data store when you need to start centralizing supplier data somewhere for your sourcing projects and procurement buys (which is where your organization has determined it needs to start its S2P journey), when you’re ready for analytics, just go straight to BoB. (And if the C-Suite wants to see reports in the fancy suite, buy the basic reporting package and let them use the basic dashboards. And if the suite supports custom dashboards, then pump the appropriate analytics back in as reporting data. Get good with best-in-class analytics from the go with the best solution you can.)

The 39 Steps … err … The 39 Clues … err … The 39 Part Series to Help You Figure Out Where to Start with Source-to-Pay

Figuring out where to start is not easy, and often never where the majority of vendors or consultants say you should start. They’ll have great reasons for their recommendations, which will typically be true, but they will be the subset of reasons that most benefits them (as it will sell their solution), and not necessarily the subset of reasons that most benefits you now. While you will likely need every module there is in the long run, you can often only start with one or two, and you need to focus on what’s the greatest ROI now to prove the investment and help you acquire funds to get more capability later, when you are ready for it. But figuring out how much you can handle, what the greatest needs are, and the necessary starting points aren’t easy, and that’s why SI dove into this topic, with arguments and explanations and module overviews, both broader and deeper than any analyst firm or blogger has done before. Enjoy!

Introductory Posts:
Part 1: Where Do You Start?
Part 2: Where Should You Start?
Part 3: You Start with …
Part 4: e-Procurement, and Here’s Why.

e-Procurement
Part 5: Defining an e-Procurement Baseline
Part 6: There are Barriers to Selecting an e-Procurement Solution (and they are not what you think)
Part 7: Over 70 e-Procurement Companies to Check Out

Interlude 1
Part 8: What Comes Next?

Spend Analysis
Part 9: Time for Spend Analysis
Part 10: What Do You Need for A Spend Analysis Baseline, I
Part 11: What Do You Need for A Spend Analysis Baseline, II
Part 12: Over 40 Spend Analysis Vendors to Check Out

Interlude 2
Part 13: But I Can’t Touch the Sacred Cows!
(including Over 20 SaaS, 10 Legal, and 5 Marketing Spend Management / Analysis Companies to Check Out)
Part 14: Do Not Stop At Spend Analysis!

Supplier Management
Part 15: Supplier Management is a CORNED QUIP Mash
Part 16: Supplier Management A-Side
Part 17: Supplier Management B-Side
Part 18: Supplier Management C-Side
Part 19: Supplier Management D-Side
Part 20: Over 90 Supplier Management Companies to Check Out

Contract Management
Part 21: Time for Contract Management
Part 22: Contract Management is a NAG: Let’s Start with Negotiation
Part 23: Contract Management is a NAG: Let’s Continue with [Contract]Analytics
Part 24: Contract Management is a NAG: Let’s End with [Contract] Governance
Part 25: Over 80 Contract Management Vendors to Check Out

e-Sourcing
Part 26: Time for e-Sourcing
Part 27: Breaking Down the ORA of Sourcing Starting With RFX
Part 28: Breaking Down the ORA of Sourcing Continuing with e-Auctions
Part 29: Breaking Down the ORA of Sourcing Ending with [Strategic Sourcing Decision] Optimization
Part 30: Over 75 e-Sourcing Vendors to Check Out!

Invoice-to-Pay (I2P):
Part 31: Time for Invoice-to-Pay
Part 32: Breaking Down the Invoice-to-Pay Core
Part 33: Over 75 Invoice-to-Pay Companies to Check Out

Orchestration:
Part 34: How Do I Orchestrate Everything?
Part 35: Do I Intake, Manage, or Orchestrate?
Part 36: Over 20 Intake, [Procurement] [Project] Management, and/or Orchestration Companies to Check Out
Part 37: Investigating Intake By Diving In to the Details
Part 38: Prettying Up the Project with Procurement Project Management
Part 39: Deobfuscating the Orchestration and Fitting it All Together

Reporting is Not Analysis — And Neither Are Spreadsheets, Databases, OLAP Solutions, or “Business Intelligence” Solutions

… and one of the best explanations the doctor has ever read on this topic (which he has been writing about for over two decades) was just published over on the Spendata blog on Closing the Analysis Gap. Written by the original old grey beard himself (who arguably built what was the first stand alone spend analysis application back in 2000 and then redefined what spend analysis was not once, but twice, in two subsequent start-ups that built two, entirely new, analytics applications that took a completely different, more in-depth approach), it’s one of the first articles to explain why every current general purpose solution that you’re currently using to try and do analysis actually doesn’t do true analysis and why you need a purpose built analysis solution if you really want to find results, and in our world, do some Spend Rappin’.

We’re not going to repeat the linked article in its entirety, so we’ll pause for you to go read it …

 

… we said, go to the linked article and read it … we’ll wait …

 

READ IT! Then come back. Here’s the the linked article again …

 

Thank you for reading it. Now we’ll continue.

As summarized by the article, we have the following issues:

Tool Issue Resolution Loss of Function
Spreadsheet Data limit; lack of controls/auditability Database No dependency maintenance; no hope of building responsive models
Database performance on transactional data (even with expert optimization) OLAP Database Data changes are offline only & tedious, what-if analysis is non-viable
OLAP Database Interfaces, like SQL, are inadequate BI Application Schema freezes to support existing dashboards; database read only
BI Application Read only data and limited interface functionality Spreadsheets Loss of friendly user interfaces and data controls/auditability

In other words, the cycle of development from stone-age spreadsheets to modern BI tools, which was supposed to take us from simple calculation capability to true mathematical analysis in the space age using the full breadth of mathematical techniques at our disposal (both built-in and through linkages to external libraries), has instead taken us back to the beginning to begin the cycle anew, while trying to devour itself like an Ouroboros.


Source: Wikipedia

 

Why did this happen? The usual reasons. Partly because some of the developers couldn’t see a resolution to the issues when they were first developing these solutions, or at least a resolution that could be implemented in a reasonable timeframe, partly (and sometimes mostly) because vendors were trying to rush a solution to market (to take your money), and partly (and sometimes largely) because the marketers keep hammering the message that what they have is the only solution you need until all the analysts, authors, and columnists repeat the same message to the point they believe it. (Even though the users keep pounding their heads against the keyboard when given a complex analysis assignment they just can’t do … without handing it off to the development team to write custom code, or cutting corners, or making assumptions, or whatever.) [This could be an entire rant on its own how the rush to MVP and marketing mania sometimes causes more ruin than salvation, but considering volumes still have to be written on the dangers of dunce AI, we’ll have to let this one go.]

The good news is that we now have a solution you can use to do real analysis, and this is much more important than you think. The reality is that if you can’t get to the root cause of why a number is as it is, it’s not analysis. It’s just a report. And I don’t care if you can drill down to the raw transactions that the analysis was derived from, that’s not the root cause, that’s just supporting data.

For example, profit went down because warranty costs increased 5% is not helpful. Why did warranty costs go up? Just being able to trace down to the transactions where you see 60% of that increase is associated with products produced by Substitional Supplier is not enough (and in most modern analysis/BI tools, that’s all you can do). Why? Because that’s not analysis.

Warranty costs increasing 5% is the inevitable result of something that happened. But what happened? If all you have is payables data, you need to dive into the warranty claim records to see what happened. That means you need to pull in the claim records, and then pull out the products and original customer order numbers and look for any commonalities or trends in that data. Maybe after pulling all this data in you see, of the 20 products you are offering (where each would account for 5% of the claims if all things were equal) there are 2 products that account for 50% of the claims. Now you have a root cause of the warranty spend increase, but not yet a root cause of what happened, or how to do anything about it.

To figure that out, you need to pull in the customer order records and the original purchase order records and link the product sent to the customer with a particular purchase order. When you do this, and find out that 80% of those claims relate to products purchased on the last six monthly purchase orders, you know the products that are the problem. You also know that something happened six months or so ago that caused those products to be more defective.

Let’s say both of these products are web-enabled remote switch control boxes that your manufacturing clients use to remotely turn on-and-off various parts of their power and control systems (for lighting, security monitoring, etc.) and you also have access, in the PLM system, to the design, bill of materials (BOM), and tier 2 suppliers and know a change takes 30 to 60 days to take effect. So you query the tier 1 BOM from 6, 7, 8, and 9 months ago and discover that 8 months ago the tier 2 supplier for the logic board changed (and nothing else) for both of these units. Now you are close to the root cause and know it is associated with the switch in component and/or supplier.

At this point you’re not sure if the logic board is defective, the tier 1 supplier is not integrating it properly, or the specs aren’t up to snuff, but as you have figured out this was the only change, you know you are close to the root cause. Now you can dive in deep to figure out the exact issue, and work with the engineering team to see if it can be addressed.

You continue with your analysis of all available data across the systems, and after diving in, you see that, despite the contract requiring that any changes be signed off by the local engineering team only after they do their own independent analysis to verify the product meets the specs and all quality requirements, you see that the engineering, who signed off on the specs, did not sign off on the quality tests which were not submitted. You can then place a hold on all future orders for the product, get on the phone with the tier 1 supplier and insist they expedite 10 units of the logic board air freight for quality testing, and get on the phone with engineering to make sure they independently test the logic boards as soon as they arrive.

Then, when the product, which is designed for 12V power inputs, arrives and the engineers do their stress tests and discover that the logic board, which was spec’ed to be able to handle voltage spikes to 15V (because some clients power backup systems off of battery backups that run off of chained automotive batteries) actually burns out at 14V, you have your root cause. You can then force the tier 1 supplier to go back to the original board from the original supplier, or find a new board from the current supplier that meets the spec … and solve the problem. [And while it’s true you can’t assume that all of the failure increases were the logic board without examining each and every unit of each and every claim, in this situation, statistically, most of the increase in failures will be due to this [as it was the only change].]

In other words, true analysis means being able to drill into raw data, bring in any and all associated data, do analysis and summaries of that data, drill in, bring in related data, and repeat until you find something you can tie to a real world event that led to something that had a material impact on the metrics that are relevant to your business. Anything less is NOT analysis.