Category Archives: Spend Analysis

You Don’t Need Gen-AI to Revolutionize Procurement and Supply Chain Management — Classic Analytics, Optimization, and Machine Learning that You Have Been Ignoring for Two Decades Will Do Just Fine!

This originally posted on March 22 (2024).  It is being reposted because we need solutions, Gartner (who co-created the hype cycle) published a study which found that Gen-AI/technology implementations fail  85% of time, and its because we have abandoned the foundations — which work wonders in the hands of properly applied Human Intelligence (HI!).  Gen-AI, like all technologies, has its place, and it’s not wherever the Vendor of the Week pushes it, but where it belongs.  Please remember that.

Open Gen-AI technology may be about as reliable as a career politician managing your Nigerian bank account, but somehow it’s won the PR war (since there is longer any requirement to speak the truth or state actual facts in sales and marketing in most “first” world countries [where they believe Alternative Math is a real thing … and that’s why they can’t balance their budgets, FYI]) as every Big X, Mid-Sized Consultancy, and the majority of software vendors are pushing Open Gen-AI as the greatest revolution in technology since the abacus. the doctor shouldn’t be surprised, given that most of the turkeys on their rafters can’t even do basic math* (but yet profess to deeply understand this technology) and thus believe the hype (and downplay the serious risks, which we summarized in this article, where we didn’t even mention the quality of the results when you unexpectedly get a result that doesn’t exhibit any of the six major issues).

The Power of Real Spend Analysis

If you have a real Spend Analysis tool, like Spendata (The Spend Analysis Power Tool), simple data exploration will find you a 10% or more savings opportunity in just a few days (well, maybe a few weeks, but that’s still just a matter of days). It’s one of only two technologies that has been demonstrated, when properly deployed and used, to identify returns of 10% or more, year after year after year, since the mid 2000s (when the technology wasn’t nearly as good as it is today), and it can be used by any Procurement or Finance Analyst that has a basic understanding of their data.

When you have a tool that will let you analyze data around any dimension of interest — supplier, category, product — restrict it to any subset of interest — timeframe, geographic location, off-contract spend — and roll-up, compare against, and drill down by variance — the opportunities you will find will be considerable. Even in the best sourced top spend categories, you’ll usually find 2% to 3%, in the mid-spend likely 5% or more, in the tail, likely 15% or more … and that’s before you identify unexpected opportunities by division (who aren’t adhering to the new contracts), geography (where a new local supplier can slash transportation costs), product line (where subtle shifts in pricing — and yes, real spend analysis can also handle sales and pricing data — lead to unexpected sales increases and greater savings when you bump your orders to the next discount level), and even in warranty costs (when you identify that a certain supplier location is continually delivering low quality goods compared to its peers).

And that’s just the Procurement spend … it can also handle the supply chain spend, logistics spend, warranty spend, utility and HR spend — and while you can’t control the HR spend, you can get a handle on your average cost by position by location and possibly restructure your hubs during expansion time to where resources are lower cost! Savings, savings, savings … you’ll find them ’round the clock … savings, savings, savings … analytics rocks!

The Power of Strategic Sourcing Decision Optimization

Decision optimization has been around in the Procurement space for almost 25 years, but it still has less than 10% penetration! This is utterly abysmal. It’s not only the only other technology that has been generating returns of 10% or more, in good times and bad, for any leading organization that consistently uses it, but the only technology that the doctor has seen that has consistently generated 20% to 30% savings opportunities on large multi-national complex categories that just can’t be solved with RFQ and a spreadsheet, no matter how hard you try. (But if you want to pay them, an expert consultant will still claim they can with the old college try if you pay their top analyst’s salary for a few months … and at, say, 5K a day, there goes three times any savings they identify.)

Examples where the doctor has repeatedly seen stellar results include:

  • national service provider contract optimization across national, regional, and local providers where rates, expected utilization, and all-in costs for remote resources are considered; With just an RFX solution, the usual solution is to go to all the relevant Big X and Mid-Sized Bodyshops and get their rate cards by role by location by base rate (with expenses picked up by the org) and all-in rate; calc. the expected local overhead rate by location; then, for each Big X / Mid-Size- role – location, determine if the Big X all-in rate or the Big X base rate plus their overhead is cheaper and select that as the final bid for analysis; then mark the lowest bid for each role-location and determine the three top providers; then distribute the award between the three “top” providers in the lowest cost fashion; and, in big companies using a lot of contract labour, leave millions on the table because 1) sometimes the cheapest 3 will actually be the providers with the middle of the road bids across the board and 2) for some areas/roles, regional, and definitely local, providers will often be cheaper — but since the complexity is beyond manageable, this isn’t done, even though the doctor has seen multiple real-world events generate 30% to 40% savings since optimization can handle hundreds of suppliers and tens of thousands of bids and find the perfect mix (even while limiting the number of global providers and the number of providers who can service a location)
  • global mailer / catalog production —
    paper won’t go away, and when you have to balance inks, papers, printing, distribution, and mailing — it’s not always local or one country in a region that minimizes costs, it’s a very complex sourcing AND logistics distribution that optimizes costs … and the real-world model gets dizzying fast unless you use optimization, which will find 10% or more savings beyond your current best efforts
  • build-to-order assembly — don’t just leave that to the contract manufacturer, when you can simultaneously analyze the entire BoM and supply chain, which can easily dwarf the above two models if you have 50 or more items, as savings will just appear when you do so

… but yet, because it’s “math”, it doesn’t get used, even though you don’t have to do the math — the platform does!

Curve Fitting Trend Analysis

Dozens (and dozens) of “AI” models have been developed over the past few years to provide you with “predictive” forecasts, insights, and analytics, but guess what? Not a SINGLE model has outdone classical curve-fitting trend analysis — and NOT a single model ever will. (This is because all these fancy-smancy black box solutions do is attempt to identify the record/transaction “fingerprint” that contains the most relevant data and then attempt to identify the “curve” or “line” to fit it too all at once, which means the upper bound is a classical model that uses the right data and fits to the right curve from the beginning, without wasting an entire plant’s worth of energy powering entire data centers as the algorithm repeatedly guesses random fingerprints and models until one seems to work well.)

And the reality is that these standard techniques (which have been refined since the 60s and 70s), which now run blindingly fast on large data sets thanks to today’s computing, can achieve 95% to 98% accuracy in some domains, with no misfires. A 95% accurate forecast on inventory, sales, etc. is pretty damn good and minimizes the buffer stock, and lead time, you need. Detailed, fine tuned, correlation analysis can accurately predict the impact of sales and industry events. And so on.

Going one step further, there exists a host of clustering techniques that can identify emergent trends in outlier behaviour as well as pockets of customers or demand. And so on. But chances are you aren’t using any of these techniques.

So given that most of you haven’t adopted any of this technology that has proven to be reliable, effective, and extremely valuable, why on earth would you want to adopt an unproven technology that hallucinates daily, might tell of your sensitive employees with hate speech, and even leak your data? It makes ZERO sense!

While we admit that someday semi-private LLMs will be an appropriate solution for certain areas of your business where large amount of textual analysis is required on a regular basis, even these are still iffy today and can’t always be trusted. And the doctor doesn’t care how slick that chatbot is because if you have to spend days learning how to expertly craft a prompt just to get a single result, you might as well just learn to code and use a classic open source Neural Net library — you’ll get better, more reliable, results faster.

Keep an eye on the tech if you like, but nothing stops you from using the tech that works. Let your peers be the test pilots. You really don’t want to be in the cockpit when it crashes.

* And if you don’t understand why a deep understand of university level mathematics, preferably at the graduate level, is important, then you shouldn’t be touching the turkey who touches the Gen-AI solution with a 10-foot pole!

Ghosts in Data Can Indicate Fraud but …

… so can the telltale signs, and if you don’t even know how to spot those signs, why even look for the ghosts (which are very hard to find).

A recent article over on Dev Discourse on Spotting the Ghosts Using Big Data to Detect Fraud in Government Purchases described the results of a study by the University of Craiova, Romania, Institute of Financial Studies, Bucharest, and three other universities that examined how big data and online systems can help make public procurement more transparent and fair.

They analyzed the data from Romania’s public procurement system in 2023, where the government made 2.25 million purchases that totalled about 3.22 Billion Euros. In this study, the researchers were particularly interested in “exclusive” relationships, where a vendor only works with one public entity. They found that over 14% of all public purchases fell into this category, which is concerning as these exclusive deals can indicate problems like favouritism or fraud because they don’t follow the usual rules of fair competition.

This is just one standard way to identify potential fraud. Other ways, as noted by the article, are to

  • look for unusual transaction values,
  • look at the geographical distribution of unusual transactions or sole-source relationships (and for clusters in particular) as many happen in specific regions (suggesting that certain areas have higher risks of fraud)
  • look at deals that were completed too quickly (such as those completed within minutes of posting) and that were awarded considerably after hours or on weekends,

If you’re not even doing these basics to identify potential fraud, then you’re not ready to look for ghosts in the data.

And when you are doing this, and you’re struggling to weed out the likely fraud in sole-source, unusual transaction value, and transactions completed at weird times, the next step is to do a basic analysis on the supplier. As the report indicates, the correlation between award level and supplier financial performance should be correlated, not inverted. If a supplier with poor financial performance keeps getting sole-source awards, that’s a BIG RED FLAG.

Then, run the standard contract-purchase_oder-invoice matching to make sure the amounts line up. And if you do all of the above, you’ll find more fraud, non-policy compliance, and overpayments than you ever thought possible. No ghosts needed. (But if you ever get to the point that all of the above comes up blank, reach out and the doctor will tell you how to find ghosts in the data as well as ghosts in the machine.)

SpendKey: Your Solution-Oriented Key to Spend Insights

Preamble:

As the doctor wrote on Spend Matters back in November of 2021, shortly after SpendKey‘s initial release, SpendKey was formed in 2020 by a senior team of Procurement and Spend Analysis professionals with experience at big consultancies (Deloitte, E&Y, etc.), big companies (Thomas Cook, Marks and Spencer, etc. ), big banks and Finance Institutions (Barclays, London Stock Exchange, etc.), and Managed Service Providers (Cloudaeon, Zensar Technologies, etc.) who identified a market need for faster, more accurate data processing and better analytics across the board as well as better expert advice and guidance to accompany those analytics to help companies make quick and optimal decisions to get on the right track the first time around.

After less than a year and a half of development, their initial service-based offering was already sufficient for turn-key consultant led projects and their roadmap had them on track for a completely stand-alone SaaS offering by 2023, which they delivered to the market last year.

So where are they now and what do they do? That’s what we’ll dive into in this article.

Introduction:

SpendKey has evolved from a dashboard driven spend analysis solution to a comprehensive spend, contract tracking and decision intelligence platform with a mission to provide deep insight for sourcing and procurement.

SpendKey‘s unique selling proposition is its ability to index every part, product, services and vendor with context. The product ontology and interoperability creates relationships with any attribute; providing end-to-end visibility; and a data foundation for autonomous workflows (on the roadmap), which can currently be used to power a client’s existing stack.

The SpendKey platform supports the creation of customized reports tailored to client-specific requirements. With a wide array of out-of-the-box dynamic dashboards, SpendKey offers standard insights into spend across categories and suppliers. These dashboards are augmented with advanced analysis tools like ABC analysis, trend analysis, Pareto analysis, Inside/Outside evaluations, order-to-actual correlations, and what-if scenarios, delivering a full-spectrum view of spending.

In addition to its customizable options, SpendKey provides a variety of standard reports to analyse spend, costs, goods, services, and information flows. The platform includes pre-defined reports that cover essential areas of spend analysis with customization for every client need.

SpendKey’s reporting suite has been expanded to include contract reports, budgeting reports, and dynamic MIS reports, offering a comprehensive toolkit for monitoring and optimising spend.
These tools were designed by procurement experts with decades of experience in spend analysis, ensuring that organizations can identify opportunities to not only reduce costs but also enhance overall efficiency and profitability.

SpendKey has an advanced spend-intake process that maps all of an organisation’s spend to any taxonomy (which can be theirs, yours, or a hybrid) using a multi-stage hybrid mapping process that uses known mappings, AI, human corrections, and overrides that feedback into the next mapping cycle. Once the client has worked with SpendKey to do the initial spend upload and mapping, the client can subscribe to incremental updates (that will be handled fully by SpendKey) or do self-serve via file-based incremental uploads.

So, if you read the initial analysis, what’s new?

  • improved data intake pipeline (which increases auto-mapping completeness and shortens the intake cycle)
  • project tracking
  • budget approvals
  • document analytics (and contract tracking)
  • commitments, budgets, and actuals comparison capability
  • ability to index parts, products, and services
  • line item auditability and more security controls
  • more spend sources
  • new dashboards

And what hasn’t changed (much)?

  • still no DIY (do-it-yourself) report builder
  • limited mapping audit access through the front end

And we’ll talk about each of these in turn.

Data Intake Pipeline

The data-intake pipeline is multi-step and works something like this:

1. Upload a raw data file in CSV or Excel or integrate via API

2. Validate the file against column descriptions, data formats, and language requirements (auto-translating to English if required) and apply any necessary transformations and cleansings to create records for classifications.

3. Run the current corpus of mapping rules.

3a. Push the mapped data into the live spend database.

3b. Package the unmapped transactions for web-processing.

4. Extract the supplier, product, and related information and use web-scraping (including Gen-AI models) to extract supplier and line of business information that can be used for classification.

5. Create suggested mappings where there is sufficient confidence for a human to review.

6. Push the verified mappings into the mapping rules and then retrain the machine learning on the new corpus of mapping rules to map the remaining unmapped spend and push through anything with sufficient confidence to the live system, having a human deal with the rest (or push it to an unclassified bucket).

By using multiple techniques, they are able to get to a high accuracy very quickly and turn around the client’s spend cube rather quickly compared to most consultancies using traditional methodologies. For even their largest clients, they are typically live with high mapping accuracy within 10 days.

Project Tracking

When an analyst or buyer identifies a potential savings project, they can record their find/proposal in the tool, get approval, track status, and keep stakeholders informed. All they need to do to define a project (for tracking) is to define the item or category, supplier(s), aggregated spend amount, project period, project type, and expected savings. They can add custom organizational tags or note key stakeholders if required, and then send it off for approval. Once approved, they just have to update the status and savings-to-date on a regular basis until the project is complete.

It’s not meant to be a project management tool, since most of the projects will be sourcing, procurement, contract, or other events or processes managed by other tools, just a tracking tool to track usage of the platform as well as approvals on projects before buyers or analysts go off on their own savings goose chases.

Budget and Forecasting Management

Budgeting and forecasting are pivotal components of financial management that empower businesses to plan, manage resources effectively, and navigate toward strategic goals. SpendKey platform offers advanced budgeting and forecasting tools for the financial year ahead. With predefined templates for easy budget setup, bulk data upload and download capabilities, and the option to assign specific budgets to each supplier.

SpendKey’s budget management module has specific processes for classification and mapping of the budget and spend data, aligning budget allocations with actual spend patterns. It empowers users with advanced budgeting and forecasting functionalities. With comprehensive reports, a user-friendly interface, and the ability to create, manage, and analyse budgets, users can make well-informed financial decisions. SpendKey enables users to optimise their budget allocations, monitor variances, and gain valuable insights for successful investment strategies.

Document Analytics

Spend Under Management is one of the ultimate keys to Procurement success, and this often requires a lot of Spend Under Contract to ensure supply and mitigate risk. This requires understanding the spend under contract, which requires that the contract meta data be stored in the system. As well as contract prices (to track agreed upon to invoiced to paid).

But no one wants to enter meta-data, so they built a machine learning and document analytics application that can automatically parse documents, identify key meta data, extract price tables, and present it to a human for final verification before the data is stored in the system.

The analytics can also be used on POs and invoices for verification purposes, and the user can decide whether or not to store that data in the system (or associated it with contracts).

More Spend Sources

Not only do they now support contract meta-data and contracted prices, but they also support the upload of asset-based data (for an organization to analyze the current and future value of organizational assets), payroll data (since that’s a significant amount of organizational spend), contingent workforce management data (to track services / contingent worker spend), and PO data in addition to AP data (which is the typical data source analyzed by simple “analytics” applications). In addition, if available, they will also load ESG Ranking data.

Their goal is to allow a complete understanding of organizational spend from budget to commitment to ordered to received to paid to projection using both standard cash views as well as amortization, accrual, and projected spend views.

New Dashboards

There are a slew of new dashboards, which include, but are not limited to:

  • Incliner/Decliner: highlights suppliers with increased or decreased spend compared to a user defined period
  • Contract Overview: provides analytics on different type of contract documents types, their expiry date, contract length
  • Contract Details: navigate and review the summary of data for each contract and the ability to view the respective contract
  • End-to-End Visibility: connects data from spend, contract, budget and other systems to provide end to end visibility e.g. spend vs budget vs contracted spend
  • ESG Summary: provides insights ESG score by suppliers and their relevant spend, including average ESG rating by industry and analytics on performance on each of the E, S and G areas
  • ESG Supplier Ranking: provides insights into ESG ranking for each individual supplier
  • Budget Overview: provides an overview of budget allocation and spending trends, highlighting key variances between actual spend and budget across different suppliers and categories.
  • Budget by Category: shows Budget by Category breakdown, displaying spend, budget, and variances across different levels of categories and suppliers
  • Budget by Suppliers: highlights spend, budget, and variance for key suppliers, along with an overall budget variance by category
  • Budget Distribution: shows the distribution of spend, budget, and variance across different transaction brackets, along with the corresponding transaction counts
  • Budget Detail: details supplier-specific budget, spend, and variance, including non-PO spend and transaction counts
  • Supplier Reclassification: allows you to reclassify supplier spend into a different taxonomy
  • Supplier Fragmentation: allows you to to track the number of suppliers in any category or subcategory
  • Key Insights: presents key spend insights, highlighting potential savings, category spend, new suppliers, and contract renewal dates

Add these to the existing dashboards that include, but are not limited to:

  • Main Dashboard : provides an overview of the spend across all categories of spend
  • Category Breakdown : enables the user to drill deep into any category and sub-category of spend to get deeper insights
  • Contract Kanban View : summarizes contract expiry in a kanban view to help identify contracts and suppliers to prioritise for renegotiations
  • MIS Dashboard: provides the user the ability to create their own pivot style report by connecting different data sets to generate views that were not available before
  • PO vs Non PO Analysis : provides an overview of spend compliant with purchase orders
  • Reseller Insights : provides insights to understand purchase of products from resellers
  • Savings Opportunity : provides ability to get a quick high level business case on potential savings based on certain user defined parameters.
  • Spend Summary : provides a narrative on the spend
  • Spend By Country : provides a summary of spend by different geographies and the ability to drill further by country
  • Spend Distribution : provides insights on spend by different transaction brackets to help identify low value low risk spend and suppliers
  • Spend Detail : provides view of the raw data and the enrichment from SpendKey to this raw data at the individual transaction level
  • Spend by Category : provides insights for each category and the relevant sub-categories based on the defined taxonomy tree
  • Supplier Hierarchy : provides insights at supplier level to help understand the parent and all the relevant child entities under that parent
  • Supplier Performance : provides a summary on the reduction in supplier count post data cleansing and supplier normalization
  • Supplier Segmentation : provides the ability to segment or tag a supplier based on user preferences
  • Tail Spend : provides insights and summary into tail spend (bottom 20% to 40% of the spend)
  • What-If : gives the user the ability to try different permutations and combinations of parts/products/services to understand potential savings opportunities
  • IT OPEX Budget : provides the user with the ability to view budget at supplier level or by category or cost centre, material code, etc.
  • Set Budget : provides ability to a user to set and define budget for a user-defined period
  • Forex Rate : gives the user option to set the FX rates for various currencies for a defined date range / period to enable the platform to convert all transactions into the base currency based on your company’s defined FX rates
  • Key Management : this provides the user with the ability to set distribution keys for spend allocation to business units, departments, functions etc. to help calculate recharge
  • Project Tracker : provides the ability to the user to create projects such as savings initiatives and track them in the tool. Also provides a workflow for approval of project milestones such as delivering on your savings targets.
  • User Management : allows the administrator to add new users and define their access control

And it’s a fairly extensive offering for an organization looking for a services-oriented solution to give them insights out of the box.

No DIY Report Builder

Now, companies looking for a services-oriented spend analysis solution aren’t looking for DIY initially, but as they mature in spend analysis, they will likely want the ability to modify the dashboards and reports on their own, which is baseline DIY. As they continue to mature, a few organizations will eventually want to start building their own reports and views, so it’s important that DIY is on the roadmap for an organization looking to mature in their analytics capability over time.

Limited Mapping Audit Access through the Front End

In the backend, they keep a complete audit trail of how and why every transaction was mapped where it was mapped. In the front end every single edit and amend that is made by a user is logged, along with supported commentary by the user. However, when a user goes to edit and amend a mapping in the front end, she doesn’t know if a transaction was initially mapped by rule, SpendKey‘s home-grown self-trained AI, or Gen-AI, and whether or not there was ever a human in the loop.

It’s critical that this data be pushed through to the front end because, among other things,

  1. there will always be someone who questions a mapping,
  2. when that happens, you need to know how it was mapped, and
  3. you need to know the ratio of human vs AI mapping in a category for confidence.

As of now, users can reclassify transactions within the tool, so if there is an error, they can push that to the admin or a “parking lot” for review, where, if the admin agrees, it can be pushed straight to the back end.

Showing who, or what, (initially) mapped the data, and why, in the front end is on the roadmap, and hopefully it appears sooner than later.

Summary

All-in-all, SpendKey is definitely a solution you should be looking at if you are a mid-market (plus) in the UK/Western Europe looking for a services-oriented spend analysis solution to help you analyze your spending and come up with strategies to get it under control.

Dear SaaS Provider, Where’s Your Substance? Being SaaSy is No Longer Enough.

As per our January article, Half a Trillion Dollars will be Wasted on SaaS Spend This Year and, as per a recent article over on The CFO, CFO’s are wising up to the hidden bill attached to SaaS and cloud, which might just be growing faster than the US National Debt (on a per capita basis).

As the CFO article notes, per-employee SaaS subscriptions alone are now costing businesses $2,000 (or more) annually on average, and that’s including ALL employees from the Janitor (who shouldn’t be using any SaaS) to the CEO (who likely doesn’t use any SaaS either and just needs a locally installed PowerPoint license).

To put this in perspective, this says a small company of only 1,000 people is spending 2 MILLION on SaaS (and a mid-size company of 10,000 people is spending 20 MILLION), most of it consumer, and likely a good portion of it through B2B Software Marketplaces because it’s easier for AP. If the average salary is 100K with 30K base overhead, that’s costing the organization 15 (or 150) people, or a 1.5% increase in workforce, which is substantial if it’s an organization that needs people to grow.

And the worst part is that a very significant portion of this spend is overspend or unnecessary spend, with many SaaS auditors and SaaS management specialists finding 33% (or more) overspend as a result of duplicate tools, unused licenses, and sometimes outright zombie subscriptions that just need to be cancelled. Plus, poor management and provisioning leads to unnecessary surcharges that is almost as bad as unused licenses.

There’s no excuse for it, and CFOs are not going to put up with it anymore. SaaS Audit and Management tools are going to become a lot more common, and once the zombie subscriptions, unused licenses, and cloud subscriptions are rightsized, when these companies realize they are still spending at least 1,500 per employee on SaaS and cloud, they are going to start grouping tools by function and analyzing value. If there are two tools that do lead management, workforce management, or catalog management, one is going to go. More specifically, the one providing the least value to the organization. It doesn’t support multiple what-if scenario creation yet or true SSDO, but its more than just simple side-by-side comparison and more analysis capability is on the roadmap for later this year.

So, dear SaaS Provider, it’s important to ask:

  • what’s your substance
  • how do you provide more hard dollar value for that substance than your peers
  • how do you measure it and prove it to the customer
  • … and make sure you’re not the vendor that is cancelled during the audit

And, dear organization who hasn’t done a SaaS audit recently, why haven’t you? You’re sitting on 30% overspend in a category which is likely, with most of the spend split between departments and hidden on P-Cards and expense reports, $2,000 per employee and growing daily. You need to do the audit, rightsize your SaaS, and then centralize SaaS management and SaaS acquisition policy. It’s not a minor expense, it’s a major, business altering, outlay.

Analytics Is NOT Reporting!

We’ve covered analytics, and spend analysis, a lot on this blog, but seeing the surge in articles on analytics as of late, and the large number that are missing the point, it seems we have to remind you again that Analytics is NOT Reporting. (Which, of course, would be clear if anyone bothered to pick up a dictionary anymore.)

As defined by the Oxford dictionary, analytics is the systematic computational analysis of data or statistics and a report is a written account of something that has been observed, heard, done, or investigated. In simple terms, analysis is what is done to identify useful information and reporting is the process of displaying that information in a fancy-shmancy graph. One is useful, one is, quite frankly, useless.

A key requirement of analysis is the ability to do arbitrary systematic computational analysis of data as needed to find the information that you need when you need it. Not just a small set of canned analysis on discrete data subsets that become completely and utterly useless once they are run the first time and you get the initial result — which will NEVER change if the analysis can’t change.

Nor is analysis a random AI application that applies a random statistical algorithm to bubble up, filter out, or generate a random “insight” that may or may not be useful from a Procurement viewpoint. Sometimes an outlier is indicative of fraud or a data error, and sometimes an outlier is just an outlier. Maybe the average transaction value with the services firm is 15,000 for the weekly bill; which makes the 3,000 an outlier, but it’s not fraud if the company only needed a cyber-security expert for one day to test a key system — in fact, the insight is useless.

As per our recent post on a true enterprise analytics solution, real analysis requires the ability to explore a hunch and find the answer to any question that pops up when it pops up. To build whatever cube is needed, on whatever dimensions are required, that rolls up data using whatever metrics are required to produce whatever insights are needed to determine if an opportunity is there and if it is worth being pursued. Quickly and cost-effectively in real-time. If you have to wait for a refresh, or spend days doing offline computation in Excel to answer a question that might only save you 20K, you’re not going to do it. (Three days and 6K of your time from a company perspective is not worth a 20K saving if that time spent preparing for a negotiation on a 10M category can save an extra 0.5%, which would equate to 50K. But if you can dynamically build a cube and get an answer in 30 minutes, that 30 minutes is definitely worth it if your hunch is right and you save 20K.)

Analysis is the ability to ask “what if” and pursue the answer. Now! Not tomorrow, next week, or next month on the cube refresh, or when the provider’s personnel can build that new report for you. Now! At any time you should be able to ask What if we reclassify the categories so that the primary classification is based on primary material (“steel”) and not usage (“electrical equipment”); What if the savings analysis is done by sourcing strategy (RFX, auction, re-negotiation, etc.) instead of contract value; and What if the risk analysis is done by trade lane instead of supplier or category. Analysis is the process of asking a question, any question, and working the data to get the answer using whatever computations are required. It’s not a canned report.

Analytics is doing, not viewing. And the basics haven’t changed since SI started writing about it, or publishing guest posts by the Old Greybeard himself. (Analytics I, II, III, IV, V, and VI.)