Category Archives: Spend Analysis

SpendKey: Your Solution-Oriented Key to Spend Insights

Preamble:

As the doctor wrote on Spend Matters back in November of 2021, shortly after SpendKey‘s initial release, SpendKey was formed in 2020 by a senior team of Procurement and Spend Analysis professionals with experience at big consultancies (Deloitte, E&Y, etc.), big companies (Thomas Cook, Marks and Spencer, etc. ), big banks and Finance Institutions (Barclays, London Stock Exchange, etc.), and Managed Service Providers (Cloudaeon, Zensar Technologies, etc.) who identified a market need for faster, more accurate data processing and better analytics across the board as well as better expert advice and guidance to accompany those analytics to help companies make quick and optimal decisions to get on the right track the first time around.

After less than a year and a half of development, their initial service-based offering was already sufficient for turn-key consultant led projects and their roadmap had them on track for a completely stand-alone SaaS offering by 2023, which they delivered to the market last year.

So where are they now and what do they do? That’s what we’ll dive into in this article.

Introduction:

SpendKey has evolved from a dashboard driven spend analysis solution to a comprehensive spend, contract tracking and decision intelligence platform with a mission to provide deep insight for sourcing and procurement.

SpendKey‘s unique selling proposition is its ability to index every part, product, services and vendor with context. The product ontology and interoperability creates relationships with any attribute; providing end-to-end visibility; and a data foundation for autonomous workflows (on the roadmap), which can currently be used to power a client’s existing stack.

The SpendKey platform supports the creation of customized reports tailored to client-specific requirements. With a wide array of out-of-the-box dynamic dashboards, SpendKey offers standard insights into spend across categories and suppliers. These dashboards are augmented with advanced analysis tools like ABC analysis, trend analysis, Pareto analysis, Inside/Outside evaluations, order-to-actual correlations, and what-if scenarios, delivering a full-spectrum view of spending.

In addition to its customizable options, SpendKey provides a variety of standard reports to analyse spend, costs, goods, services, and information flows. The platform includes pre-defined reports that cover essential areas of spend analysis with customization for every client need.

SpendKey’s reporting suite has been expanded to include contract reports, budgeting reports, and dynamic MIS reports, offering a comprehensive toolkit for monitoring and optimising spend.
These tools were designed by procurement experts with decades of experience in spend analysis, ensuring that organizations can identify opportunities to not only reduce costs but also enhance overall efficiency and profitability.

SpendKey has an advanced spend-intake process that maps all of an organisation’s spend to any taxonomy (which can be theirs, yours, or a hybrid) using a multi-stage hybrid mapping process that uses known mappings, AI, human corrections, and overrides that feedback into the next mapping cycle. Once the client has worked with SpendKey to do the initial spend upload and mapping, the client can subscribe to incremental updates (that will be handled fully by SpendKey) or do self-serve via file-based incremental uploads.

So, if you read the initial analysis, what’s new?

  • improved data intake pipeline (which increases auto-mapping completeness and shortens the intake cycle)
  • project tracking
  • budget approvals
  • document analytics (and contract tracking)
  • commitments, budgets, and actuals comparison capability
  • ability to index parts, products, and services
  • line item auditability and more security controls
  • more spend sources
  • new dashboards

And what hasn’t changed (much)?

  • still no DIY (do-it-yourself) report builder
  • limited mapping audit access through the front end

And we’ll talk about each of these in turn.

Data Intake Pipeline

The data-intake pipeline is multi-step and works something like this:

1. Upload a raw data file in CSV or Excel or integrate via API

2. Validate the file against column descriptions, data formats, and language requirements (auto-translating to English if required) and apply any necessary transformations and cleansings to create records for classifications.

3. Run the current corpus of mapping rules.

3a. Push the mapped data into the live spend database.

3b. Package the unmapped transactions for web-processing.

4. Extract the supplier, product, and related information and use web-scraping (including Gen-AI models) to extract supplier and line of business information that can be used for classification.

5. Create suggested mappings where there is sufficient confidence for a human to review.

6. Push the verified mappings into the mapping rules and then retrain the machine learning on the new corpus of mapping rules to map the remaining unmapped spend and push through anything with sufficient confidence to the live system, having a human deal with the rest (or push it to an unclassified bucket).

By using multiple techniques, they are able to get to a high accuracy very quickly and turn around the client’s spend cube rather quickly compared to most consultancies using traditional methodologies. For even their largest clients, they are typically live with high mapping accuracy within 10 days.

Project Tracking

When an analyst or buyer identifies a potential savings project, they can record their find/proposal in the tool, get approval, track status, and keep stakeholders informed. All they need to do to define a project (for tracking) is to define the item or category, supplier(s), aggregated spend amount, project period, project type, and expected savings. They can add custom organizational tags or note key stakeholders if required, and then send it off for approval. Once approved, they just have to update the status and savings-to-date on a regular basis until the project is complete.

It’s not meant to be a project management tool, since most of the projects will be sourcing, procurement, contract, or other events or processes managed by other tools, just a tracking tool to track usage of the platform as well as approvals on projects before buyers or analysts go off on their own savings goose chases.

Budget and Forecasting Management

Budgeting and forecasting are pivotal components of financial management that empower businesses to plan, manage resources effectively, and navigate toward strategic goals. SpendKey platform offers advanced budgeting and forecasting tools for the financial year ahead. With predefined templates for easy budget setup, bulk data upload and download capabilities, and the option to assign specific budgets to each supplier.

SpendKey’s budget management module has specific processes for classification and mapping of the budget and spend data, aligning budget allocations with actual spend patterns. It empowers users with advanced budgeting and forecasting functionalities. With comprehensive reports, a user-friendly interface, and the ability to create, manage, and analyse budgets, users can make well-informed financial decisions. SpendKey enables users to optimise their budget allocations, monitor variances, and gain valuable insights for successful investment strategies.

Document Analytics

Spend Under Management is one of the ultimate keys to Procurement success, and this often requires a lot of Spend Under Contract to ensure supply and mitigate risk. This requires understanding the spend under contract, which requires that the contract meta data be stored in the system. As well as contract prices (to track agreed upon to invoiced to paid).

But no one wants to enter meta-data, so they built a machine learning and document analytics application that can automatically parse documents, identify key meta data, extract price tables, and present it to a human for final verification before the data is stored in the system.

The analytics can also be used on POs and invoices for verification purposes, and the user can decide whether or not to store that data in the system (or associated it with contracts).

More Spend Sources

Not only do they now support contract meta-data and contracted prices, but they also support the upload of asset-based data (for an organization to analyze the current and future value of organizational assets), payroll data (since that’s a significant amount of organizational spend), contingent workforce management data (to track services / contingent worker spend), and PO data in addition to AP data (which is the typical data source analyzed by simple “analytics” applications). In addition, if available, they will also load ESG Ranking data.

Their goal is to allow a complete understanding of organizational spend from budget to commitment to ordered to received to paid to projection using both standard cash views as well as amortization, accrual, and projected spend views.

New Dashboards

There are a slew of new dashboards, which include, but are not limited to:

  • Incliner/Decliner: highlights suppliers with increased or decreased spend compared to a user defined period
  • Contract Overview: provides analytics on different type of contract documents types, their expiry date, contract length
  • Contract Details: navigate and review the summary of data for each contract and the ability to view the respective contract
  • End-to-End Visibility: connects data from spend, contract, budget and other systems to provide end to end visibility e.g. spend vs budget vs contracted spend
  • ESG Summary: provides insights ESG score by suppliers and their relevant spend, including average ESG rating by industry and analytics on performance on each of the E, S and G areas
  • ESG Supplier Ranking: provides insights into ESG ranking for each individual supplier
  • Budget Overview: provides an overview of budget allocation and spending trends, highlighting key variances between actual spend and budget across different suppliers and categories.
  • Budget by Category: shows Budget by Category breakdown, displaying spend, budget, and variances across different levels of categories and suppliers
  • Budget by Suppliers: highlights spend, budget, and variance for key suppliers, along with an overall budget variance by category
  • Budget Distribution: shows the distribution of spend, budget, and variance across different transaction brackets, along with the corresponding transaction counts
  • Budget Detail: details supplier-specific budget, spend, and variance, including non-PO spend and transaction counts
  • Supplier Reclassification: allows you to reclassify supplier spend into a different taxonomy
  • Supplier Fragmentation: allows you to to track the number of suppliers in any category or subcategory
  • Key Insights: presents key spend insights, highlighting potential savings, category spend, new suppliers, and contract renewal dates

Add these to the existing dashboards that include, but are not limited to:

  • Main Dashboard : provides an overview of the spend across all categories of spend
  • Category Breakdown : enables the user to drill deep into any category and sub-category of spend to get deeper insights
  • Contract Kanban View : summarizes contract expiry in a kanban view to help identify contracts and suppliers to prioritise for renegotiations
  • MIS Dashboard: provides the user the ability to create their own pivot style report by connecting different data sets to generate views that were not available before
  • PO vs Non PO Analysis : provides an overview of spend compliant with purchase orders
  • Reseller Insights : provides insights to understand purchase of products from resellers
  • Savings Opportunity : provides ability to get a quick high level business case on potential savings based on certain user defined parameters.
  • Spend Summary : provides a narrative on the spend
  • Spend By Country : provides a summary of spend by different geographies and the ability to drill further by country
  • Spend Distribution : provides insights on spend by different transaction brackets to help identify low value low risk spend and suppliers
  • Spend Detail : provides view of the raw data and the enrichment from SpendKey to this raw data at the individual transaction level
  • Spend by Category : provides insights for each category and the relevant sub-categories based on the defined taxonomy tree
  • Supplier Hierarchy : provides insights at supplier level to help understand the parent and all the relevant child entities under that parent
  • Supplier Performance : provides a summary on the reduction in supplier count post data cleansing and supplier normalization
  • Supplier Segmentation : provides the ability to segment or tag a supplier based on user preferences
  • Tail Spend : provides insights and summary into tail spend (bottom 20% to 40% of the spend)
  • What-If : gives the user the ability to try different permutations and combinations of parts/products/services to understand potential savings opportunities
  • IT OPEX Budget : provides the user with the ability to view budget at supplier level or by category or cost centre, material code, etc.
  • Set Budget : provides ability to a user to set and define budget for a user-defined period
  • Forex Rate : gives the user option to set the FX rates for various currencies for a defined date range / period to enable the platform to convert all transactions into the base currency based on your company’s defined FX rates
  • Key Management : this provides the user with the ability to set distribution keys for spend allocation to business units, departments, functions etc. to help calculate recharge
  • Project Tracker : provides the ability to the user to create projects such as savings initiatives and track them in the tool. Also provides a workflow for approval of project milestones such as delivering on your savings targets.
  • User Management : allows the administrator to add new users and define their access control

And it’s a fairly extensive offering for an organization looking for a services-oriented solution to give them insights out of the box.

No DIY Report Builder

Now, companies looking for a services-oriented spend analysis solution aren’t looking for DIY initially, but as they mature in spend analysis, they will likely want the ability to modify the dashboards and reports on their own, which is baseline DIY. As they continue to mature, a few organizations will eventually want to start building their own reports and views, so it’s important that DIY is on the roadmap for an organization looking to mature in their analytics capability over time.

Limited Mapping Audit Access through the Front End

In the backend, they keep a complete audit trail of how and why every transaction was mapped where it was mapped. In the front end every single edit and amend that is made by a user is logged, along with supported commentary by the user. However, when a user goes to edit and amend a mapping in the front end, she doesn’t know if a transaction was initially mapped by rule, SpendKey‘s home-grown self-trained AI, or Gen-AI, and whether or not there was ever a human in the loop.

It’s critical that this data be pushed through to the front end because, among other things,

  1. there will always be someone who questions a mapping,
  2. when that happens, you need to know how it was mapped, and
  3. you need to know the ratio of human vs AI mapping in a category for confidence.

As of now, users can reclassify transactions within the tool, so if there is an error, they can push that to the admin or a “parking lot” for review, where, if the admin agrees, it can be pushed straight to the back end.

Showing who, or what, (initially) mapped the data, and why, in the front end is on the roadmap, and hopefully it appears sooner than later.

Summary

All-in-all, SpendKey is definitely a solution you should be looking at if you are a mid-market (plus) in the UK/Western Europe looking for a services-oriented spend analysis solution to help you analyze your spending and come up with strategies to get it under control.

Dear SaaS Provider, Where’s Your Substance? Being SaaSy is No Longer Enough.

As per our January article, Half a Trillion Dollars will be Wasted on SaaS Spend This Year and, as per a recent article over on The CFO, CFO’s are wising up to the hidden bill attached to SaaS and cloud, which might just be growing faster than the US National Debt (on a per capita basis).

As the CFO article notes, per-employee SaaS subscriptions alone are now costing businesses $2,000 (or more) annually on average, and that’s including ALL employees from the Janitor (who shouldn’t be using any SaaS) to the CEO (who likely doesn’t use any SaaS either and just needs a locally installed PowerPoint license).

To put this in perspective, this says a small company of only 1,000 people is spending 2 MILLION on SaaS (and a mid-size company of 10,000 people is spending 20 MILLION), most of it consumer, and likely a good portion of it through B2B Software Marketplaces because it’s easier for AP. If the average salary is 100K with 30K base overhead, that’s costing the organization 15 (or 150) people, or a 1.5% increase in workforce, which is substantial if it’s an organization that needs people to grow.

And the worst part is that a very significant portion of this spend is overspend or unnecessary spend, with many SaaS auditors and SaaS management specialists finding 33% (or more) overspend as a result of duplicate tools, unused licenses, and sometimes outright zombie subscriptions that just need to be cancelled. Plus, poor management and provisioning leads to unnecessary surcharges that is almost as bad as unused licenses.

There’s no excuse for it, and CFOs are not going to put up with it anymore. SaaS Audit and Management tools are going to become a lot more common, and once the zombie subscriptions, unused licenses, and cloud subscriptions are rightsized, when these companies realize they are still spending at least 1,500 per employee on SaaS and cloud, they are going to start grouping tools by function and analyzing value. If there are two tools that do lead management, workforce management, or catalog management, one is going to go. More specifically, the one providing the least value to the organization. It doesn’t support multiple what-if scenario creation yet or true SSDO, but its more than just simple side-by-side comparison and more analysis capability is on the roadmap for later this year.

So, dear SaaS Provider, it’s important to ask:

  • what’s your substance
  • how do you provide more hard dollar value for that substance than your peers
  • how do you measure it and prove it to the customer
  • … and make sure you’re not the vendor that is cancelled during the audit

And, dear organization who hasn’t done a SaaS audit recently, why haven’t you? You’re sitting on 30% overspend in a category which is likely, with most of the spend split between departments and hidden on P-Cards and expense reports, $2,000 per employee and growing daily. You need to do the audit, rightsize your SaaS, and then centralize SaaS management and SaaS acquisition policy. It’s not a minor expense, it’s a major, business altering, outlay.

Analytics Is NOT Reporting!

We’ve covered analytics, and spend analysis, a lot on this blog, but seeing the surge in articles on analytics as of late, and the large number that are missing the point, it seems we have to remind you again that Analytics is NOT Reporting. (Which, of course, would be clear if anyone bothered to pick up a dictionary anymore.)

As defined by the Oxford dictionary, analytics is the systematic computational analysis of data or statistics and a report is a written account of something that has been observed, heard, done, or investigated. In simple terms, analysis is what is done to identify useful information and reporting is the process of displaying that information in a fancy-shmancy graph. One is useful, one is, quite frankly, useless.

A key requirement of analysis is the ability to do arbitrary systematic computational analysis of data as needed to find the information that you need when you need it. Not just a small set of canned analysis on discrete data subsets that become completely and utterly useless once they are run the first time and you get the initial result — which will NEVER change if the analysis can’t change.

Nor is analysis a random AI application that applies a random statistical algorithm to bubble up, filter out, or generate a random “insight” that may or may not be useful from a Procurement viewpoint. Sometimes an outlier is indicative of fraud or a data error, and sometimes an outlier is just an outlier. Maybe the average transaction value with the services firm is 15,000 for the weekly bill; which makes the 3,000 an outlier, but it’s not fraud if the company only needed a cyber-security expert for one day to test a key system — in fact, the insight is useless.

As per our recent post on a true enterprise analytics solution, real analysis requires the ability to explore a hunch and find the answer to any question that pops up when it pops up. To build whatever cube is needed, on whatever dimensions are required, that rolls up data using whatever metrics are required to produce whatever insights are needed to determine if an opportunity is there and if it is worth being pursued. Quickly and cost-effectively in real-time. If you have to wait for a refresh, or spend days doing offline computation in Excel to answer a question that might only save you 20K, you’re not going to do it. (Three days and 6K of your time from a company perspective is not worth a 20K saving if that time spent preparing for a negotiation on a 10M category can save an extra 0.5%, which would equate to 50K. But if you can dynamically build a cube and get an answer in 30 minutes, that 30 minutes is definitely worth it if your hunch is right and you save 20K.)

Analysis is the ability to ask “what if” and pursue the answer. Now! Not tomorrow, next week, or next month on the cube refresh, or when the provider’s personnel can build that new report for you. Now! At any time you should be able to ask What if we reclassify the categories so that the primary classification is based on primary material (“steel”) and not usage (“electrical equipment”); What if the savings analysis is done by sourcing strategy (RFX, auction, re-negotiation, etc.) instead of contract value; and What if the risk analysis is done by trade lane instead of supplier or category. Analysis is the process of asking a question, any question, and working the data to get the answer using whatever computations are required. It’s not a canned report.

Analytics is doing, not viewing. And the basics haven’t changed since SI started writing about it, or publishing guest posts by the Old Greybeard himself. (Analytics I, II, III, IV, V, and VI.)

Spendata: A True Enterprise Analytics Solution

As we indicated in our last article, while Spendata is the absolute best at spend analysis, it’s not just a spend analysis platform. It’s a general-purpose data analytics platform that can be used for much more than spend analysis.

The current end-state vision for business data analytics is a “data lake” database with a BI front end. The Big X consultancies (aided and abetted by your IT department, which is only too eager to implement another big system) will try to convince you of the data paradise you’ll have if you dump all of your business data into a data lake. Unfortunately, reality doesn’t support the vision, because organizational data is created only to the extent necessary, never verified, riddled with errors from day one, and left to decay over time as it’s never updated. The data lake is ultimately a data cesspool.

Pointing a BI tool at the (dirty) lake will spice up the data with bars, pies, waves, scatters, multi-coloured geometric shapes, and so on, but you won’t find much insight other than the realization that your data is, in fact, dirty. Worse, a published BI dashboard is like a spreadsheet you can’t modify. Try mapping new dimensions, creating new measures, adding new data, or performing even the simplest modification of an existing dimension or hierarchy, and you’ll understand why this author likes to point out that BI should actually stand for Bullsh!t Images, not Business Intelligence.

So how does a spend analysis platform like Spendata end up being a general-purpose data analytics tool? The answer is that the mechanisms and procedures associated with spend analysis and spend analysis databases, specifically data mapping and dimension derivation, can be taken to the next level — extended, generalized, and moved into real time. Once those key architectural steps are taken, the system can be further extended with view-based measures, shared cubes where custom modifications are retained across refreshes, and spreadsheet-like dependencies and recalculation at database scale.

The result is an analysis system that can be adapted not only to any of the common spend analysis problems, such as AP/PO analysis or commodity-specific cubes with item level price X quantity data, but also to savings tracking and sourcing and implementation plans. Extending the system to domains beyond spend analysis is simple: just load different data.
The bottom line is that to do real data analysis, no matter what the domain, you need:

  • the ability to extend the schema at any time
  • the ability to add new derived dimensions at any time
  • the ability to change mappings at any time
  • the ability to build derivations, data views, and mappings that are dependent on other derivations, mappings, views, inputs, linked datasets, and so on, with real-time “recalc”
  • the ability to create new views and reports relevant to the question you have … without dumping the data to Excel
  • … and preserve all of the above on cube data refreshes
  • … in your own copy of the cube so you don’t have to wait for anyone to agree
  • … and get an answer today, not on the next refresh next month when you’ve forgotten why you even had the question in the first place

You don’t get any of that from a spend analysis solution, or a BI solution, or a database pointing at a data lake. You only get that in a modern data analysis solution — which supports all of the above, and more, for any kind of data. A data analysis system works equally well across all types of numeric or set-valued data, including, but not limited to sales data, service data, warranty data, process data, and so on.

As Spendata is a real data analysis solution, it supports all of these analyses with a solution that’s easier and friendlier to use than the spreadsheet you use every day. Let’s walk through some examples so you can understand what a data analysis solution really can do.

SALES ANALYSIS

Spending data consists of numerical amounts that represent the price, tax, duty, shipping, etc. paid for items purchased. Sales data is numerical amounts that represent the price, tax, duty, shipping, etc. paid for items sold.

They are basically the inverse of each other. For every purchase, there is a sale. For every sale, there is a purchase. So, there’s absolutely no reason that you shouldn’t be able to apply the exact the same analysis (possibly in reverse) to sales data as you apply to spend data. That is, IF you have a proper data analysis tool. The latter part is the big IF because if you’re using a custom tool that needs to map all data to a schema with fixed semantics, it won’t understand the data and you’re SOL.

However, since Spendata is a general-purpose data analysis tool that builds and maintains its schema on the fly, it doesn’t care if the dataset is spend data or sales data; it’s still transactional data and it’s happy to analyze away. If you need the handholding of a workflow-oriented UI, that can also be configured out of the box using Spendata‘s new “app” capability.

Here are three types of sales analysis that Spendata supports better than CRM/Sales Forecasting systems, and that can’t be done at all with a data lake and a BI tool.

Sales Discount Variation Analysis Over Time By Salesperson … and Client Type

You run a sales team. Are your different salespeople giving the same mix of discounts by product type to the same types of customers by customer size and average sales size?

Sounds easy right? Can’t you simply plot the product/price ratio by month by salesperson in a bubble chart (where volume size correlates to bubble size) against the average trend line and calculate which salespeople are off the most (in the wrong direction)? Sure, but how do you handle client type? You could add a “color” dimension, but when the bubbles overlap and the bubbles blur, can you see it visually? Not likely. And how do you remember a low sales volume customer which is a strategic partner, so has a special deal? Theoretically you could add another column to the table “Salesperson, Product/Price Ratio, Client Type, Over/Under Average”, and that would work as long as you could pre-compute the average discount by Product/Price Ratio and Client Type.

And then you realize that unless you group by category, you have entirely different products in the same product/price ratio and your multi-stage analysis is worthless, so you have to go back and start again, only to find out that the bubble chart is only pseudo-useful (as you can’t really figure it out visually because what is that shade of pink (from the multiple red and white bubbles overlapping) — Fuchsia, Bright, or Barbie — and what does it mean) and you will have to focus on the fixed table to extract any value at all from the analysis.

But then you’ll realize that you still need to see monthly variations in the chart, meaning you want the ability to drag a slider or change the month and have the bubble chart update. Uh-oh, you forgot to individually compute all the amounts by month or select the slider graph! Back to square one, doing it all over again by month. Then you notice some customers have long-term, fixed prices on some products, which messes up the average discount on these products as the prices for these customers are not changing over time. You redo the work for the third (or is it the fourth? time), and then you realize that your definitions of client type “large, medium, and small” are slightly off as a client that should be in large is in medium and two that should be in small were made medium. Aaarrrggghhh!!!

But with Spendata, you simply create or modify dimensions to the cube to segment the data (customer type, product groups, etc.) You leverage a dynamic view-based measure by customer type to set the average prices per time period (used to calculate the discount). You then use filters to define the time range of interest, another view with filters to click through the months over time, a derived view to see the performance by quarter, another by year. If you change the definition of client type (which customers belong to which client type), which products for customers are fixed prices, which SKU’s that are the same type, time range of interest, etc. you simply map them and the entire analysis auto-updates.

This flexibility and power (with no wasted effort) gives you a very deep analysis capability NOT available in any other data analysis platform. For example, you can find out with a few clicks that your “best” salesperson in terms of giving the lowest average discount is actually costing you the most. Turns out, he’s not serving any large customers (who get good discounts) and has several fixed price contracts (which mess up the average discounts). So, the discounts he’s giving the small clients, while less than what large customers get, are significantly more than what other salespeople provide to other small customers. This is something you’d never know if you didn’t have the power of Spendata as your data consultant would give up on the variance analysis at the global level because the salesman’s overall ratio looked good.

Post-Merger White-Space Analysis

White space sales analysis is looking for spaces in the market where you should be selling but are not. For example, if you sell to restaurants, you could look at your sales by geography, normalized by the number of establishments by type or the sales of the restaurants by type in that geography. In a merger, you could measure your penetration at each customer for each of the original companies. You can find white space by looking at each customer (or customer segment) and measuring revenue per customer employee across the two companies. Where is one more effective than the other?

You might think this is no big deal because this was theoretically done during the due diligence and the opportunity for overlap was deemed to be there, as well as the opportunity for whitespace, and whatever was done was good enough. The reality couldn’t be further from the truth.

If the whitespace analysis was done with a standard analytics tool, it has all the following problems:

  • matching vendors were missed due to different name entries and missing ids
  • vendors were not familied by parent (within industry, geography, etc.)
  • the improperly merged vendors were only compared against a target file built by the consultants and misses vendors
  • i.e. it’s poor, but no worse than you’d do with a traditional analytics tool

But with Spendata, these problems would be at least minimized, if not eliminated because:

  • Spendata comes with auto-matching capability
  • … that can be used to enrich the suppliers with NAICS categorization (for example)
  • Spendata comes with auto-familying capability so parent-child relationships aren’t missed
  • Spendata can load all of the companies from a firmographic database with their NAICS codes in a separate cube …
  • … and then federation can be used to match the suppliers in use with the suppliers in the appropriate NAICS category for the white space analysis

It’s thus trivial to

  1. load up a cube with organization A’s sales by supplier (which can be the output from a view on a transaction database), and run it through a view that embeds a normalization routine so that all records that actually correspond to the same supplier (or parent-child where only the parent is relevant) are grouped into one line
  2. load up a cube with organization B’s sales by supplier and do the same … and now you know you have exact matches between supplier names
  3. load up the NAICS code database – which is a list of possible customers
  4. build a view that pulls in, for each supplier in the NAICS category of interest, Org A spend, Org B Spend, and Total Spend
  5. create a filter to only show zero spend suppliers — and there’s the whitespace … 100% complete. Now send your sales teams after these.
  6. Create a filter to show where your sales are less than expected (eg. from comparable other customers or Org A or Org B). This is additional whitespace where upselling or further customer penetration is appropriate.

Bill Rate Analysis

A smart company doesn’t just analyze their (total) spend by service provider, they analyze by service role and against the service role average when different divisions/locations are contracting for the same service that should be fulfilled by a professional with roughly the same skills and same experience level. Why? Because if you’re paying, on average, 150/hr for an intermediate DBA across 80% of locations and 250/hr across the remaining 20%, you’re paying as much as 66% too much at those remaining locations, with the exception being San Francisco or New York where your service provider has to pay their locals a cost-of-living top-up just so they can afford to live there.

By the same token, a smart service company is analyzing what they are getting by role, location, and customer and trying to identify the customers that are (the most) profitable and those that are the least (or unprofitable when you take contract size or support requirements into account), so they can focus on those customers that are profitable, and, hopefully, keep them happy with their better talent (and not just the newest turkey on the rafter).

However, just like sales discount variation analysis over time by client type, this is tough as it’s essentially a variation of that analysis, except you are looking at services instead of products, roles instead of client types, and customer instead of sales rep … and then, for your problem clients, looking at which service reps are responsible … so after you do the base analysis (using dynamic view based measures), you’re creating new views with new measures and filters to group by service rep and filter to those too far beyond a threshold. In any other tool, it would be nigh impossible for even an expert analyst. In Spendata, it’s a matter of minutes. Literally.

And this is just the tip of the iceberg in terms of what Spendata can do. In a future article, we’ll dive into a few more areas of analysis that require very specialized tools in different domains, but which can be done with ease in Spendata. Stay tuned!

CF Suite for your Consumer-Friendly Source-to-Contract Needs

Founded in 2004 to help public and private sector companies save money through reverse auctions, the Curtis Fitch Solution has expanded since then to offer a source-to-contract procurement solution, which includes extensive supplier onboarding evaluation, performance management, contract lifecycle management, and spend and performance management. Curtis Fitch offers the following capabilities in its solution.

Supplier Insight

CF Supplier Insights is their supplier registration, onboarding, information, and relationship management solution. It supports the creation and delivery of customized questionnaires, which can be associated with organizational categories anywhere in the 4-level hierarchy supported, so that suppliers are only asked to provide information that the organization needs for their qualification. You can track insurance and key certification requirements, with due dates for auto-reminders, to enable suppliers to self-serve. Supplier Insights offers task-oriented dashboards to help a buyer or evaluator focus in on what needs to be done.

The supplier management module presents supplier profiles in a clear and easy to view way, showing company details, registration audit, location, and contact information, etc.. You can quickly view an audit trail of any activity that the supplier is linked to in CF Suite, including access to onboarding questionnaires, insurance and certification documents, events they were involved in, quotes they provided, contracts that were awarded, categories they are associated with, and balanced scorecards.

When insurance and certifications are requested, so is the associated metadata like coverage, award date, expiry date, and insurer/granter. This information is monitored, and both the buyer and supplier are alerted when the expiration date is approaching. The system defines default metadata for all suppliers, but buyers can add their own fields as needed.

It’s easy to search for suppliers by name, status, workflow stage, and location, or simply scan through them by name. The buyer can choose to “hide” suppliers that have not completed the registration process and they will not be available for sourcing events or contracting.

e-Sourcing

CF eSourcing is their sourcing project management and RFx platform where a user can define event and RFx templates, create multi-round sourcing projects, evaluate the responses using weighted scoring and multi-party ratings, define awards, and track procurement spend against savings. Also, all of the metadata is available for scorecards, contracting, and event creation, so if a supplier doesn’t have the necessary coverage or certification, the supplier can be filtered out of the event, or the buyer can proactively ensure they are not invited.

Events can be created from scratch but are usually created from templates to support standardization across the business. An RFx template can define stakeholders, suppliers (or categories), and any sourcing information, including important documentation. In addition, a procurement workplan can be designed to reflect any sign off gates as necessary when supporting the appropriate public sector requirements some buying organizations must adhere to.

Building RFx templates is easy to do and there’s a variety of question styles available, depending on the response required from the vendor (i.e. free text, multichoice, file upload, financial etc.) RFx’s can be built by importing question sets, linking to supplier onboarding information, or via a template. The tool offers tender evaluation with auto-weighting and scoring functionality (based on values or pre-defined option selections). Their clients’ buyers can invite stakeholders to evaluate a tender and what the evaluator scores can be pre-defined. In addition, when it comes to RFQs for gathering the quotes, it supports total cost breakdowns and arbitrary formulas. Supplier submissions and quotes can be exported to Excel, including any supplier document.

The one potential limitation is that there is not a lot of built in analysis / side-by-side comparison for price analysis in Sourcing, as most buyers prefer to either do their analysis in Excel or use custom dashboards in analytics.

In addition, e-Sourcing events can be organized into projects that can not only group related sourcing events, and provide an overarching workflow, but can also be used to track actuals against the historical baseline and forecasted actuals for a realized savings calculation.

e-Auctions

CF Suite also includes CF Auctions. There are four styles of auction available for running both forward and reverse auctions; English, Sequential, Dutch, and Japanese auctions, which can all be executed and managed in real time. Auctions are easy to define and very easy to monitor by the buying organization as they can see the current bid for each supplier and associated baseline and target information that is hidden from the suppliers, allowing them to track progress against not only starting bids, but goals and see a real-time evaluation of the benefit associated with a bid.

Suppliers get easy to use bidding views, and depending on the settings, suppliers will either see their current rank or distance from lowest bid and can easily update their submissions or ask questions. Buyers can respond to suppliers one-on-one or send messages to all suppliers during the auction.

In addition, if something goes wrong, buyers can manage the event in real time and pause it, extend it, change owners, change supplier reps, and so on to ensure a successful auction.

Contract Management

CF Contracts Contract management enables procurement to build high churn contracts with limited and / or no clause changes, for example, NDAs or Terms of Service. CF Contracts has a clause library, workflow for internal sign off, and integrated redline tracking. Procurement can negotiate with suppliers through the tool, and once a contract has been drafted in CF Suite, the platform can be used to track versions, see redlines, accept a version for signing, and manage the e-Signature process. If CF Suite was used for sourcing, then if a contract is awarded off the back of an event, the contract can be linked with the award information from the sourcing module.

Most of their clients focus on using contracts as a central contract repository database to improve visibility of key contract information, and to feed into reporting outputs to support the management of the contract pipeline, including contract spend and contract renewals.

The contract database includes a pool of common fields (i.e. contract title, start and end dates, contract values etc.) and their clients can create custom fields to ensure the contract records align with their business data. Buyers can create automated contract renewal alerts that can be shared with the contract manager, business stakeholders or the contract management team, as one would expect from a contract management module.

Supplier Scorecards

CF Scorecards is their compliance, risk, and performance management solution that collates ongoing supplier risk management information into a central location. CF Suite uses all of this data to create a 360 degree supplier scorecard for managing risk, performance and development on an ongoing basis.

The great thing about scorecards is that you can select the questionnaires and third-party data you want to include, define the weightings, define the stakeholders who will be scoring the responses that can’t be auto-scored, and get a truly custom 360-degree scorecard on risk, compliance, and/or performance. You can attach associated documents, contracts, supplier onboarding questionnaires, third party assessments, and audits as desired to back up the scorecard, which provides a solid foundation for supplier performance, risk, and compliance management and development plan creation.

Data Analytics

Powered by Qlik, CF Analytics provides out-of-the-box dashboards and reports to help analyze spend, manage contract pipelines and lifecycles, track supplier onboarding workflow and status, and manage ongoing supplier risk . Client organizations can also create their own dashboards and reports as required, or Curtis Fitch can create additional dashboards and reports for the client on implementation. Curtis Fitch has API integrations available as standard for those clients that wish to analyse data in their preferred business tool, like Power BI, or Tableau.

The out-of-the-box dashboards and reports are well designed and take full advantage of the Qlik tool. The process management, contract/supplier status dashboard, and performance management dashboards are especially well thought out and designed. For example, the project management dashboard will show you the status of each sourcing project by stage and task, how many tasks are coming due and overdue, the total value of projects in each stage, and so on. Other process-oriented dashboards for contracts and supplier management are equally well done. For example, the contract management dashboard allows you to filter in by supplier category, or contract grouping and see upcoming milestones in the next 30 days, 60 days, and 90 days as well as overdue milestones.

The spend dashboards include all the standard dashboards you’d expect in a suite, and they are very easy to use with built-in filtering capability to quickly drill down to the precise spend you are interested in. The only down-side is they are OLAP based, and updates are daily. However, they are considering adding support for one or more BoB spend analysis platforms for those that want more advanced analytics capability.

Overall

It’s clear that the Curtis Fitch platform is a mature, well thought out, fleshed out platform for source to contract for indirect and direct services in both the public and private sector and a great solution not only for the global FTSE 100 companies they support, but the mid-market and enterprise market. It’s also very likely to be adopted, a key factor for success, because, as we pointed out in our headline, it’s very consumer friendly. While the UI design might look a bit dated (just like the design of Sourcing Innovation), it was designed that way because it’s extremely usable and, thus, very consumer friendly.

Curtis Fitch have an active roadmap, following development best practices, alongside scoping workshops, where they partner with their clients to ensure new features and benefits are based on user requirements. Many modern applications with flashy UIs, modern hieroglyphs, and text-based conversational interfaces might look cool, but at the end of the day sourcing professionals want to get the job done and don’t want to be blinded by vast swathes of functionality when looking for a specific feature. Procurement professionals want a well-designed, intuitive, guided workflow, a ‘3-clicks and I’m there’ style application that will get the job done efficiently and effectively. This is what CF Suite offers.

Conclusion

While there are some limitations in award analysis (as most users prefer to do that in Excel) and analytics (as it’s built on QlikSense), and not a lot of functionality that is truly unique if you compare it to functionality in the market overall, it is one of the broadest and deepest mid-market+ suites out there and can provide a lot of value to a lot of organizations. In addition, Curtis Fitch also offers consulting and managed auction/RFX services which can be very helpful to an understaffed organization as they can get some staff augmentation / event support while also having full visibility into the process and the ability to take over fully when they are ready. If you’re looking for a tightly integrated, highly useable, easily adopted Source-to-Contract platform with more contract and supplier management ability than you might expect, include CF Suite in the RFP. It’s certainly worth an investigation.