Monthly Archives: July 2024

GlobalTrade Tackled Procurement 2024 Before McKinsey, But Their Suggestions Weren’t that Innovative, Part II

As per Part 1, the doctor ignored this article over on GlobalTrade Magazine on 10 Innovative Approaches to Enhance Procurement Efficiency in 2024 because the approaches weren’t all that innovative, and the article, while professionally written, clearly wasn’t written by a Procurement Professional, as most of the recommendations were so basic even Chat-GPT could likely have produced something equally as good with high probability (gasp!). He’s only covering it because one recommendation had the potential to be the most innovative recommendation of the year (because no one is recommending it) had the author got it right (and approached it the right way).

However, since we covered and analyzed the McKinsey recommendations in great detail in a four-part series over the past two weeks, we will be fair and give GlobalTrade their due. In this two part article, we’ll quickly discuss each recommendation one-by-one to make it clear most of the suggestions really weren’t innovative. In fact, the one recommendation that is innovative wasn’t even described in the one way that makes it innovative. But since it did remind the doctor of one thing many of the recommendation articles were missing, this gives us another reason to cover it and use it as an example of why you need to seek out advice written by the experts, or at least people who live Procurement and/or Procurement Tech day-in-and-day-out.

6. Use AI to Review Process.

Uhm, NO! Use analytics and automation, not AI! And use traditional process analysis tools to identify where you are spending the most (and possibly too much) time.

7. Try New Inventory Software.

And if everything written to this point wasn’t a dead giveaway this article wasn’t written by a Procurement Pro, this is. First of all, inventory is operation / supply chain & logistics, not Procurement. Secondly, it’s not new inventory software, it’s e-Procurement software that can integrate with the inventory management system to determine if a request should be (re)allocated from inventory or ordered from a nearby supplier (using a pre-approved catalog item). (Heck, the author couldn’t even get the market size increase right — it’s 4.9 Billion according to the linked study, not 4.9 million! And if you’re interested in the Procurement market, Technavio, owned by Infiniti Research, is NOT one of the leading analyst firms in the Procurement Market.)

8. Formalize the Procurement Process.

How non-innovative can you get? Are there any organizations still in business at this point who have Not formalized the process? It’s no longer formalize, it’s SaaS-back and automate as much as possible!

9. Strategize Market Analysis.

Would any Procurement department doing market analysis really be doing it off the cuff? Uhm, no! It’s not strategize, it’s automate — implement platforms that automatically collect, track, analyze, report on changes and provide predictions on costs, availability, risk, and other important pieces of information.

10. Reassess Cost Evaluation.

This is the ONE prediction that could have been the most innovative prediction this year if thought through and presented properly. The author noted that many companies are not looking at the total acquisition cost and indicated that buyers should look at this, as well as usage costs and even disposal costs, getting into total cost of ownership (TCO) territory — you know, the concept we’ve been talking about here on SI since we started in 2006!

However, in today’s economy, TCO is no longer enough, and you have to move onto the next generation of what we have been calling TVM: Total Value Management since 2007! The root of TVM was that total cost of ownership is not enough when the end goal of every product or service obtained is about value, and value goes beyond pure cost elements and includes bundled services, controlled and understood risk, and brand recognition.

So cost evaluation needs to factor that in as well, but often that’s not enough anymore either. It’s not just supply or stability risk, it’s regulatory compliance. It’s not just product cost, but carbon cost. It’s not just brand recognition, it’s brand risk if your suppliers are using slave labour, polluting the environment with carcinogens, or finding new and inventive ways to be truly evil. It’s also not just today’s price, it’s tomorrow’s price. If the product relies on a raw material currently getting scarcer by the day, can you find an alternative that doesn’t need that material, or needs less of it? And so on. Cost evaluation is not just cost alone anymore. And any organization that takes the next step here will be truly innovative.

Now, in all fairness, the doctor should point out that the article’s recommendations could be considered innovative if the organization didn’t have a Procurement department, but in today’s economic environment, unless it had a monopolistic stranglehold on a market, the doctor doesn’t see how a company of any size without a proper Procurement function could still be in operation.

Anyway, that’s all, folks!

GlobalTrade Tackled Procurement 2024 Before McKinsey, But Their Suggestions Weren’t that Innovative, Part I

Except for one suggestion, and only if you interpreted it the right way. But let’s backup.

the doctor ignored this article over on GlobalTrade Magazine on 10 Innovative Approaches to Enhance Procurement Efficiency in 2024 because the approaches weren’t all that innovative, and the article, while professionally written, clearly wasn’t written by a Procurement Professional, as most of the recommendations were so basic even Chat-GPT could likely have produced something equally as good with high probability (gasp!).

However, since we covered and analyzed the McKinsey recommendations in great detail in a four-part series over the past two weeks, we will be fair and give GlobalTrade their due. In this two part article, we’ll quickly discuss each recommendation one-by-one to make it clear most of the suggestions really weren’t innovative. In fact, the one recommendation that is innovative wasn’t even described in the one way that makes it innovative. But since it did remind the doctor of one thing many of the recommendation articles were missing, this gives us another reason to cover it and use it as an example of why you need to seek out advice written by the experts, or at least people who live Procurement and/or Procurement Tech day-in-and-day-out.

1. Consolidate Various Supplier Lists.

Is this 1984? This was advice you’d expect to see when Jack Welch started revolutionizing Procurement at GE in the 80s, which gave rise to the first sourcing and procurement platforms in the 90s (like FreeMarkets Inc. that was started by Meakem in ’95 after leaving GE to productize what he learned). Today, the advice should be upgrade to a modern supplier management 360 platform that consolidates all of your suppliers and their associated information including, but not limited to, complete corporate profile, insurance and compliance, risk, sustainability/ESG/Scope 3, and any other information you need to do business with the supplier.

2. Conduct Frequent Educational Courses.

This is best practices 101 for any critical discipline within your organization, not just Procurement, and it’s relevant both for the team, and the people who need to interact with / depend on the team and / or use Procurement’s systems. Plus, overworked, and overstressed, professionals will learn better with frequent short courses (that they can put into practice) vs. a once a year cram session. The best advice here is to conduct frequent, specialized, courses on key systems and processes by role. And archive the materials online for easy access for refresh as needed.

3. Work on Supplier Relationships.

Supplier Relationship Management is Procurement 101 for strategic suppliers and has been for two decades. Nothing to learn here. Except make sure your modern Supplier Management 360 platform can support your supplier relationship management activities by tracking performance, agreed upon development plans, synchronous and asynchronous activities between all parties, etc.

4. Review Expectations with Suppliers.

Isn’t this part of supplier relationship management? Which, as we just discussed, is something you should have been doing since day 1. The advice here should be to make sure your modern Supplier Management 360 portal contains all of the agreements, milestones, orders, delivery dates, real-time performance data, development plans, and other elements that define supplier expectations.

5. Remain Open to Solutions of All Sizes.

While not very innovative, especially as written, this was the only other suggestion that Procurement departments need to hear. Consumer spending is flat or falling. Investment money has slowed to a trickle. Inflation is back with a vengeance, and budgets are being slashed to the bones. So you should be open to solutions of all sizes, especially when it comes to:

  • supplier management
  • process management
  • software / SaaS platforms
  • consulting

And especially SaaS platforms and consulting. If you haven’t looked for a solution to solve process / problem X since the last decade because it was too expensive, look again. When spend analysis first hit the market, it was a Million Dollar solution for software and services. A few years later, when BIQ hit the scene, you got more power and more value identified for 1/10 of the cost and low six figures bought you a full enterprise license and enough services to identify a year’s worth of opportunities. Then, a decade later, when Spendata hit the scene, a mid-market could get a full enterprise license for a core analytics team of 5 for $14,000 a a year, and for another $10,000, get enough training and guidance to use the software themselves to identify a year’s worth of opportunities from built-in templates and standard analyses. Same holds for any application you can think of — for any module you could want, someone has a SaaS mid-market solution for 2K to 3K a month. Not the 20K to 30K you would have paid a decade ago.

And for consulting, you don’t need a Big X where you have to hire a team at rates starting at 4K a day for the recent grad. You can hire an expert from a mid-market niche who is powered by the right tech who can do the work of an entire team for 6K a day — which is less than the Big X charges for the project manager who adds no value to your project.

We’ll tackle the next 5 in Part II.

Spendata: A True Enterprise Analytics Solution

As we indicated in our last article, while Spendata is the absolute best at spend analysis, it’s not just a spend analysis platform. It’s a general-purpose data analytics platform that can be used for much more than spend analysis.

The current end-state vision for business data analytics is a “data lake” database with a BI front end. The Big X consultancies (aided and abetted by your IT department, which is only too eager to implement another big system) will try to convince you of the data paradise you’ll have if you dump all of your business data into a data lake. Unfortunately, reality doesn’t support the vision, because organizational data is created only to the extent necessary, never verified, riddled with errors from day one, and left to decay over time as it’s never updated. The data lake is ultimately a data cesspool.

Pointing a BI tool at the (dirty) lake will spice up the data with bars, pies, waves, scatters, multi-coloured geometric shapes, and so on, but you won’t find much insight other than the realization that your data is, in fact, dirty. Worse, a published BI dashboard is like a spreadsheet you can’t modify. Try mapping new dimensions, creating new measures, adding new data, or performing even the simplest modification of an existing dimension or hierarchy, and you’ll understand why this author likes to point out that BI should actually stand for Bullsh!t Images, not Business Intelligence.

So how does a spend analysis platform like Spendata end up being a general-purpose data analytics tool? The answer is that the mechanisms and procedures associated with spend analysis and spend analysis databases, specifically data mapping and dimension derivation, can be taken to the next level — extended, generalized, and moved into real time. Once those key architectural steps are taken, the system can be further extended with view-based measures, shared cubes where custom modifications are retained across refreshes, and spreadsheet-like dependencies and recalculation at database scale.

The result is an analysis system that can be adapted not only to any of the common spend analysis problems, such as AP/PO analysis or commodity-specific cubes with item level price X quantity data, but also to savings tracking and sourcing and implementation plans. Extending the system to domains beyond spend analysis is simple: just load different data.
The bottom line is that to do real data analysis, no matter what the domain, you need:

  • the ability to extend the schema at any time
  • the ability to add new derived dimensions at any time
  • the ability to change mappings at any time
  • the ability to build derivations, data views, and mappings that are dependent on other derivations, mappings, views, inputs, linked datasets, and so on, with real-time “recalc”
  • the ability to create new views and reports relevant to the question you have … without dumping the data to Excel
  • … and preserve all of the above on cube data refreshes
  • … in your own copy of the cube so you don’t have to wait for anyone to agree
  • … and get an answer today, not on the next refresh next month when you’ve forgotten why you even had the question in the first place

You don’t get any of that from a spend analysis solution, or a BI solution, or a database pointing at a data lake. You only get that in a modern data analysis solution — which supports all of the above, and more, for any kind of data. A data analysis system works equally well across all types of numeric or set-valued data, including, but not limited to sales data, service data, warranty data, process data, and so on.

As Spendata is a real data analysis solution, it supports all of these analyses with a solution that’s easier and friendlier to use than the spreadsheet you use every day. Let’s walk through some examples so you can understand what a data analysis solution really can do.

SALES ANALYSIS

Spending data consists of numerical amounts that represent the price, tax, duty, shipping, etc. paid for items purchased. Sales data is numerical amounts that represent the price, tax, duty, shipping, etc. paid for items sold.

They are basically the inverse of each other. For every purchase, there is a sale. For every sale, there is a purchase. So, there’s absolutely no reason that you shouldn’t be able to apply the exact the same analysis (possibly in reverse) to sales data as you apply to spend data. That is, IF you have a proper data analysis tool. The latter part is the big IF because if you’re using a custom tool that needs to map all data to a schema with fixed semantics, it won’t understand the data and you’re SOL.

However, since Spendata is a general-purpose data analysis tool that builds and maintains its schema on the fly, it doesn’t care if the dataset is spend data or sales data; it’s still transactional data and it’s happy to analyze away. If you need the handholding of a workflow-oriented UI, that can also be configured out of the box using Spendata‘s new “app” capability.

Here are three types of sales analysis that Spendata supports better than CRM/Sales Forecasting systems, and that can’t be done at all with a data lake and a BI tool.

Sales Discount Variation Analysis Over Time By Salesperson … and Client Type

You run a sales team. Are your different salespeople giving the same mix of discounts by product type to the same types of customers by customer size and average sales size?

Sounds easy right? Can’t you simply plot the product/price ratio by month by salesperson in a bubble chart (where volume size correlates to bubble size) against the average trend line and calculate which salespeople are off the most (in the wrong direction)? Sure, but how do you handle client type? You could add a “color” dimension, but when the bubbles overlap and the bubbles blur, can you see it visually? Not likely. And how do you remember a low sales volume customer which is a strategic partner, so has a special deal? Theoretically you could add another column to the table “Salesperson, Product/Price Ratio, Client Type, Over/Under Average”, and that would work as long as you could pre-compute the average discount by Product/Price Ratio and Client Type.

And then you realize that unless you group by category, you have entirely different products in the same product/price ratio and your multi-stage analysis is worthless, so you have to go back and start again, only to find out that the bubble chart is only pseudo-useful (as you can’t really figure it out visually because what is that shade of pink (from the multiple red and white bubbles overlapping) — Fuchsia, Bright, or Barbie — and what does it mean) and you will have to focus on the fixed table to extract any value at all from the analysis.

But then you’ll realize that you still need to see monthly variations in the chart, meaning you want the ability to drag a slider or change the month and have the bubble chart update. Uh-oh, you forgot to individually compute all the amounts by month or select the slider graph! Back to square one, doing it all over again by month. Then you notice some customers have long-term, fixed prices on some products, which messes up the average discount on these products as the prices for these customers are not changing over time. You redo the work for the third (or is it the fourth? time), and then you realize that your definitions of client type “large, medium, and small” are slightly off as a client that should be in large is in medium and two that should be in small were made medium. Aaarrrggghhh!!!

But with Spendata, you simply create or modify dimensions to the cube to segment the data (customer type, product groups, etc.) You leverage a dynamic view-based measure by customer type to set the average prices per time period (used to calculate the discount). You then use filters to define the time range of interest, another view with filters to click through the months over time, a derived view to see the performance by quarter, another by year. If you change the definition of client type (which customers belong to which client type), which products for customers are fixed prices, which SKU’s that are the same type, time range of interest, etc. you simply map them and the entire analysis auto-updates.

This flexibility and power (with no wasted effort) gives you a very deep analysis capability NOT available in any other data analysis platform. For example, you can find out with a few clicks that your “best” salesperson in terms of giving the lowest average discount is actually costing you the most. Turns out, he’s not serving any large customers (who get good discounts) and has several fixed price contracts (which mess up the average discounts). So, the discounts he’s giving the small clients, while less than what large customers get, are significantly more than what other salespeople provide to other small customers. This is something you’d never know if you didn’t have the power of Spendata as your data consultant would give up on the variance analysis at the global level because the salesman’s overall ratio looked good.

Post-Merger White-Space Analysis

White space sales analysis is looking for spaces in the market where you should be selling but are not. For example, if you sell to restaurants, you could look at your sales by geography, normalized by the number of establishments by type or the sales of the restaurants by type in that geography. In a merger, you could measure your penetration at each customer for each of the original companies. You can find white space by looking at each customer (or customer segment) and measuring revenue per customer employee across the two companies. Where is one more effective than the other?

You might think this is no big deal because this was theoretically done during the due diligence and the opportunity for overlap was deemed to be there, as well as the opportunity for whitespace, and whatever was done was good enough. The reality couldn’t be further from the truth.

If the whitespace analysis was done with a standard analytics tool, it has all the following problems:

  • matching vendors were missed due to different name entries and missing ids
  • vendors were not familied by parent (within industry, geography, etc.)
  • the improperly merged vendors were only compared against a target file built by the consultants and misses vendors
  • i.e. it’s poor, but no worse than you’d do with a traditional analytics tool

But with Spendata, these problems would be at least minimized, if not eliminated because:

  • Spendata comes with auto-matching capability
  • … that can be used to enrich the suppliers with NAICS categorization (for example)
  • Spendata comes with auto-familying capability so parent-child relationships aren’t missed
  • Spendata can load all of the companies from a firmographic database with their NAICS codes in a separate cube …
  • … and then federation can be used to match the suppliers in use with the suppliers in the appropriate NAICS category for the white space analysis

It’s thus trivial to

  1. load up a cube with organization A’s sales by supplier (which can be the output from a view on a transaction database), and run it through a view that embeds a normalization routine so that all records that actually correspond to the same supplier (or parent-child where only the parent is relevant) are grouped into one line
  2. load up a cube with organization B’s sales by supplier and do the same … and now you know you have exact matches between supplier names
  3. load up the NAICS code database – which is a list of possible customers
  4. build a view that pulls in, for each supplier in the NAICS category of interest, Org A spend, Org B Spend, and Total Spend
  5. create a filter to only show zero spend suppliers — and there’s the whitespace … 100% complete. Now send your sales teams after these.
  6. Create a filter to show where your sales are less than expected (eg. from comparable other customers or Org A or Org B). This is additional whitespace where upselling or further customer penetration is appropriate.

Bill Rate Analysis

A smart company doesn’t just analyze their (total) spend by service provider, they analyze by service role and against the service role average when different divisions/locations are contracting for the same service that should be fulfilled by a professional with roughly the same skills and same experience level. Why? Because if you’re paying, on average, 150/hr for an intermediate DBA across 80% of locations and 250/hr across the remaining 20%, you’re paying as much as 66% too much at those remaining locations, with the exception being San Francisco or New York where your service provider has to pay their locals a cost-of-living top-up just so they can afford to live there.

By the same token, a smart service company is analyzing what they are getting by role, location, and customer and trying to identify the customers that are (the most) profitable and those that are the least (or unprofitable when you take contract size or support requirements into account), so they can focus on those customers that are profitable, and, hopefully, keep them happy with their better talent (and not just the newest turkey on the rafter).

However, just like sales discount variation analysis over time by client type, this is tough as it’s essentially a variation of that analysis, except you are looking at services instead of products, roles instead of client types, and customer instead of sales rep … and then, for your problem clients, looking at which service reps are responsible … so after you do the base analysis (using dynamic view based measures), you’re creating new views with new measures and filters to group by service rep and filter to those too far beyond a threshold. In any other tool, it would be nigh impossible for even an expert analyst. In Spendata, it’s a matter of minutes. Literally.

And this is just the tip of the iceberg in terms of what Spendata can do. In a future article, we’ll dive into a few more areas of analysis that require very specialized tools in different domains, but which can be done with ease in Spendata. Stay tuned!