Category Archives: Sourcing Innovation

The Sourcing Innovation Source-to-Pay+ Mega Map!

Now slightly less useless than every other logo map that clogs your feeds!

1. Every vendor verified to still be operating as of 4 days ago!
Compare that to the maps that often have vendors / solutions that haven’t been in business / operating as a standalone entity in months on the day of release! (Or “best-of” lists that sometimes have vendors that haven’t existed in 4 years! the doctor has seen both — this year!)

2. Every vendor logo is clickable!
the doctor doesn’t know about you, but he finds it incredibly useless when all you get is a strange symbol with no explanation or a font so small that you would need an electron microscope to read it. So, to fix that, every logo is clickable so you can go to the site and at least figure out who the vendor is.

3. Every vendor is mapped to the closest standard category/categories!
Furthermore, every category has the standard definitions used by Sourcing Innovation and Spend Matters!
the doctor can’t make sense of random categories like “specialists” or “collaborative” or “innovative“, despises when maps follow this new age analyst/consultancy award trend and give you labels you just can’t use, and gets red in the face when two very distinct categories (like e-Sourcing and Marketplaces or Expenses and AP are merged into one). Now, the doctor will also readily admit that this means that not all vendors in a category are necessarily comparable on an apples-to-apples basis, but that was never the case anyway as most solutions in a category break down into subcategories and, for example, in Supplier Management (SXM) alone, you have a CORNED QUIP mash of solutions that could be focused on just a small subset of the (at least) ten different (primary) capabilities. (See the link on the sidebar that takes you to a post that indexes 90+ Supplier Management vendors across 10 key capabilities.)

Secure Download the PDF!  (or, use HTTP) [HTML]
(5.3M; Note that the Free Adobe Reader might choke on it; Preview on Mac or a Pro PDF application on Windows will work just fine)

Spendata: The Power Tool for the Power Spend Analyst — Now Usable By Apprentices as Well!

We haven’t covered Spendata much on Sourcing Innovation (SI), as it was only founded in 2015 and the doctor did a deep dive review on Spend Matters in 2018 when it launched (Part I and Part II, ContentHub subscription required), as well as a brief update here on SI where we said Don’t Throw Away that Old Spend Cube, Spendata Will Recover It For You!. the doctor did pen a 2020 follow up on Spend Matters on how Spendata was Rewriting Spend Analysis from the Ground Up, and that was the last major coverage. And even though the media has been a bit quiet, Spendata has been diligently working as hard on platform improvement over the last four years as they were the first four years and just released Version 2.2 (with a few new enhancements in the queue that they will roll out later this year). (Unlike some players which like to tack on a whole new version number after each minor update, or mini-module inclusion, Spendata only does a major version update when they do considerable revamping and expansion, recognizing that the reality is that most vendors only rewrite their solution from the ground up to be better, faster, and more powerful once a decade, and every other release is just an iteration, and incremental improvement of, the last one.)

So what’s new in Spendata V 2.2? A fair amount, but before we get to that, let’s quickly catch you up (and refer you to the linked articles above for a deep dive).

Spendata was built upon a post-modern view of spend analysis where a practitioner should be able to take immediate action on any data she can get her hands on whenever she can get her hands on it and derive whatever insights she can get for process (or spend) improvement. You never have perfect data, and waiting until Duey, Clutterbuck, and Howell1 get all your records in order to even run your first report when you have a dozen different systems to integrate data from, multiple data formats to map, millions of records to classify, cleanse and enrich, and third party data feeds to integrate will take many months, if not a year, and during that year where you quest for the mythical perfect cube you will continue to lose 5% due to process waste, abuse, and fraud, and 3% to 15% (or more) across spend categories where you don’t have good management but could stem the flow simply by identifying them and putting in place a few simple rules or processes. And you can identify some of these opportunities simply by analyzing one system, one category, and one set of suppliers. And then moving on to the next one. And, in the process, Spendata automatically creates and maintains the underlying schema as you slowly build up the dimensions, the mapping, cleansing, and categorization rules, and the basic reports and metrics you need to monitor spend and processes. And maybe you can only do 60% to 80% piecemeal, but during that “piecemeal year”, you can identify over half of your process and cost savings opportunities and start saving now, versus waiting a year to even start the effort. When it comes to spend (related) data analysis, no adage is more true than “don’t put off until tomorrow what you can do today” with Spendata, because, and especially when you start, you don’t need complete or perfect data … you’d be amazed how much insight you can get with 90% in a system or category, and then if the data is inconclusive, keeping drilling and mapping until you get into the 95% to 98% accuracy range.

Spendata was also designed from the ground up to run locally and entirely in the browser, because no one wants to wait for an overburdened server across a slow internet connection, and do so in real time … and by that we mean do real analysis in real time. Spendata can process millions of records a minute in the browser, which allows for real time data loads, cube definitions, category re-mappings, dynamically derived dimensions, roll-ups, and drill downs in real-time on any well-defined data set of interest. (Since most analysis should be department level, category level, regional, etc., and over a relevant time span, that should not include every transaction for the last 10 years because beyond a few years, it’s only the quarter over quarter or year over year totals that become relevant, most relevant data sets for meaningful analysis even for large companies are under a few million transactions.) The goal was to overcome the limitations of the first two generations of spend analysis solutions where the user was limited to drilling around in, and deriving summaries of, fixed (R)OLAP cubes and instead allow a user to define the segmentations they wanted, the way they wanted, on existing or newly loaded (or enriched federated data) in real time. Analysis is NOT a fixed report, it is the ability to look at data in various ways until you uncover an inefficiency or an opportunity. (Nor is it simply throwing a suite of AI tools against a data set — these tools can discover patterns and outliers, but still require a human to judge whether a process improvement can be made or a better contract secured.)

Spendata was built as a third generation spend analysis solution where

  • data can be loaded and processed at any point of the analysis
  • the schema is developed and modified on the fly
  • derived dimensions can be created instantly based on any combination of raw and previously defined derived dimensions
  • additional datasets from internal or external sources can be loaded as their own cubes, which can then be federated and (jointly) drilled for additional insight
  • new dimensions can be built and mapped across these federations that allow for meaningful linkages (such as commodities to cost drivers, savings results to contracts and purchasing projects, opportunities by size, complexity, or ABS analysis, etc.)
  • all existing objects — dimensions, dashboards, views (think dynamic reports that update with the data), and even workspaces can be cloned for easy experimentation
  • filters, which can define views, are their own objects, can be managed as their own objects, and can be, through Spendata‘s novel filter coin implementation, dragged between objects (and even used for easy multi-dimensional mapping)
  • all derivations are defined by rules and formula, and are automatically rederived when any of the underlying data changes
  • cubes can be defined as instances of other cubes, and automatically update when the source cube updates
  • infinite scrolling crosstabs with easy Excel workbook generation on any view and data subset for those who insist on looking at the data old school (as well as “walk downs” from a high-level “view” to a low-level drill-down that demonstrates precisely how an insight was found
  • functional widgets which are not just static or semi-dynamic reporting views, but programmable containers that can dynamically inject data into pre-defined analysis and dimension derivations that a user can use to generate what-if scenarios and custom views with a few quick clicks of the mouse
  • offline spend analysis is also available, in the browser (cached) or on Electron.js (where the later is preferred for Enterprise data analysis clients)

Furthermore, with reference to all of the above, analyst changes to the workspace, including new datasets, new dashboards and views, new dimensions, and so on are preserved across refresh, which is Spendata’s “inheritance” capability that allows individual analysts to create their own analyses and have them automatically updated with new data, without losing their work …

… and this was all in the initial release. (Which, FYI, no other vendor has yet caught up to. NONE of them have full inheritance or Spendata‘s security model. And this was the foundation for all of the advanced features Spendata has been building since its release six years ago.)

After that, as per our updates in 2018 and 2020, Spendata extended their platform with:

  • Unparalleled Security — as the Spendata server is designed to download ONLY the application to the browser, or Spendata‘s demo cubes and knowledge bases, it has no access to your enterprise data;
  • Cube subclassing & auto-rationalization — power users can securely setup derived cubes and sub-cubes off of the organizational master data cubes for the different types of organizational analysis that are required, and each of these sub-cubes can make changes to the default schema/taxonomy, mappings, and (derived) dimensions, and all auto-update when the master cube, or any parent cube in the hierarchy, is updated
  • AI-Based Mapping Rule Identification from Cube Reverse Engineering — Spendata can analyze your current cube (or even a report of vendor by commodity from your old consultant) and derive the rules that were used for mapping, which you can accept, edit, or reject — we all know black box mapping doesn’t work (no matter how much retraining you do, as every “fix” all of a sudden causes an older transaction to be misclassified); but generating the right rules that can be human understood and human maintained guarantees 100% correct classification 100% of the time
  • API access to all functions, including creating and building workspaces, adding datasets, building dimensions, filtering, and data export. All Spendata functions are scriptable and automatable (as opposed to BI tools with limited or nonexistent API support for key functions around building, distributing, and maintaining cubes).

However, as we noted in our introduction, even though this put Spendata leagues beyond the competition (as we still haven’t seen another solution with this level of security; cube subclassing with full inheritance; dynamic workspace, cube, and view creation; etc.), they didn’t stop there. In the rest of this article, we’ll discuss what’s new from the viewpoint of Spendata Competitors:

Spendata Competitors: 7 Things I Hate About You

Cue the Miley Cyrus, because if competitors weren’t scared of Spendata before, if they understand ANY of this, they’ll be scared now (as Spendata is a literal wrecking ball in analytic power). Spendata is now incredibly close to negating entire product lines of not just its competitors, but some of the biggest software enterprises on the planet, and 3.0 may trigger a seismic shift on how people define entire classes of applications. But that’s a post for a later day (but should cue you up for the post that will follow this on on just precisely what Spendata 2.2 really is and can do for you). For now, we’re just going to discuss seven (7) of the most significant enhancements since our last coverage of Spendata.

Dynamic Mapping

Filters can now be used for mapping — and as these filters update, the mapping updates dynamically. Real-time reclassify on the fly in a derived cube using any filter coin, including one dragged out of a drill down in a view. Analysis is now a truly continuous process as you never have to go back and change a rule, reload data, and rebuild a cube to make a correction or see what happens under a reclassification.

View-Based Measures

Integrate any rolled up result back into the base cube on the base transactions as a derived dimension. While this could be done using scripts in earlier versions, it required sophisticated coding skills. Now, it’s almost as easy as a drag-and-drop of a filter coin.

Hierarchical Dashboard Menus

Not only can you organize your dashboards in menus and submenus and sub-sub menus as needed, but you can easily bookmark drill downs and add them under a hierarchical menu — makes it super easy to create point-based walkthroughs that tell a story — and then output them all into a workbook using Spendata‘s capability to output any view, dashboard, or entire workspace as desired.

Search via Excel

While Spendata eliminates the need for Excel for Data Analysis, the reality is that is where most organizational data is (unfortunately) stored, how most data is submitted by vendors to Procurement, and where most Procurement Professionals are the most comfortable. Thus, in the latest version of Spendata, you can drag and drop groups of cells from Excel into Spendata and if you drag and drop them into the search field, it auto-creates a RegEx “OR” that maintains the inputs exactly and finds all matches in the cube you are searching against.

Perfect Star Schema Output

Even though Spendata can do everything any BI tool on the market can do, the reality is that many executives are used to their pretty PowerBI graphs and charts and want to see their (mostly static) reports in PowerBI. So, in order to appease the consultancies that had to support these executives that are (at least) a generation behind on analytics, they encoded the ability to output an entire workspace to a perfect star schema (where all keys are unique and numeric) that is so good that many users see a PowerBI speed up by a factor of almost 10. (As any analyst forced to use PowerBI will tell you, when you give PowerBI any data that is NOT in a perfect star schema, it may not even be able to load the data, and that it’s ability to work with non-numeric keys at a speed faster than you remember on an 8088 is nonexistent.)

Power Tags

You might be thinking “tags, so what“. And if you are equating tags with a hashtag or a dynamically defined user attribute, then we understand. However, Spendata has completely redefined what a tag is and what you can do with it. The best way to understand it is a Microsoft Excel Cell on Steroids. It can be a label. It can be a replica of a value in any view (that dynamically updates if the field in the view updates). It can be a button that links to another dashboard (or a bookmark to any drill-down filtered view in that dashboard). Or all of this. Or, in the next Spendata release, a value that forms the foundation for new derivations and measures in the workspace just like you can reference a random cell in an Excel function. In fact, using tags, you can already build very sophisticated what-if analysis on-the-fly that many providers have to custom build in their core solutions (and take weeks, if not months, to do so) using the seventh new capability of Spendata, and usually do it in hours (at most).

Embedded Applications

In the latest version of Spendata, you can embed custom applications into your workspace. These applications can contain custom scripts, functions, views, dashboards, and even entire datasets that can be used to instantly augment the workspace with new analytic capability, and if the appropriate core columns exist, even automatically federate data across the application datasets and the native workspace.

Need a custom set of preconfigured views and segments for that ABC Analysis? No sweat, just import the ABC Analysis application. Need to do a price variance analysis across products and geographies, along with category summaries? No problem. Just import the Price Variance and Category Analysis application. Need to identify opportunities for renegotiation post M&A, cost reduction through supply base consolidation, and new potential tail spend suppliers? No problem, just import the M&A Analysis app into the workspace for the company under consideration and let it do a company A vs B comparison by supplier, category, and product; generate the views where consolidation would more than double supplier spend, save more than 100K on switching a product from a current supplier to a lower cost supplier; and opportunities for bringing on new tail spend suppliers based upon potential cost reductions. All with one click. Not sure just what the applications can do? Start with the demo workspaces and apps, define your needs, and if the apps don’t exist in the Spendata library, a partner can quickly configure a custom app for you.

And this is just the beginning of what you can do with Spendata. Because Spedata is NOT a Spend Analysis tool. That’s just something it happens to do better than any other analysis tool on the market (in the hands of an analyst willing to truly understand what it does and how to use it — although with apps, drag-and-drop, and easy formula definition through wizardly pop-ups, it’s really not hard to learn how to do more with Spendata than any other analysis tool).

But more on this in our next article. For The Times They Are a-Changin’.

1 Duey, Clutterbuck, and Howell keeps Dewey, Cheatem, and Howe on retainer … it’s the only way they can make sure you pay the inflated invoices if you ever wake up and realize how much you’ve been fleeced for …

Is Your Strategic Operational Sourcing Not Succinct Enough? Maybe You Need A DeepStream To Tackle That SOS Problem.

DeepStream is a Source-to-Contract (S2C) platform that was founded in 2016 in London, England to empower midsized organizations with affordable, modern, streamlined, but still sufficiently deep source-to-contract capability that would empower their customers to be more efficient, get more spend under management (and savings, or at least cost avoidance in today’s inflationary economy), and award with confidence.

DeepStream was founded by practitioners with experience, built with the guidance of expert consultants and industry leaders and beta customers, and overseen by former implementation consultants with a lot of experience implementing the S2C Mega-Suites (and who know all the issues customers have in implementing, integrating, and maintaining those systems as well as all the reasons they aren’t always the best solution for the mid-market) who are continually developing and improving the system over time.

While DeepStream is designed to very customizeable and very general purpose (and works great for indirect/finished goods and services in general), because it is impossible to be everything to everyone (even though the Big Suites will claim they are), especially from a consulting/guidance perspective, they are highly focussed on the industries their founders are experts in and related industries. Specifically, they are focussed heavily on Energy & Renewable Energy (and O&G), Utilities, MRO, Site/Port Operations, and Consultancies that support these sectors (be they public, private, or quasi — such as public funded, privately managed). (These are the sectors in which they have the process expertise to help you set up your category templates to streamline your sourcing efforts … more on this later.)

The platform started as a core sourcing platform (RFX and Auction), and evolved to support Supplier Information Management, Contract Management (primarily Governance in Sourcing Innovation Terminology), and now offers a public Supplier Network of almost 10,000 suppliers that grows daily (and significantly with every new client. This may sound small compared to the suite supplier networks that claim millions of suppliers, but you need to remember three things: 1] many of these mega-suite networks reach their number by simply importing every government business registry globally, and nowhere near that many suppliers are active in their customer base; 2] DeepStream are focussed on a particular set of sectors which don’t have a super large supply base, and all of their suppliers have been verified as active and being interacted with; and 3] DeepStream expects your ERP/MRP/P2P/AP to be the supplier master and advocates customers only import active suppliers).


Sourcing revolves around a templated event structure, which can be setup by a Full User, the DeepStream services team, or both. (On implementation they will work with you to setup one template per category, as that is their recommended best practice. They have found that trying to cover multiple categories with one template misses the nuances of the individual categories and requires too much customization for every event, and having multiple per category with only slight differences by product/service makes management and upkeep too much work.) These event templates don’t just capture the RFI/P/Q requests, but all stages, including, but not limited to NDA, Onboarding, Prequalification (which can be separate from the RFI to avoid repeated RFIs), RFI, Initial Bid Collection, e-Auction, etc. etc. etc.

When an event is instantiated from a template, which requires only some basic information (name and dates), it will have a pre-populated summary, stages, details, a default evaluation matrix, a team, a starting set of suppliers, and possibly an e-Auction. The buyer can quickly access each event section of each stage and customize as needed. The application supports all standard HTML form functionality for data collection, makes it super easy to build sections, subsections, and questions for data collection, just as easy to build grids for bidding (that can collect all cost elements associated with a product or service, including complex rate cards), even easier to upload bids from a spreadsheet and, if desired, even cut-and-paste spreadsheet/Excel based bids (because it’s not just the favourite tool of a Procurement organization that doesn’t have modern tech, but the favourite tool of Supplier Reps as well). In addition, once instantiated, the event structure is not locked, the request owner or super user can modify it as needed (if more time needs to be added to a stage due to technical or communication issue, if another stage needs to be added because the responses are not differentiated or competitive enough or more requirements are added, and so forth).

Reverse Auctions have a very simple and clean UX and were designed to be easy to grasp, and use, by both buyers and supplier bidders. There is also integrated chat for real-time communication if needed. Buyers see the current total lot cost and suppliers see the current lowest bids, or their rank, in a public or blind manner, and can keep bidding until the time is up or they’ve given their best and final bid.

Evaluation is done using a grid structure on each relevant event section, where sections can be added or removed, by one or more evaluators, who can see all of the bids and responses side by side, including either full details or just summary, filtering down to just what they need to make an evaluation (and eventual award if the event is completely price-based). In a summary evaluation, they can click into the full response history or bid details (especially if the product was broken down into multiple cost components) and if it’s a multi-evaluator event, drill in to see the individual evaluator scores. There’s no graphical representation for bids just yet, but they have added BAFO (Best And Final Offer) capability to clearly designate final bids as well as automatically computing the deltas in bid responses in both percentage (%) and dollar ($) value, which are highlighted in the comparison view. Additional enhanced valuation functionality is planned for future releases.

One very unique feature of the platform is built-in support for collaborators. Most platforms make it easy to add other organizational users, but not so easy to add consultants who are helping on specific categories or projects. In the DeepStream platform, you can define collaborator organizations and users within these organizations and then, on an event, or stage [“page”], basis grant collaborators access at whatever level of access they need (read, comment, evaluate, write, etc.). This means that the platform is also great for niche consultancies as they can add their client as a collaborator and give key stakeholders visibility while managing everything on the customer’s behalf. (And, of course, it’s super easy to add organizational users to each page and grant them the precise level of access they need.)

A second very unique feature is their document management capability. Most RFX platforms just allow upload, with simple version tracking, and that’s it. The DeepStream platform understands there is a workflow around document management, especially where contracts and detailed specifications must be agreed to, and has a detailed set of process-centric statuses that can be associated with each document uploaded (for information only, upload requested, upload deviation, accept, etc. — modifiable by the client if desired) so both sides clearly understand where the document is in a request or negotiation cycle, as well as the ability to tag in-platform messaging to a document, which not only allows for audit trails to be queried at the document level but allows for in-platform discussions around documents to be captured and not only centralizes document communications (which get lost in email) but simplifies acceptance and approvals (of contract-related documents).

Contract Governance

The system allows the storage and management of contracts — which are currently defined as a collection of documents and bids accepted by both sides that are included in an award. The user can define the start and end dates, milestones, review periods and notifications and the platform will notify the appropriate parties when a milestone is do (so the appropriate individual can login and execute that milestone when it is completed, which may include notes or documentation), when a mandatory review has been completed (along with appropriate documentation and possibly future milestone steps if a corrective action is needed), or when a renewal/termination date is coming up on a contract. They don’t have integrated e-signature yet, but it is coming. Nor can they output everything to one single amalgamated PDF, but they haven’t found that to be necessary when most of the documents in the system are stored as DOCX or PDF, and it’s much easier for a user to find and extract just the information they need (original contract, delivery schedule, pricing, spec sheet, etc.) when a contract is stored as a “package” of documents and related system artifacts.

Supplier Management

The foundation of Supplier Management in the platform is the Network where all uploaded suppliers have a common, basic profile, that consists of basic organizational identifies (name, business ids, primary location[s], primary contact[s], etc.), the UNSPSC codes that the organization provides, and the locations they can provide those goods and services to. This makes supplier discovery within their primary industries practical for their rapidly growing customer base.

On top of this, a user can add their own qualification profiles to collect, and maintain, the information they need on the supplier, and these are kept private. When they do this, or when they select network suppliers as their suppliers, they show up in their “My Supplier” view where they can be selected for starting (pre-approved) supplier lists for every sourcing event template that the organization believes they are suitable for.

Finally, each organization has their own “Activity” tab in the supplier view that shows all associated Pre-Qualification questionnaires, Sourcing/RFX events and Contracts with their related status. One click will take the user into the associated document or event.

Dashboard and Reporting

When a user logs in, they see their activity dashboard that summarizes their requests, contracts, notifications, pre-qualification/onboardings, and a few report highlights (mainly negotiated savings and request completion status). It’s kept simple and streamlined so a user can get right to what they need to do when they log in, especially since they are integrating other communication channels besides email for notifications so users only have to log in to do something, not to get a status update.

Reporting right now is very basic, and very process/cycle time centric (which should not be surprising as they do not do spend analysis, preferring to instead integrate with the organization’s current platform, and if the organization does not have one, help the customer find and integrate with an appropriate partner organization for spend analysis). The reporting is really focussed around:

  • Team Productivity: how many requests made, completed, etc., by category, and average cycle time(s)
  • Supplier Engagement: requests received, responded to, awarded, etc. and associated rates and durations

With regards to saving, it’s focussed around:

  • Total Negotiated Savings: that summarizes the total negotiated savings based on the current PPU/RPH, the award rates, and the total number of units/hours requested
  • Total Negotiated Savings from Auction: that summarizes the savings from auctions, as well as savings statistics on an auction basis

Other Features

Standard Drive functionality where the organization can store all of the document templates it needs for its various supplier (pre)qualification and sourcing events.

Easy Query Audit Trails: When you bring up an Event in DeepStream, you can see a history of every action that was taken at every step by every participant (buyer, collaborator, supplier rep, etc.), filter, and export at any time.

Great Help Library:
DeepStream has a very extensive help library that is organized by role and process, to help an average user find the help they are looking for based on where they are in their sourcing process. It also has a built in advanced search function (powered by a custom in-house AI-backed search algorithm trained ONLY on all of the help documentation they have available) that can quickly find the right section of the right document with a reasonably detailed search request. This AI also powers their integrated chat/online help function that can handle full natural language questions and guides the user to right help quickly and easily (if the help exists). Since their help library covers every function on their platform, as well as best practice sourcing processes, the help bot is able to direct a user to the guidance they need and complete a help request roughly 80% of the time.

The DeepStream platform, including all help documentation, is fully translated into English, French, Spanish, and Portuguese.

Easy Integrations:
Out of the box ERPs include IFS and Dynamics, which are the two most common in the mid-market in their target industries, but they can (and have) integrated with other ERPs and P2P/AP systems. They’ve also integrated with supplier qualification and certification systems (like Avetta) and you can expect more integrations as time goes on. They built on to allow them to integrate with any platform they need to quickly, easily, and in a standard fashion.

Easy Account Management:
In the DeepStream system, it’s really easy to define collaborator organizations, user accounts, notifications, and system preferences (around currency, notification, etc.). Remember, one of the main goals was efficiency, so the idea is that organizations and users can configure event templates precisely to their needs so it’s super easy for buyers to kick off and complete sourcing events.

Terminology Customization:
DeepStream understands that one of the biggest hurdle to adoption is trying to force an organization to switch to terminology they are not used to. Thus, in their system, the super use can define the language used in all system elements at each step of the event template. For example, some jurisdictions in the world might use bid envelope terminology, others might use bid package, the private sector just wants RFP, and so on. All of this terminology is customizable as needed.

Coming Soon!

As per our intro, they are constantly developing and a few features coming soon include:

  • Enhanced evaluation functionality with more auto-computed differentials/savings potentials and advanced ranking/weighting capability based on calculations;
  • Integrated e-Signature powered by Verify — which will be available at all stages of supplier interaction, as you may require an NDA to be signed before you can even invite a supplier for a bid
  • Microsoft teams and Slack for communications and reminders (which is in beta now)
  • More Language Support: the entire platform, including the entire help library, can be internationalized to a new language within three weeks; languages are being added based on customer prioritization

Coming Later

  • More Out-of-the-Box Supplier Certification/Qualification/Risk Integrations: to help buyers certify and qualify new suppliers for their operations without leaving the DeepStream platform
  • Category Template Library: they have a number they can set you up with if you don’t have any; right now they help you get your current (Excel) templates and processes templated
  • Supplier [onboarding] Questionnaires: there are best practice templates out there for IT/Cyber Security, Personal Data Protection, Health & Safety, regulations like the GSCA, etc. and no need for each organization to create their own from scratch; right now they will share what they have on request [enhanced onboarding is one of their newest capabilities and, as such, is still under active development]

In conclusion, DeepStream is a great sourcing platform for mid-markets who need to modernize and get efficient fast, especially in the Energy & Renewable Energy (and O&G), Utilities, MRO, and Site/Port Operation sectors (be they public, private, or quasi — such as public funded, privately managed). As the platform is true multi-tenant SaaS, it’s more or less a flick-of-the-software switch to instantiate a new instance, typically only a day or two to configure an out-of-the-box implementation, only a few days to a week for a non-out-of-the-box integration, only a few hours to pull in the active suppliers once the ERP/P2P/AP is integrated, and only a few weeks to get an organization’s category processes templated. Most customers are fully up and running within a few weeks (and a month at most), and some customers have even kicked off initial events (on a small set of suppliers pulled in through one of the out-of-the-box ERP integrations) within 24 hours (while the while the remainder of the active suppliers for near-term events were being onboarded and the remainder of the category templates built out for future events). If you’re a mid-market looking for modern sourcing tech, and especially if you are a mid-market in one of the target sectors, you should definitely consider putting DeepStream on your shortlist and checking them out.

Strategic Sourcing & Procurement for Technology Cost Optimization

Given that we recently published a piece noting that Roughly Half a Trillion Dollars Will Be Wasted on SaaS Spend This Year and up to One Trillion Dollars on IT Services, it’s obvious that one has to be very careful with technology acquisition as it is very easy to overspend on the license and the implementation for something that doesn’t even solve your problem.

As a result, you need to be very strategic about it. While you certainly can’t put the majority of your technology acquisitions (which can be 6, 7, and even 8 figures) up for auction (as products are never truly apples to apples to apples), you definitely have to be strategic about it. As a result, you should be doing multi-round RFPs and then awarding to the vendor who brings you the best overall value for the term you want to commit to, once all things are considered.

But these have to be well thought out … you need to make sure that you are only inviting providers that are likely to meet 100% of your must haves, 80% of your should haves, and 60% of your nice to haves (and, moreover, that you have really separated out absolute vs highly desired vs wanted but not needed because the more you insist on, especially when it’s not necessary, the shallower the vendor pool, and the more you are going to end up paying*).

To do this, as the article notes, you have to know what processes you need to support, what improvements you are expecting, what measurements you need the platform to take, and what business objectives it needs to support. Then you need to align your go-to-market sourcing/procurement strategy with those objectives and make sure the RFP covers all the core requirements (without asking 100 unnecessary questions about features you’ll never actually use in practice).

You also need to know what quantifiable benefits the platform should deliver, both in terms in tactical work(force) reduction (as the tech you acquire should be good at thunking), and the value that will be obtained from the strategic enablement (in terms of analysis, intelligence gathering, guided events, etc.) the platform should deliver. If it is a P2P platform, how much invoice processing is it going to automate, and, based on that, how much is it going to reduce your average invoice processing cost? If it’s a sourcing platform, how much more spend will you be able to source (without increasing person-power) and what is a reasonable savings percentage to expect on that? Understand the value before you go to market.

Then you need to understand how much support and help you need from the vendor. If you just want a platform that does a function, then you just need to know the vendor can support the platform in supporting that function. But if you need help in process transformation or optimization, customized development or third party tool integration for advanced/custom processes, etc. you need a vendor that cannot only provide services, but also be a strategic provider for you as well.

And so on. For more insights, we suggest you check out a recent article by Alix Partners on Strategic Sourcing and Procurement for Technology Cost Optimisation. It has a lot of great advice for those starting their strategic procurement technology journey.

*Just remember, if you’re a mid-market, and you’re flexible (i.e. define what a module needs to accomplish for you vs. a highly specific process) you can get your absolute functionality and most of your desired functionality for 120K in annual SaaS license fees, excluding data feeds and services. If you’re not flexible, or not really strict in really separating out absolute vs strongly desired vs nice-to-have, you can easily be paying four times that.

Also remember, if you’re enterprise, your absolutes and strongly desired are much more extensive, typically require a lot more advanced tech (like optimization, predictive analytics, ML/AI, etc.), and licenses fees alone will cost you in the 500K to 1M range annually at a minimum, not counting the 100K to 1M you will need to spend on the implementation, data cleansing and enrichment, integration, training, and real-time data feed access, so it is absolutely vital you get it right!

Take the Tedious out of the Tactical Tail and Autonomously Avoid Overspend with mysupply

The taming of the tail is tedious and that’s why it’s overlooked in many organizations beyond whatever a catalog can address. There are only so many strategic sourcing professionals, there are only so many projects they can handle, and only so much spend they can get under strategic management. After that, beyond what’s in the catalog, IF there is a catalog, it’s typically the wild wild west for Procurement — especially if it fits on a credit card or P-card. There just isn’t enough bandwidth to manage more than a measly modicum of the tactical tail in an average organization.

Many organizations believe it’s okay to ignore tail spend because it’s only 20% to 30%, and because they believe that overspend probably can’t be that high on small purchases. They’re wrong on both points. In most organizations, even when the strategic categories are defined to include 80% of spend, because products and services change all the time, organizational buyers and / or overworked sourcerers won’t always catch when new products or services should be included in a strategically managed category; and because p-card/T&E is never included in the initial estimate, tactical/tail spend that’s unmanaged is usually 30% to 40%. If it’s 40% that ends up being unmanaged when the expectation is 20%, that’s a lot. Secondly, spend analysts and tail spend analysts have regularly found that the average overspend in the tail is in excess of 10%, with some categories of spend routinely being in the 15% to 30% window because no one ever looks at it. And if your organization is losing out on 10% of 40%, that’s 4% that could go straight to the bottom line with a good tactical tail spend solution.

To put into perspective just how good 4% straight to the bottom line is, consider the fact that, in direct organizations, strategic events on carefully managed direct categories that are regularly sourced typically only net 2% as the categories have already been squeezed. It’s only the mid-tier categories where you will see higher savings rates, which will typically average in the 5% to 7% range at best as these categories at least go to auction or multi-round RFP regularly. So if you save 2% on the top 30% and 5% on the next 30%, that’s only a savings of 2.1% that hits the bottom line. In other words, if your organization has been actively strategically sourcing top spend for five or six years, your organization has twice the cost avoidance / savings opportunity in the tail. It may seem counter-intuitive, but it’s the truth. Let that sink in for a moment before you read on.

mysupply is the newest start-up that aims to tackle the Tactical Tail Spend space, which has been historically underserved since the first specialists popped up (and then disappeared) to tackle it in the early 2010s. Even today you can count the true tactical tail spend specialist solutions on one hand without a thumb, compared to the seventy-five plus sourcing providers, but the new generation of providers, and mysupply in particular, understands that no one wants their spend in multiple systems (as you can’t do integrated spend, PO, and invoice management otherwise, key for Procurement success) and are developing their system as an extension to current sourcing systems, not a replacement for.

mysupply, which is even available on the SAP app store for those that use SAP (Ariba) and want a quick-start into tactical tail spend management, was designed to integrate with, and feed into your existing sourcing / procurement platforms — and in the case of Ariba, will fully use the Ariba Catalog and Ariba PO system to manage all spend. mysupply allows for:

  • quick event definition for sourcerors short on time (though the App or ProcurementBot)
  • roll-out to organizational users who can do their own quick-hit RFPs/Auctions/Catalog buys (also through the app, if needed, or ProcurementBot)
  • integration with your intake platform of choice for event push to the sourcing team

While it’s not designed as a full intake (or intake-powered) platform, as it was built for tactical tail spend and not all organizational spend, it was built from the ground up with integration in mind (as their goal is not to replace any platform you might already be using, as they are going after the enterprise market) and has a lot of orchestration capability built in and could even serve as an intake platform if desired (and route requests that should be strategically managed spend to an existing strategic sourcing application or to mysupply, which can also be used for strategic sourcing if desired).

Event creation in mysupply can be super easy. Options include:

  • in-house LLM-assisted Event Creation and Management via API-powered ProcurementBot, that can be integrated through existing enterprise collaboration platforms (Microsoft Teams is in Production, further integrations are planned)
  • Existing event templates that define all of the items being sourced, data required for bids, and (pre) approved vendors (which can easily be augmented or removed) (any event can be saved as a template to kick off future events)
  • events from scratch, where the platform is very adaptive and you only need to specify as much information as is necessary to source the product/service, which, if already defined in the system, can simply be an RFP request and a due date

and, most importantly, all of these strategies can include

  • demand bundling, even if different products or services should be sourced using different strategies, which can be across buyers for a given timeframe (i.e. collect all requests for a week or a month and then source)
  • pre-selected, custom, or hybrid supplier lists
  • customized lots, as the platform allows sourcing by item (price) or lot (price)
  • multiple tender/go-to-market approaches (i.e. each lot can be designated for a different [type] of RFX or auction), where the approach doesn’t need to be selected until suppliers have confirmed interest AND initial bids are in (which is very relevant for tactical spend where you don’t know the market dynamics because you haven’t researched the market and/or don’t source the product or service regularly; it’s not like strategic spend where you know there are seven suppliers, and five will show up to a reverse auction)
  • automated negotiation via (lot-based) QuickBot or multi-line item QuickBot
  • multiple scenarios for negotiation award analysis (where the items can be broken up for further negotiation/award after an initial bid event based on total spend, number responses, etc.)

For the requester, integrated LLMs through ProcurementBot help the requester:

  • identify the product or service being requested
  • capture demand and critical requirements
  • select the category
  • be presented with the appropriate sourcing approach: catalog, self-service, or central sourcing (team)
    • for catalog, immediately make the buy by presenting the user with the available catalog options and allowing them to select one and complete the purchase (and then the bot completes the process in the source system)
    • for self service, flesh out tender specifics and select (pre-approved) suppliers and then ProcurementBot sends out the tenders and, when they are all returned, or a certain time has passed (as configured by the category manager in the mysupply platform) returns the quotes to the buyer through the initial chat channel (where they can select one)
    • for central sourcing, it collects the request and, if appropriate, bundles it with others that are then rolled up into a managed tender that is then put into a central buyer’s queue for management, which may happen before or after initial quote requests are sent to suppliers (if an event template has already been pre-configured)

Let’s dive into some key sections / capabilities for the sourcing professional.

Demand Management / Bundling

As above, the system can be pre-configured to bundle demand over a period of time for all requests for the same product or products in a pre-defined lot, but for the rest of the requests that come in, there is the demand management/bundling section. In this section, the buyer can see all of the requests, have mysupply suggest a bundling, and either pick a suggested bundle or create her own bundle. She can quickly search and filter to create custom sourcing project bundles and then immediately kick off a workflow to define a new sourcing project bundle.

When a new sourcing project is kicked off, the user is taken to a screen where they can select starting pre-defined supplier groupings that are relevant for each item requested in the demand bundle (and, of course, the system will not include duplicate invites if the supplier is in multiple supplier groups, so the sourcing organization doesn’t have to create intersection groups, just groups for each commonly requested item).

Standard Sourcing Process

Once the buyer defines a basic event through one of the workflows (kicked off from a single request or request bundle), the platform takes the user to the event summary. From there they can:

  • define the automation and starting strategy — the event can be setup to automatically select all approved suppliers, send the request out at a certain time, remind suppliers, automatically advance to evaluation when all starting bids are in or the deadline is reached, kick off automated negotiation rounds (where suppliers are given a chance to update bids based on rank information and built-in game theory negotiation strategies), and basically free the buyer until it’s time to evaluate the first round of bids and either award, or kick off another round — at this point, the buyer can change the negotiation strategy, and even split the event up into multiple parts; this is different from most platforms where the entire event structure, and strategy (single round, multi round, Dutch action, etc.) has to be defined up-front and cannot be changed — something which makes no sense in tactical tail spend sourcing where you don’t know the supplier interest or current market dynamics; note that the starting strategy can be multi-pronged based on event value (if the award can be done under 10,000, then just award the lot to the current lowest bidder; if under 25,000 use autonomous QuickBot negotiation and award to the lowest bidder on an item basis; if over 25,000, do a 2nd round RFP with the three best suppliers and more negotiation/bundling to motivate better pricing; etc.)
  • flesh out the request — quote breakdown (while it is tactical tail spend, you may still want shipping, handling, taxes, service fees, etc. broken out), basic information required, documents required, delivery and payment details that must be accepted, compliance requirements, etc.
  • invitation of the selected suppliers (where you can add or remove suppliers that were pre-populated from supplier groups appropriate to the items in the request)
  • the evaluation of the bids that come back – manually, autonomously, or a combination thereof;
    the platform supports best price strategies, threshold strategies (which allow the strategy to be dependent on the amount of the bid, i.e user-driven negotiation above a range, best price negotiation within a range, and best-price auto-award below a range), QuickBot single lot auto-negotiation, Multi-Item QuickBot, English auction, Dutch auction, ranking (based on weighted responses and costs), buyer awards (no auction/negotiation); it supports lot strategies (best distribution by single-item award or all split); it also supports multiple rounds if desired with pre-scheduled negotiation windows (for RFQs and auctions); and, finally, it supports automated awarding if strategies that permit automated awarding are selected (subject to conditions that can restrict auto-award based on LDO — Least Desirable Outcome — or MDO — Most Desirable Outcome — scenarios; however, note that this is just the starting strategy;
  • select one or more bids for negotiation and make an award (unassigned/unawarded items are summarized and the user can see, through color coding, the lowest cost among all offers, select one, and send it to the e-Procurement system; the user can even dynamically kick-off new rounds of the RFP/auction, which may have a smaller supplier set or introduce new suppliers if the responses weren’t acceptable )
  • manage Q&A with the suppliers

A great feature of mysupply is it is not built to replace your current strategic sourcing platform (which most organizations have), your existing catalogs and catalog management applications (they integrate with them through their extensive API support), or your ERP/MRP/AP system which manages your purchase orders (as they integrate with those too). It’s meant to fill the tactical / tail spend sourcing hole in most organizations and, in particular, help organizations with tactical sourcing teams and help desks become considerably more efficient so overall savings can be increased though effective category management practices that capture and encode organizational knowledge so the end users can make the right buys on their own as often as possible, ensuring that the tactical team can focus on higher spend tail spend categories and new categories (and develop the right strategies to manage those going forward).

If your organization does a lot of tactical / tail spend sourcing, mysupply is definitely a platform you might want to check out, especially since its ProcurementBot allows it to do intake through third party platforms organizational users are already familiar with (such as Microsoft Teams).