Category Archives: SaaS

Are Your IT Vendors Performing Up To Snuff? Probably Not. Maybe you should VendorScoreIT and find out!

Let’s face it, it’s a well known fact that there is serious overspend in IT. Given that global IT spend is surpassing 5 Trillion, and that overspend is generally at least 30% on average, this is a category with over one Trillion in annual, unnecessary, losses, and, let’s be honest, your organization is contributing to this loss as much as your peers unless you are taking multiple initiatives to minimize it. (See: Roughly Half a Trillion Dollars Will Be Wasted on SaaS Spend This Year and up to One Trillion Dollars on IT Services. How Much Will You Waste?)

Moreover, just because you engaged / acquired one of the SaaS spend analyzers and optimizers (that we chronicled in our piece on the Sacred Cows) to optimize your SaaS spend, that doesn’t mean you’ve optimized it. Amalgamating your licences to less platforms, amalgamating the spend to one invoice, and then ensuring all of the licenses are assigned to active employees is a good start, but that still doesn’t mean you are getting your money’s worth. First of all, this only works for commodity software — it doesn’t work for big back-office systems like your ERP or finance platform where you have to negotiate direct with the vendor. Secondly, it doesn’t work for cloud costs. Thirdly, it’s just software costs, not service/support costs. Finally, if the platform isn’t doing what it is supposed to do and the vendor is not supporting you, every dollar you are spending could be wasted. That’s why Vendor performance management in IT Software (Platform) and Service categories is critically important.

That is precisely what VendorScoreIT was founded to do. Allow a SME (or larger) organization to monitor and manage it’s IT vendors on a regular basis through scorecarding-based performance management at a price any organization can afford (with pricing that starts at an amount that can be paid on a P-Card) while also providing that SME (or larger) organization the support to do it right.

Now, you might think that such a solution wouldn’t be needed as there are SXM solutions with scorecarding and good performance management modules, and that’s true, but most of these were designed for large organizations with a six figure price tag to match — which puts these solutions out of reach of SMEs, and especially those that just need to get their IT and strategic suppliers under control. (In other words, while VendorScoreIT was initially designed to help an organization manage its IT vendors, which are typically the most significant unmanaged vendors in many organizations, it is generic enough that it can be used to score any supplier.)

The solution is a scorecard driven solution, where you can define scorecard models by vendor type, primary service, and/or platform offering; break scorecards up into core areas (operations, partnership, innovation, and procurement) and measures by area; associate multiple scorecards with a vendor (based on services and/or platforms you want to score and monitor individually); define scoring intervals (and the solution will automatically initiate scorecard rounds on the interval you define and invite all of the individuals you want to score it); and define who you want to score each individual scorecard (and average the scores across all respondents).

The platform, which allows an organization to define its areas and measures of interest, also allows them to define default service model and platform scorecard templates, and then associate these with the vendors it wants to scores. Vendors can be grouped by type and tier for easier management, and the platform comes with predefined models for SaaS vendors and system integrators to jump start an organization’s vendor management initiatives. (Supporting non-IT vendors is just a matter of defining any additional areas of relevance, and the measures for those areas, in the metadata for the products (which would fall under platforms) and services the vendors offers, defining default scorecards, defining a new vendor type, and associating the relevant product and service scorecard with that vendor.)

The platform makes it super easy to see overall performance by vendor type, vendor, vendor service or platform, for a time-period of your choice (and to output those scorecards to Excel) — offering simple drill down / drill up functionality scorecards to allow a user to quickly identify where a supplier is doing well, where it’s not doing well, and how it’s doing with respect to its peers. It’s also super easy to pull up the comments associated with a measure that you drill in on, which can give you insight into why a score is high or low.

Right now, it doesn’t have any integrations with third party data sources (for risk, environmental, financial stability, up-time metrics, etc.), but an Open API is coming that will allow you to integrate some metrics with ease, or initiative management (when you decide you need to take action on a supplier who is underperforming in a critical area), but that is currently being designed and will materialize at some point (and most organizations already have a project management platform they prefer to use).

But the platform does come with a considerable amount of expert advice and support to jump-start your initiative, from experts with almost two decades of vendor management experience (especially in IT), depending on what service level you buy. With the

  • Professional Plan: in addition to customized scorecards, you get guided performance reviews and quarterly strategic sessions
  • Premium Plan: in addition to everything in the professional plan, you get a customized vendor management framework, monthly strategy sessions and targetted strategic vendor consulting

However, it has the collaborative scorecarding many SMEs are missing in their platforms and, more importantly, has it at a very affordable price-point, with premium service options available where one of their experts will help you with initial scorecard creation, vendor assignment, performance review training, and even regular strategy sessions and supplier performance management training. VendorScoreIT is all about making vendor performance management accessible and affordable for all organizations in a package that anyone can use, and get going with quickly. If this sounds like something you could use, you should check it out.

The Pundits Agree. Winter is Coming!

In a recent LinkedIn Post, THE PROPHET, a year late to the party (see the SI and Procurement Insights archives, and our Marketplace Madness post in particular), finally announced that Winter is Coming: The Great Procurement (and Broader) Legacy SaaS Rationalization and that it is going to be a very cold winter that will be Swift. Brutal. And very, very final.

There’s too many companies that took too much funding at too high a valuation with nothing to show. PE firms will be dropping these companies faster than a hot potato into boiling lava pits to focus on the companies in their portfolio with current year-over-year growth in hopes of making some of their losses back. Too many companies started without doing any research and literally built the 10th AP clone, SXM clone, RFX clone etc. of a dozen already existing solutions. (Just check the mega map if you don’t believe me.) In a race to the bottom (which helps no one), they’ll lose due to their lack of a bank account as they slash prices too deep in hopes of getting customers. Too many applications took a silo focus, didn’t build Open API centric, and the hurdles of plugging them in will be too much or too costly, and no matter how good the tech is, they won’t get bought.

And then Agentric AI will thin what remains (but not lead the cull as THE PROPHET predicts, because these AI solutions still cost money, and sometimes a lot of it, and for what they cost you can hire a real expert and not a fake one, license a few augmented intelligence tools for a quarter of what these over hyped Agentric AI platforms are charging (because they raised, and wasted, too much cash and have to recoup that before they become the next hot potato dropped into the lava pit by their PE investors), and have a super-human employee who does the work of an entire team, error free (with little to no risk of that skilled employee getting you sued as a result of a conversation, installing a back door for hackers on your systems, shutting down your systems for days, costing you 10X on a purchase due to a dot transcription error, or increasing your internal fraud [link]).

And for some of these companies, it’s already too late to pivot. For others, there are actions they can do. THE PROPHET offered some:

  • risk over-cutting
    (if done smartly)
  • consumption-based models
    (which will be more attractive to some potential customers)
  • challenge the team to earn their existence
    (but that doesn’t necessarily mean prompting GPT like a pro: you don’t want junkies)
  • redefine the sales org
    (a better playbook is key, and it needs to be differentiated)
  • Skip the Fairy Dust and Buzzwords
    (Hear! Hear! I’ve been shouting that for years! Unfortunately, not sure most of the companies out there know how to do that though! I know for a fact only the squirrels have been listening to me, and they are getting very tired of that rant. They like variety. Basically, it’s been a long, long time since marketing focussed on education and value and startups priced based on that.)

And that’s just the tip of the iceberg. SI has posted entire series on:

Failures from those who raised too much and offer too little will be coming fast and furious. This needs to be repeated. You need to be very careful which vendor you buy from and what protections you have in place if they don’t make it. (In particular, there must be a “we own our data” clause which gives you the right to all of your data and the right to export all of it into a standard file format at any time, and that specifies your data includes rules and workflow configurations and the right to export those too — for example, in spend analysis, it’s not just the data, it’s the rules that create the cubes; in invoice processing, it’s the workflow and approval rules … you won’t be able to migrate to another system quickly if all you have is the transaction data, the data that defines the processes and rules is just as important. And if you can’t export all of your data, rules, templates, etc., at any time, then don’t buy the system!)

However, while the app consolidation will be brutal, as will the renegotiations if you want to be one of the apps that make it (now that organizations are realizing they don’t need 17 apps for S&M and probably shouldn’t have 17 apps in any function, including Procurement), Agentric AI (especially at 20K a month when you can hire a REAL person for 1/4 of that) will not replace people en masse (but AI-enabled technology will). Teams will be cut and replaced by two individuals who can use next generation augmented intelligence solutions that can truncate months of research and analysis to a few days and allow strategic decisions to be made in hours, not weeks, and shifts to be made seemingly overnight while eventually allowing 99% of all tactical data processing to be automated through evolving rules and workflows under expert guidance.

Moreover, at the end of the day, relationships are not built on 1s and 0s, and they are needed now more than ever. So not only will we have skilled technologists, but skilled relationship managers. (While everyone else who does nothing but push e-paper 90% of the time or code spreadsheets will slowly be eliminated.) Of course, this means if you don’t understand optimization, analytics, statistics, game theory, economics, and logic, or you’re not an expert in relationship management, you’re screwed, but everyone had a chance to study STEM in University (and skip the woke liberal arts) and learn the technical skills for the first set of jobs.

So this also means if the platform is not enabling this next generation of employees to become more and more productive over time, its lifespan is probably short.

So get focussed in your diligence efforts on solution acquisition if you don’t want your platforms disappearing out from under your virtual feet, and if you need help, call an expert!

Simplify Services Sourcing By NVELOPing Your Bid Packages!

Services sourcing in most organizations is a complex nightmare. It’s not simple like indirect sourcing where you identify a finished product need, send out an RFQ for a standard product spec, get some quotes, do a landed cost calc using your pre-negotiated or market spot-buy freight rates and current tariffs, and select the lowest cost bid. Easy-peasy. It’s not even straightforward like BoM (Bill of Material) or Program management in a direct sourcing application where you send out a quote package with a set of components, detailed drawings and specs on each component, detailed cost breakdown requests, anticipated production schedules, and compliance and regulatory requirement documents. (And yes, while this is a lot of work to put together even with the best platform, including platforms that can suck in the majority of requirements from the ERP and the PLM, it’s still relatively straightforward for an engineer.)

Services sourcing is complex. While services might have categories for the chart of accounts, and services professionals might have standard roles, and any subsequent request for the same service on the chart of account from someone in the same role with the same experience will be similar, they won’t be the same. Installing a cable line is not just installing a cable line. Is it a home line or business line? Do you have to connect to the poll, a junction box, or a rack mount? Do you have to drill through walls? Are they wood or cement? Implementing an ERP is not implementing an ERP is not implementing an ERP even if it is SAP or Oracle in all instances. What version? What cloud platform does it have to run on / integrate with? Which P2P and back office systems have to be connected? And so on.

On top of the basic project requirements, services projects require a lot of terms and conditions, NDAs, professional certifications and insurance requirements, key performance requirements, confidential information on current state and desired state, evaluation criteria, etc. In a typical organization, a bid package will consist of a huge stack of e-documents, hastily assembled (and riddled with errors due to the haste), zipped up, and sent off to bidders who, hopefully, when struggling to fill out the overly convoluted RFPs on a tight deadline, don’t miss any key requirements before sending it back. (When the organization is in a crunch, which it always is, a lot of this work will often be done by third party consultants who will be less familiar with the requirements than the overworked staff who don’t have time to do it, leading to oversights as well as errors.)

Nvelop was founded to solve these woes, namely

  • the time requirement to put together the core of the RFP
  • the need to ensure that RFPs contain all the required bid / information fields
  • the effort to collect all the corresponding documents
  • the need to ensure the terms and conditions satisfy legal
  • the need to ensure suppliers see and respond to all mandatory terms and conditions
  • the need to communicate with vendors in a secure, trustable, method
  • etc.

We’re going to discuss their solution by addressing these points.

The RFP Core

Services RFPs are extensive and time consuming to put together. So Nvelop gets around this by jump-starting the process with LLMs that will create a starting RFP given:

  • project type (RFI, RFP Lite, RFP)
  • domain (Technology, IT Services, Facility Services, Legal Services, Marketing & Advertising, etc.)
  • expected issues (high competition, business criticality, environmental risk, etc.)
  • pricing model (fixed price, target price, time & materials, etc.)
  • engagement type (staff augmentation, system integration, software implementation, consulting & advisory, etc.)
  • business domain (Sales & Marketing, Finance, HR, Supply Chain, etc.)
  • Technology [stack] (AWS, Google Cloud, Oracle, etc.)
  • Enterprise Software (SAP, Oracle, Salesforce, etc.)
  • Business Criticality (1 to 5 scale)
  • Background Material (any documents with relevant information that will not be given to bidders)
  • brief project description
  • meta-data for management and indexing (due dates, owner, team, categorization, etc.)

It will use the LLM that is trained on generic project documents that match the request as well as historical projects from the organization to generate a starting RFP that will break the project requirements down into core processes and sub-processes with supporting requirements for the project type for each sub-process. For example, for an ERP Application Maintenance project, it will define the core processes of:

  • application maintenance
  • ERP system management
  • integration services
  • reporting and analytics

and for application maintenance, for example, it will identify the core sub-processes of:

  • issue resolution
  • incident management
  • change management
  • performance monitoring
  • user support

with a detail description of each sub-process and bidder requirements. For example, for issue resolution, it might break down into

  • vendor must allow users to log issues through a portal and issue confirmations and ticket numbers
  • vendor must investigate all issues within one business day, regardless of criticality
  • vendor must respond to critical issues within one hour with a resolution team and estimated timeframe for resolution
  • vendor must maintain a knowledge base updated bi-weekly with common issues and resolutions for self-help
  • etc.

Once the initial RFP has been auto-generated, the buying team can

  • add internal comments during team discussions and/or collectively prioritize requirements
  • manually add, edit, or remove any process, sub-processes, requirement or description
    with status (generated, edited, etc.) tracked
  • approve when happy (and the platform can be configured to prevent issuance until all requirements are approved)

All Requirements Accounted For

Nvelop is not a new-age rapidly developed Chat-GPT wrapper parading as a Procurement solution when it really isn’t. It is a new startup founded by consultants who spent 20 years doing services projects while constantly thinking to themselves that there has to be a better way, who came together and defined what the requirements of such a sourcing platform should be, who built a real platform (that walks buyers through a 7-step process), and who only use LLMs for generating content where it makes sense.

The platform has a fairly extensive administration component where, for a project type, you can define:

  • the starting bid templates
  • core documentation requirements
  • mandatory terms & conditions

and the generation logic will ensure that all of these are included in the starting RFP (and associated package) that is generated, either through custom LLM instructions or forced overrides.

It also has a document library where you can store all of your standard documents on company profile, insurance requirements, compliance requirements, general service requirements for personnel on your site, etc. that can be pulled into all relevant projects.

Effort to Collect Corresponding Documentation

Since the platform, as described above,

  1. has a document library
  2. can, privately, store all related documents relevant in RFP construction on a project basis

It’s very easy to automatically include all of the relevant documents in an RFP, as the majority will already be in the system, and the rest can simply be uploaded and the majority of relevant content auto-extracted by the LLM during the initial generation process.

Terms and Conditions Satisfy Legal

Not only can you include a standard clause library, in the administration section, but you can configure the application to use pre-approved legal clauses verbatim in requirements and draft agreement generation, and the LLM will only be used to generate the parts of the RFQ and draft agreement for which there is not a verbatim clause. In addition, if a clause needs to be adjusted for different geographies, categories, etc., you are able to configure the application to force the LLM to generate its response based on a library clause. In other words, if you already have something acceptable, you can be sure it makes it into the RFQ or draft agreement vs. just rolling the bones.

Forced Supplier Response on Mandatory Ts and Cs

RFIs and RFP packages can be designed to force a supplier to respond to each mandatory and critical requirement, where they can simply select a Yes/No or Yes/Partal/No response with additional commentary, if required. This way a supplier can never say “they didn’t know” something was a requirement in the final stages of a negotiation as they will have seen and responded to it during the initial bid and requirements traceability is a core part of the solution platform.

Secure Communication

Since the RFP process is now going through a platform, where all documents can be securely downloaded and uploaded, all communications securely maintained in their own, auditable, stream, and all required confidential documentation can be accessed at any time once the NDAs have been accepted, the platform solves the security issue that buyers and vendors have with sensitive documents and bids being sent back and forth through email or common FTP portals.

Solution Summary

As we hinted above, the solution will walk a buying team through a seven-step process that consists of:

  1. Planning – enter all the data you have and instruct it to generate a starting RFQ
  2. Requirements – edit and finalize the requirements listing
  3. RFQ – finalize the RFQ (by approving or editing), which will consist of
    • overview information (introduction, your client info, submission and evaluation process, technical landscape, services overview requirements, services timeline, and other relevant information sections)
    • Questions – the requirements you worked on last phase, which can be extended with other questions about the vendor not core to the services requirements
    • Pricing – where the vendor will submit the bid sheets in the appropriate format that is automatically identified based on procurement type and category (including fixed price bids, rate cards, etc.)
    • Evaluation Criteria – where you define the criteria (and weighting)
    • Attachments – automatically pulled in
  4. Tendering – where you can see the RFP Preview (as the vendors will see it), select the vendors, and handle the Q&A; and where you can resend if you have to make an update or do a subsequent round
  5. Responses – which captures the vendor responses
  6. Evaluation – where you evaluate the responses once the RFQ is closed, and can compare them side-by-side at a high level, drill down into the details, and have the system generate overall scores based on the evaluation criteria (and weighting) you define
  7. Deal Room – where you kick off a negotiation process with one or more vendors, assisted by an automatically generated assessment of deviations from specifications or requirements that you will need to address (which will be based not just on clicked boxes but comments, likely intention, degree of deviation, summary of the deviation, assessed negotiation complexity, and likely relative importance)

Moreover, a supplier can easily access, and respond to the RFQ, through the vendor portal that allows them to quickly access the relevant sections, check the boxes, provide their responses, and upload documents. They can also engage in Q&A, and see the status of each project they have been invited to. The Q&A capability includes a LLM-powered chatbot that will search all of the available documentation and provide answers to questions already answered, including pointers to where to find that information in the RFQ package, and that will, when an answer is not found, direct the user to directly message the buyer for the answer.

Why SI is Covering and Recommending Nvelop for Shortlist Configuration

Those who follow SI will know that the doctor despises the rampant proliferation of untested Gen-AI and the random application of Gen-AI to every problem, even a problem so obviously far beyond what Gen-AI was suited for that even a complete idiot would abandon the tech once they say how bad it worked, so why does the doctor recommend this Gem-AI powered platform?

  1. it’s not Gen-AI / LLM driven
    the core is a solid workflow app that follows a process that the founders, each of whom have over two decades of services sourcing support, know works
  2. LLMs are being used for what they are good for
    massive document store summarization and document generation off of standard requirements and similar projects
  3. the LLM can be fine tuned
    and you can direct it to (re)generate an entire package, single document, single section, single process, or single line-level requirement description with additional instructions
  4. everything can be overwritten or manually generated in the first place
    Gen-AI was never meant to be the solution, but a starting point that can get you 90% to 95% of the way there when you don’t have an out-of-the-box solution, so it is designed to generate content where it can get “close enough” and where it’s easier to manually edit the generated output than to even generate a starting document from scratch through cut-and-paste
  5. every single line MUST be manually approved
    and while you can click that “approve” button without reading the associated content, if something is issued wrong, then everyone knows who didn’t do their job and who is ultimately responsible

Moreover, it will get you through a complex sourcing project mostly correct and mostly complete in a matter of weeks, with little to know external (consulting) support required, even if you’ve never done that particular complex sourcing project before! And while no solution is perfect, we’d hazard a guess to say even a neophyte would do a better job with this platform than a grizzled veteran who had to do everything manually under a severe time crunch. (While the grizzled veteran likely wouldn’t make any mistakes on anything they touched, they would be likely to miss something important in the virtual stacks of paperwork with more pages than a copy of “War and Peace” [Simon & Schuster edition] given the time crunch they are always under.)

Nvelop might be new, but it’s solid, which is what you get when you realize it is a solution built by veteran complex services sourcing professionals specifically to support the processes that complex services sourcing professionals use. So if you are in an industry with a lot of services sourcing requirements, and your current sourcing solution (designed for indirect and direct) is letting you down (and it is), then we recommend you at least give this solution a look. The responsible use of AI impressed the doctor, so we would fathom a guess that it should impress you too.

Is your Procurement Practice Too Tactical? Maybe you need to Quote STRATegic with QSTRAT!

QSTRAT was founded twenty years ago to help companies get a better understanding of real product and part costs in order to assist those companies with the relentless margin pressure, constantly tightening timelines, and always-on global competition. They do this through a very customizable, and customized, Sourcing and Supplier Management Platform along with a rather unique Distributor Quoting Solution (which we are not covering in detail) to support value-added distributors (VADs) [who go beyond just logistics and fulfillment and might also provide training, technical support, sales demonstrations, and/or bundling with other complementary products for more complete customer solutions in addition to other value-add solutions]. Let it just be said that the Distributor Solution is an extension of the manufacturer solution where the value added distributor can get detailed quotes from the supply base, add its own markups and outbound costs, and provide very detail quotes to a buyer.

The core of their platform is a highly configurable single tenant cloud sourcing and supplier management solution for manufacturers and distributors who need to extend their ERP backbone with solid strategic sourcing capability in a manner that is compatible with their favourite tools — Microsoft Excel, e-Mail, and PDF forms.

Before we go any further we are going to call out one key difference between QSTRAT and most other cloud-based platforms — there is no supplier portal. In certain traditional manufacturing industries populated by old-school manufacturers, they are not very modern technology / SaaS savvy. They don’t want yet another supplier portal to try and figure out, to try and remember the passwords and security configurations for, and to have to log into every day to try and find their orders and communications when for years they ran off of email and spreadsheets and could get all their communications through one source. Also, in defense, it can be very hard to get yet another platform/portal approved, and even if it’s okay for your company, if one of your suppliers is also a defense subcontractor, they may not be able to use your new platform, putting you in a pickle — do you insist on the platform and find a new supplier, or keep the supplier and not get the benefit of the platform you jut bought.

QSTRAT was developed with those customers in mind and ensures all bids, quotes, surveys, and information exchange goes through secure PDF forms (or, should the customer choose, Excel spreadsheets can also be used, especially for simple information requests where there is no need for provably secure audit trails because no award is being made). This is because, with their target market, e-mail, Excel, and PDF forms are already universally accepted by the majority of suppliers. Whenever a quote, survey, request, etc. is issued, a custom secure PDF form is generated by the platform which is sent to the supplier through an email (link) for completion (and button based submission when they are happy with the form, where the state can be saved while it is in process).

The QSTRAT Sourcing and Supplier Management platform has the following key parts:

  • part database
  • supplier management
  • events (RFPs/Qs and Surveys)
  • reports
  • administration

Part Database

The platform can maintain the organization’s complete part database natively to facilitate rapid (re)sourcing of parts and programs, which can even consist of multiple quote packages that collectively satisfy a bill of materials. Part profiles are extremely extensive and can be configured to track all of the fields you want to track as well as have their own cost breakdown models if desired. They can be associated with risk factors, compliance requirements, insurance requirements, and detailed CAD/CAM drawings / STP files and all of this information will be sucked in by an event that includes the part.

Supplier Management

Supplier management is kicked-off during initial implementation where the cusstomer’s (active) supplier list is loaded from the ERP. Once the suppliers are loaded, additional information can be collected through surveys that can be sent to the suppliers as part of onboarding, data collection, performance reviews, etc. Suppliers can be categorized into organizational hierarchies of choice which can be product based, region based, raw-material based, etc. to allow for easy administration and selection of relevant suppliers for events.

Supplier profiles can be as in-depth as you like and upon system installation QSTRAT will define as many data fields in as many categories (basic profile, financials, contract management, risk factors, Scope 3, scorecard, etc. etc. etc.) as the organization desires. All available data will be imported from the existing supplier master in the ERP, the rest can be collected from surveys or manual entry, and updated data can be pushed back to the ERP on a schedule.

Suppliers can be created and onboarded natively in the QSTRAT application (and then pushed to the ERP when approved), and onboarding can begin with only a supplier name, contact name, and contact e-mail. Onboarding can be multistage and start with a registration questionnaire, continue with specific questionnaires based on the categories you will assign the supplier to (and focussed on its capabilities), and end with a formal evaluation exercise that follows a pre-defined workflow that will end in a supplier approval (or denial). The specific requests can include tooling capabilities, associated capacities, and, if the supplier is an MRO supplier or provides services, it can also include maintenance services and default rate cards.

Events

RFQs, which are the primary events in the system, consist of header information, line information, attachments, suppliers, communications, and returned quotes. The header information is the information that defines the event and goes beyond the simple meta data (id, name, dates, contact) but also defines the RFQ type (and associated workflow), program relationship (is it supporting a program defined by a parent RFQ that has been split into sub-programs to simplify analysis based on similar part types or supply base), prior event history (with the last quotes received, if they exist), and any financial guidance you want to provide (like expected margin % in detailed cost breakdowns, etc).

The attachments consist of global event attachments as well as individual attachments by part. Default attachments will be pulled in according to the RFQ type and the parts included in the RFQ, but additional can be attached at the global or part level before the RFQ is issued. This attachments will typically include NDAs, CAD/CAM drawings, compliance and insurance requirements, etc.

The lines represent the individual parts being quoted, each of which can have their own custom-defined detailed cost breakdown model, notes, and attachments. The quotes, which are filled out through custom PDF forms generated for the suppliers, can be exported in bulk to CSV for users that like to do spreadsheet-based analyses, but can also be compared in the QSTRAT platform in the QuoteMatrix which will show the quotes summarized (and sortable) on two key fields (landed cost, fully burdened cost, country of origin, or any other pre-defined calculation or identifier). The buyer can use this quote matrix to select bids for award, or use the pre-defined auto-select functionality (which, depending on the workflow, will select the lowest cost, the lowest cost that meets a certain requirement, etc.). The quote matrix is not limited to current bids by line, but can include key description fields (unit count, target price) as well as historical quotes for comparison. The matrix has extensive filtering capability so it is really easy to see parts with all or without one or more supplier responses, that are or are not selected for award, and with and without returned attachments. The buyer can also drill in to the cost breakdown matrix by supplier as well to see the relative costs for each component (material, tooling, certification, etc.).

Suppliers can be auto-selected based upon a pre-configured auto-assignment rule (that will select all suppliers that can supply all of the parts or at least one of the parts), assigned en-masse based on associated categories, assigned on a part basis based on manual selection, or selected en-massed based on category and then deselected on a part-by-part basis. Once the supplier is selected, the buyer can select the contacts at the supplier who should receive the RFQ, which, depending on the event flow, will be pre-selected to the first/default contact or all.

Events are single round by default, but can be turned into multi-round events with the re-quote functionality, which can also be used to kick back a quote (with a comment) to the supplier for re-quoting during the event if the buyer believes the supplier made a mistake.

Reporting

Accessible from its own tab or as drill downs off the dashboard, the platform contains a number of built in reports around event activity, customers, suppliers, and parts that breakdown by status, spend, activity, etc. The specific dashboard widgets and reports will depend on the platform configuration on implementation, and if the buyer wants DIY reporting, they can export all of the data to CSV or pull it into an external business intelligence (BI) or spend analysis tool using the Open API.

Administration

The user can do standard organization, user, workflow, and form field administration through the platform, but most of the configuration is done on implementation where the different event workflows are defined for the different event types that the buyer wants the system to support.

The platform supports a large number of event types which include, but are not limited to, supplier registration, supplier information update, supplier evaluation, tooling request, RFI, estimate (only), RFQ, order request, maintenance request, market test, etc. and additional types can be configured on demand. Associated with each event type can be an associated response/quote-flow definition that defines not only the desired header information, attachments, and communication requirements, but also the associated workflow both for event creation and issuance but also review and approval. Note that every field/document submission requirement defined at the header level will cascade down to each individual line, which will also pull in the cost model, fields, and attachment requirements associated with the part/SKU. These templates can also be configured to pull in the cost of an item currently in inventory or the last quote from a supplier (if still valid), to minimize the effort on the part of the supplier to update a quote and respond to an estimate request or RFQ.

Implementation and Integration

Now, with everything customized for each customer, it sounds like it would take a long time to get this system up and running, but the reality is that customers are usually up-and-running on supplier onboarding and core categories within four to six weeks and fully up and running in three to four months. QSTRAT has been delivering their solution in this manner for over 15 years (as the company turns 20 this year) and have a huge library of templates for each event type, each industry, etc. that they can start with for rapid customization to customer needs.

QSTRAT integrates with the major ERPs and has an open API for loading parts/lines/suppliers from your existing systems (ERP, CRM, EDI feeds, etc.), automatically creating events, and pulling information back.

Summary

QSTRAT is very interesting in both its approach to manufacturing and distributor sourcing and the way it implements that approach. Unlike many sourcing platforms,

  • it believes in 100% customization to the client and the way they work, maximizing the value-add on top of the ERP/MRP/WIMS the manufacturer/distributor uses to run their business.
  • it is single tenant cloud both to ensure maximized customization capability and to meet the security requirements of defense contractors that are subject to rigid security requirements
  • it uses secure PDFs for supplier interaction and data capture, forgoing the “yet another portal” approach the majority of vendors take to supplier interaction
  • it was built around Open APIs as most buyers want to work with their existing systems and tools they are comfortable with to the extent possible, and just use a sourcing tool for sourcing

So if you happen to be looking for a direct sourcing solution that meets one or more of these requirements, QSTRAT is definitely a solution you shouldn’t overlook when making your shortlist, and one that is currently serving customers across automotive, aerospace, medical devices, industrial, and high tech manufacturing.

Who Needs The Beef?

For those of you who have been following my rants, especially on intake-to-orchestrate (which really is clueless for the popular kids as it doesn’t do anything unless you already have all the systems you need and don’t know how to connect them), you’ll know that one of my big qualms, to this day, is Where’s the Beef?, because while the intake and orchestrate buns are nice and fluffy and likely very tasty, they aren’t filling. If you want a full stomach, you need the beef (or at least a decent helping of Tofu, which, unless you are vegetarian, won’t taste as good or be quite as filling, but will give you the subsistence you need).

And you need filling. Specifically, you need the part of the application that does something — that takes the input data (possibly properly transformed), applies the complex algorithms, and produces the output you need for a transaction or to make a strategic decision. That’s not intake-to-orchestrate, that’s not a fancy UI/UX, that’s not an agent that can perform transactional tasks that fall within scope, and that’s NOT a fancy bun. It’s the beef.

But, apparently, at least as far as THE PROPHET is concerned, (bio) re-engineering is going to eliminate the need for the beef. Apparently, the buns are going to have all the nutrients (or data processing abilities) you need to function and do your job.

In THE PROPHET‘s latest analogy, today’s enterprise technology burger consists of:

  • the patty: (not to be mistaken for the paddy) which combines enterprise technology and labour (which means it really should be the patty [labour] and the trimmings [technology] in this analogy)
  • the upper bun: and
  • the lower bun: which collectively provide you a way to cleanly get a grip on the patty

But tomorrow’s enterprise technology burger will consist of:

  • the upper bun: which will be replaced by a new type of technology that fuses co-pilots and agentic systems to power autonomous agents and replaces the patty [labour] and part of trimmings
  • the lower bun: which will represent the next generation data store and information supply chain and build in “self-healing” technology for data maintenance and replace the other part of the trimmings

… and that’s it. NO BEEF! Just two co-dependent buns that are destined to fuse into a roll … and not a very tasty one at that. Because this roll will, apparently, operate fully autonomously and never get anywhere near you, leaving you perpetually hungry.

Now, apparently, not all parts of the patty (with its complex amino acid chains and protein structures) will be capable of being (bio) re-engineered into the buns right away and the patty won’t disappear all at once, just shrink bit by bit over the next decade until there’s nothing left and the last protein structure is absorbed (or replaced by a good enough AI-generated facsimile — they can do that now too). In THE PROPHET‘s view, legacy systems of record (ERP/MRP, payment platforms, etc.) will be the last to be replaced, and those will survive along with the legacy labour to maintain them until they can finally be split up into components and absorbed into the bun.

In other words, in THE PROPHET‘s view, you don’t need the patty, and, more specifically, you don’t need (or even want) the beef. I have to argue this is NOT the case.

1. You Need the Beef

Thinking that the patty can be completely absorbed into the buns is what results from a lack of understanding of enterprise software architecture best practices and software development in general.

The best architecture we have, which took years to get two, is MVC, which stands for

  • Model: specifically, data model, which should be at the bottom (and could be absorbed into a data bun)
  • View: specifically, the UI/UX we interact with (and could be absorbed into a soft, warm, sweet smelling sourdough bun)
  • Controller: the core algorithms and data processing, which needs to be its own layer that supports the UX (and allows the UX to reconfigure the processing steps and outputs as needed) and can be cross-adapted to the best available data sources (that need to be remain independent)

Moreover, even Bill Gates, who predicts AI will have devastating effects across all industries, realizes that you can’t replace coders, energy experts, and biologists, and, by extension, jobs that require constantly evolving code, organic structure, and energy requirements to complete. So you will still need labour that creates, and relies on, highly specialized algorithms and expert interpretations of outputs to do their jobs. That also means that, in our field, strategic sourcing and procurement professionals cannot be replaced but tactical AP clerks are on their way out as AP software automatically processes 99% to 99.9% of invoices with no human involvement, even those with missing data and errors, handling the return, correction, negotiation, etc. until all of the data matches and costs are within tolerance.

2. You Want the Beef!

The whole point of modern architectures and engineering is to minimize legacy code / technical debt and maximize tactical data processing and system throughput (and have the system do as much thunking as possible, which is what it’s good at). If you try to push too much into the lower bun, you don’t have separation of data and processing, which means it’s almost impossible to validate the data as it’s not data you’re getting, but processed data, which means that the system might be continually pushing wrong data to the outer bun, even with good data fed in, due to a bug deep in the transformation and normalization code. But your automatic checks and fail safes would never catch it because you’ve turned what should be a crystal (clear) box into a black box! If you try to push too much processing into the upper bun, you have to replicate common functionality across every agent and application, leading to a lot of replication and bloat that consumes too much space, uses too much energy, and makes the systems even harder to maintain than the legacy applications of today.

So while the burger of tomorrow might be different with a much leaner, more protein rich, patty (with less sauce and unhealthy trimmings), and the bread might be a super healthy natural yeast-free multi-grain flat bread, making for a smaller (and possibly less appetizing burger from a surface view), it still needs to be a burger and anyone who thinks otherwise has joined the pretty fly Gen-AI in hallucination land!