One of these things is not like the other — it’s the right choice!

This originally published March 6 (2024).  It is being reposted due to the criticality of the subject matter (and the fact that One Trillion was wasted on services last year).

Note the Sourcing Innovation Editorial Disclaimers and note this is a very opinionated rant!  Your mileage will vary!  (And not about any firm in particular.)

Three bids for that spend analytics project from the three leading Big X firms come in at 1 Million. One bid for that spend analytics project from a specialized niche consultancy you pulled out of the hat for bid diversity comes in at 250 Thousand. Which one is right?

Those of you who only partially paid attention to the education Sesame Street was trying to impart upon you when you were growing up will simply remember the “one of these things is not like the other” song and think that any of the bids from the Big X firm is right and the niche consultancy is wrong because it’s different, and therefore must be thrown out because it’s too low when, in fact, it’s just as likely that the three bids from the Big X firms that are wrong and the bid from the niche consultancy that was right.

Those of us who paid attention knew that Sesame Street was trying to show us how to detect underlying similarities so we could properly cluster objects for further analysis. What we should have learned is that the Big X bids were all the same, built on the same assumption, and can be compared equally. And that the outlier bid needed further investigation — a further investigation that can only be undertaken against an appropriately sized set of sample set of bids from other specialized niche consultancies to compare against. And without that sample set of bids, you can’t properly evaluate the lower bid, which, the doctor can tell you, is just as likely to be closer to correct than what could be wildly overpriced Big X bids.  (Newer firms often have newer tech and methods — and if these are the right methods and tech for your problem … )

As per our recent post, if you want to get analytics and AI right, most of these guys don’t have the breadth and depth of expertise they claim to have (as most don’t have the educational background to know just how broad, deep, and advanced AI and analytics can get, especially when you dig deep into the math and computer science and all of the variable models and strengths and weaknesses, and instead are trained on what is essentially marketing content from AI and analytics providers). In the group that sells you, there will be a leader who is a true expert (and worth his or her weight in platinum), a few handpicked lieutenants who are above average and run the projects, and a rafter of juniors straight out of private college with more training in how to dress, talk, and follow orders than training in actual analytics … and no guarantee they even have any real university level mathematics beyond basic analysis in operational research (and thus a knowledge of what analytics is and isn’t and can and can’t do).  And unless you know what you need, and why, you can’t judge the response.  (Furthermore, you can’t expect them to figure out your problem and goals with only partial information!)

While there was a time big analytics projects were (multi) million dollar projects, that was twenty years ago when Spend Analysis 1.0 was still hitting the market; when there were limited tools for data integration, mapping, cleansing, and enrichment; and when there weren’t a lot of statistics on average savings opportunities across internal and external spend categories. Now we have mature Spend Analysis 3.0 technologies (some taking steps towards spend analysis 4.0 technologies); advanced technologies for automatic data integration, mapping, cleansing, and even enrichment; deep databases on projects and results by vertical and industry size; extensive libraries for out-of-the-box analytics across categories and potential opportunities; and a whole toolkit for spend analysis that didn’t exist two decades ago. This new toolkit, built by best of breed vendors used, and sometimes [co-]owned by these best of breed niche consultancies (that don’t try to do everything, and definitely don’t pretend they can), allows modern spend analysis projects to be done ten times as efficiently and effectively, in the hands of a master — a master that isn’t necessarily on your project if you hire a Big X or Mid-Sized Consultancy without doing your homework, vetting the proposal, and vetting the people. [See when should you be using Big X.]

In contrast, a dedicated niche consultancy should have all these tools, and only have masters on the project who do these projects day in and day out. Compared to the bigger consultancies who don’t specialize in these projects, which will have a team of juniors using the manual playbook from the early 2000s, and one lieutenant to guide them. That’s often why sometimes their project bids are five times as much — and why you should be inviting multiple niche best-of-breed consultancies to bid on your project as well as multiple Big X consultancies (including those that are truly focusing on analytics and AI, and you can identify some of these by their recent acquisitions in the area) and be focusing in just as much on the six figure bids for the one that provides the best value, not just the seven figure Big X bids.  (And, FYI, if you invite enough Big X, you might find some come in at six figures and not seven because they have acquired the newer tech, took the time to understand your request, and figured out how they could get you the same value for less cost, leaving you funds for the follow on project where you should consider the Big X!)

(This is also the case for implementations. The Big X always have a rafter on the bench to assign to any project you give them, but there’s no guarantee any of them have ever implemented the system you chose before, or if they did, no guarantee they’ve ever connected it to the systems you need to connect to. You need specialists if you want a new system implemented as cost effectively as possible, especially if its a narrow focused specialist application and not a big enterprise application the Big X always implements. At the end of the day, even if you’re paying those specialists 500 or more an hour because getting a system up in 2 months at 40K is considerably better than a small team of juniors taking 4 months at 200 an hour and a total cost of 80K.  But again, mileage will vary — if the solution you select is a Big X partner, then the Big X will be best.  If it’s a solution they never heard of, you will need to evaluate multiple bids from multiple parties. )

Remember, where any group of vendors on the same page are concerned, All of us is as dumb as One of us!

Don’t fall for the Collectivism MindF6ck! that if multiple parties agree on something, that’s the right answer!  the doctor does NOT want to do say it again, but since a month still is not going by where he’s hearing about niche consultancies being thrown out for “being too cheap” or “obviously not understanding the problem” (which means the enterprise throwing them out is too uninformed and not recognizing that the Big X bids could just as likely the outliers because they aren’t inviting enough expert consultancies to the table), apparently he has to keep writing (and screaming) this truth. (the doctor isn’t saying that you can’t get a million dollars of value from some of these consultancies, just that you won’t by giving them a project they are not suited for;  again, see when should you use big X to identify when that million dollar project will generate a five million ROI — it’s people doing these projects at the end of the day, and where are those people?)

Remember, most of these firms got big in management, or accounting and tax, or marketing and sales consulting, not technology consulting. The only reason these big consultancies started offering these services is because of the amount of money flowing into technology, money which they want, but while the best of the best of the best in more traditional accounting, management, and marketing fields flocked to them, the best of the best in technology flocked to startups and c00l big tech firms  Now, some of these firms double downed, went and recruited those people, built small teams, learned, bought tech companies to expand the team, and now have great offerings in a number of areas.  But we have tens of thousands of tech companies for a reason, not everyone can build every type of technology, and not everyone can be an expert in every type of technology.  So while they will have expertise in some areas, they just can’t have expertise in all areas.  No one can.  Find the best provider for you.  Sometimes it will be Big X.  Sometimes Mid-Market.  Sometimes Niche.  It all depends on your problem at hand.)

And yes, sometimes the niche vendor will be wrong and woefully undersize the project or your needs.  But as per the above, if you don’t do give them a chance, and deep dive into their bid, how will you know?

 

Did you ever try eating a mitten? the doctor bets some of those clients did! (He feels you’re not all there if you think glorified reporting projects should still cost One Million Dollars by default and might actually try to eat your mittens! [Joking, but you get the point.]  Deep analytics projects that require the most advanced tech, especially AI tech, will cost a lot, but standard spend analysis, sales analysis, etc. where we have been iterating and improving on the technology for two decades should not.)

DEAR ENTERPRISE PROCUREMENT SOFTWARE BUYER: THERE ARE NO FREE RFPs!

This originally published June 29 (2024) and is being reprinted due to how important it is to remember as you enter a new budgetary year and seek out new technology.

This shouldn’t have to be said (again), but apparently it does since Zip has relaunched the FREE RFP madness in Source-to-Pay (that began in 2006 when Procuri first aggressively launched the Sourcing, Supplier Management, Contract Management, and Spend Analysis RFPs) with an RFP that is intake heavy, orchestrate light, process deficient, and, like many RFPs before, completely misses some of the key points when going to market for a technology solution. (Especially since there isn’t a single FREE RFP template from a vendor that isn’t intrinsically weighted towards the vendor’s solution, as it’s always written from the viewpoint of what the vendor believes is important.)

the doctor has extensively written about RFPs and the RFP process here on SI in the past, but, at a high level, a good RFP specifies:

  • your current state,
    it does NOT leave this out leaving the vendor to guess your technical and process maturity
  • what you need the solution to do
    NOT just a list of feature/functions
  • what ecosystem you need the solution to work in
    NOT just a list of protocols or APIs that must be supported
  • where the data will live
    and, if in the solution, how you will access it (for free) for exports and off-(vendor-)site backups, do NOT leave this out
  • what support you need from the vendor
    NOT just whether the vendor offers integration / implementation services and their hourly / project rate
  • any specific services you would like from the vendor
    NOT a list of all services you might want to buy someday
  • what the precise scope of the RFP is if it is part of a larger project
    NOT a blanket request for the vendor to “address what they can”
  • what regulations and laws you are subject to that the vendor must support
    NOT just an extensive list of every standard and protocol you can think of
  • what languages and geographies and time zones you need supported
  • any additional requirements the vendor will need to adhere to based on the regulations you or the vendor would be subject to and additional requirements your organization puts in place
    NOT endless forms of every question you can think of that might never be relevant
  • your goal state,
    it does NOT leave the vendor to guess what you are looking for (note that “goal” defines what you want to achieve, it is up to the vendor to define how they will help you achieve it)
  • what (management) processes you use to work with vendors — and —
  • what collaboration tools you make available to vendors and what your expectations are of them

And it is only created after a current state assessment, goal state specification, and key use-case identification so that it is relatively clear on organization needs and vendors have no excuse to provide a poor response.

Furthermore, a good RFP does NOT contain:

  • requests for features/functions you don’t currently need (but you can ask for a roadmap)
  • specific requests for a certain type of AI/ML/Analytics/Optimization/etc. when you don’t even know what that tech actually does — let the vendor tell you, and then show you, how their tech solves their problem
    (after all, there are almost NO valid uses for Gen-AI in S2P)
  • specific requests on the technology stack, when it doesn’t matter if they use Java or Ruby, host on AWS or Azure, etc.
  • requests for audits (tech, environmental, social welfare, etc.) when you haven’t selected the vendor for an award, pending a successful negotiation
  • requests for service professional resumes when you haven’t selected the vendor for an award that includes professional service, pending a successful negotiation
  • requests for financials, when you haven’t selected the vendor for an award pending a successful negotiation
    (because these last three [3] will scare some vendors off and possibly prevent the best vendor for you from even acknowledging your RFP exists)

And, a good RFP, goes to the right providers! This means that you need to select providers with the right type of solution you need before you issue the RFP, and then only issue to providers that you know offer that type of solution. (You can use analyst reports here if you like to identify potential vendors, but remember these maps cannot be used for solution selection! You will then need to do some basic research to make sure the vendor appears to fit the criteria.)

And if there are a lot of potential providers, you may need to do a RFI — Request for Interest / Intent (to Bid) — where you specify at a high level what the RFP you intend to issue is for, and if you get a lot of positive responses, do an initial call with the providers to confirm not only interest but the solution offered is relevant to your organization. (After all, at the end of the day, as The Revelator is quick to point out, it’s as much about the people behind the technology as the technology itself if you expect to be served by the provider.)

And even if you don’t need to an RFI before the RFP, you should still reach out to the vendors you want to respond, let them know the RFP is coming, and let them know you’ve done your research, believe they are one of the top 5 vendors, and are looking forward to their response. (Otherwise, you might find you don’t get as many responses as you’d hope for as vendors prioritize RFPs that they believe they have a good shot at winning vs. random unexpected RFP requests from unknown companies.)

At the end of the day, if you don’t know:

  • what the main categories of S2P+ solutions are
  • what the typical capabilities of a solution type are, what’s below, average or above
  • who the vendors are
  • how to determine your current state of process maturity (and how that compares to the industry, market, and best-in-class) and what a solution could do for you
  • how to evaluate a vendor’s solution
  • how to evaluate a vendor overall
  • how to write a good RFP that balances core business, tech, and solution requirements to maximize your chances of finding a good vendor for you

and the reality is that you most likely don’t (as less than 10% of Procurement departments are world class, as per Hackett research going back to the 2000s where they also determined the typical journey for an organization to become best-in-class in Procurement was 8 years, and that’s the minimum requirement to write a world-class technology RFP), then you should engage help from an expert to help you craft that RFP, be it an independent consultant or firm that specializes in Procurement transformation.

It is also critically important that the firm you select to help you needs to be neutral (not aligned with one solution provider who refers implementations to them in return for potential customer referrals) and that the firm does not rely on analyst maps either!

If you want help, the doctor has relationships with leading, neutral, firms on both sides of the pond who can help you, and who he will work with to make sure the technology / solution component is precisely what you need to get the right responses from vendors. Simply contact the doctor (at) sourcinginnovation [dot] com if you would like help getting it right.

Simply put, getting help with your technology RFP is the best insurance money you can spend. When you considering that, all in, these solutions will cost seven (7) or eight (8) figures over just a few years, you should be willing to spend 5% to 10% of the initial contract value to make sure you get it right. (Especially when there isn’t a single Private Equity Firm that wouldn’t invest in a technology player without doing a six [6], if not seven [7] figure due diligence first … and sometimes the firm will do this and then walk away! At least in your case, when you work with someone who can identify multiple potential vendors, you’re certain to find one at the end of the day.)

One Supply Chain Misconception That Should Be Cleared Up Now

This originally posted May 14 (2024).  It’s being reposted because this definitely needs to be cleared up before the new year (due to the constant proliferation of AI, which is, when all is said and done, just another technology).

Not that long ago, Inbound Logistics ran a similarly titled article that quoted a large number of CXOs that made some really good observations on common misconceptions that included, and are not necessarily limited to (and you should check out the article in full as a number of the respondents made some very good points on the observations):

The misconceptions included statements that supply chains should:

  • reduce cost and/or track the most important metric of cost savings
  • accept negotiations as a zero-sum game
  • model supply chains as linear (progression from raw materials to finished goods)
  • … and made up of planning, buying, transportation, and warehousing silos
  • … and each step is independent of the one that proceeds and follows
  • accept they will continue to be male dominated
  • become more resilient by shifting production out of countries to friendly countries
  • expect major delays in transportation
  • … even though traditional networks are the best, even for last-mile delivery
  • accept truck driver shortage as a systemic issue
  • accept the blame when anything in them goes wrong
  • only involve supply chain experts
  • run on complex / resource intensive processes
  • … and only be optimal in big companies
  • … which can be optimized one aspect at a time
  • press pause on innovation or redesign or growth in a down market
  • be unique to a company and pose unique challenges only to that company
  • not be sustainable as that is still cost-prohibitive
  • see disruption as an aberration
  • return to (the new) normal
  • use technology to fix everything
  • digitalize as people will become less important with increasing automation and AI in the supply chain

And these are all very good points, as these are all common misconceptions that the doctor hears too much (and if you go through enough of the Sourcing Innovation archives, it should become clear as to why), but not the biggest, although the last one gets pretty close.

 

THE BIGGEST SUPPLY CHAIN MISCONCEPTION

We Can Use Technology to Do That!

the doctor DOES NOT care what “THAT” is, you cannot use technology to do “THAT” 100% of the time in a completely automated way. Never, ever, ever. This is regardless of what the technology is. No technology is perfect and every technology invented to date is governed by a set of parameters that define a state it can operate effectively in. When that state is invalidated, because one or more assumptions or requirements cannot be met, it fails. And a HUMAN has to take over.

Even though really advanced EDI/XML/e-Doc/PDF invoice processing can automate processing of the more-or-less 85% of invoices that come in complete and error free, and automate the completion and correction of the next 10% to 13%, the last 2% to 5% will have to be human corrected (and sometimes even human negotiated) with the supplier. And this is technology we’ve been working on for over three decades! So you can just imagine the typical automation rates you can expect from newer technology that hasn’t had as much development. Especially when you consider the next biggest misconception.

Enterprises have a Data Problem. And they will until they accept they need to do E-MDM, and it will cost them!

This originally published on April (29) 2024.  It is being reposted because MDM is becoming more essential by the day, especially since AI doesn’t work without good, clean, data.

insideBIGDATA recently published an article on The Impact of Data Analytics Integration Mismatch on Business Technology Advancements which did a rather good job on highlighting all of the problems with bad integrations (which happen every day [and just result in you contributing to the half a TRILLION dollars that will be wasted on SaaS Spend this year and the one TRILLION that will be wasted on IT Services]), and an okay job of advising you how to prevent them. But the problem is much larger than the article lets on, and we need to discuss that.

But first, let’s summarize the major impacts outlined in the article (which you should click to and read before continuing on in this article):

  • Higher Operational Expenses
  • Poor Business Outcomes
  • Delayed Decision Making
  • Competitive Disadvantages
  • Missed Business Opportunities

And then add the following critical impacts (which is not a complete list by any stretch of the imagination) when your supplier, product, and supply chain data isn’t up to snuff:

  • Fines for failing to comply with filings and appropriate trade restrictions
  • Product seizures when products violate certain regulations (like ROHS, WEEE, etc.)
  • Lost Funds and Liabilities when incomplete/compromised data results in payments to the wrong/fraudulent entities
  • Massive disruption risks when you don’t get notifications of major supply chain incidents when the right locations and suppliers are not being monitored (multiple tiers down in your supply chain)
  • Massive lawsuits when data isn’t properly encrypted and secured and personal data gets compromised in a cyberattack

You need good data. You need secure data. You need actionable data. And you won’t have any of that without the right integration.

The article says to ensure good integration you should:

  • mitigate low-quality data before integration (since cleansing and enrichment might not even be possible)
  • adopt uniformity and standardized data formats and structures across systems
  • phase out outdated technology

which is all fine and dandy, but misses the core of the problem:

Data is bad (often very, very bad), because the organizations don’t have an enterprise data management strategy. That’s the first step. Furthermore this E-MDM strategy needs to define:

  1. the master schema with all of the core data objects (records) that need to be shared organizational wide
  2. the common data format (for ids, names, keys, etc.) (that every system will need to map to)
  3. the master data encoding standard

With a properly defined schema, there is less of a need to adopt uniformity across data formats and structures across the enterprise systems (which will not always be possible if an organization needs to maintain outdated technology either because a former manager entered into a 10 year agreement just to be rid of the problem or it would be too expensive to migrate to another system at the present time) or to phase out outdated technology (which, if it’s the ERP or AP, will likely not be possible) since the organization just needs to ensure that all data exchanges are in the common data format and use the master data encoding standard.

Moreover, once you have the E-MDM strategy, it’s easy to flush out the HR-MDM, Supplier/SupplyChain-MDM, and Finance-MDM strategies and get them right.

As THE PROPHET has said, data will be your best friend in procurement and supply chain in 2024 if you give it a chance.

Or, you can cover your eyes and ears and sing the same old tune that you’ve been singing since your organization acquired its first computer and built it’s first “database”:

Well …
I have a little data
I store it on my drive
And when it’s old and flawed
The data I’ll archive

Oh, data, data, data
I store it on my drive
And when it’s old and flawed
The data I’ll archive

It has nonstandard fields
The records short and lank
When I try to read it
The blocks all come back blank

I have a little data
I store it on my drive
And when it’s old and flawed
The data I’ll archive

My data is so ancient
Drive sectors start to rot
I try to read my data
The effort comes to naught

Oh, data, data, data
I store it on my drive
And when it’s old and flawed
The data I’ll archive

You Don’t Need Gen-AI to Revolutionize Procurement and Supply Chain Management — Classic Analytics, Optimization, and Machine Learning that You Have Been Ignoring for Two Decades Will Do Just Fine!

This originally posted on March 22 (2024).  It is being reposted because we need solutions, Gartner (who co-created the hype cycle) published a study which found that Gen-AI/technology implementations fail  85% of time, and its because we have abandoned the foundations — which work wonders in the hands of properly applied Human Intelligence (HI!).  Gen-AI, like all technologies, has its place, and it’s not wherever the Vendor of the Week pushes it, but where it belongs.  Please remember that.

Open Gen-AI technology may be about as reliable as a career politician managing your Nigerian bank account, but somehow it’s won the PR war (since there is longer any requirement to speak the truth or state actual facts in sales and marketing in most “first” world countries [where they believe Alternative Math is a real thing … and that’s why they can’t balance their budgets, FYI]) as every Big X, Mid-Sized Consultancy, and the majority of software vendors are pushing Open Gen-AI as the greatest revolution in technology since the abacus. the doctor shouldn’t be surprised, given that most of the turkeys on their rafters can’t even do basic math* (but yet profess to deeply understand this technology) and thus believe the hype (and downplay the serious risks, which we summarized in this article, where we didn’t even mention the quality of the results when you unexpectedly get a result that doesn’t exhibit any of the six major issues).

The Power of Real Spend Analysis

If you have a real Spend Analysis tool, like Spendata (The Spend Analysis Power Tool), simple data exploration will find you a 10% or more savings opportunity in just a few days (well, maybe a few weeks, but that’s still just a matter of days). It’s one of only two technologies that has been demonstrated, when properly deployed and used, to identify returns of 10% or more, year after year after year, since the mid 2000s (when the technology wasn’t nearly as good as it is today), and it can be used by any Procurement or Finance Analyst that has a basic understanding of their data.

When you have a tool that will let you analyze data around any dimension of interest — supplier, category, product — restrict it to any subset of interest — timeframe, geographic location, off-contract spend — and roll-up, compare against, and drill down by variance — the opportunities you will find will be considerable. Even in the best sourced top spend categories, you’ll usually find 2% to 3%, in the mid-spend likely 5% or more, in the tail, likely 15% or more … and that’s before you identify unexpected opportunities by division (who aren’t adhering to the new contracts), geography (where a new local supplier can slash transportation costs), product line (where subtle shifts in pricing — and yes, real spend analysis can also handle sales and pricing data — lead to unexpected sales increases and greater savings when you bump your orders to the next discount level), and even in warranty costs (when you identify that a certain supplier location is continually delivering low quality goods compared to its peers).

And that’s just the Procurement spend … it can also handle the supply chain spend, logistics spend, warranty spend, utility and HR spend — and while you can’t control the HR spend, you can get a handle on your average cost by position by location and possibly restructure your hubs during expansion time to where resources are lower cost! Savings, savings, savings … you’ll find them ’round the clock … savings, savings, savings … analytics rocks!

The Power of Strategic Sourcing Decision Optimization

Decision optimization has been around in the Procurement space for almost 25 years, but it still has less than 10% penetration! This is utterly abysmal. It’s not only the only other technology that has been generating returns of 10% or more, in good times and bad, for any leading organization that consistently uses it, but the only technology that the doctor has seen that has consistently generated 20% to 30% savings opportunities on large multi-national complex categories that just can’t be solved with RFQ and a spreadsheet, no matter how hard you try. (But if you want to pay them, an expert consultant will still claim they can with the old college try if you pay their top analyst’s salary for a few months … and at, say, 5K a day, there goes three times any savings they identify.)

Examples where the doctor has repeatedly seen stellar results include:

  • national service provider contract optimization across national, regional, and local providers where rates, expected utilization, and all-in costs for remote resources are considered; With just an RFX solution, the usual solution is to go to all the relevant Big X and Mid-Sized Bodyshops and get their rate cards by role by location by base rate (with expenses picked up by the org) and all-in rate; calc. the expected local overhead rate by location; then, for each Big X / Mid-Size- role – location, determine if the Big X all-in rate or the Big X base rate plus their overhead is cheaper and select that as the final bid for analysis; then mark the lowest bid for each role-location and determine the three top providers; then distribute the award between the three “top” providers in the lowest cost fashion; and, in big companies using a lot of contract labour, leave millions on the table because 1) sometimes the cheapest 3 will actually be the providers with the middle of the road bids across the board and 2) for some areas/roles, regional, and definitely local, providers will often be cheaper — but since the complexity is beyond manageable, this isn’t done, even though the doctor has seen multiple real-world events generate 30% to 40% savings since optimization can handle hundreds of suppliers and tens of thousands of bids and find the perfect mix (even while limiting the number of global providers and the number of providers who can service a location)
  • global mailer / catalog production —
    paper won’t go away, and when you have to balance inks, papers, printing, distribution, and mailing — it’s not always local or one country in a region that minimizes costs, it’s a very complex sourcing AND logistics distribution that optimizes costs … and the real-world model gets dizzying fast unless you use optimization, which will find 10% or more savings beyond your current best efforts
  • build-to-order assembly — don’t just leave that to the contract manufacturer, when you can simultaneously analyze the entire BoM and supply chain, which can easily dwarf the above two models if you have 50 or more items, as savings will just appear when you do so

… but yet, because it’s “math”, it doesn’t get used, even though you don’t have to do the math — the platform does!

Curve Fitting Trend Analysis

Dozens (and dozens) of “AI” models have been developed over the past few years to provide you with “predictive” forecasts, insights, and analytics, but guess what? Not a SINGLE model has outdone classical curve-fitting trend analysis — and NOT a single model ever will. (This is because all these fancy-smancy black box solutions do is attempt to identify the record/transaction “fingerprint” that contains the most relevant data and then attempt to identify the “curve” or “line” to fit it too all at once, which means the upper bound is a classical model that uses the right data and fits to the right curve from the beginning, without wasting an entire plant’s worth of energy powering entire data centers as the algorithm repeatedly guesses random fingerprints and models until one seems to work well.)

And the reality is that these standard techniques (which have been refined since the 60s and 70s), which now run blindingly fast on large data sets thanks to today’s computing, can achieve 95% to 98% accuracy in some domains, with no misfires. A 95% accurate forecast on inventory, sales, etc. is pretty damn good and minimizes the buffer stock, and lead time, you need. Detailed, fine tuned, correlation analysis can accurately predict the impact of sales and industry events. And so on.

Going one step further, there exists a host of clustering techniques that can identify emergent trends in outlier behaviour as well as pockets of customers or demand. And so on. But chances are you aren’t using any of these techniques.

So given that most of you haven’t adopted any of this technology that has proven to be reliable, effective, and extremely valuable, why on earth would you want to adopt an unproven technology that hallucinates daily, might tell of your sensitive employees with hate speech, and even leak your data? It makes ZERO sense!

While we admit that someday semi-private LLMs will be an appropriate solution for certain areas of your business where large amount of textual analysis is required on a regular basis, even these are still iffy today and can’t always be trusted. And the doctor doesn’t care how slick that chatbot is because if you have to spend days learning how to expertly craft a prompt just to get a single result, you might as well just learn to code and use a classic open source Neural Net library — you’ll get better, more reliable, results faster.

Keep an eye on the tech if you like, but nothing stops you from using the tech that works. Let your peers be the test pilots. You really don’t want to be in the cockpit when it crashes.

* And if you don’t understand why a deep understand of university level mathematics, preferably at the graduate level, is important, then you shouldn’t be touching the turkey who touches the Gen-AI solution with a 10-foot pole!