Category Archives: Miscellaneous

Don’t Underestimate the Strength of Straw Bridges!

Joël Collin-Demers says that running your Procurement Department on Excel, which was not designed to support business processes, is like asking a straw bridge to support an elephant. (Original LinkedIn Post)

His point being that Excel should only be used for its intended use: ad-hoc spreadsheet-based analysis.

While I applaud his goal, as we need to stop running the business world on Excel (especially since over 90% of spreadsheets have significant errors and these errors will cost you billions), Joel doesn’t understand just how strong well engineered straw bridges can be.

Engineering Students in the classroom have built (13g) straw bridges from 0.4 gram straws, using trusses and lots of triangular bracing, that can support over 4 kg! (Video)

Now, of course, this is for stationary weight, and Procurement, like Supply Chains, has moving parts, but engineering students have demonstrated that Howe bridge designs, made from a roughly 2kg PASCO model bridge set, can minimize the compression force to 5.7N! (1N = force required to accelerate 1kg at 1m/s)

And it’s very likely that even stronger bridges will be built out of straw in the future.

This means that if we consider that a plastic straw has a compressive strength exceeding 20 MPa (2 N/m^2 * 10^6) and a surface area of approx 24π cm^2, that tells us that we likely haven’t reached the limit yet (which would be about 20 N).

In other words, his comparison, meant to move Procurement off Excel, actually illustrates why Procurement won’t abandon Excel. While seriously flawed as a Procurement tool, continual bursts of innovative creativity allow Excel to continue supporting the ridiculous weight being thrust upon it. It might be built from straws in the application world, but, as we just demonstrated, properly arranged, straws can be unbelievably strong.

Remember that the next time you are arguing against Excel.

A Very Brief History of “Safe” American Inventions and Products

More specifically, a brief history of inventions and products developed, or (primarily) adopted, in the USA as perfectly “Safe” for public use when they were anything but! From the late 1800s to the present day.

Asbestos: large scale mining began in the late 1800s when manufacturers and builders decided it was a great thermal and electrical insulator whose adverse effects on human health were not widely recognized and acknowledged until the (late) 1970s; even today exposure is still the #1 cause of work-related deaths in the world (with up to 15K dying annually in the US due to asbestos-related disease)

Aspirin: as per our previous post, invented in 1897, available over the counter in 1915, it was heavily promoted as the cure all in the 1920s through the 1940s and might have cost us over a hundred thousand lives due to overprescription during the Spanish Flu pandemic alone

Cocaine: from the late 1880s through the early 1910s, your physicians were big fans of the Victorian wonder drug (as per this Lloyd Manufacturing Ad archived on the NIH site) as it was effectively the first effective local anesthetic the western world knew about (which was endorsed by the Surgeon-General of the US Army in 1886), although the real popularity was in the public, with an estimated 200,000 cocaine addicts in the US by 1902; still, it was 1914 before it was restricted to prescription use, 1922 before tight regulations were put in place, and likely the late 1940s before prescription and dispensation finally came to an end; moreover, it was generally viewed as harmless and non-addictive until crack emerged in 1985 (even though the number of cocaine related deaths in the US climbed to 2 per 1,000 in 1981)

DDT: (this is particularly relevant to Gen-Z who are fully on-board the Gen-AI hype train) developed in the 1940s as the first modern synthetic insecticide, Gen Z’s grandparents and great-grandparents used to run through DDT clouds that were sprayed in the streets of your cities and towns in the 1940s through the 1960s, as the first health risks were not reported until roughly 1962 when Rachel Carson published Silent Spring, and it wasn’t until 1972 when the US banned it for adverse effects on human health (as well as the environment); to this day, we’re still not sure how many deaths it has contributed to, although the UN estimates 200K people globally still die from toxic exposure to pesticides, of which DDT was the first and the precursor to many newer derivations (Source)

PFAS, inc. PTFE (Teflon)

developed by DuPont in 1938, spun off into Chemours, it found use as a lubricant and non-stick coating for pans, and was produced using PFOA (C8), which we now know (and should have known much sooner, but there was a massive PFAS cover up) is carcinogenic (but only for the last decade or so as it was only classified as such in 2013 even though we should have known by the late 1990s) but they still aren’t banned (even though legislation was proposed last year to phase them out over the next decade); because of the cover ups and lack of studies until recent times, we still don’t know how deadly this was, and is, but estimates are that PFAS likely killed 600K annually between 1999 and 2015 and 120K annually after that in the USA (Source) … WOW!

Tobacco: in the 1950s, cigarettes were advertised as good for you with Doctor (Camel Advertisement) and Dentist (Viceroy Advertisement) recommendations on the ads! Despite the fact that health risks were known since the late 1950s (when the first epidemiological study showing an association between smoking and lung cancer was published by Wynder and Graham), minors in the USA could still buy cigarettes until 2009 … even though Tobacco likely killed over 100 Million people globally in the 1900s (Source)

etc.

We could go on, but the point is this: like most cultures, the USA is not good at picking winning technology that is safe for everyday use, or at least safe enough under appropriately designated usage conditions.

There’s a reason that most countries have harsh regulations on the introduction of new consumer products and technologies that US lobbyists and CEOs scream about, and that’s because more mature countries (which have been around longer than a mere 249 years) understand that no matter how safe something seems, every advancement comes at a cost, every invention comes with a risk, and every convenience comes at a price — and until we know what we are paying, when we need to pay it, and how much we are going to pay, we shouldn’t rush in head first with blinders on.

And while we might still get it wrong, the reality is that we’re more likely to get it right if we take our time and properly evaluate a new technology or advancement first, and even if we get it partially wrong, as in the case of Aspirin, at least the gain should outweigh the cost. For example, even though it can be argued Aspirin was rushed to market, when used in proper doses, the side effects for the vast majority of the population are typically much less than the anti-inflammatory benefits as, for decades, there was no substitute. Even if it gave a person stomach irritation or minor ulcers, if it was life-saving, then that was a reasonable cost at the time.

However, in the cases of DDT, PFAS, and Tobacco, there was no excuse for the lack of research, and, in some cases, the prolonged cover up of research that indicated that maybe the products were not safe but, in fact, very deadly, and since they brought no significant life saving benefits (Malaria wasn’t a big concern in the USA; people were cooking with butter, lard, and oils for centuries; and, in small quantities, both alcohol and cannabis were known to not only be safer, but even medicinal in the right quantities), there was no need to rush them to market.

The simple fact of the matter is that no tech — be it chemical/medicinal, (electro-)mechanical, or computational — can be presumed safe without adequate testing over time, and that’s why we need regulations and proper application of the scientific method. A lack of apparent side effects doesn’t mean that there are none. That’s why we have the scientific method and mathematical proofs (for confidence and statistical certainty), which is something today’s generation doesn’t appear to know a thing about (especially if they just did a couple of years of college programming) as they’ve probably never been in a real lab [or played with uranium like their grandparents because it was legal in the USA to sell home chemistry kits with uranium samples to children in the 1950s, and these kits included the Gilbert U-238 Atomic Energy Lab] and more than likely don’t know the rule of thumb that you should generally add the acid to the base (and not vice versa because, otherwise, this could happen) and that you should definitely add the acid to whatever liquid [typically water] you are diluting it with.

Regulations exist for a reason, and that reason is to keep us safe. The Hippocratic Oath should not be restricted to
doctors and the Obligation of the Order should not be restricted to engineers. Every individual in every organization bringing a product to market should be bound by the same, and regulations should exist to make sure that all organizations take reasonable care in the development and testing of every product brought to market, real or virtual. (This doesn’t mean that every product needs to be inspected, but that regulations and standards exist for organizations to follow, and those caught not following the regulations should be subject to fines that would ensure that not just the company, but the C-Suite personally, was bankrupted if the company was found to have ignored the regulations.)

While Gen Z might like the Wild Wild West (which the USA never grew out of) as much as Gen X who created the dot com boom, we need to remember that the dot com boom ended in the dot com bust in 2000, and that if this new generation continues to latch on to AI like Boomers would latch on to blankies and teddies, it just means they are doomed to repeat the mistakes of their grandparents (and will bring about a tech market crash that makes the dot com bust look like a blip). You’re supposed to learn from history, NOT repeat it!

Got a Headache? Don’t Take an Aspirin or Query a LLM!

Yesterday we provided you with a brief history of Aspirin, the first turn-of-the-century miracle drug that was both society’s salvation and sorrow, though the latter wouldn’t be known for more than half a century. As we discussed, it was hailed as a miracle and life-saving drug that could be used for everything from the common cold to global pandemics. And it worked, for a price. That price, when it needed to be paid, was usually one of many, many side effects which were often minor and insignificant compared to the perceived benefit the drug was bringing, except when they weren’t and they enflamed ulcers and/or increased gastrointestinal bleeding and created a life threatening situation, caused hyperventilation in a pneumonia patient, or induced a pulmonary edema and killed the patient. While the death rate even at the height of over-prescription was likely only 3%, and less than a 10th of that today, it’s still not good.

The reason for this, as we elaborated in our last post, is because, like many of the breakthrough technologies that came before, it was not only rolled out before the side effects, and more importantly, the long term effects, were well understood, but before even the proper use for the desired primary effects were well understood (as evidenced by the fact that the best physicians were routinely prescribing two to four times the maximum safe dosage during the Spanish Flu Pandemic almost 20 years after first availability). While there were benefits, there were consequences, some of them severe, and others deadly.

Medicine is as much a technology as a new mode of transportation (boat, automobile, airplane, etc.), a new piece of manufacturing equipment, a new computing device, or a new piece of software.

Now you see the point. Every breakthrough tech cycle is the same. Whether it is medicine, farm machinery, the airplane, or modern software technology — and this includes AI and definitely includes LLMs like ChatGPT.

As Aspirin proves, even if the first test seems to be successful, there’s always more beneath the surface. Especially when the population numbers in the billions and every individual could react differently. Or, in the case of an LLM, billions of people who have thousands of queries, the large majority of which have never been tested, and all of which could generate unknown results.

Moreover, there have not been significant large-scale independently funded academic studies that we can use to understand the true strengths and weaknesses, truths and hallucinations, and appropriate utilization of the technology. As Mr. Klein has pointed out in a recent LinkedIn post that asked who funded that study, over 80% of AI industry “studies” are funded by undisclosed sources, and most of them, like most industry studies these days (see Mr. Hembitski’s latest post) don’t contain good data on demographics, sample size, test material, or potential bias.

That would be the first step to trying to get a grip on this technology. The next step would be to create reasonable measures that we could use to appropriately define technology categories and domains for which we could identify tests and measures that would give us a level of confidence for a given population of inputs or usage. If you consider a traditional (X)NN (Neural Network), which have a fixed set of outputs and are designed to process inputs from a known population, we have developed methodologies to determine the accuracy of such models with high confidence through testing and random sampling with sufficiently sized data sets using appropriate statistical models. Furthermore, mathematicians have proved the accuracy of those models for a given population and we know that if appropriate tests have demonstrated 90% accuracy for a population with 98% confidence, the model is 90% accurate with 98% confidence when used properly.

We have no such guarantees for LLMs, nor any proof that they are reliable. “It worked fine for me” is NOT proof. Vendors quoting nebulous client success stories (without client names or real data) is not proof. Moreover, the fact they raised millions of dollars to bring this technology to market is definitely not proof. (All a raise proves is that the C-Suite sales team is very charismatic and convincing and great at selling a story. Nothing more. In fact, fund raising would be more honest if securities law allowed fund raising via poker and takeover protection via gunfighting, as imagined in the season two episode of Sliders “The Good, the Bad, and the Wealthy“. At least then the shenanigans would be out in the open.)

The closest thing out there to a good industry study on LLMs and LRMs is likely Apple’s newest study, as summarized in The Guardian, where they find that “standard AI models outperformed LRMs in low-complexity tasks while both types of model suffered “complete collapse” with high-complexity tasks“.

The study also found that as LRMs neared performance collapse they began “reducing their reasoning effort and that if the problem was complex enough even when provided with an algorithm that would solve the problem, the models failed.

Still we have to question this study, or more precisely, the release of this study (especially given the timing). Did Apple do it out of genuine academic interest to get to the bottom of the technology claims, or are they doing it to cast doubt on competition as rivals are claiming they are behind in the AI race (and thus they are focussing only on the negatives of the technology to show that their competition doesn’t have what their competition claims to have and are thus not behind).

The point is, we don’t understand this technology, and that fact should scream louder in your head every day. Look at all the bad stuff we’ve discovered so far, and it’s likely we’re not even close to being done yet:

Yes there is potential to the new technology, as there is with all discovery, but until we understand fully not only what that is, how to use it safely, and, most importantly, how to prevent harm, we should approach it with extreme caution and we should most definitely not let it tell us how to run our business or our lives — or else, like an Aspirin overdose, it might just kill us. (And remember, Aspirin was studied for 18 years before it was made available without a prescription, and deadly side effects and prescribed overdoses still happened. In comparison, today’s LLMs and LRMs haven’t been formally studied at all, and the providers of this technology want you to run your business, and your life, off of them in next-generation agentic systems. Think about that! And when the migraine comes, remember, don’t take Aspirin!)

A Brief History of Aspirin

The history of Aspirin, a genericized trademark for acetylsalicylic acid (ASA), and more precisely, aspirin precursors, is a long and winding one, which goes all the way back to ancient Sumer and Egypt, with the famous Hippocrates referring to the use of salicylic tea to reduce fever circa 400 BC. Now, since I’m sure you haven’t come here for a complete history of Aspirin from ancient times to present day, especially since you want to understand the relevance of this discussion sooner rather than much later, we’re going to skip ahead to 1897.

In 1897, Felix Hoffmann and/or Arthur Eichengrün of Bayer was the first to produce acetylsalicylic acid in a pure, stable form. It was only two short years later before Bayer began to sell the drug globally under the brand name of Aspirin, with the first tablet appearing in 1900. It wasn’t long before Aspirin’s popularity took off as it was touted as a “turn of the century miracle drug“, especially since early trials (published in an 1899 study in the journals Die Heilkunde and Therapeutische Monatshefte) demonstrated that Aspirin was indeed superior to other known salicylates. Moreover, since this drug was deemed to be considerably safer and comparably less toxic to the drugs it was replacing, it was fast-tracked through review and approval processes and first became available to the public without a prescription in 1915, only 15 years after the first tablet appeared. If you consider the rate of progress and introduction of new technologies at the turn of the century, this was blazingly fast for the time.

It’s quick, and early, introduction arguably made it the first modern over-the-counter mass market pharmaceutical product as well as a household name across the world. As the first generally available pharmaceutical anti-inflammatory and pain-killer, it changed societies. It allowed anyone to deal with mild to moderate pain and continue to function. It allowed doctors to quickly get inflammation and fever under control and spend more time diagnosing the cause, or simply move onto the next patient if it was a flu or infection they couldn’t do anything about (and the patient just had to survive long enough to fight it off on their own). Since there was no technology to quickly develop a vaccine for a heretofore unknown virus back in 1916, it was hailed as the literal lifesaver during the Spanish Flu pandemic of 1918. Even though that pandemic [which infected over 20% of the global population] killed an estimated 50 MILLION people, or almost 3% of the global population at the time, (which means COVID really wasn’t that bad with a global death toll of 7 Million, or a mere 0.1% of the global population) it is believed that many more people would have succumbed to the Spanish Flu without Aspirin that helped them control the fever (and the pain) long enough for their body to fight off the infection on its own. (And many articles to this day claim this, including this 2019 article from the Saturday Evening Post.)

But guess what? Aspirin didn’t save. Aspirin Killed!

In 1916 Aspirin was still new and physicians didn’t understand the long term effects or the proper dosage levels. Moreover, the sicker you were, the more they’d give you. Regimens were 8g to 31g a day, which, by the way, is two to four times the maximum safe dosage for an average adult (of 4g). Two to four times! What’s even worse is that at those levels, 33% and 3% of patients will experience hyperventilation and pulmonary edema, respectively. The last thing you want when experiencing a high fever and pneumonia is hyperventilation. The stress on an older adult or one with already compromised lungs (due to smoking, coal mining, asbestos production, or genetic conditions) could literally be lethal. Moreover, pulmonary edema generally is, unless you have immediate access to an expert physician who can drain the fluids without collapsing your lung. As per recent research, it’s likely that at least 3% of those administered Aspirin for the Spanish Flu died from the Aspirin overdoses they were being given.

Of course, the damage done by Aspirin was not limited to the Spanish Flu Epidemic. It wasn’t long before Aspirin was prescribed for everything. Common cold? Check. Sore throat? Check. Arthritic pain? Check. Heart problems? Check. See the 1933 Advertisement in the linked Saturday Evening Post article above. (Note that a tablet at that time would have been about 325 grams [Source], like today, and the advertisement was recommending 1.3 grams in 4 hours and gargling with 975 grams, of which you need to expect some additional absorption (of 5% to 10%, we’ll assume worst case), bringing that total to 1.4 grams. While not nearly as bad as the Spanish Flu level prescriptions, that’s still twice the amount that should be taken in a 4 hour window, and that was being taken in 2 hours.)

When we say damage, we mean damage. Moreover, the damage goes beyond the almost 60 side effects you can find on the Mayo Clinic page.

This is because regular use and/or overdoses of aspirin:

  • increase the risk of developing stomach ulcers,
  • agitate and stomach exacerbate ulcers and can cause bleeding, and
  • can increase non-life threatening ulcer or gastrointestinal bleeding to the point of life threatening

Moreover, in some people it can irritate the lining of the stomach and begin the formation of an ulcer after just a few doses!

But the general population didn’t know this in the 1930s. Heck, it was the 1950s or 1960s before it started to become common knowledge that aspirin wasn’t good if you have an upset stomach or an ulcer. (As far as I can tell, while the first study of aspirin on the stomach was in a 1938 publication by A. H. Douthwaite and G. A. M. Lintott, the subject matter and research was not taken seriously until the 1950s and 1960s, where you had publications like this by R. A. Douglas and E. D. Johnston on Aspirin and the Chronic Gastric Ulcer, which also references the 1938 publication.)

Which means millions of people around the world were using a medicine on a daily basis that was, due to misuse, often harming them as much as it was helping them. And this is only ONE of the 60 potential side effects. (And how many were known, or communicated, in the 1920s through 1950s?)

Because, like many of the breakthrough technologies that came before, it was not only rolled out before the side effects, and more importantly, the long term effects, were well understood, but before even the proper use for the desired primary effects were well understood (as evidenced by the fact that the best physicians were routinely prescribing two to four times the maximum safe dosage during the Spanish Flu Pandemic almost 20 years after first availability). And while there were benefits, there were consequences, some of them severe, and others deadly.

So what’s the relevance? Stay tuned.