Daily Archives: November 19, 2024

Have You Brought Your Supply Chain Planning Out of the Middle Ages?

Back in the 1930s, the dark ages of computing began, starting with the Telex messaging network in 1933. Beginning as an R&D project in 1926, it became an operational teleprinter service, operated by the German Reichspost (under the Third Reich — remember we said “dark ages”). With a speed of 50 baud, or about 66 words per minute, it was initially used for the distribution of military messages, but eventually became a world-wide network of both official and commercial text messaging that survived into the 2000s in some countries. A few years later, Bell Labs’ George Stibitz built the “Model K” adder in 1937 that was the first proof of concept for the application of Boolean Logic to computer design. Two years later, the Bell Labs CNC (Complex Number Calculator) was completed. In 1941, the Z3, using 2,300 relays, was constructed and could perform floating point binary arithmetic with a 22 bit word length and execute aerodynamic calculations. Then, in 1942, the ABC (Atanasoff-Berry Computer) was completed, seen by the John Mauchly, who invented the ENIAC, which was the first general purpose computer completed in 1945.

Three years later, in 1948, Frederic Williams, Tom Kilburn, and Geoff Toothill developed the Small-Scale Experimental Machine (SSEM), which was the first digital, electronic, stored-program computer to run a computer program, consisting of a mere 17 instructions! A year later, we saw the modem that allowed computers to communicate through ordinary phone lines. Originally developed for transmitting radar signals, the modem was adapted for computer use four years later in 1953. The same year saw the EDSAC, the first practical stored-program computer to provide a regular computing service.

A year later, in 1950, we saw the introduction of magnetic drum storage, which could store 1 Million bits, which was a previously unimagined amount of data (and twice what Gates once said anyone would ever need), though nothing by today’s standards. Then, in 1951, the US Census Bureau gets the Univac 1 and the end of the dark ages are in sight. Then, in 1952, only two years after the magnetic drum, IB introduces a high speed magnetic tape, which could store 2 million digits per tape! In 1953, Grimsdale and Webb built a 48-bit prototype transistorized computer that used 92 transistors and 550 diodes. Later that same year, MIT created magnetic core memory. Almost everything was in place for the invention of a computer that didn’t take a whole room. In 1956, MIT researchers began experimenting with direct keyboard input to computers (which up to now could only be programmed using punch cards or paper tape). A prototype of a mini computer, the LGP-30, was created at Caltech this same year. A year later, FORTRAN, one of the first third generation computing languages, was developed in 1957. Early magnetic disk drives were invented in 1959. And 1960 saw the introduction of the DEC PDP-1, one of the first general purposed mini-computers. A decade later saw the first IBM computer to use semiconductor memory. And one year later, in 1971, we saw one of the first memory chips, the Intel 1103, and the first microprocessor, the Intel 4004.

Two years later NPL and Cyclades started experimenting with internetworking with the European Informatics Network (EIN) and Xerox PARC began linking Ethernets with other networks using its PUP protocol. And the Micral, based on the Intel 8008 microprocessor, one of the earliest non-kit personal computers, ws released. the next year, in 1974, the Xerox Parc Alto was released and the end of the dark ages were in sight. In 1976, we saw the Apple I, and in 1981 we saw the first IBM PC and the middle ages began as computing was now within reach of the masses.

By 1981, before the middle ages began, we already had GAIN Systems (1971), SAP (1972), Oracle (1977), and Dassault Systemes, four (4) of the top fourteen (14) supply chain planning companies according to Gartner in their 2024 Supply Chain Planning Magic Quadrant (Challengers, Leaders, and Dassault Systemes). In the 1980s we saw the formation of Kinaxis (1984), Blue Yonder (1985), and OMP (1985). Then in the 1990s, we saw Arkieva (1993), Logility (1996), and John Galt Solutions (1996). This says ten (10) of the top fourteen (14) supply chain planning solution companies were founded before the middle ages ended in 1999 (and the age of enlightenment began).

Tim Berners-Lee invented the World Wide Web in 1989, the first browser appeared in 1990, the first cable internet service appeared in 1995, Google appeared in 1998, and Salesforce, considered to be one of the first SaaS solutions built from scratch launched in 1999. At the same time, we reached an early majority of internet users in North America, ending the middle ages and starting the age of enlightenment, as global connectivity was now available to the average person (at least in a first world country).

Only e2Open (2000), RELEX Solutions (2005), Anaplan (2006), and o9 Solutions (2009) were founded in the age of enlightenment (but not the modern age). In the age of enlightenment, we left behind on premise and early single client-server applications and began to build SaaS applications using a modern SaaS MVC architecture where requests came in, got directed to the machine with the software, that computed answers, and sent them back. This allowed for rather fault-tolerant software since if hardware failed, the instance could be moved. If an instance failed, it could just be redeployed with backup data. It was true enlightenment. However, not all companies adopted multi-tenant SaaS from day one, only a few providers did in the early days. (So even if your SCP company began in the age of enlightenment, it may not be built on a modern multi-tenant cloud-native true SaaS architecture.) This was largely because there were no real frameworks to build and deploy such solutions on (and Salesforce literally had to build their own.

However, in 2008, Google launched its Cloud and in 2010, one year after the last of the top 14 supply chain applications was launched, when Microsoft launched Azure, the age of enlightenment came to an end and the modern age began as there were now multiple cloud-based infrastructures available to support cloud-native true multi-tenant SaaS applications (no DC operational knowledge required), making it easy for any true SaaS provider to develop these solutions from the ground up.

In other words, not one Supply Chain Planning Solution recognized as a top supply chain planning solution by Gartner was founded in the modern age. (Moreover, if you look at the niche players, only one of the six was founded in the age of enlightenment, the rest are also from the middle ages.)

So why is this important?

  • If the SCP platform core was architected back in the day of client server, and the provider did not rearchitect it for true multi-tenant, even if the vendor wrapped this core in a VM (Virtual Machine), put it in a Docker container, and put it in the cloud, it’s still a client-server application at the core. This means it has all the limits of client server applications. One client per server. No scalability (beyond how many cores and how much memory the server can support).
  • If the platform core was architected such that each module, which runs in its own VM, requires a complete copy of the data to function, that’s a lot of data replication required to run the platform, especially if it has 12 separate modules. This can greatly exacerbate the storage requirements, and thus the cost.
  • But that’s not the big problem. The big problem is that models constructed on a traditional client-server architecture were designed to run only one scenario at a time, and only do so if a complete copy of the data is available. So if you want to run multiple models, multiple scenarios for a model, or both, you need multiple copies of the module, each with their own data set for each model scenario you want to run. This not only exacerbates data requirements, but compute requirements as well. (This is why many providers limit how many models you can have and scenarios you can run as their cloud compute costs skyrocket due to the inefficiency in design and data storage requirements.)

    And while there is no such thing as a truly optimal supply chain plan, since you never know all the variables in advance, there are near optimal fault-tolerant plans that, with enough scenarios, can be identified (by building up a picture of what happens at different demand levels, supply levels, transportation times, etc.) and you can select the one that balances cost savings, quality, expected delivery time, and risk at levels you are comfortable with.

That’s the crux of it. If you can’t run enough scenarios across enough models to build up a picture of what happens across different possibilities, you can’t come up with a plan that can withstand typical perturbations, and definitely can’t come up with a plan that can be rapidly put into place to deal with a major demand fluctuation, supply fluctuation, or an unexpected supply chain event.

So if you want to create a supply chain plan that can enable supply chain success, make sure you’ve brought your supply chain planning out of the middle ages (through the age of enlightenment) and into the modern age. And we mean you. If you chose a vendor a decade ago and are resisting migration to a newer solution, including one offered by the vendor, because you spent years, and millions, tightly integrating it to your ERP solution, then you’re likely running on a single tenant SaaS architecture at best, and a nicely packaged client server architecture otherwise. You need to upgrade … and you should do it now! (We won’t advise you here as we don’t know all of the vendors in the SCP quadrant well enough, but we know some, including those that have recently acquired newer, age of enlightenment and even modern age solutions, and know that some still have old tech on old stacks that they maintaining because of install base. Don’t be the company stalling progress for your own good!)