Category Archives: Supply Chain

One Supply Chain Misconception That Should Be Cleared Up Now

This originally posted May 14 (2024).  It’s being reposted because this definitely needs to be cleared up before the new year (due to the constant proliferation of AI, which is, when all is said and done, just another technology).

Not that long ago, Inbound Logistics ran a similarly titled article that quoted a large number of CXOs that made some really good observations on common misconceptions that included, and are not necessarily limited to (and you should check out the article in full as a number of the respondents made some very good points on the observations):

The misconceptions included statements that supply chains should:

  • reduce cost and/or track the most important metric of cost savings
  • accept negotiations as a zero-sum game
  • model supply chains as linear (progression from raw materials to finished goods)
  • … and made up of planning, buying, transportation, and warehousing silos
  • … and each step is independent of the one that proceeds and follows
  • accept they will continue to be male dominated
  • become more resilient by shifting production out of countries to friendly countries
  • expect major delays in transportation
  • … even though traditional networks are the best, even for last-mile delivery
  • accept truck driver shortage as a systemic issue
  • accept the blame when anything in them goes wrong
  • only involve supply chain experts
  • run on complex / resource intensive processes
  • … and only be optimal in big companies
  • … which can be optimized one aspect at a time
  • press pause on innovation or redesign or growth in a down market
  • be unique to a company and pose unique challenges only to that company
  • not be sustainable as that is still cost-prohibitive
  • see disruption as an aberration
  • return to (the new) normal
  • use technology to fix everything
  • digitalize as people will become less important with increasing automation and AI in the supply chain

And these are all very good points, as these are all common misconceptions that the doctor hears too much (and if you go through enough of the Sourcing Innovation archives, it should become clear as to why), but not the biggest, although the last one gets pretty close.

 

THE BIGGEST SUPPLY CHAIN MISCONCEPTION

We Can Use Technology to Do That!

the doctor DOES NOT care what “THAT” is, you cannot use technology to do “THAT” 100% of the time in a completely automated way. Never, ever, ever. This is regardless of what the technology is. No technology is perfect and every technology invented to date is governed by a set of parameters that define a state it can operate effectively in. When that state is invalidated, because one or more assumptions or requirements cannot be met, it fails. And a HUMAN has to take over.

Even though really advanced EDI/XML/e-Doc/PDF invoice processing can automate processing of the more-or-less 85% of invoices that come in complete and error free, and automate the completion and correction of the next 10% to 13%, the last 2% to 5% will have to be human corrected (and sometimes even human negotiated) with the supplier. And this is technology we’ve been working on for over three decades! So you can just imagine the typical automation rates you can expect from newer technology that hasn’t had as much development. Especially when you consider the next biggest misconception.

Don’t Abuse Lean and Mean — The Four Horsemen of the Shipocalypse Don’t Need Any Help!

If you are in Procurement or Logistics, you know that the time of cheap, fast, and reliable — which we had for almost two decades, is now long gone and likely to never return. That is because the four horsemen have turned their attention to global trade … specifically, global logistics … and have brought:

  • war: the conflict in the Red Sea, one of the two most important waterways in the world, has made most transport almost impossible
  • famine: the droughts in Panama, the other of the two most important waterways in the world, have reduced its capacity by at least 1/3 for at least 1/3 of the year
  • pestilence: plague has returned, taking down the necessary workers (and closing the necessary ports) with it
  • death: corporate greed and union response have stepped in here to bring certain death to global supply chains if things don’t change:
    • oil prices: the more they go up, the more unaffordable our dirty ocean freight becomes
    • limited capacity: greedy corporations scrapped ships during the pandemic for insurance claims, sometime ships that hadn’t even made a single voyage … and now that they’ve learned they can raise prices up to 10X pre-pandemic prices for a single container during peak season, and the richer (luxury good) companies will still pay the rates, they have no incentive to bring capacity back
    • union demands: inflation has been rampant, workers have been impacted, and they want their pre-pandemic buying power … and, as I’ve noted before, labour unrest and strikes is now one of the biggest risks in your global supply chain

As a result, the last thing you want to do is help the horsemen bring your supply chain to a a halt, but that’s exactly what you keep doing day in and day out as you keep pursuing, and applying, lean, mean, and JIT (just-in-time) where it doesn’t belong.

As noted by the author of this recent LinkedIn article on how you have (less than) two weeks to stave off supply chain chaos, we’re at the point where a one day stop in any part of the supply chain turns into one week to recover from, a one week stop in any part of the supply chain turns into one month to recover from, and a one month stop in any part of the supply chain totally f*cks us for a year! (Since the effects are not linear but exponential!) And it’s all your fault.

Lean and mean was supposed to be about efficiency in manufacturing and lack of waste, not slashing inventory to dangerous levels, not slashing capacity to dangerous levels, and was certainly NOT meant to be used by idiot MBAs (which stands for Master of Business Annihilation) with no concept of what the corporation does running global corporations off of spreadsheets alone!

So stop applying it to inventory and capacity! Thank you.

ketteQ: An Adaptive Supply Chain Planning Solution Founded in the Modern Age

As per yesterday’s post, any supply chain planning solution developed before 2010 isn’t necessarily built on a modern multi-tenant cloud-ready SaaS stack (as such a stack didn’t exist, and it would have had to be partially to fully re-platformed to be modern multi-tenant cloud-ready SaaS). Any solution built after was much more likely to be built on a modern multi-tenant cloud-ready SaaS stack. Not guaranteed, but more likely.

KetteQ‘s Adaptive Supply Chain Planning Solution is one of these solutions that was built in the modern age on a fully modern multi-tenant cloud-native SaaS stack, and one that has some advantages you won’t find in most of the competition. I was able to get an early view of the latest product which was released last week. Founded in 2018, ketteQ was built from the ground up to embody all of the lessons learned from the founders’ 100+ successful supply chain planning solution implementations across industries and systems, and the wisdom gained from building two prior supply chain companies, with the goal of addressing all of the issues they encountered with previous generation solutions. The modern architecture was purpose built to fully utilize the transformational power of optimization and machine learning. It was a tall feat, and while still a work in progress (as they admit they currently only have three mature core modules on par with their peers in depth and breadth [although all inherit the advantages of their modern stack and solver architecture]), but one they have pulled off as they can also address a number of other areas with their other, newer modules, and integration to third party systems (particularly for order management, production scheduling, and transportation management) and address End-to-End (E2E) supply chain planning, with native Integrated Business Planning (IBP) across demand, inventory, and supply — which are their core modules, along with a module for Service Parts Planning and S&OP Planning.

In addition to this solid IBP core, they also have capabilities across cost & price management, asset management, fulfillment & allocation, work order management, and service parts delivery. And all of this can be accessed and controlled through a central control tower.

And most importantly, the entire solution is cloud native, designed to leverage horizontal scalability and connectivity, and built for scale. The solution is enabled by a single data model that can be stored in an easily accessible open SQL database, in a contemporary architecture that supports all solutions. The solution is extendable to support scalability, multiple models, multiple scenarios per model, and a new, highly scalable solver that can perform thousands of heuristic tests and apply a genetic algorithm with machine learning to find a solution by testing all demand ranges against all supply options to find a solution that minimizes cost / maximizes margin against potential demand changes and fill rates.

Of course, the ketteQ platform comes with a whole repertoire of applied Optimization/ML/Genetic/Heuristic models for
demand planning, inventory planning, and supply planning, as well as S&OP. In addition, because of its extensible architecture, instead of manually running single scenarios at a time, it can run up tothousands of scenarios for multiple models simultaneously, and present the results that best meet the goal or the best trade-off between multiple goals.

KetteQ does all of this in a platform that is, compared to older generation solutions:

  • fast(er) to deploy — the engine was built for configuration, their scalable data model and data architecture make it easy to transform and integrate data, and they can customize the UX quickly as well
  • easy to use — every screen is configured precisely to efficiently support the task at hand, and the UX can be deployed standalone or as a Salesforce front end
  • cost-effective — since the platform was built from the ground up to be a true multi-tenant solution using a centralized, extensible, data architecture, each instance can spin off multiple models, which can spin off multiple scenarios, each of which only requires the additional processing requirement for that scenario instance and only the data required by that scenario; and as more computing power is required, it supports automatic horizontal scaling in the cloud.
  • better performing — since it can run more scenarios in more models using modern multi-pass algorithms that combine traditional machine learning with genetic algorithms and multi-pass heuristics that go broad and deep at the same time to find solutions that can withstand perturbations while maximizing the defined goals using whatever weighting the customer desires (cost, delivery time, carbon footprint, etc.)
  • more insightful — the package includes a full suite of analytics built on Python that are easily configured, extended, and integrated with AI engines (including Gen-AI if you so desire), which allows data scientists to add their own favorite forecasting, optimization, analytics, and AI algorithms; in addition, it can easily be configured to run and display best-fit forecasts at any level of hierarchy and automatically pull in and correlate external indicators as well
  • more automated — the platform can be configured to automatically run through thousands of scenarios up and down the demand, supply, and inventory forecasts on demand as data changes, so the platform always has the best recommendation on the most recent data; these scenarios can include multiple sourcing, logistics, and even bills of material; and they can be consolidated meta-scenarios for end-to-end integrated S&OP across demand, supply, and inventory
  • seamless Salesforce integration — takes you from customer demand all the way down to supply chain availability; seamless collaboration workflow with Salesforce forecast, pipeline, and order objects in the Salesforce front end
  • AWS nativity — for full leverage of horizontal scalability and serverless computing, multi-tenant optimization and analytics, and single-tenant customer data. Moreover, the solution is also available on the AWS marketplace.

In this coverage, we are going to primarily focus on demand and supply (planning) as that is the most relevant from a sourcing perspective. Both of these heavily depend on the platform’s forecasting ability. So we’ll start there.

Forecasting

In the ketteQ platform, forecasts, which power demand and supply planning,

  • can be by day, week, month, or other time period of interest
  • can be global, regional, local, at any level of the (geo) hierarchy you want
  • can be category, product line, and individual product
  • can be business unit, customer, channel
  • can be computed using sales data/forecasts, finance data, marketing data/forecasts, baselines, and consensus
  • can use a plethora of models (including, but not limited to Arima[Multivariate], Average, Croston, DES, ExtraTrees, Lasso[variants], etc.), as well as user defined models in Python
  • can be configured to select the best fit algorithm automatically based on historical data, based on just POS data, POS data augmented with economic indicators, external data (where insufficient POS data), etc.

These models, like all models in the platform, can be set up using a very flexible and responsive hierarchy approach, with each model automatically pulling in the model above it and then altering it as necessary (simply by modifying constraints, goals, data [sources], etc.). In the creation of models, restore points can be defined at any level before new data or new scenarios are run so the analyst can backtrack at any time.

Demand Planning

The demand planning module in ketteQ can compute demand plans that take into account:

  • market intelligence input to refine the forecast (which can include thousands of indicators across 196 countries from Trading Economics as well as your own data feeds) (and which can include, or not, correlation factors for correlation analysis)
  • demand sensing across business units, channels, customers, and any other data sources that are available to be integrated into the platform
  • priorities across channels, customers, divisions, and departments
  • multiple “what if” scenarios (simultaneously), as defined by the user
  • consensus demand forecasts across multiple forecasts and accepted what-ifs

The module can then display demand (plans) in units or value across actuals, sales forecasts, finance forecasts, marketing forecasts, baseline(s), and consensus.

In addition to this demand planning capability and all of the standard capabilities you would expect from a demand planning solution, the platform also allows you to:

  • Prioritize demand for planning and fulfillment
  • Track demand plan metrics
  • Consolidate market demand plans
  • Handle NPI & transition planning
  • Define user-specific workflows

Supply Planning

The reciprocal of the demand planning module, the supply planning module in ketteQ leverages what they call the PolymatiQ solver. (See their latest whitepaper at this link.)

Their capabilities for product and material planning includes the ability to:

  • compute plans by the day, week, month, or any other time frame of interest
  • do so globally, regionally, locally, or at any level of the hierarchy you want
  • and do so for all regional, local, or any other subset of suppliers of interest, as well as view by customer-focused dimensions such as channel, business unit and customer
  • use the current demand forecast, modifications, and taking into account current and projected supply availability, safety stock, inventory levels, forecasted consumption rates, expected defect rates, rotatable pools, and current supplier commitments, among other variables
  • run scenarios that optimize for cost and service
  • coordinate raw and pack material requirements for each facility
  • support collaboration with suppliers and manufacturing
  • manage sourcing options and alternates (source/routes) for make, buy, repair and transfers

Moreover, supply plans, like demand plans, can be plotted over time based on any factors or factor pair of interest, such as supply by time frame, sourcing cost vs fill rate, etc.

In addition, the supply planning module for distribution requirements can:

  • develop daily deployment plans
  • develop time-phased fulfillment and allocation plans
  • manage exceptions and risks
  • conduct what-if scenario analysis
  • execute short-term plans
  • track obsolescence and perform aging analysis/tracking

Inventory Planning

We did not see or review the inventory planning module in depth, even though it is one of their three core modules, so all we can tell you is that it has most of the standard functionality one would expect, and given the founder’s heritage in the service parts planning world, you know it can handle complex multi-echelon / multi-item planning. Capabilities include:

  • manage raw, pack and finished goods inventory
  • set and manage dynamic safety stock, EOQ, ROP levels and policies
  • ensure inventory balance and execution and support for ASL (authorized stocking list), time-phased, and trigger planning by segment
  • support parametric optimization for cost and service balancing
  • the ability to minimize supply chain losses through better inventory management
  • the ability to optimize service levels relative to goals

Salesforce: IBP

As we noted, the ketteQ platform supports native Salesforce integration, and you can do full IBP through the custom front-end built in Salesforce CRM, which allows you to seamlessly jump back and forth between your CRM and SCM, following the funnel from customer order to factory supply and back again.

The Salesforce front-end, which is very extensive, supports the typical seven-step IBP process:

  1. Demand Plan
  2. Demand Review
  3. Supply Plan
  4. Pre IBP Review
  5. Executive IBP Review
  6. Operational Plan
  7. Finalization

… and allows it to be done easily in Salesforce design style, with walk-through tab-based processes and sub-tabs to go from summary to detail to related information. Moreover, the UI can be configured to only include relevant widgets, etc.

In addition, users can easily select an IBP Cycle; drill into orders and track order status; define custom alerts; subscribe to plans, updates, and related reports; follow sales processes including the identification and tracking of opportunities; jump into their purchase orders (on the supply side); track assets; manage programs; and access control tower functionality.

As a result of the integration with Salesforce objects, including Pipeline and Orders, the solution helps bridge the gap between sales and supply chain organizations, enabling executive-driven process change. As an advanced supply chain solution on the Salesforce Appexchange, it enables the broad base of Salesforce customers on the manufacturing cloud a slew of unique integration possibilities.
And, of course, if you don’t have Salesforce, you still have all this functionality (and more) in the ketteQ front-end.

Finally, the platform can do much more as it also has modules, as we noted, for service parts planning, service parts delivery, sales and operations planning, cost and price management, fulfillment & allocation, asset management, clinical demand management, and a control tower. It is a fundamentally modern approach to planning that is worth exploring for companies that are challenged to adapt in today’s disruptive supply chain environment. For a deeper dive into these modules and capabilities, check out their website or reach out to them for a demo. This is a recommendation for ANY mid-sized or larger manufacturing (related) organization looking for a truly modern supply chain planning solution.

Have You Brought Your Supply Chain Planning Out of the Middle Ages?

Back in the 1930s, the dark ages of computing began, starting with the Telex messaging network in 1933. Beginning as an R&D project in 1926, it became an operational teleprinter service, operated by the German Reichspost (under the Third Reich — remember we said “dark ages”). With a speed of 50 baud, or about 66 words per minute, it was initially used for the distribution of military messages, but eventually became a world-wide network of both official and commercial text messaging that survived into the 2000s in some countries. A few years later, Bell Labs’ George Stibitz built the “Model K” adder in 1937 that was the first proof of concept for the application of Boolean Logic to computer design. Two years later, the Bell Labs CNC (Complex Number Calculator) was completed. In 1941, the Z3, using 2,300 relays, was constructed and could perform floating point binary arithmetic with a 22 bit word length and execute aerodynamic calculations. Then, in 1942, the ABC (Atanasoff-Berry Computer) was completed, seen by the John Mauchly, who invented the ENIAC, which was the first general purpose computer completed in 1945.

Three years later, in 1948, Frederic Williams, Tom Kilburn, and Geoff Toothill developed the Small-Scale Experimental Machine (SSEM), which was the first digital, electronic, stored-program computer to run a computer program, consisting of a mere 17 instructions! A year later, we saw the modem that allowed computers to communicate through ordinary phone lines. Originally developed for transmitting radar signals, the modem was adapted for computer use four years later in 1953. The same year saw the EDSAC, the first practical stored-program computer to provide a regular computing service.

A year later, in 1950, we saw the introduction of magnetic drum storage, which could store 1 Million bits, which was a previously unimagined amount of data (and twice what Gates once said anyone would ever need), though nothing by today’s standards. Then, in 1951, the US Census Bureau gets the Univac 1 and the end of the dark ages are in sight. Then, in 1952, only two years after the magnetic drum, IB introduces a high speed magnetic tape, which could store 2 million digits per tape! In 1953, Grimsdale and Webb built a 48-bit prototype transistorized computer that used 92 transistors and 550 diodes. Later that same year, MIT created magnetic core memory. Almost everything was in place for the invention of a computer that didn’t take a whole room. In 1956, MIT researchers began experimenting with direct keyboard input to computers (which up to now could only be programmed using punch cards or paper tape). A prototype of a mini computer, the LGP-30, was created at Caltech this same year. A year later, FORTRAN, one of the first third generation computing languages, was developed in 1957. Early magnetic disk drives were invented in 1959. And 1960 saw the introduction of the DEC PDP-1, one of the first general purposed mini-computers. A decade later saw the first IBM computer to use semiconductor memory. And one year later, in 1971, we saw one of the first memory chips, the Intel 1103, and the first microprocessor, the Intel 4004.

Two years later NPL and Cyclades started experimenting with internetworking with the European Informatics Network (EIN) and Xerox PARC began linking Ethernets with other networks using its PUP protocol. And the Micral, based on the Intel 8008 microprocessor, one of the earliest non-kit personal computers, ws released. the next year, in 1974, the Xerox Parc Alto was released and the end of the dark ages were in sight. In 1976, we saw the Apple I, and in 1981 we saw the first IBM PC and the middle ages began as computing was now within reach of the masses.

By 1981, before the middle ages began, we already had GAIN Systems (1971), SAP (1972), Oracle (1977), and Dassault Systemes, four (4) of the top fourteen (14) supply chain planning companies according to Gartner in their 2024 Supply Chain Planning Magic Quadrant (Challengers, Leaders, and Dassault Systemes). In the 1980s we saw the formation of Kinaxis (1984), Blue Yonder (1985), and OMP (1985). Then in the 1990s, we saw Arkieva (1993), Logility (1996), and John Galt Solutions (1996). This says ten (10) of the top fourteen (14) supply chain planning solution companies were founded before the middle ages ended in 1999 (and the age of enlightenment began).

Tim Berners-Lee invented the World Wide Web in 1989, the first browser appeared in 1990, the first cable internet service appeared in 1995, Google appeared in 1998, and Salesforce, considered to be one of the first SaaS solutions built from scratch launched in 1999. At the same time, we reached an early majority of internet users in North America, ending the middle ages and starting the age of enlightenment, as global connectivity was now available to the average person (at least in a first world country).

Only e2Open (2000), RELEX Solutions (2005), Anaplan (2006), and o9 Solutions (2009) were founded in the age of enlightenment (but not the modern age). In the age of enlightenment, we left behind on premise and early single client-server applications and began to build SaaS applications using a modern SaaS MVC architecture where requests came in, got directed to the machine with the software, that computed answers, and sent them back. This allowed for rather fault-tolerant software since if hardware failed, the instance could be moved. If an instance failed, it could just be redeployed with backup data. It was true enlightenment. However, not all companies adopted multi-tenant SaaS from day one, only a few providers did in the early days. (So even if your SCP company began in the age of enlightenment, it may not be built on a modern multi-tenant cloud-native true SaaS architecture.) This was largely because there were no real frameworks to build and deploy such solutions on (and Salesforce literally had to build their own.

However, in 2008, Google launched its Cloud and in 2010, one year after the last of the top 14 supply chain applications was launched, when Microsoft launched Azure, the age of enlightenment came to an end and the modern age began as there were now multiple cloud-based infrastructures available to support cloud-native true multi-tenant SaaS applications (no DC operational knowledge required), making it easy for any true SaaS provider to develop these solutions from the ground up.

In other words, not one Supply Chain Planning Solution recognized as a top supply chain planning solution by Gartner was founded in the modern age. (Moreover, if you look at the niche players, only one of the six was founded in the age of enlightenment, the rest are also from the middle ages.)

So why is this important?

  • If the SCP platform core was architected back in the day of client server, and the provider did not rearchitect it for true multi-tenant, even if the vendor wrapped this core in a VM (Virtual Machine), put it in a Docker container, and put it in the cloud, it’s still a client-server application at the core. This means it has all the limits of client server applications. One client per server. No scalability (beyond how many cores and how much memory the server can support).
  • If the platform core was architected such that each module, which runs in its own VM, requires a complete copy of the data to function, that’s a lot of data replication required to run the platform, especially if it has 12 separate modules. This can greatly exacerbate the storage requirements, and thus the cost.
  • But that’s not the big problem. The big problem is that models constructed on a traditional client-server architecture were designed to run only one scenario at a time, and only do so if a complete copy of the data is available. So if you want to run multiple models, multiple scenarios for a model, or both, you need multiple copies of the module, each with their own data set for each model scenario you want to run. This not only exacerbates data requirements, but compute requirements as well. (This is why many providers limit how many models you can have and scenarios you can run as their cloud compute costs skyrocket due to the inefficiency in design and data storage requirements.)

    And while there is no such thing as a truly optimal supply chain plan, since you never know all the variables in advance, there are near optimal fault-tolerant plans that, with enough scenarios, can be identified (by building up a picture of what happens at different demand levels, supply levels, transportation times, etc.) and you can select the one that balances cost savings, quality, expected delivery time, and risk at levels you are comfortable with.

That’s the crux of it. If you can’t run enough scenarios across enough models to build up a picture of what happens across different possibilities, you can’t come up with a plan that can withstand typical perturbations, and definitely can’t come up with a plan that can be rapidly put into place to deal with a major demand fluctuation, supply fluctuation, or an unexpected supply chain event.

So if you want to create a supply chain plan that can enable supply chain success, make sure you’ve brought your supply chain planning out of the middle ages (through the age of enlightenment) and into the modern age. And we mean you. If you chose a vendor a decade ago and are resisting migration to a newer solution, including one offered by the vendor, because you spent years, and millions, tightly integrating it to your ERP solution, then you’re likely running on a single tenant SaaS architecture at best, and a nicely packaged client server architecture otherwise. You need to upgrade … and you should do it now! (We won’t advise you here as we don’t know all of the vendors in the SCP quadrant well enough, but we know some, including those that have recently acquired newer, age of enlightenment and even modern age solutions, and know that some still have old tech on old stacks that they maintaining because of install base. Don’t be the company stalling progress for your own good!)

Part Analytics Get PARTicular About Your Electronics-Enabled Supply Chain and Source Smarter with Deep ANALYTICS-based Insight

Over the past few years a few vendors have come out of the factories to support your direct-specific supply chain, but there’s still only a few that specialize in the Electronics Supply Chain (especially when you include deep sourcing [automation] support) and PartAnalytics is one that you may not have heard of, but definitely should know of given their ability to drastically reduce direct sourcing times for electronics components while reducing costs, lead-times, uncertainty, and compliance.

Part Analytics was founded in 2019 to increase open collaboration between Original Equipment Manufacturers (OEMs), Electronics Manufacturing Services (EMS), and suppliers, starting with a comprehensive, standardized source of information for direct components and materials used in electronics-based manufacturing. This would allow demand and supply information to be shared, costs and lead times to be better managed, and supply chain risks minimized. Founded by global sourcing professionals with expertise in electronic and electro-mechanical supply chains, they applied their deep knowledge of products and buying processes to build a solution that would not only simplify the sourcing process for components and bills of material, but also allow much of it to be automated.

The Part Analytics solution is split into four primary modules (which can be accompanied by a fifth that serves as a cross-platform executive dashboard):

  • Part IQ: Contains detailed part data for the electronic / electro-mechanical parts the organization sources
  • BOM IQ: Contains detailed information on BOMs used by the organization (for sourcing purposes)
  • Category IQ: Aggregates part information across BOMs to provide insights into demand, benchmarks, commodity, and supplier risk information and provide analytical insights to reduce spend, lead-time, and risk
  • RFQ IQ: deep RFQ functionality for sourcing (automation) on a Total Cost Basis (TCB) (with up to 97% manual time savings once an event has been setup)

In this article, we will look at each module individually, after noting that since Part Analytics is focused on the electronic and electro-mechanical supply chains, most of the sourcing projects (60% to 70%) revolve around PCBs (Printed Circuit Boards, not Polychlorinated Biphenyls) and related components. As a result, the focus of most of their customers is on part and material cost and lead-time optimization in those categories specifically, which is why their central focus in Part IQ is on those components.

Part IQ

Part IQ is the global supplier parts library that

  • represents the integrated catalog across all suppliers,
  • maintains the organization AVLs (Approved Vendor Lists)
  • maintains Part Analytics‘ and the customers’ internal PPE (Prescribed Parts Equivalent) lists
  • maintains part cost and risk details by part
  • makes global search by part and equivalent quick and easy

Part IQ contains a database of hundred of thousands of components from thousands of suppliers globally and provides real-time inventory availability monitoring from distributors and suppliers. It’s also capable of monitoring part availability and notifying the buyer as soon as a specific part becomes available anywhere in the network.

Deep detail is maintained on every single part and includes information such as manufacturer and part number, usage, prices, savings opportunities, and alternates. It also cross-references BoMs containing the part, known risks associated with the part, and Part Analytics.

This allows a design engineer to quickly gauge availability during R&D or build-to-order quoting, as well as sourcing professionals to quickly gauge immediate availability, lead times and expected pricing if they also have Category IQ (to be discussed later) as it can pull in trends and insights from the Category IQ module. If they integrate with their PLM, they can also see current inventory within the tool as well as pull in product forecasts to see if the available supply is likely to meet their demand.

It also helps R&D to ensure compliance with industry and market requirements as Part Analytics harmonizes all of the data and they can quickly tell not only if a product is compatible, but if it is compliant with certain regulations as detailed specifications with the required material composition will always be available in the drill down.

BOM IQ

Arguably the core of the suite, BOM IQ (or Bill of Materials IQ) stores all of the electronic / electro-mechanical bill of materials being sourced by the organization as well as associated forecasts/demands from the PLM solution (and can push updates on material availability, inventory, AVL, and approved PPEs into the ERP if desired).

At the BOM level, it provides an organization with insight into:

  • the overall health rating (based on compliance, material/product risk, product/part/line-item health)
  • the number of unique line items with lead time, lifecycle, single source, RoHS, or other supported compliance risk (if data feeds/subscriptions are available)
  • the current annual spend summary and projected spend summary
  • BOM cost trend over time
  • the total estimated savings available from based on alternates and negotiation

For every line item it also stores all of the relevant associated information including, but not limited to manufacturer, distributor, current costs, usages, risk/health rating, and information from past events. It’s very easy for a user to navigate around the BOM IQ product and see not only current prices and usage, but to drill into associated risks and compliance.

In a nutshell, the solution provides actionable data by leveraging technology to contextualize data from hundreds of sources including distributors and manufacturers.

Category IQ

Category IQ rolls up the line-item/part/component/material intelligence by category and allows an organization to get an overview of their spend, opportunity, and risk from various points of view such as commodity, supplier, business divisions and products. This allows the organization to get a comprehensive view of spend and savings potential from different viewpoints and make the best overall sourcing decisions for the organization.

More importantly, the organization can also see a roll-up of risks and non-compliance by category, which can be filtered to certain risk types to allow the organization to address the most critical risks first. Especially since it can roll up the number of units in a given life cycle state, being single sourced, etc. and drill in to parts coming from a specific region to allow an organization to quickly assess the potential impact of geographic/geopolitical events and disasters on the supply chain.

Once the buyer has a firm handle on her categories, she can proceed to sourcing in RFQ IQ.

RFQ IQ

With a strong understanding of her categories, a buyer can initiate sourcing projects using RFQ IQ. This module simplifies the sourcing process by allowing buyers to set up events with ease. Key elements include defining line-items or BOMs, approved vendors, questionnaires, and bid sheets with detailed cost breakdowns.

Upon receiving bids, buyers benefit from a comprehensive summary that highlights the total parts up for bid, the number of bids received, and potential new spend based on the lowest bids. This summary also offers insights into category spend by business unit.

The platform enables buyers to delve deeper into individual supplier bids, comparing spend differentials and assessing the impact of choosing specific bids. Buyers can utilize automated features to award the lowest bid by supplier on a part or BOM basis, or make manual adjustments to finalize awards. Notifications are then sent to suppliers, and the award details can be integrated into the ERP system to initiate the contracting and P2P process.

One of the standout benefits of RFQ IQ is the ability to automate much of the sourcing process. Once the master file is established, buyers can launch events by simply defining timelines. Automation can handle the process from initiation to award recommendation, significantly reducing manual effort. For example, one client saw a reduction in manual effort from 710 hours to just 10 hours, thanks to the module’s robust automation capabilities. While results may vary, most organizations experience efficiency gains of 30% to 60%, depending on their automation preferences. Additionally, the overall sourcing process time can be cut from months to mere weeks, providing substantial time savings.

Furthermore, since the demand can be defined by business unit, it allows their customers to maintain their decentralized structure (as the platform can support bids by business unit when each is in a different location and would dictate a different landed cost) while still supporting volume consolidation through a Centralized Center of Excellence (COE) for cost reduction and best practice sourcing. This also allows an organization to get a fully centralized view into their global supply base by category, BoM, and part; identify key areas of material/part/product-based risk that needs to be assessed; and harmonize costs and lead-times at the same time.

By giving buyers a global view, they can identify all FFF and F component alternatives, including those that are more readily available, higher in quality, and/or earlier in their life-cycle, allowing the organization to identify potential strategic OEMs and suppliers early. And for off-the-shelf, always having centralized insight into global supply across hundreds of distributors is extremely valuable when a disruption happens in your current supply chain.

Furthermore, the fact that Part Analytics is PLM (and not ERP) first means that buyers have a firm handle on not only what Manufacturing needs, but what R&D is working on and can ensure R&D is not designing for materials/parts that could be expensive or hard to get and/or maintain a stable supply of when there are more affordable, more available, or more reliable alternatives available.

Plus, if desired, part and BoM population can be done entirely from spreadsheets, allowing for an organization to get up and running quickly as a) most organizations without a system custom designed for electronics / electro-mechanical direct sourcing, even if they have a modern PLM (and/or ERP), maintain all their part and BOM info in spreadsheets. Not only does this allow Part Analytics to get an organization up and running quickly, but it also allows them to instill best practice as Part Analytics serves as the Parts Master and, once the PLM integration is completed, always keeps the BOM in synch, and the organization never has to worry if they are sending the sheet with the right version of the BOM (was that v.21 or v.23 we finally decided on) to a supplier for bidding.

Looking Ahead

Right now, it’s just cost trends over time for category intelligence, but by Q4 Part Analytics intends to release advanced commodity/sub-commodity insights around pricing trends, availability, and lead time using advanced analytic and forecasting algorithms and supply and demand signals in their Category IQ Module.

Also, as indicated above, future releases will support more data integrations for supplier (and not just material/component/part-level) risk analysis.

Summary

Part Analytics is a great solution to harmonize sourcing, inventory, and supply chain visibility in your electronics and electro-mechanical spend categories. Furthermore, it’s real-integration to hundreds of OEMs and distributors provide invaluable real-time insight into supply, demand, lead-time, and cost trends and benchmarks that can help organizations get a better handle on their overall sourcing efforts, especially if they primarily run as a decentralized operation across product lines / business units or geographies, especially since it can unite engineering and commodity sourcing teams on one coherent picture. It’s a great solution for part-based supply chain visibility, and for deeper insight into how to achieve this, you can download and checkout their handbook on supply chain visibility.