January 30, 2017
A Turbulent Industry: The Evolution of Component Distribution and the Future of Obsolescence Management
Since the harnessing of static electricity and the understanding of its production and ultimately useful storage, inventors have been refining the ways in which we can advance technology. Currently, Taiwan Semiconductor Manufacturing Company (TSMC) believes their semiconductor processing plant will be mass producing 5nm process transistors by 2020. (Smaller transistors are preferred worldwide because they allow their “switch” to be reset more quickly, allowing faster data exchange. With a human hair around 75,000nm in diameter and a red blood cell 6,000nm in diameter, 5nm is thus phenomenally small.) The pace of change is incredible and as technology advances, the human element becomes the limiting factor – from iterative design through to simple things like shipping and delivery.
When Arrow Electronics was founded in 1935 in New York City the company was focused on repairing radios, with components that were often as large as your fist. At this stage obsolescence didn’t really exist as a concept outside of the horse/automobile dynamic and certainly not in electronics. Televisions didn’t exist as a massive commercial venture and transistor development only started to make large advances in the mid-20th century.
Today, in stark contrast, the electronics supply chain is one of the largest and most automated industries on the planet, worth hundreds of billions of dollars. The top 25 distributors are valued at over $85bn in annual sales. Many hundreds of companies are franchised to distribute components for original component manufacturers (OCMs), while many thousands more exist in the non-franchised space (concepts explained in my previous blog). With this huge spread of technology and massive choices in supply partners, there is a similarly massive opportunity for nefarious organisations to exploit even the most sensitive supply chains. So who can you trust? When we layer onto this enormous market the knowledge that several well-respected experts place, the consequences of counterfeit electronics at up to 8% of the global market, we start to see larger issues with trust and supply-chain security.
Experts exist across the market in avoiding counterfeits. Standards bodies such as SAE International, IDEA and government agencies like UK MOD and US DOD can advise on best practices and offer certification, but in reality only some distributors are adhering to them.
Of course, not every independent distributor is the same, and as with any industry, some are better able to absorb the cost, logistics, and time commitments needed to adhere to stringent certification requirements. For example, test and inspection commitments are a significant factor when considering partners in your supply chain. What level is appropriate for your customers, though?
Selecting suppliers who have deep investments in x-ray fluorescence (XRF – the measurement and determination of the composition of materials), x-rays (the non-destructive testing of components to confirm internal characteristics), and decapsulation (the destructive testing of components to confirm internal characteristics) technologies in-house is advisable, but this restrains your breadth of choice. Our own quality and compliance expert Dwight Gerardi penned an excellent series of blogs detailing our capabilities; this is an excellent place to start when evaluating future suppliers and partners.
Ultimately, deciding which level of testing, certification, and security in your supply chain is appropriate for your customer base is something that will be unique to every company. However, in an increasingly risk-averse industry, how much of your brand capital could be lost in choosing the wrong partner?
To understand more about industry best practices, or anything covered in our Converge blog series, please reach out directly. I’m delighted to hear from you at email@example.com
January 10, 2017
It’s interesting that in an industry that’s over 100 years old we would be re-visiting past or seemingly settled topics to bring new perspectives. Even more interesting is that these concepts are not driven by the traditional march of technological development, but by market forces to which we need to respond. In our case, those forces are the increasing demands driven by obsolescence – which is regularly dictated by . No matter what the cause, the effects are stark and can be devastating.
In my last Converge blog I introduced the ideas of content, context and control and their place in the decision-making process. Today I’ll dig into why the actual process followed for decision-making is often confused and hobbled by the volume of data that companies have to process. Many companies don’t understand the difference between information and communication and choosing to make decisions based on information (content), instead of the communication that arises from information (context). Therefore, they lose control of the ability to make good decisions.
If we evaluate the process for problem solving, it generally follows the same outline, or the three D’s: drive, direction and decision.
Drive is what starts your process. This could be a reaction to a market event, a response to a client request, or an instruction from colleagues or leadership. Regardless of what you’re reacting to, this is internally motivated.
Direction is what you want to achieve in response to the drive. Specifying obstacles and options, identifying partners and contributors, and planning an approach and managing that approach can be a long-term strategic plan or a short-term fix.
Decision is the execution of the direction within your company structure and its constraints. As with drive, this is an internal process because it’s internally executed.
The mistake a lot of companies make in this process is also treating the direction piece as an exclusively internal analysis point. This direction segment of the process should be massively influenced by what you’re experiencing from the marketplace. Technology development and availability of product, volatility of a chosen component manufacturer, export controls, life cycles of components … anything at all that is beyond the internal control of your organisation should be considered when formulating a response.
This is a key reason why high-reliability and high-criticality components users – like aerospace and defence manufacturers – are so often hit with obsolescence problems. Two of the external factors they need to contend with are the qualification of designs against the original specification and then the certification of those designs for their intended use. Both of these factors are expensive and time-consuming … so these industries tend to have a higher than expected reliance on common, reused, tested and certified components. This means that current known issues are often simply replaced with future unknown issues – purely based on cost and simplicity, not on external factors influencing the correct decision. Studies by Cranfield University show the VASTLY lower costs of cannibalisation of existing components compared with the cost of redesigning the assembly (top of page 21).
Open market knowledge is a key area and will become increasingly important as more obsolescence affects our industry in the coming months and years. OEMs have a difficult time acquiring this data simply because they tend to be removed from the “front lines” where components are bought and sold. We’re trying to educate companies to move this balance of control back into their own favour. We’d be delighted to talk to you about your experiences and for you to join our community of like-minded experts. Visit our new website and join our community to stay up to date.