Microchip technology, from creation to supply chain shortage

Share
Timeline: Microchip technology
As the chip shortage continues to put pressure on the manufacturing supply chain, we outline the transformative history of this 20th century invention

The ongoing semiconductor (chip) shortage continues to cause issues for the global manufacturing industry. Initially aggravated by the COVID-19 pandemic as many global industries were put on hold, the disrupted chip supply chain directed attention to a previously overlooked fact: our increasingly globalising and digitalising world relies on microchips to function.

With this growing concern came increased attention to the origins and function of the microchip. When were they invented? What are they used for? And why did it take the pandemic for us to realise how important they are to our world?

We explore the answers to these questions in our timeline below.

1959: The microchip is invented

In the 1950s computers were expensive and complicated machines made up of tiny parts. The strict and complicated wiring of computers, the so-called ‘tyranny of numbers’, meant engineers were unable to improve the performance of their computers. 

On behalf of all engineers frustrated at this, Jack Kilby, Engineer at Texas Instruments, decided to create a single component that could do the job by itself. He invented the microchip and displayed it to his company’s management. In February 1959 a US Patent for ‘Miniaturised Electronic Circuits’, the first integrated circuit ever, was filed. The era of modern computing had begun.

Youtube Placeholder

1961: Expensive progress

As the microchip began to be distributed more widely, it began to be used by the United States Air Force to manufacture missiles, and by NASA in the Apollo projects. At this stage, a single microchip cost US$31.

1965: Moore’s Law

Co-founder of Intel Gordon E. Moore claimed that the number of transistors on a microchip doubled every two years, though the cost of computers was halved. This statement, which then became known as Moore’s Law, suggested that computers would become cheaper as their capabilities increased.

1971: Lower costs with mass supply chain production

Half a decade later Moore’s Law was proved correct. Thanks to the American government’s investment, the mass production of microchips reduced their cost to US$1.25.

“It was the government that created the large demand that facilitated mass production of the microchip” explained Fred Kaplan, author of 1959: The Year Everything Changed.

1986: Managing costs with the Semiconductor Agreement

Moore had not, however, considered how competing international interests and trade wars would affect microchip manufacturing. The Semiconductor Agreement between the USA and Japan made sure to fix prices for manufacturing so that the supply chain competition did not get out of hand.

1998: First microchip is implanted into a human

The first microchip-to-human experiment took place at the end of the 20th century. Professor Kevin Warwick, Cybernetics Director at the University of Reading, was the first human in history to have a microchip implanted into their body.

After one week the microchip was removed. Warwick said that smart-card activated doors had opened for him, and lights would blink around him.

2021: Mass production in China amidst the pandemic

Shortly after Texas Instruments’ invention in 1959, Chinese engineers had built their own transistor. However, the Cultural Revolution meant their efforts went mostly unnoticed, and even when China’s economy opened up globally in the 1980s their manufacturing firms were behind the rest of the world.

In 2021, Chinese companies managed to produce 29.9bn chips during various COVID-19 lockdowns.

2022: The chip shortage causes a supply chain crisis

Microchip production, like many other industries, was halted by the pandemic lockdowns. Since, chip manufacturers have struggled to meet demand due to restrictions and shortages along the supply chain.

For example, the closure of Shanghai, China’s chip-manufacturing hub, led to a complete disruption in the country’s tech supply chain which impacted companies across the world – including Semiconductor Manufacturing International Corp. (SMIC).

It was also recently reported that the chip shortage is forcing Russian soldiers to raid abandoned kitchens in the Ukraine, looking for microchips in dishwashers and fridges to power their weapons.

In the face of this enduring global crisis, engineers are now looking for ways to enhance microchip technology, not only to boost global digitalisation but also to manage risk in case of another supply chain disruption.

Share

Featured Articles

Tonkean & Beroe's bid to Transform Procurement Orchestration

Tonkean and Beroe's launch of Market Intelligence-Infused Orchestration for procurement processes looks set to revolutionise supply chain decision-making

UPDATED VENUE & DATE – PSC LIVE Chicago 2025

PSC LIVE Chicago announces important changes to its venue and date for the co-located event with Sustainability LIVE and Manufacturing LIVE in 2025

Returns Tuesday: The Ultimate Reverse Logistics Challenge

Returns Tuesday, the post-Black Friday rush, tests retailers’ reverse logistics and demands swift processing to balance costs and customer satisfaction

Supply Chains at a Crossroads as Plastic Treaty Talks Stall

Sustainability

Cyber Monday: Sustainability in the Digital Shopping Boom

Sustainability

Vauxhall Factory Closure: The Supply Chain Impact

Operations