Latest market news

AI boom to drive demand for chip materials

  • : Metals
  • 23/06/30

The growing hype around artificial intelligence (AI) has highlighted the pivotal role of specialised silicon-based semiconductors in driving the deployment of new technologies at scale, and signals rising demand for metals used to make the chips themselves and the associated data centre servers.

AI — the ability of machines to perform tasks associated with human brains — requires specialised semiconductor chips that are optimised for advanced computation, more powerful and more efficient than the chips used in consumer electronics. Demand from the sector will have a lasting impact on chip design and production, owing to the massive volumes of data that AI applications process and store. Although some general-purpose semiconductors can be used for some basic AI functionality, they are becoming less useful as AI applications advance. And as demand for AI chips surges, so too does the need for silicon wafers.

The amount of silicon used in a single AI chip varies depending on its design and functionality. Silicon wafers with diameters of around 5-8 inches, and increasingly, up to 12 inches, are doped with chemicals such as boron, phosphorus, arsenic, and gallium to prepare the silicon for imprinting circuitry patterns. A metal layer is then laid over the imprinted wafer, and electrical circuits are etched on the wafer. The back-end of the chip sits on top of this front-end and consists of layers formed of insulators through which conductive metal wires called interconnects connect the electrical devices of the front-end. Interconnects were typically made from aluminium in the past but are now more commonly copper or cobalt based.

AI chips are forecast to account for up to 20pc of the $450bn total semiconductor market by 2025, according to consulting firm McKinsey. US-based Insight Partners projects that sales of AI chips will climb to $83.3bn in 2027 from $5.7bn in 2018, a compound annual growth rate (CAGR) of 35pc. That is close to 10 times the forecast growth rate for non-AI chips.

Data centre servers are central to AI computation, particularly as algorithms are typically trained in the cloud. The rising adoption of cloud computing and the emergence of 5G telecom technology, which provides fast data transmission with low latency, are driving demand for servers that rely on AI chips for efficient processing in healthcare, automotive and financial services applications .

US-based chipmaker Nvidia made waves in May when it announced that demand for AI chips from data centres has driven its second-quarter revenue guidance to $11bn, well above analysts' estimates of $7.15bn.

Around two-thirds of the rise in demand for AI hardware will come from data centre servers, based on McKinsey's forecast.

Although silicon is the foundation of AI semiconductors, minor metals such as indium contribute to data centre server performance. Optical fibres and cables used to transmit data between servers and networking equipment are coated with indium tin oxide (ITO) to increase signal transmission and reduce losses. Indium phosphide (InP) is used in the production of high-speed photodetectors and laser diodes for optical communications. And Indium-based solder alloys can also be used in the production and assembly of electronic components in servers to enable precise and reliable soldering connections.

"The computer industry is going through two simultaneous transitions — accelerated computing and generative AI," according to Nvidia's founder and chief executive Jensen Huang. "A trillion dollars of installed global data centre infrastructure will transition from general purpose to accelerated computing as companies race to apply generative AI into every product, service and business process." Nvidia is "significantly increasing" output of its data centre products to meet growing demand, Huang said.

US-based Inflection AI said yesterday it has raised $1.3bn in financing from Microsoft, Reid Hoffman, Bill Gates, Eric Schmidt and Nvidia. It will use the funding to help build an AI cluster of 22,000 Nvidia GPUs, which is around three times more computing power than was used to train the ChatGPT-4 generative AI tool, and indicates the scale of potential chip volumes at play.

Nvidia's rival AMD plans to ramp up production of its new AI data centre chip in the fourth quarter and has introduced upgraded versions of other chip models to address AI demand.

With AI chips expected to be used in smartphones, laptops, vehicles, manufacturing robots, surveillance systems, and military hardware, minor metals consumption will be central to the technology transition.


Related news posts

Argus illuminates the markets by putting a lens on the areas that matter most to you. The market news and commentary we publish reveals vital insights that enable you to make stronger, well-informed decisions. Explore a selection of news stories related to this one.

Business intelligence reports

Get concise, trustworthy and unbiased analysis of the latest trends and developments in oil and energy markets. These reports are specially created for decision makers who don’t have time to track markets day-by-day, minute-by-minute.

Learn more