Semiconductors are really inseparable from technology. Since their inception, They dramatically changed the course of technology in the 1940s with the successful demonstration of the first transistor. The use of semiconductors as the core material of optical fibres was then widely introduced in 2000. The role of semiconductors in electronic circuits and lasers is proof of the undeniable importance of semiconductors in our modern world.
As the world evolves in the next phase of digitization and the era of Web 3.0, semiconductors are once again at a crucial inflection point. This isn’t merely driven by geopolitical factors like TSMC-Taiwan and China, or supply chain disruptions due to the Covid-19 pandemic, which caused delays in various industries including automotive.
Rather, it’s fueled by transformative shifts toward electronic and electric vehicles, artificial intelligence (AI), or the transition to 5G/6G wireless networks, as shown in the recent surge in share prices of key players such as Nvidia. The SMH index, comprising 25 industry leaders, has already increased by approximately 25% this year, boasting a lower equity beta compared to most technology and AI stocks. These trends signal a promising future for semiconductors, with forecasts suggesting the industry could reach a valuation of US$1 trillion within the next five years.
Bubble, or beyond Moore?
However, amidst this optimism, the question arises:
Is this growth sustainable or merely a bubble?
For some players like Nvidia, their share price performance closely aligns with their strong Return on Assets (ROA) and Return on Equity (ROE), which have seen significant expansion in recent months, Yet, amidst the rapid evolution within the semiconductor industry, it is importance to identify and shape new emerging dynamics, around at least five emerging trends :
Trend 1: Hello (generative) AI: How will demand evolve, especially with the emergence of generative AI models driving semiconductor demand?
Trend 2: Product Evolution: Is silicon still the reigning champion, or will compounds like gallium nitride (GaN) dominate the landscape with their superior electrical properties and energy efficiency?
Trend 3: Twin Transformation: Can sustainability and digitalization coexist harmoniously, or does the energy-intensive nature of digital technologies present a stumbling block?
Trend 4: Hypercompetition: How will the changing competitive landscape, with tech giants increasingly designing their own chips?
Trend 5: Platform Battle: With the rise of Arm architecture challenging the dominance of x86, how will this shift reshape the semiconductor ecosystem, particularly in terms of chip architectures and supplier dynamics?
Hello Generative AI
A major driver of semiconductor demand in recent months has been the development of powerful generative AI models on top of already booming AI applications such as deep learning: Computer Vision and Robotics, Internet of Things (IoT).
For these models to work, a special type of chips, AI accelerator chips (or deep learning processors), are needed to speed up AI computations, making them significantly faster and more power efficient than using general-purpose processors. These AI accelerators often have manycore designs and focus on low-precision arithmetic: Optimised to handle data with reduced precision, with novel dataflow architectures: Efficiently process data through specialised pipelines
Leading vendors include NVIDIA Nvidia’s Tensor Processing Units (TPUs) and AMD’s Radeon Instinct accelerators. These specialized chips are optimized for deep learning tasks and are used extensively in data centers for AI inference and training. Nvidia’s Tesla GPUs, for instance, power AI applications in industries ranging from healthcare to autonomous vehicles, demonstrating the significant role of semiconductor companies in meeting evolving AI demands.
Uncertainty does not come from AI demand push – it comes from the fact that there is no dominant design (yet?) and that the evolution of AI in terms of horizontal/vertical LLMs is still to be worked out.
Is silicon here to stay? The rise of gallium nitride (GaN)
GaN is a compound semiconductor with superior electrical properties that will usher in a new era of energy-efficient electronics. GaN has a very hard crystal structure and a wide bandgap, making it more suitable for optoelectronic, high power and high frequency applications such as blue LEDs, microwave power amplifiers and space applications (e.g. satellite solar panels).
However, it is increasingly being used in power supplies for electronic devices, converting alternating current from the grid to low-voltage direct current. GaN technology can handle larger electric fields in a much smaller form factor than silicon, while providing much faster switching. GaN is becoming essential, for example, in power conversion platforms where silicon has reached its limits, or for the edge computing/mobile transition to Web 3.0. GaN chips are also easier and faster to manufacture than silicon chips – a major drawback of semi-finished products in the recent past – in other words, companies are turning to GaN for more efficient and smaller electronic devices.
The promise of twin transformation
Twin transformation is the hope that sustainability and digitalisation are highly complementary. Digital technologies can, for example, enable people to work efficiently from home, reducing the environmental impact of commuting. At present, however, the twin-transformation vision is not warranted — because digitalisation is an energy-intensive process: a single semiconductor factory can consume up to 1 TWh of energy per year and two to four million gallons of ultra-pure water per day. Semiconductors understand the challenge and are revealing their sustainability paths, along with fully digital native players. This includes moving cloud workloads to GANs with access to renewable energy, or improving semiconductor design. However, the shift to GAN is a game changer as it radically reduces energy consumption.
Value chain integration or specialization?
So far, the value chain has been that mega AI users (e.g. Google, etc.) outsource and thus buy chips from third parties. However, this is changing. Many tech giants such as Apple, Tesla, Google and Amazon are now making their own chips designed specifically for their products. – Google just unveiled its new Pixel 6 and Pixel 6 Pro phones, which use Tensor, the first chip designed by Google to bring AI capabilities to its mobile phone range. Apple’s new 2021 MacBook Pros are based on the company’s own M1 chips. Importantly, this evolution could challenge the current horizontal model of AI accelerator market tenants such as Nvidia.
This shift towards in-house chip development poses a potential challenge to the existing model of outsourcing semiconductor production to third-party manufacturers. Notably, this trend could disrupt the dominance of companies like Nvidia in the horizontal market for AI accelerators, as tech giants seek greater control over their semiconductor supply chains.
The platform battle: chip architectures
The x86 architecture has dominated the microprocessor industry for more than 50 years. However, this is now changing with the growing popularity of Arm. While the Arm architecture was born out of the need for low-power chips for vertical applications, it is starting to emerge as not only a low-power solution, but also a high-performance competitor to the established x86 players.
Google and AWS have decided to build their own chips, choosing the Arm architecture to build their own chip because of its performance and low power consumption, which has become so important for power-hungry data centers, consumer products and sustainability efforts. This growing shift to Arm is changing the dynamics of the semiconductor ecosystem. Unlike the x86 platform, where companies can buy from one or two suppliers, Arm has become a broker, making its IP available to multiple companies.