Advertisements
Since the significant rise of ChatGPT, the landscape of the AI market has been transformed, with Nvidia (NVDA.US) primarily associated with the booming GPU segment. The company witnessed a meteoric surge in its market capitalization, even reaching the top spot among U.S. stocks for a brief period. However, as the market dynamics shift, a notable transition is underway, signaling a pivot of investment enthusiasm from GPUs to ASICs, or application-specific integrated circuits.
One of the most apparent indications of this shift is reflected in the stock price trajectory of Nvidia. Following an impressive run, it has recently faced a decline reminiscent of a massive ship losing momentum amid turbulent seas. In sharp contrast, Broadcom (AVGO.US), a leading player in the ASIC domain, has experienced substantial stock price increases, likened to a sudden wave rising over a calm lake, drawing significant investor attention and capital influx.
Typically, when discussing AI chips, Nvidia's GPUs come to mind. However, the current landscape highlights not only GPU-centric general-purpose chips but also dedicated ASICs, semi-custom Field Programmable Gate Arrays (FPGAs), and occasionally CPUs. ASICs have emerged as a crucial player, designed specifically for tailored applications rather than general computing tasks.
ASICs represent a paradigm shift in chip technology, enabling significant gains in performance, energy efficiency, size reduction, and cost-effectiveness compared to traditional chips such as CPUs and GPUs. Their applications are diverse and increasingly relevant in sectors like cryptocurrency mining, data processing, image processing, and traditional networking services.
Before the AI boom, ASICs faced limitations due to high research and development costs, prolonged development timelines, and insufficient or rapidly changing market demand. This situation contributed to a lackluster performance in the ASIC market. Yet, with the AI race gaining momentum, the increasing demand for Nvidia's AI chips—driven by soaring AI inference application needs—has set the stage for ASICs to shine. Currently, ASICs are primarily utilized in inference scenarios and are beginning to penetrate some training processes, showcasing their versatility.
Analysts from Southwest Securities have noted that the massive demand for accelerated computing chips, particularly in inference clusters, is a core driver of ASICs' rapid growth. In fact, predictions suggest that by 2023, ASICs will account for 16% of accelerated computing chips in data centers, amounting to a market size of approximately $6.6 billion. As AI computing requirements expand, it's anticipated that ASICs' market share could rise to 25%, ultimately pushing the data center ASIC market value to an estimated $42.9 billion by 2028, with a compounded annual growth rate (CAGR) of 45.4%.
Moreover, Barclays' report predicts a remarkable bump in AI inference computing demand, which could represent over 70% of total computational needs for general artificial intelligence. This portion of inference computing is expected to surpass training computing demands significantly, reaching a ratio of 4.5:1. Currently, Nvidia's GPUs dominate the inference market with an impressive 80% market share, yet with the emergence of customized ASIC chips from major tech firms, this figure is projected to decline to approximately 50% by 2028.
The swift rise of ASICs is also reflected in the performances of related companies. Nvidia continues to lead the GPU segment; however, in the ASIC marketplace, Broadcom and Marvell (MRVL.US) have solidified their positions as industry trailblazers. For instance, in the latest fiscal year, Broadcom's AI-related revenue skyrocketed by 220%, reaching $12.2 billion. This surge is largely attributed to its advanced range of customized AI accelerators and Ethernet networking products.
Broadcom's CEO recently shared insights into the market's future, indicating that the demand for customized AI chips could reach between $60 billion and $90 billion by 2027. This forecast is based on potential requirements from three 'hyperscale' clients anticipating the deployment of millions of AI chip clusters. Analysts believe that if these predictions hold true, Broadcom's ASIC-related AI business could continue to expand significantly over the next three years.
Similarly, Marvell has disclosed a robust performance in its third-quarter fiscal report for 2025, showing a year-over-year revenue increase of 7% and a sequential jump of 19%, totaling $1.516 billion. This quarter's performance and outlook for the fourth quarter exceeded analyst expectations. According to the company's CEO, Marvell’s strong growth trajectory is primarily fueled by sales of new customized AI chips to Amazon (AMZN.US) and other data center companies.
In conclusion, the landscape of AI technology is on the cusp of a monumental transformation, with a clear shift towards ASICs taking center stage. As critical components in the architecture of future AI applications, ASICs not only offer tailored solutions to the demanding computational needs of businesses but also promise substantial growth for the companies that specialize in their development. The evolution of AI necessitates an adaptation of strategies and technologies, ensuring that both businesses and investors remain agile and responsive to the tide of innovation reshaping the industry.
Leave a Comment