Nvidia’s AI Boom Might Be Front-Loaded
Companies have devoted immense resources to building AI models over the last two years or so. Now training these massive models is more of a one-time affair that requires considerable computing power and Nvidia has been the biggest beneficiary of this, as its GPUs are regarded as the fastest and most efficient for these tasks. This is evident from Nvidia’s recent revenue growth. Sales are on track to expand from a mere $27 billion in FY’23 to almost $130 billion in FY’25. However, the AI landscape may be evolving. As models grow larger in terms of several parameters, incremental performance gains are expected to diminish. Separately, the availability of high-quality data for training models is likely to become a bottleneck. With much of the Internet’s high-quality data already run through by large language models, there could be a shift from large-scale, general-purpose AI models to smaller, specialized models - reducing demand for Nvidia’s high-powered GPUs. The explosive demand Nvidia has witnessed over the last few years may very well have been front-loaded, with future growth very likely slowing. Separately, see What’s Behind The 80% Rise In Meta Stock?
Now, AI-related chip demand could shift from training to inference, which is the phase where trained models generate outputs. Inference is less computationally intensive and could open the door for alternative AI processors. To be sure, Nvidia will likely remain the leader by far in the inferencing space as well (it says that inferencing accounts for about 40% of its data center chip demand) but there’s certainly an opening from rivals such as AMD and potentially even Intel to gain a bit of market share.
During the initial wave of generative AI, enterprises and big tech companies scrambled to invest in GPUs due to the “fear of missing out,” without worrying about costs and returns on investments. This led to a surge in pricing power for Nvidia, with its net margins coming in at over 50% in recent quarters. However, companies and their investors will eventually look for returns on their investments meaning that they could become more judicious about AI costs going forward and this is likely to hurt margins. Moreover, besides rivals such as AMD and Intel, Nvidia’s biggest customers such as Google and Amazon are doubling down on building their own AI chips. On Tuesday, Amazon announced plans to build an AI ultracluster, essentially a massive AI supercomputer that will be built using its proprietary Trainium chipsets. This could also pose a risk to Nvidia’s business.
Intel May Offer A Better Risk-Adjusted Return
Nvidia, on the other hand, trades at a lofty 48x projected FY’25 earnings. While Nvidia has seen impressive growth recently, it remains to be seen if the good times will last. And at the current valuation, we see little room for error. The risks we highlighted above could put Nvidia’s future growth and margins at risk, weighing on the company’s earnings. As the AI market shows signs of evolving, investors could see better risk-adjusted returns by moving from Nvidia to more undervalued semiconductor players like Intel. Considering the above factors, Intel may have only one way to go and that’s probably up. For Nvidia, on the other hand, things could get a bit more tricky.
No comments:
Post a Comment