Nvidia’s CEO Jensen Huang highlighted the business advantage of using high-speed AI chips, stating that “Speed is a cost reduction.” At the GTC2025 in California, Huang emphasized that high-performance chips like Nvidia’s Blackwell Ultra system, which delivers AI services up to 50 times faster than prior models, can significantly boost data center profitability. This advancement allows cloud companies to serve more users simultaneously, enhancing operational efficiency.
Major cloud providers, including Microsoft, Google, Amazon, and Oracle, are heavily investing in building robust AI infrastructures. However, there’s growing concern about the economic feasibility of such high-performance chips. Addressing these concerns, Huang presented data showcasing the cost-effectiveness of Nvidia’s Blackwell GPUs. Despite their $40,000 price tag, these GPUs can deliver 50 times the revenue in cloud environments compared to previous models, underscoring their value.
Nvidia’s sales of Blackwell GPUs reached 3.6 million this year, surpassing the 1.3 million sales of the Hopper GPU. Looking ahead, Nvidia revealed plans for future technologies with the Rubin Next in 2027 and Feynman AI chips in 2028. Huang also countered the notion that ASIC chips developed by cloud companies might replace Nvidia’s GPUs, arguing that the adaptability of GPUs to evolving AI algorithms provides a distinct competitive advantage.