
China is preparing to approve the import of Nvidia Corp.’s (NASDAQ:NVDA) H200 artificial intelligence chips for select commercial customers as soon as this quarter, according to people familiar with the matter.
The move represents a potential multi-billion dollar windfall for the Santa Clara-based chipmaker, which saw its 95% market share in the country crater to nearly zero following a series of U.S. export bans.
Chinese officials are reportedly drafting guidelines that would allow private-sector giants like Alibaba Group Holding and ByteDance to procure the H200 for AI model training.
However, the approval comes with strict caveats: the chips will be barred from use in the military, state-owned enterprises, and critical government infrastructure—mirroring previous restrictions placed on Micron and Apple products.
The shift follows a pivotal policy reversal by U.S. President Donald Trump in December 2025, which granted Nvidia permission to export the Hopper-generation H200 to "approved customers" in exchange for a 25% surcharge remitted to the U.S. Treasury.
While the H200 is roughly 18 months behind Nvidia’s cutting-edge Blackwell and Rubin architectures, it remains nearly six times more powerful than the "downsized" H20 chips previously available in the Chinese market.
In anticipation of the green light, demand from Chinese tech firms has been explosive.
People familiar with the situation say Alibaba and ByteDance have expressed interest in ordering more than 200,000 units each.
To manage the regulatory risk, Nvidia has reportedly implemented a strict "full upfront payment" policy for Chinese orders, with no options for refunds or cancellations.
The stakes for domestic self-sufficiency remain high.
While Nvidia prepares its return, local rivals such as Huawei Technologies and Cambricon Technologies are ramping up production.
Huawei plans to produce roughly 600,000 Ascend 910C chips in 2026, while Cambricon aims to triple its output to 500,000 units.
Despite these gains, industry experts note that Nvidia's H200 remains the "gold standard" for large-scale language model training, a critical need for Chinese developers like DeepSeek as they race to keep pace with U.S. rivals.