Baidu Inc. is accelerating its push for AI self-sufficiency, revealing it has successfully trained a key version of its ERNIE 5.1 large language model on a fully domestic chip cluster that reached a 97% effective training rate. The announcement signals a significant step in China's efforts to develop a homegrown alternative to Nvidia Corp.'s dominant AI processors.
"On the fully domestic Kunlunxin cluster, the company has successfully completed training of a key version of ERNIE 5.1," Shen Dou, Baidu's Executive Vice President and President of its AI Cloud Group, said at the Create 2026 AI developer conference. He noted that multiple ten-thousand-card clusters using the company's Kunlunxin P800 chip have been delivered since last year following large-scale validation.
By the numbers, the domestic ten-thousand-card cluster demonstrated linear scalability surpassing 85%, according to the company. Furthermore, Baidu announced its next-generation "Tianchi 256-card supernode" will launch in June. This new hardware promises a 25% throughput performance increase over the prior generation and has been adapted for mainstream models from competitors like DeepSeek, GLM, and MiniMax, in addition to Baidu's own ERNIE.
This development is critical for Baidu's (9888.HK) competitive standing and China's broader tech ambitions. Success with its in-house AI hardware could reduce reliance on foreign suppliers amid ongoing trade tensions and bolster investor confidence in Baidu's AI-centric strategy. The ability to control its own AI infrastructure stack gives the company a potential edge in China's vast cloud services market against rivals like Alibaba and Tencent.
This article is for informational purposes only and does not constitute investment advice.