DeepSeek Highlights Open-Source AI Value, Says Alibaba Chairman

Alibaba Group

Alibaba Group Holding Chairman Joe Tsai emphasized the growing importance of open-source artificial intelligence initiatives, particularly DeepSeek, in making advanced AI more accessible and cost-effective.

The Impact of Open-Source AI

Speaking at the World Government Summit in Dubai, Tsai highlighted how DeepSeek has demonstrated the value of open-source AI. “The thing about the open-source community is that people share everything that you contribute to,” Tsai noted. He argued that purely closed-source AI development is losing its value compared to collaborative models.

DeepSeek has challenged the notion that only a few tech giants can develop advanced AI models. Tsai pointed out that engineering innovation, as demonstrated by DeepSeek, can significantly reduce the cost of training and inference for large language models.


Competitive AI Pricing and Industry Disruption

DeepSeek-R1, released on Jan. 20, has shown competitive performance compared to leading AI models at a fraction of the cost. Silicon Valley investor Marc Andreessen even called it an “AI Sputnik moment.”

DeepSeek’s API services cost CNY1 (USD0.14) per million input tokens and CNY16 (USD2.20) per million output tokens—substantially lower than OpenAI's pricing, which stands at USD15 and USD60, respectively. Tsai suggested that rather than investing hundreds of billions into computing infrastructure, companies should focus on AI applications to drive economic impact.


Alibaba’s AI Partnerships and Open-Source Contributions

At the conference, Tsai confirmed that Apple has chosen Alibaba as its AI partner for iPhone features in China.

Alibaba has actively contributed to open-source AI, launching ModelScope in 2022 and its first open-source AI model, Qwen-7B, in 2023. ModelScope now supports various AI applications, and over 90,000 Qwen-based derivative models have been developed on Hugging Face.

Tsai likened the AI race to education, suggesting that developing effective AI applications should take precedence over relying solely on massive computing resources.

Post a Comment

Previous Post Next Post