News

BEIJING/SHANGHAI (Reuters) -Huawei's artificial intelligence research division has rejected claims that a version of its ...
Huawei has denied claims that its Pangu Pro MoE model copied Alibaba’s Qwen 2.5 14B, after an anonymous paper alleged high ...
Huawei released the source code for its PanGu 7B dense parameter model and the larger PanGuPro mixture-of-experts (MoE) 72B ...
AI-native cloud solutions will help drive South Africa’s intelligent transformation and unlock economic growth.
Huawei argues that its model was independently trained and developed with its own Ascend Hardware platform refuting the claims made online that it used code from another source.
Huawei's Noah Ark Lab has denied claims that its Pangu Pro Moe model copied elements from Alibaba's Qwen 2.5 14B, asserting independent development and training.
According to a cable sent by Rubio’s office to State Department employees, the impersonator sent voicemails via Signal and, at one point, used a text message to encourage a conversation on the app.
Pangu 5.5 was debuted globally for the first time at Huawei’s developer conference last month, so it has taken only a few weeks for the solutions to be rolled out to local customers.
The release also includes inference technology optimized for Huawei’s Ascend chips. Commercial Times highlights that this is the first time Huawei has open-sourced the core capabilities of its Pangu ...
If you would like to suggest a corrected translation, please click here. Huawei released the large-scale language model ' Pangu Pro MoE 72B ' with 72 billion parameters on Monday, June 30, 2025.
Pangu being available in an open-source manner allows developers and businesses to test the models and customize them for their needs, said Lian Jye Su, chief analyst at Omdia.
Huawei last month revealed a new, 5.5 version of its Pangu foundation model, and on Monday, released an open source version for developers.