< IMG SRC = "/Uploads/Blogs/E7/19/IB-FQOFEC3U_9D0F6B09.jpg" Alt = "Alibaba introduced a new Shi model that exceeds Deepeseek R1"/> ~ < P Data-Start = "71" Data-End = "334" > Chinese Alibaba Group announced the Qi-Model QWQ-32b, which has an open source code and is capable of reasoning. According to developers, it exceeds Deeseeek R1 in performance in a number of tasks using much less computing resources.
< P Data-Start = "336" Data-end = "466" >After this announcement of the Alibaba share increased by 7.5% on the Hong Kong Exchange, which became the largest day jump in two weeks.
< H3 Data-Start = "468" Data-End = "500" > What is known about QWQ-32b ?0 ~/H3 > < Ul Data-Start = "501" Data-end = "865" > < Li Data-Start = "501" Data-end = "642" > has 32 billion parameters but exceeds Deepeseek R1 (671 billion parameters) in mathematics, programming and general questions. < Li Data-Start = "643" Data-End = "723" > Due to the smaller number of parameters the model works faster and more efficient. < Li Data-Start = "724" Data-end = "808" > uses training with reinforcement to improve the ability to think. ~ ~ < Li Data-Start = "809" Data-End = "865" > Ahead of Openai O1-Mini (100 billion parameters).
< P Data-Start = "867" Data-end = "1052" > qwq-32b is available for testing for Hugging face & ndash; The largest platform for open WI models. It can also be tested through chat bot & nbsp; qwen & nbsp; under the name QWQ-32b-PREView.
~ < H3 Data-Start = "1054" Data -nd = "1091" > Alibaba makes a bid on a si< P Data-Start = "1092" Data-end = "1283" > The company announced an investment of $ 52 billion in cloud computing and AI infrastructure for three years. It is the largest Shi project funded by a private company in China.
< P Data-Start = "1285" Data-end = "1437" Data-is-Last-Node = "" Data-is-Oly-Node = "" > Alibaba Head of Alibaba Eddie Wu stated that the company seeking to create general-purpose; a system capable of performing 80% of human tasks.