Zhipu 40%, MiniMax 25%, and the chip sector also soared! The AI concept in Hong Kong stocks exploded

robot
Abstract generation in progress

Hong Kong stocks’ AI concept stocks collectively surged on February 12, driven by the dense release of domestic large models during the Spring Festival season, leading to a comprehensive strengthening of related sectors.

Zhipu’s stock rose over 40% after the company announced a price increase for AI programming subscription packages and officially launched the new flagship model GLM-5 the day before. Large model stocks MINIMAX-WP soared over 21%. On the same day, MiniMax launched the latest programming model M2.5, claiming it to be the world’s first production-level model natively designed for Agent scenarios. Both companies are focusing on upgrading programming and intelligent agent capabilities as their core directions.

The chip sector also moved higher in tandem. Known as the “Four Little Dragons” of domestic GPUs, Tiandao Zhixin’s afternoon gains expanded to 25%, and Bairen Technology rose nearly 10%. GigaDevice’s stock increased over 8%. The ongoing demand for AI computing power continues to boost the outlook for related hardware manufacturers.

Behind this round of market activity is the concentrated release of new products by domestic large model companies during the Spring Festival window. Following DeepSeek’s earlier release of a new model, products such as Alibaba’s Qwen 3.5 and ByteDance’s SeeDance 2.0 have recently been unveiled, intensifying industry competition.

Zhipu GLM-5 Parameter Scale Doubled

During the Spring Festival period, the domestic large model track entered a phase of intensive releases. After DeepSeek’s new model, Alibaba’s Qwen 3.5, ByteDance’s SeeDance 2.0, and others have been launched successively. Multiple vendors are choosing this time to release new products simultaneously, indicating that industry competition continues to heat up.

According to Wall Street Journal reports, Zhipu’s GLM-5, launched on February 11, expanded its parameter scale from the previous generation’s 355 billion to 744 billion, with activated parameters increasing from 32 billion to 40 billion. Pre-training data volume increased from 23 terabytes to 28.5 terabytes. Zhipu confirmed that the mysterious model “Pony Alpha,” which topped the global model service platform OpenRouter’s popularity list, is actually GLM-5.

This model introduces DeepSeek’s sparse attention mechanism for the first time, reducing deployment costs and improving token utilization efficiency while maintaining long-text processing performance. Architecturally, GLM-5 features 78 hidden layers, integrates 256 expert modules, activates 8 at a time, with approximately 44 billion active parameters, a sparsity of 5.9%, and a maximum context window of 202,000 tokens. Internal evaluations show that GLM-5 improves performance by over 20% on average in front-end, back-end, and long-range programming development scenarios, with a real programming experience approaching Claude Opus 4.5.

According to Shanghai Securities News, an AI industry insider in Shanghai analyzed that, whereas domestic large models previously competed mainly on lower prices, Zhipu’s significant price increase this time indicates a clear enhancement in the technological capabilities and market competitiveness of domestic models.

MiniMax Competes at the International Top Level

MiniMax M2.5 is positioned as the world’s first production-level model natively designed for Agent scenarios, with programming and agent performance directly comparable to Claude Opus 4.6. The model supports full-stack programming development across PC, app, and cross-platform applications, especially leading in high-level Excel processing, in-depth research, PPT, and other core productivity scenarios.

The M2.5 model has only 10 billion parameters activated, with significant advantages in VRAM usage and inference efficiency, supporting ultra-high throughput of 100 TPS, with inference speeds far surpassing top international models. This marks another rapid iteration from MiniMax, just over a month after the release of the previous version 2.2.

The Crowded “Spring Festival Release Season”

The Spring Festival of 2026 is no longer just a celebration of consumption but has evolved into a speed and passion race among China’s AI giants for “mobile terminal entry points.”

A recent report from JPMorgan on February 11 pointed out that China’s internet and AI industries are experiencing the most intensive flagship model release wave in history. This is no longer a single-model show but a game of who can fastest turn “technological spillover” into “consumer-grade blockbuster.”

ByteDance took the lead, releasing a “bundle” of three models: Seedance 2.0 (video), Seedream 5.0 (image), and Doubao 2.0. Among them, Seedance 2.0 has already shown “blockbuster” signals.

Alibaba is also not to be outdone, reportedly preparing to launch Qwen 3.5 in mid-February, supported by a 3 billion yuan reward plan to attract users.

DeepSeek is said to aim for version V4 in mid-February, with key improvements in coding and ultra-long prompt processing. On February 11, reports indicated that DeepSeek updated its new model to support up to 1 million tokens of context length.

Risk Warning and Disclaimer

Market risks are present; investments should be cautious. This article does not constitute personal investment advice and does not consider individual users’ specific investment goals, financial situations, or needs. Users should consider whether any opinions, viewpoints, or conclusions herein are suitable for their particular circumstances. Invest at your own risk.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
  • Pin

Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)