Zhipu AI
Beijing-based LLM lab spun out of Tsinghua University — China's most government-aligned frontier lab, with the GLM model family and a strong vertical / front-end agent specialization.
1. Core Product / Service
Zhipu AI (智谱AI / 北京智谱华章科技) was founded 2019 as a Tsinghua University spin-out. Core products:
- GLM model family — the General Language Model series. GLM-4 (2024), GLM-4.5 / 4-Plus, GLM-5 generation in 2025, with GLM 5.1 referenced in the kimi entity's competitive-landscape table as the front-end / GUI agent leader among Chinese frontier-tier models. GLM-6 has been previewed / discussed in 2026.
- GLM-4V / GLM-4.5V — multimodal vision-language variants.
- CogVideoX — open-weight video generation model line.
- CodeGeeX — coding-specialized model (also distributed as a JetBrains / VSCode plugin).
- Open-weight releases: GLM-4-9B, GLM-4-32B, ChatGLM lineage, CogVideoX, all on Hugging Face under permissive (model-license) terms. Frontier GLM-5 / 6 variants are partially closed.
- BigModel.cn / open.bigmodel.cn — Zhipu's commercial API platform; OpenAI-compatible chat + embeddings + image + video + agent endpoints.
- Qingyan (清言) — consumer chat assistant.
- AutoGLM — Zhipu's agent product, focused on autonomous web / phone GUI control.
2. Target Users & Pain Points
- Chinese government + state-owned-enterprise (SOE) buyers — Zhipu's Tsinghua provenance and explicit alignment with Chinese AI policy make it the default choice for ministerial / SOE / regulated deployments. This is the distinguishing channel vs DeepSeek (independent), Kimi (private VC), Qwen (Alibaba commercial).
- Domestic Chinese developers — BigModel API is the 1P commercial path for production deployments inside China.
- Agent / automation builders — AutoGLM positions Zhipu in the GUI-agent niche where front-end / browser / phone control is the use case.
Pain solved: politically-clean Chinese frontier model for state sector; front-end + GUI agent specialization that is genuinely differentiated from DeepSeek / Kimi / Qwen's text-first positioning.
3. Competitive Landscape
| Lab | Origin | Open weights? | Niche |
|---|---|---|---|
| Zhipu | Tsinghua / China | Hybrid (older GLM open, frontier closed) | Front-end / GUI agent + government channel |
| deepseek | China (High-Flyer) | Yes (MIT + license) | Cost / frontier-tier |
| kimi | Moonshot / China | Partial | Long-horizon agent |
| qwen | Alibaba / China | Yes (Apache 2.0) | Broadest SKU zoo, multimodal |
| Baidu Ernie / MiniMax | China | Mixed | Domestic chat / multimodal |
Zhipu's positioning inside the "AI four dragons" (DeepSeek / Kimi / Qwen / Zhipu): the most institutionally embedded in the Chinese state — which is both a moat (locked-in buyers, government endorsement) and a constraint (less able to compete on open-weight permissiveness or pure cost the way DeepSeek does).
4. Unique Observations
Frontier training cost (GLM-5 / 6): not disclosed. Zhipu's compute access has been constrained by US export controls on H100 / H200 to mainland China, forcing reliance on H800 / H20 (China-export NVIDIA SKUs with bandwidth limitations) and increasingly on domestic Chinese accelerators (Huawei Ascend, etc.). This compute disadvantage relative to US labs and even relative to deepseek (which secured large H800 fleets early via High-Flyer hedge fund money) means GLM-5-class training likely runs in the mid-tens-of-millions to low-hundreds-of-millions envelope on suboptimal hardware. The architectural-efficiency-over-brute-force playbook DeepSeek pioneered is even more relevant for Zhipu.
API pricing — top SKU (GLM-4-Plus / GLM-5): BigModel.cn lists GLM-4-Plus / GLM-5 in the sub-RMB-per-M-token tier domestically, translating to roughly $0.5–$2/M input · $1.5–$5/M output when converted. This places Zhipu in the DeepSeek price band, materially below US closed-frontier labs (OpenAI / Anthropic / Google). Pricing in China is discounted further for domestic enterprise buyers and government tenders.
Pricing vs estimated unit cost — gross margin signal: with smaller MoE active-param counts and constrained-but-dedicated H800 / Ascend fleets, per-token marginal inference cost is in the same band as DeepSeek's. At ~$2/M output list, gross margin on cache-warm API is plausibly in the 60–80% range — lower than US-lab fat margins because the price floor is lower, but still profitable.
Open vs closed strategy: hybrid, similar to mistral. ChatGLM-6B / GLM-4-9B / GLM-4-32B / CogVideoX are open weights with permissive (custom model license, mostly commercially usable) terms. The frontier GLM-5 / GLM-6 closed variants are reserved for BigModel API monetization. Strategic logic identical to Mistral and Qwen-Max — open the ecosystem, monetize the closed top tier.
Chinese government + commercial mix: Zhipu has been explicitly placed by Beijing as one of the "Six Tigers" (六小龙) Chinese AI companies that the state has signaled support for. State-backed funds (Beijing AI Industry Investment Fund, Hangzhou City, Shanghai AICityX, Zhuhai Hi-Tech Zone, Chengdu Hi-Tech Zone) have all participated in Zhipu rounds. This is structurally different from DeepSeek (privately funded by High-Flyer), Kimi (Alibaba / Tencent VC), or Qwen (Alibaba subsidiary). For Western analysts, Zhipu is the easiest single point on the map of Chinese AI policy execution.
US sanctions exposure: 2025-01, the US Department of Commerce added Zhipu to the Entity List, citing concerns about military-civil fusion ties. This restricts US-origin technology export to Zhipu (most directly: NVIDIA datacenter GPUs, advanced lithography). Operationally, Zhipu's compute roadmap depends on (a) gray-market H800 / H20 supply, (b) accelerated migration to Huawei Ascend / domestic silicon. This is the single biggest external risk to the model trajectory and a reason Zhipu has emphasized architecture / efficiency over scale.
IPO rumor: 2024–2025 reporting documented multiple rounds of Hong Kong / Shanghai STAR Board IPO speculation for Zhipu, with reported targets in the $1B+ pre-IPO raise and post-IPO valuation discussions in the $10B+ range. As of 2026-05, no IPO has closed publicly, but Zhipu remains the most likely Chinese AI lab to attempt a public listing (DeepSeek is privately held by Liang / High-Flyer with no exit pressure; Kimi is also still private).
Vertical integration: none on infrastructure. Zhipu does not own DCs or silicon. It rents capacity from Chinese hyperscalers (Alibaba Cloud, Tencent Cloud, Baidu Cloud, Huawei Cloud) and increasingly deploys on Ascend-based clusters as a state-aligned move.
5. Financials / Funding
| Date | Round / Event | Amount | Valuation |
|---|---|---|---|
| 2019 | Founded as Tsinghua spin-out | — | — |
| 2022 | Pre-A | — | — |
| 2023 | Multiple rounds (Tencent, Sequoia China, Hillhouse, Meituan) | reported >$340M cumulative | growing |
| 2024 | Multiple rounds (Alibaba, Saudi Prosperity7, Beijing AI fund) | reported $400M+ | $2B+ |
| 2024–2025 | State-fund rounds (Beijing / Hangzhou / Shanghai / Zhuhai / Chengdu) | reported multi-hundred-millions | toward $3B+ |
| 2025-01 | US Entity List designation | — | — |
| 2025–2026 | IPO preparation rumored | — | reported targets $10B+ |
- Total raised: $1.5B+ disclosed cumulatively as of 2025.
- Revenue: not publicly disclosed; estimated low-hundreds-of-millions RMB scale, with government / SOE contracts as the largest single revenue line.
6. People & Relationships
- CEO: Zhang Peng (张鹏) — Tsinghua KEG (Knowledge Engineering Group) lineage.
- Chairman: Liu Debing (刘德兵).
- Chief scientist / academic anchor: Tang Jie (唐杰) — Tsinghua professor, GLM architecture lead.
- Investors: Tencent, Alibaba, Sequoia China, Hillhouse, Meituan, Saudi Prosperity7, plus a long list of Chinese state and municipal AI funds (Beijing AI Industry Investment Fund, Shanghai AICityX, Hangzhou City, Zhuhai HTI, Chengdu HTI, etc.).
- Cloud / infra partners: Alibaba Cloud, Tencent Cloud, Huawei Cloud, China Telecom — domestic cloud distribution.
- Competitors: deepseek, kimi, qwen, plus Baidu Ernie, MiniMax, StepFun, 01.AI, Moonshot.