Zhipu AI Releases GLM-5-Turbo Model with 2-3x Speed Increase, API Pricing Raised 20%

Gate News: On March 16, Zhipu AI released the GLM-5-Turbo model, a high-speed optimized variant of the flagship GLM-5, specifically customized for OpenClaw agent scenarios. The model uses a 744B MoE architecture, offering 2 to 3 times faster performance than GLM-5, supporting around 200K context length and a maximum output of 128K. Zhipu AI states that GLM-5-Turbo is an “OpenClaw native model,” with targeted optimizations in tool invocation stability, complex instruction decomposition, temporal understanding, and high-throughput long-chain execution efficiency. In ZClawBench evaluations, GLM-5-Turbo significantly outperforms GLM-5 in OpenClaw scenarios and leads several mainstream models in multiple key tasks.

For commercialization, Zhipu AI has increased the API price for GLM-5-Turbo by 20%. They also launched package plans for individual users: a trial monthly card for 39 RMB offering 35 million tokens, and an advanced monthly card for 99 RMB offering 100 million tokens. In the GLM Coding Plan, the usage limit for GLM-5-Turbo has been tripled, and off-peak hours now offer the same high-capacity quota as GLM-4.7. This promotion ends on April 30. GLM-5-Turbo is now available on OpenRouter, priced at $0.96 per million tokens for input and $3.20 per million tokens for output.

View Original
Disclaimer: The information on this page may come from third parties and does not represent the views or opinions of Gate. The content displayed on this page is for reference only and does not constitute any financial, investment, or legal advice. Gate does not guarantee the accuracy or completeness of the information and shall not be liable for any losses arising from the use of this information. Virtual asset investments carry high risks and are subject to significant price volatility. You may lose all of your invested principal. Please fully understand the relevant risks and make prudent decisions based on your own financial situation and risk tolerance. For details, please refer to Disclaimer.
Comment
0/400
No comments