Newsletter image

Subscribe to the Newsletter

Join 10k+ people to get notified about new posts, news and tips.

Do not worry we don't spam!

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Search

GDPR Compliance

We use cookies to ensure you get the best experience on our website. By continuing to use our site, you accept our use of cookies, Privacy Policy, and Terms of Service.

Z.ai GLM-5.1

Z.ai (formerly Zhipu AI) has released GLM-5.1, an open-weight 744 billion parameter Mixture-of-Experts model with 40 billion active parameters. The model immediately took the top spot on SWE-Bench Pro with 58.4, beating GPT-5.4 (57.7) and Claude Opus 4.6 (57.3). Built on DeepSeek Sparse Attention with a 200K context window and 131K maximum output, GLM-5.1 is engineered for sustained autonomous coding agents capable of running plan, execute, test, fix, optimize loops for up to eight hours. The weights are available on Hugging Face under a permissive MIT license for full commercial use.