Qwen3.5-397B-A17B & Qwen3.5-397B-A17B & MiniMax M2.5 is Live on Canopy Wave. Try it Now!DeepSeek V3.1
GLM-5 API
CodeLLM

GLM-5 API

All You Need To Know About GLM-5 API

Overview

Model Provider:Zai-org
Model Type:CODE/LLM
State:Ready

Key Specs

Quantization:FP8
Parameters:754B
Context:200K
Pricing:$0.90 input / $3.10 output / $0.20 cache
Try Model API
Quick Start
Reserve Dedicated Endpoint

Introduction

GLM-5 is built for complex systems engineering and long-horizon agentic tasks. Compared to GLM-4.5, GLM-5 scales from 355B parameters (32B active) to 744B parameters (40B active), and increases pre-training data from 23T to 28.5T tokens. GLM-5 also integrates DeepSeek Sparse Attention (DSA), largely reducing deployment cost while preserving long-context capacity.

Reinforcement learning aims to bridge the gap between competence and excellence in pre-trained models. However, deploying it at scale for LLMs is a challenge due to the RL training inefficiency. To this end, we developed slime, a novel asynchronous RL infrastructure that substantially improves training throughput and efficiency, enabling more fine-grained post-training iterations. With advances in both pre-training and post-training, GLM-5 delivers significant improvement compared to GLM-4.7 across a wide range of academic benchmarks and achieves best-in-class performance among all open-source models in the world on reasoning, coding, and agentic tasks, closing the gap with frontier models.

GLM-5 API Usage

Model

Endpoint

zai/glm-5


        1
        curl -X POST https://inference.canopywave.io/v1/chat/completions \
      
        2
          -H "Content-Type: application/json" \
      
        3
          -H "Authorization: Bearer $CANOPYWAVE_API_KEY" \
      
        4
          -d '{
      
        5
            "model": "zai/glm-5",
      
        6
            "messages": [
      
        7
              {"role": "user", "content": "tell me a story"}
      
        8
            ],
      
        9
            "max_tokens": 1000,
      
        10
            "temperature": 0.7
      
        11
          }'
      
PromotionContact us