(Be the first to comment)
GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023)0
Visit website

What is GLM-130B?

GLM-130B is an open bilingual pre-trained model with 130 billion parameters, designed to support inference tasks with fast performance. It outperforms other models in both English and Chinese language tasks and can be easily reproduced with open-sourced code and model checkpoints. The software is cross-platform and can be used for training and inference on various hardware configurations.

Key Features:

  1. 🌐 Bilingual Support: GLM-130B supports both English and Chinese languages.

  2. ⚡ Fast Inference: The software enables fast inference on a single server, with up to 2.5X faster performance using the FasterTransformer library.

  3. 🔄 Reproducibility: All results can be easily reproduced with open-sourced code and model checkpoints.

Use Cases:

  1. Language Tasks: GLM-130B performs better than other models in tasks like LAMBADA, MMLU, and zero-shot CLUE datasets, making it ideal for language-related applications.

  2. Web-Enhanced Question Answering: The software enables efficient and accurate web-enhanced question answering, making it valuable for information retrieval tasks.

  3. Dialogue Language Modeling: GLM-130B can be used for bilingual dialogue language modeling, providing assistance in generating conversational responses.


GLM-130B is a powerful open bilingual pre-trained model with impressive performance and versatility. Its fast inference capabilities, reproducibility, and support for multiple languages make it a valuable tool for a wide range of applications, including language tasks, question answering, and dialogue language modeling. By leveraging its unique features and easy integration, users can achieve efficient and accurate results in their AI projects.

  • GLM-130B

More information on GLM-130B

Pricing Model
Starting Price
Global Rank
Month Visit
Tech used
GLM-130B was manually vetted by our editorial team and was first featured on September 4th 2024.
Aitoolnet Featured banner

GLM-130B Alternatives

Load more Alternatives
  1. An Open Bilingual Pre-Trained Model

  2. ChatGLM-6B is an open CN&EN model w/ 6.2B paras (optimized for Chinese QA & dialogue for now).

  3. A large-scale 7B pretraining language model developed by BaiChuan-Inc.

  4. MiniCPM is an End-Side LLM developed by ModelBest Inc. and TsinghuaNLP, with only 2.4B parameters excluding embeddings (2.7B in total).

  5. The New Paradigm of Development Based on MaaS , Unleashing AI with our universal model service