Yuan2.0-M32
MiniCPM-2B| Launched | |
| Pricing Model | Free |
| Starting Price | |
| Tech used | |
| Tag | Code Generation,Answer Generators,Question Answering |
| Launched | |
| Pricing Model | Free |
| Starting Price | |
| Tech used | |
| Tag | Language Learning |
| Global Rank | |
| Country | |
| Month Visit |
| Global Rank | |
| Country | |
| Month Visit |
Estimated traffic data from Similarweb
XVERSE-MoE-A36B - XVERSE-MoE-A36B: A multilingual large language model developed by XVERSE Technology Inc.
JetMoE-8B - JetMoE-8B is trained with less than $ 0.1 million1 cost but outperforms LLaMA2-7B from Meta AI, who has multi-billion-dollar training resources. LLM training can be much cheaper than people generally thought.
Qwen2.5-LLM - Qwen2.5 series language models offer enhanced capabilities with larger datasets, more knowledge, better coding and math skills, and closer alignment to human preferences. Open-source and available via API.
DeepSeek Chat - DeepSeek-V2: 236 billion MoE model. Leading performance. Ultra-affordable. Unparalleled experience. Chat and API upgraded to the latest model.
Hunyuan-MT-7B - Hunyuan-MT-7B: Open-source AI machine translation. Master 33+ languages with unrivaled contextual & cultural accuracy. WMT2025 winner, lightweight & efficient.