XVERSE-MoE-A36B
JetMoE-8B| Launched | |
| Pricing Model | Free |
| Starting Price | |
| Tech used | |
| Tag | Content Creation,Story Writing,Text Generators |
| Launched | |
| Pricing Model | Free |
| Starting Price | |
| Tech used | |
| Tag | Text Generators,Answer Generators,Chatbot Builder |
| Global Rank | |
| Country | |
| Month Visit |
| Global Rank | |
| Country | |
| Month Visit |
Estimated traffic data from Similarweb
Yuan2.0-M32 - Yuan2.0-M32 is a Mixture-of-Experts (MoE) language model with 32 experts, of which 2 are active.
DeepSeek Chat - DeepSeek-V2: 236 billion MoE model. Leading performance. Ultra-affordable. Unparalleled experience. Chat and API upgraded to the latest model.
EXAONE 3.5 - Discover EXAONE 3.5 by LG AI Research. A suite of bilingual (English & Korean) instruction - tuned generative models from 2.4B to 32B parameters. Support long - context up to 32K tokens, with top - notch performance in real - world scenarios.
Yi-VL-34B - Yi Visual Language (Yi-VL) model is the open-source, multimodal version of the Yi Large Language Model (LLM) series, enabling content comprehension, recognition, and multi-round conversations about images.