JetMoE-8B
Molmo AI| Launched | |
| Pricing Model | Free |
| Starting Price | |
| Tech used | |
| Tag | Text Generators,Answer Generators,Chatbot Builder |
| Launched | 2024-09 |
| Pricing Model | Free Trial |
| Starting Price | |
| Tech used | Cloudflare CDN,Next.js,Gzip,OpenGraph,Webpack,YouTube |
| Tag | Data Analysis,Data Science |
| Global Rank | |
| Country | |
| Month Visit |
| Global Rank | 1382983 |
| Country | United States |
| Month Visit | 22012 |
Estimated traffic data from Similarweb
XVERSE-MoE-A36B - XVERSE-MoE-A36B: A multilingual large language model developed by XVERSE Technology Inc.
Yuan2.0-M32 - Yuan2.0-M32 is a Mixture-of-Experts (MoE) language model with 32 experts, of which 2 are active.
OpenBMB - OpenBMB: Building a large-scale pre-trained language model center and tools to accelerate training, tuning, and inference of big models with over 10 billion parameters. Join our open-source community and bring big models to everyone.
Gemma 3 270M - Gemma 3 270M: Compact, hyper-efficient AI for specialized tasks. Fine-tune for precise instruction following & low-cost, on-device deployment.