MOSS

10 comments
Introducing MOSS: an open-source language model supporting Chinese & English with 16B parameters. Run it on a single GPU for seamless conversations & plugin support.0
Visit website

What is MOSS?

MOSS is an open-source conversational language model that supports both Chinese and English languages and various plugins. The moss-moon series model has 16 billion parameters and can run on a single A100/A800 GPU or two 3090 GPUs at FP16 precision. It can also run on a single 3090 GPU at INT4/8 precision. The MOSS base language model is pretrained on approximately 700 billion Chinese-English and code words. It is further fine-tuned with dialogue instructions, plugin reinforcement learning, and human preference training to possess the ability for multi-turn conversations and the ability to use various plugins.

More information on MOSS

Launched
2023
Pricing Model
Free
Starting Price
Global Rank
Country
Month Visit
<5k
Tech used
MOSS was manually vetted by our editorial team and was first featured on September 4th 2024.
Aitoolnet Featured banner
Related Searches

MOSS Alternatives

Load more Alternatives
  1. ChatGLM-6B is an open CN&EN model w/ 6.2B paras (optimized for Chinese QA & dialogue for now).

  2. Enhance your NLP capabilities with Baichuan-7B - a groundbreaking model that excels in language processing and text generation. Discover its bilingual capabilities, versatile applications, and impressive performance. Shape the future of human-computer communication with Baichuan-7B.

  3. JetMoE-8B is trained with less than $ 0.1 million1 cost but outperforms LLaMA2-7B from Meta AI, who has multi-billion-dollar training resources. LLM training can be much cheaper than people generally thought.

  4. GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023)

  5. MiniCPM is an End-Side LLM developed by ModelBest Inc. and TsinghuaNLP, with only 2.4B parameters excluding embeddings (2.7B in total).