What is MOSS?
MOSS is an open-source conversational language model that supports both Chinese and English languages and various plugins. The moss-moon series model has 16 billion parameters and can run on a single A100/A800 GPU or two 3090 GPUs at FP16 precision. It can also run on a single 3090 GPU at INT4/8 precision. The MOSS base language model is pretrained on approximately 700 billion Chinese-English and code words. It is further fine-tuned with dialogue instructions, plugin reinforcement learning, and human preference training to possess the ability for multi-turn conversations and the ability to use various plugins.
More information on MOSS
Launched
2023
Pricing Model
Free
Starting Price
Global Rank
Country
Month Visit
<5k
Tech used
Related Searches
MOSS Alternatives
Load more Alternatives-
ChatGLM-6B is an open CN&EN model w/ 6.2B paras (optimized for Chinese QA & dialogue for now).
-
Enhance your NLP capabilities with Baichuan-7B - a groundbreaking model that excels in language processing and text generation. Discover its bilingual capabilities, versatile applications, and impressive performance. Shape the future of human-computer communication with Baichuan-7B.
-
JetMoE-8B is trained with less than $ 0.1 million1 cost but outperforms LLaMA2-7B from Meta AI, who has multi-billion-dollar training resources. LLM training can be much cheaper than people generally thought.
-
MiniCPM is an End-Side LLM developed by ModelBest Inc. and TsinghuaNLP, with only 2.4B parameters excluding embeddings (2.7B in total).