Launched | 2023 |
Pricing Model | Free |
Starting Price | |
Tech used | |
Tag |
Launched | 1999-7 |
Pricing Model | |
Starting Price | |
Tech used | |
Tag |
Global Rank | 0 |
Country | |
Month Visit | 0 |
Global Rank | 73 |
Country | Russian Federation |
Month Visit | 506498876 |
GPT-NeoX-20B - GPT-NeoX-20B is a 20 billion parameter autoregressive language model trained on the Pile using the GPT-NeoX library.
Alfred-40 B-0723 - Alfred-40B-0723 is a finetuned version of Falcon-40B, obtained with Reinforcement Learning from Huma
Megatron-LM - Ongoing research training transformer models at scale
TinyLlama - The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.