RWKV v5 World 3B


Updated Dec 1010,000 context
$0/M input tkns$0/M output tkns

RWKV is an RNN (recurrent neural network) with transformer-level performance. It aims to combine the best of RNNs and transformers - great performance, fast inference, low VRAM, fast training, "infinite" context length, and free sentence embedding.

RWKV-5 is trained on 100+ world languages (70% English, 15% multilang, 15% code).

RWKV 3B models are provided for free, by Recursal.AI, for the beta period. More details here.