RWKV v5: Eagle 7B

recursal/eagle-7b

Updated Jan 2910,000 context
$0 / 1M input tokens$0 / 1M output tokens

Eagle 7B is trained on 1.1 Trillion Tokens across 100+ world languages (70% English, 15% multilang, 15% code).

  • Built on the RWKV-v5 architecture (a linear transformer with 10-100x+ lower inference cost)
  • Ranks as the world's greenest 7B model (per token)
  • Outperforms all 7B class models in multi-lingual benchmarks
  • Approaches Falcon (1.5T), LLaMA2 (2T), Mistral (>2T?) level of performance in English evals
  • Trade blows with MPT-7B (1T) in English evals
  • All while being an "Attention-Free Transformer"

Eagle 7B models are provided for free, by Recursal.AI, for the beta period till end of March 2024

Find out more here

rnn

OpenRouter first attempts the primary provider, and falls back to others if it encounters an error. Prices displayed per million tokens.