Search/
Skip to content
/
OpenRouterOpenRouter
© 2026 OpenRouter, Inc

Product

  • Chat
  • Rankings
  • Models
  • Providers
  • Pricing
  • Enterprise

Company

  • About
  • Announcements
  • CareersHiring
  • Partners
  • Privacy
  • Terms of Service
  • Support
  • State of AI
  • Works With OR

Developer

  • Documentation
  • API Reference
  • SDK
  • Status

Connect

  • Discord
  • GitHub
  • LinkedIn
  • X
  • YouTube

RWKV v5: Eagle 7B

recursal/eagle-7b

Created Jan 29, 202410,000 context

Eagle 7B is trained on 1.1 Trillion Tokens across 100+ world languages (70% English, 15% multilang, 15% code).

  • Built on the RWKV-v5 architecture (a linear transformer with 10-100x+ lower inference cost)
  • Ranks as the world's greenest 7B model (per token)
  • Outperforms all 7B class models in multi-lingual benchmarks
  • Approaches Falcon (1.5T), LLaMA2 (2T), Mistral (>2T?) level of performance in English evals
  • Trade blows with MPT-7B (1T) in English evals
  • All while being an "Attention-Free Transformer"

Eagle 7B models are provided for free, by Recursal.AI, for the beta period till end of March 2024

Find out more here

rnn

Recent activity on Eagle 7B

Total usage per day on OpenRouter

Not enough data to display yet.