Skip to content
  1. Status
  2. Announcements
  3. Docs
  4. Support
  5. About
  6. Partners
  7. Enterprise
  8. Careers
  9. Pricing
  10. Privacy
  11. Terms
  12.  
  13. © 2025 OpenRouter, Inc

    Liquid: LFM 40B MoE

    liquid/lfm-40b

    Created Sep 30, 202432,768 context

    Liquid's 40.3B Mixture of Experts (MoE) model. Liquid Foundation Models (LFMs) are large neural networks built with computational units rooted in dynamic systems.

    LFMs are general-purpose AI models that can be used to model any kind of sequential data, including video, audio, text, time series, and signals.

    See the launch announcement for benchmarks and more info.

    Uptime stats for LFM 40B MoE

    Uptime stats for LFM 40B MoE across all providers