Liquid: LFM 40B MoE

liquid/lfm-40b

Created Sep 30, 202432,768 context

Liquid's 40.3B Mixture of Experts (MoE) model. Liquid Foundation Models (LFMs) are large neural networks built with computational units rooted in dynamic systems.

LFMs are general-purpose AI models that can be used to model any kind of sequential data, including video, audio, text, time series, and signals.

See the launch announcement for benchmarks and more info.

Recent activity on LFM 40B MoE

Tokens processed per day

Feb 9Feb 15Feb 21Feb 27Mar 5Mar 11Mar 17Mar 23Mar 29Apr 4Apr 10Apr 16Apr 22Apr 28May 4May 100250M500M750M1B
    LFM 40B MoE - API, Providers, Stats | OpenRouter