Liquid: LFM 40B MoE

liquid/lfm-40b

Created Sep 30, 202432,768 context

Liquid's 40.3B Mixture of Experts (MoE) model. Liquid Foundation Models (LFMs) are large neural networks built with computational units rooted in dynamic systems.

LFMs are general-purpose AI models that can be used to model any kind of sequential data, including video, audio, text, time series, and signals.

See the launch announcement for benchmarks and more info.

Recent activity on LFM 40B MoE

Tokens processed per day

Feb 4Feb 10Feb 16Feb 22Feb 28Mar 6Mar 12Mar 18Mar 24Mar 30Apr 5Apr 11Apr 17Apr 23Apr 29May 50250M500M750M1B
    LFM 40B MoE - API, Providers, Stats | OpenRouter