Liquid's 40.3B Mixture of Experts (MoE) model. Liquid Foundation Models (LFMs) are large neural networks built with computational units rooted in dynamic systems.
LFMs are general-purpose AI models that can be used to model any kind of sequential data, including video, audio, text, time series, and signals.
OpenRouter normalizes requests and responses across providers for you.
To get started, you can use LFM 40B MoE via API like this:
fetch("https://openrouter.ai/api/v1/chat/completions",{ method:"POST", headers:{"Authorization":`Bearer ${OPENROUTER_API_KEY}`,"HTTP-Referer":`${YOUR_SITE_URL}`,// Optional, for including your app on openrouter.ai rankings."X-Title":`${YOUR_SITE_NAME}`,// Optional. Shows in rankings on openrouter.ai."Content-Type":"application/json"}, body:JSON.stringify({"model":"liquid/lfm-40b","messages":[{"role":"user","content":"What is the meaning of life?"}]})});
You can also use OpenRouter with OpenAI's client API:
import OpenAI from"openai"const openai =newOpenAI({ baseURL:"https://openrouter.ai/api/v1", apiKey: $OPENROUTER_API_KEY, defaultHeaders:{"HTTP-Referer": $YOUR_SITE_URL,// Optional, for including your app on openrouter.ai rankings."X-Title": $YOUR_SITE_NAME,// Optional. Shows in rankings on openrouter.ai.}})asyncfunctionmain(){const completion =await openai.chat.completions.create({ model:"liquid/lfm-40b", messages:[{"role":"user","content":"What is the meaning of life?"}]})console.log(completion.choices[0].message)}main()
See the Request docs for all possible parameters, and Parameters for recommended values.