Liquid's LFM 3B delivers incredible performance for its size. It positions itself as first place among 3B parameter transformers, hybrids, and RNN models It is also on par with Phi-3.5-mini on multiple benchmarks, while being 18.4% smaller.
LFM-3B is the ideal choice for mobile and other edge text-based applications.
OpenRouter routes requests to the best providers that are able to handle your prompt size and parameters, with fallbacks to maximize uptime.
Apps using LFM 3B
Top public apps this week using this model
Recent activity on LFM 3B
Tokens processed per day
Uptime stats for LFM 3B
Uptime stats for LFM 3B across all providers
Sample code and API for LFM 3B
OpenRouter normalizes requests and responses across providers for you.
OpenRouter provides an OpenAI-compatible completion API to 0 models & providers that you can call directly, or using the OpenAI SDK. Additionally, some third-party SDKs are available.
In the examples below, the OpenRouter-specific headers are optional. Setting them allows your app to appear on the OpenRouter leaderboards.
from openai import OpenAI
client = OpenAI( base_url="https://openrouter.ai/api/v1", api_key="<OPENROUTER_API_KEY>",)completion = client.chat.completions.create( extra_headers={"HTTP-Referer":"<YOUR_SITE_URL>",# Optional. Site URL for rankings on openrouter.ai."X-Title":"<YOUR_SITE_NAME>",# Optional. Site title for rankings on openrouter.ai.}, extra_body={}, model="liquid/lfm-3b", messages=[{"role":"user","content":"What is the meaning of life?"}])print(completion.choices[0].message.content)
Using third-party SDKs
For information about using third-party SDKs and frameworks with OpenRouter, please see our frameworks documentation.
See the Request docs for all possible fields, and Parameters for explanations of specific sampling parameters.