Nous Hermes 2 Mixtral 8x7B DPO is the new flagship Nous Research model trained over the Mixtral 8x7B MoE LLM.
The model was trained on over 1,000,000 entries of primarily GPT-4 generated data, as well as other high quality data from open datasets across the AI landscape, achieving state of the art performance on a variety of tasks.
#moe
Providers for Hermes 2 Mixtral 8x7B DPO
OpenRouter routes requests to the best providers that are able to handle your prompt size and parameters, with fallbacks to maximize uptime.
Apps using Hermes 2 Mixtral 8x7B DPO
Top public apps this week using this model
Recent activity on Hermes 2 Mixtral 8x7B DPO
Tokens processed per day
Uptime stats for Hermes 2 Mixtral 8x7B DPO
Uptime stats for Hermes 2 Mixtral 8x7B DPO across all providers
Sample code and API for Hermes 2 Mixtral 8x7B DPO
OpenRouter normalizes requests and responses across providers for you.
OpenRouter provides an OpenAI-compatible completion API to 0 models & providers that you can call directly, or using the OpenAI SDK. Additionally, some third-party SDKs are available.
In the examples below, the OpenRouter-specific headers are optional. Setting them allows your app to appear on the OpenRouter leaderboards.
from openai import OpenAI
client = OpenAI( base_url="https://openrouter.ai/api/v1", api_key="<OPENROUTER_API_KEY>",)completion = client.chat.completions.create( extra_headers={"HTTP-Referer":"<YOUR_SITE_URL>",# Optional. Site URL for rankings on openrouter.ai."X-Title":"<YOUR_SITE_NAME>",# Optional. Site title for rankings on openrouter.ai.}, extra_body={}, model="nousresearch/nous-hermes-2-mixtral-8x7b-dpo", messages=[{"role":"user","content":"What is the meaning of life?"}])print(completion.choices[0].message.content)
Using third-party SDKs
For information about using third-party SDKs and frameworks with OpenRouter, please see our frameworks documentation.
See the Request docs for all possible fields, and Parameters for explanations of specific sampling parameters.