Skip to content
  1. Status
  2. Announcements
  3. Docs
  4. Support
  5. About
  6. Partners
  7. Enterprise
  8. Careers
  9. Pricing
  10. Privacy
  11. Terms
  12.  
  13. © 2025 OpenRouter, Inc

    Nous: Hermes 2 Mixtral 8x7B SFT

    nousresearch/nous-hermes-2-mixtral-8x7b-sft

    Created Jan 16, 202432,768 context

    Nous Hermes 2 Mixtral 8x7B SFT is the supervised finetune only version of the Nous Research model trained over the Mixtral 8x7B MoE LLM.

    The model was trained on over 1,000,000 entries of primarily GPT-4 generated data, as well as other high quality data from open datasets across the AI landscape, achieving state of the art performance on a variety of tasks.

    #moe

    Sample code and API for Hermes 2 Mixtral 8x7B SFT

    OpenRouter normalizes requests and responses across providers for you.

    OpenRouter provides an OpenAI-compatible completion API to 400+ models & providers that you can call directly, or using the OpenAI SDK. Additionally, some third-party SDKs are available.

    In the examples below, the OpenRouter-specific headers are optional. Setting them allows your app to appear on the OpenRouter leaderboards.

    Using third-party SDKs

    For information about using third-party SDKs and frameworks with OpenRouter, please see our frameworks documentation.

    See the Request docs for all possible fields, and Parameters for explanations of specific sampling parameters.