Skip to content
  1. Status
  2. Announcements
  3. Docs
  4. Support
  5. About
  6. Partners
  7. Enterprise
  8. Careers
  9. Pricing
  10. Privacy
  11. Terms
  12.  
  13. © 2025 OpenRouter, Inc

    Nous: Hermes 2 Mixtral 8x7B SFT

    nousresearch/nous-hermes-2-mixtral-8x7b-sft

    Created Jan 16, 202432,768 context

    Nous Hermes 2 Mixtral 8x7B SFT is the supervised finetune only version of the Nous Research model trained over the Mixtral 8x7B MoE LLM.

    The model was trained on over 1,000,000 entries of primarily GPT-4 generated data, as well as other high quality data from open datasets across the AI landscape, achieving state of the art performance on a variety of tasks.

    #moe

    Apps using Hermes 2 Mixtral 8x7B SFT

    Top public apps this week using this model