Skip to content
  • Status
  • Announcements
  • Docs
  • Support
  • About
  • Partners
  • Enterprise
  • Careers
  • Pricing
  • Privacy
  • Terms
  •  
  • © 2026 OpenRouter, Inc

    Nous: Hermes 2 Mixtral 8x7B DPO

    nousresearch/nous-hermes-2-mixtral-8x7b-dpo

    Created Jan 16, 202432,768 context

    Nous Hermes 2 Mixtral 8x7B DPO is the new flagship Nous Research model trained over the Mixtral 8x7B MoE LLM.

    The model was trained on over 1,000,000 entries of primarily GPT-4 generated data, as well as other high quality data from open datasets across the AI landscape, achieving state of the art performance on a variety of tasks.

    #moe

    Recent activity on Hermes 2 Mixtral 8x7B DPO

    Total usage per day on OpenRouter

    Not enough data to display yet.