Skip to content
  1. Status
  2. Announcements
  3. Docs
  4. Support
  5. About
  6. Partners
  7. Enterprise
  8. Careers
  9. Pricing
  10. Privacy
  11. Terms
  12.  
  13. © 2025 OpenRouter, Inc

    Dolphin 2.6 Mixtral 8x7B 🐬

    cognitivecomputations/dolphin-mixtral-8x7b

    Created Dec 21, 202332,768 context

    This is a 16k context fine-tune of Mixtral-8x7b. It excels in coding tasks due to extensive training with coding data and is known for its obedience, although it lacks DPO tuning.

    The model is uncensored and is stripped of alignment and bias. It requires an external alignment layer for ethical use. Users are cautioned to use this highly compliant model responsibly, as detailed in a blog post about uncensored models at erichartford.com/uncensored-models.

    #moe #uncensored

    Apps using Dolphin 2.6 Mixtral 8x7B 🐬

    Top public apps this week using this model