Skip to content
  1.  
  2. © 2023 – 2025 OpenRouter, Inc

    Noromaid Mixtral 8x7B Instruct

    neversleep/noromaid-mixtral-8x7b-instruct

    Created Jan 2, 20248,000 context

    This model was trained for 8h(v1) + 8h(v2) + 12h(v3) on customized modified datasets, focusing on RP, uncensoring, and a modified version of the Alpaca prompting (that was already used in LimaRP), which should be at the same conversational level as ChatLM or Llama2-Chat without adding any additional special tokens.

    Recent activity on Noromaid Mixtral 8x7B Instruct

    Total usage per day on OpenRouter

    Not enough data to display yet.