Noromaid Mixtral 8x7B Instruct

neversleep/noromaid-mixtral-8x7b-instruct

Updated Jan 28,000 context
$3 / 1M input tokens$3 / 1M output tokens

This model was trained for 8h(v1) + 8h(v2) + 12h(v3) on customized modified datasets, focusing on RP, uncensoring, and a modified version of the Alpaca prompting (that was already used in LimaRP), which should be at the same conversational level as ChatLM or Llama2-Chat without adding any additional special tokens.

OpenRouter first attempts the primary provider, and falls back to others if it encounters an error. Prices displayed per million tokens.