A large LLM created by combining two fine-tuned Llama 70B models into one 120B model. Combines Xwin and Euryale. Credits to - @chargoddard for developing the framework used to merge the model - mergekit. - @Undi95 for helping with the merge ratios. #merge