Hunyuan-A13B is a 13B active parameter Mixture-of-Experts (MoE) language model developed by Tencent, with a total parameter count of 80B and support for reasoning via Chain-of-Thought. It offers competitive benchmark performance across mathematics, science, coding, and multi-turn reasoning tasks, while maintaining high inference efficiency via Grouped Query Attention (GQA) and quantization support (FP8, GPTQ, etc.).
Recent activity on Hunyuan A13B Instruct
Total usage per day on OpenRouter
Prompt
870K
Completion
73K
Reasoning
0
Prompt tokens measure input size. Reasoning tokens show internal thinking before a response. Completion tokens reflect total output length.