Moonshot AI: Moonlight 16B A3B Instruct (free)
moonshotai/moonlight-16b-a3b-instruct:free
Created Feb 28, 20258,192 context
$0/M input tokens$0/M output tokens
Moonlight-16B-A3B-Instruct is a 16B-parameter Mixture-of-Experts (MoE) language model developed by Moonshot AI. It is optimized for instruction-following tasks with 3B activated parameters per inference. The model advances the Pareto frontier in performance per FLOP across English, coding, math, and Chinese benchmarks. It outperforms comparable models like Llama3-3B and Deepseek-v2-Lite while maintaining efficient deployment capabilities through Hugging Face integration and compatibility with popular inference engines like vLLM12.