DeepSeek-Coder-V2

deepseek/deepseek-coder

Created May 14, 2024128,000 context
$0.04/M input tokens$0.12/M output tokens

DeepSeek-Coder-V2, an open-source Mixture-of-Experts (MoE) code language model. It is further pre-trained from an intermediate checkpoint of DeepSeek-V2 with additional 6 trillion tokens.

The original V1 model was trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language in both English and Chinese. It was pre-trained on project-level code corpus by employing a extra fill-in-the-blank task.

Providers for DeepSeek-Coder-V2

OpenRouter routes requests to the best providers that are able to handle your prompt size and parameters, with fallbacks to maximize uptime.

Apps using DeepSeek-Coder-V2

Top public apps this week using this model

Recent activity on DeepSeek-Coder-V2

Tokens processed per day

Uptime stats for DeepSeek-Coder-V2

Uptime stats for DeepSeek-Coder-V2 across all providers

Sample code and API for DeepSeek-Coder-V2

OpenRouter normalizes requests and responses across providers for you.

OpenRouter provides an OpenAI-compatible completion API to 300+ models & providers that you can call directly, or using the OpenAI SDK. Additionally, some third-party SDKs are available.

In the examples below, the OpenRouter-specific headers are optional. Setting them allows your app to appear on the OpenRouter leaderboards.

Using third-party SDKs

For information about using third-party SDKs and frameworks with OpenRouter, please see our frameworks documentation.

See the Request docs for all possible fields, and Parameters for explanations of specific sampling parameters.

    DeepSeek-Coder-V2 | OpenRouter