DeepSeek: DeepSeek V3 0324

deepseek/deepseek-chat-v3-0324

Created Mar 24, 202564,000 context
$0.27/M input tokens$1.1/M output tokens

DeepSeek V3, a 685B-parameter, mixture-of-experts model, is the latest iteration of the flagship chat model family from the DeepSeek team.

It succeeds the DeepSeek V3 model and performs really well on a variety of tasks.

Providers for DeepSeek V3 0324

OpenRouter routes requests to the best providers that are able to handle your prompt size and parameters, with fallbacks to maximize uptime.

Context
64K
Max Output
8K
Input
$0.27
Output
$1.1
Context
164K
Max Output
164K
Input
$0.4
Output
$0.89
Context
128K
Max Output
16K
Input
$0.4
Output
$1.3
Context
128K
Max Output
33K
Input
$0.75
Output
$1.5
Context
8K
Max Output
8K
Input
$1
Output
$1.5
Context
164K
Max Output
164K
Input
$1.15
Output
$1.2
Context
164K
Max Output
164K
Input
$1.25
Output
$1.25
Context
64K
Max Output
64K
Input
$1.25
Output
$1.25

Apps using DeepSeek V3 0324

Top public apps this week using this model

1.
Cline
Autonomous coding agent right in your IDE
5.45B tokens
2.
Roo Code
A whole dev team of AI agents in your editor
4.5B tokens
3.
SillyTavern
LLM frontend for power users
1.88B tokens
4.
Chub AI
GenAI for everyone
528M tokens
5.
OpenRouter: Chatroom
Chat with multiple LLMs at once
411M tokens
6.
Aider
AI pair programming in your terminal
198M tokens
7.
Open WebUI
Extensible, self-hosted AI interface
170M tokens
8.
100M tokens
9.
CHIM
AI framework for Skyrim
80.9M tokens
10.
71.2M tokens
11.
Agnaistic
A "bring your own AI" chat service
68.5M tokens
12.
liteLLM
Open-source library to simplify LLM calls
68.5M tokens
13.
66.9M tokens
14.
63.2M tokens

Recent activity on DeepSeek V3 0324

Tokens processed per day

Mar 24Mar 25Mar 26Mar 27Mar 28Mar 29Mar 30Mar 31Apr 101.5B3B4.5B6B

Versions by Token Share

Mar 25Mar 26Mar 27Mar 28Mar 29Mar 30Mar 31Apr 10%25%50%75%100%
Currently Viewing
DeepSeek: DeepSeek V3 0324
Created March 24, 2025131,072 context
24.8B tokens

Uptime stats for DeepSeek V3 0324

Uptime stats for DeepSeek V3 0324 across all providers

When an error occurs in an upstream provider, we can recover by routing to another healthy provider, if your request filters allow it.

Learn more about our load balancing and customization options.

Sample code and API for DeepSeek V3 0324

OpenRouter normalizes requests and responses across providers for you.

OpenRouter provides an OpenAI-compatible completion API to 300+ models & providers that you can call directly, or using the OpenAI SDK. Additionally, some third-party SDKs are available.

In the examples below, the OpenRouter-specific headers are optional. Setting them allows your app to appear on the OpenRouter leaderboards.

from openai import OpenAI

client = OpenAI(
  base_url="https://openrouter.ai/api/v1",
  api_key="<OPENROUTER_API_KEY>",
)

completion = client.chat.completions.create(
  extra_headers={
    "HTTP-Referer": "<YOUR_SITE_URL>", # Optional. Site URL for rankings on openrouter.ai.
    "X-Title": "<YOUR_SITE_NAME>", # Optional. Site title for rankings on openrouter.ai.
  },
  extra_body={},
  model="deepseek/deepseek-chat-v3-0324",
  messages=[
    {
      "role": "user",
      "content": "What is the meaning of life?"
    }
  ]
)
print(completion.choices[0].message.content)

Using third-party SDKs

For information about using third-party SDKs and frameworks with OpenRouter, please see our frameworks documentation.

See the Request docs for all possible fields, and Parameters for explanations of specific sampling parameters.

More models from DeepSeek

    DeepSeek V3 0324 - API, Providers, Stats | OpenRouter