DeepSeek: R1

deepseek/deepseek-r1

Created Jan 20, 2025163,840 context
$0.50/M input tokens$2.18/M output tokens

DeepSeek R1 is here: Performance on par with OpenAI o1, but open-sourced and with fully open reasoning tokens. It's 671B parameters in size, with 37B active in an inference pass.

Fully open-source model & technical report.

MIT licensed: Distill & commercialize freely!

Providers for R1

OpenRouter routes requests to the best providers that are able to handle your prompt size and parameters, with fallbacks to maximize uptime.

Context
16K
Max Output
16K
Input
$5
Output
$7
Context
33K
Max Output
33K
Input
$1
Output
$3
Context
164K
Max Output
164K
Input
$3
Output
$7
CA
fp8
Context
131K
Max Output
131K
Input
$2.99
Output
$2.99
Context
131K
Max Output
131K
Input
$1.95
Output
$5
Context
164K
Max Output
164K
Input
$3
Output
$8

Throughput

Latency

More models from DeepSeek

    DeepSeek: R1 – Provider Status | OpenRouter