Perplexity: PPLX 70B Chat

perplexity/pplx-70b-chat

Updated Dec 14,096 context
$0.7 / 1M input tokens$2.8 / 1M output tokens

The larger chat model by Perplexity Labs, with 70 billion parameters. Based on Llama 2 70B.

OpenRouter first attempts the primary provider, and falls back to others if it encounters an error. Prices displayed per million tokens.