Last verified 2025-09-22 (left) · 2025-09-22 (right)

Gemini 2.5 Pro vs GPT-5 Chat — Pricing & Capability Comparison

Gemini 2.5 Pro charges $1.25 per million input tokens and $10.00 per million output tokens. GPT-5 Chat comes in at $1.25 / $10.00. Context windows span 200K vs 200K tokens respectively.

Input price (per 1M)

Gemini 2.5 Pro

$1.25

GPT-5 Chat

$1.25

Gemini 2.5 Pro leads here

Output price (per 1M)

Gemini 2.5 Pro

$10.00

GPT-5 Chat

$10.00

Gemini 2.5 Pro leads here

Context window

Gemini 2.5 Pro

200,000 tokens

GPT-5 Chat

200,000 tokens

Gemini 2.5 Pro leads here

Cached input

Gemini 2.5 Pro

Not published

GPT-5 Chat

$0.125

GPT-5 Chat leads here

Cost comparison for 10K-token workloads

Side-by-side pricing for identical workloads (10,000 total tokens per request) across different distributions.

ScenarioGemini 2.5 ProGPT-5 ChatGPT-5 Chat cached
Balanced conversation
50% input · 50% output
$0.0563$0.0563$0.0506
Input-heavy workflow
80% input · 20% output
$0.0300$0.0300$0.0210
Generation heavy
30% input · 70% output
$0.0738$0.0738$0.0704
Cached system prompt
90% cached input · 10% fresh output
$0.0212$0.0212$0.0111

Frequently asked questions

Which model is cheaper per million input tokens?

Gemini 2.5 Pro costs $1.25 per million input tokens versus $1.25 for GPT-5 Chat.

How do output prices compare?

Gemini 2.5 Pro charges $10.00 per million output tokens, while GPT-5 Chat costs $10.00 per million.

Which model supports a larger context window?

Gemini 2.5 Pro offers 200,000 tokens (200K) versus 200K for GPT-5 Chat.

Related resources