Cohere Rerank 4 Pro vs Contextual AI Rerank v2 Instruct
Detailed comparison between Cohere Rerank 4 Pro and Contextual AI Rerank v2 Instruct. See which reranker best meets your accuracy and performance needs.
Model Comparison
Cohere Rerank 4 Pro takes the lead.
Both Cohere Rerank 4 Pro and Contextual AI Rerank v2 Instruct are powerful reranking models designed to improve retrieval quality in RAG applications. However, their performance characteristics differ in important ways.
Why Cohere Rerank 4 Pro:
- Cohere Rerank 4 Pro has 166 higher ELO rating
- Cohere Rerank 4 Pro is 2720ms faster on average
- Cohere Rerank 4 Pro has a 15.9% higher win rate
Overview
Key metrics
ELO Rating
Overall ranking quality
Cohere Rerank 4 Pro
Contextual AI Rerank v2 Instruct
Win Rate
Head-to-head performance
Cohere Rerank 4 Pro
Contextual AI Rerank v2 Instruct
Accuracy (nDCG@10)
Ranking quality metric
Cohere Rerank 4 Pro
Contextual AI Rerank v2 Instruct
Average Latency
Response time
Cohere Rerank 4 Pro
Contextual AI Rerank v2 Instruct
Visual Performance Analysis
Performance
ELO Rating Comparison
Win/Loss/Tie Breakdown
Latency Distribution (ms)
Breakdown
How the models stack up
| Metric | Cohere Rerank 4 Pro | Contextual AI Rerank v2 Instruct | Description |
|---|---|---|---|
| Overall Performance | |||
| ELO Rating | 1627 | 1461 | Overall ranking quality based on pairwise comparisons |
| Win Rate | 58.2% | 42.3% | Percentage of comparisons won against other models |
| Pricing & Availability | |||
| Price per 1M tokens | $0.050 | $0.050 | Cost per million tokens processed |
| Release Date | 2025-12-11 | 2025-09-12 | Model release date |
| Accuracy Metrics | |||
| Avg nDCG@10 | N/A | 0.114 | Normalized discounted cumulative gain at position 10 |
| Performance Metrics | |||
| Avg Latency | 614ms | 3333ms | Average response time across all datasets |
Dataset Performance
By field
Comprehensive comparison of accuracy metrics (nDCG, Recall) and latency percentiles for each benchmark dataset.
MSMARCO
| Metric | Cohere Rerank 4 Pro | Contextual AI Rerank v2 Instruct | Description |
|---|---|---|---|
| Latency Metrics | |||
| Mean | 458ms | 3283ms | Average response time |
| P50 | 408ms | 3260ms | 50th percentile (median) |
| P90 | 615ms | 3885ms | 90th percentile |
arguana
| Metric | Cohere Rerank 4 Pro | Contextual AI Rerank v2 Instruct | Description |
|---|---|---|---|
| Latency Metrics | |||
| Mean | 785ms | 3627ms | Average response time |
| P50 | 768ms | 3601ms | 50th percentile (median) |
| P90 | 933ms | 4037ms | 90th percentile |
FiQa
| Metric | Cohere Rerank 4 Pro | Contextual AI Rerank v2 Instruct | Description |
|---|---|---|---|
| Latency Metrics | |||
| Mean | 610ms | 3283ms | Average response time |
| P50 | 585ms | 3209ms | 50th percentile (median) |
| P90 | 817ms | 3891ms | 90th percentile |
business reports
| Metric | Cohere Rerank 4 Pro | Contextual AI Rerank v2 Instruct | Description |
|---|---|---|---|
| Latency Metrics | |||
| Mean | 529ms | 3231ms | Average response time |
| P50 | 498ms | 3129ms | 50th percentile (median) |
| P90 | 675ms | 3651ms | 90th percentile |
PG
| Metric | Cohere Rerank 4 Pro | Contextual AI Rerank v2 Instruct | Description |
|---|---|---|---|
| Latency Metrics | |||
| Mean | 760ms | 3566ms | Average response time |
| P50 | 720ms | 3475ms | 50th percentile (median) |
| P90 | 896ms | 4148ms | 90th percentile |
DBPedia
| Metric | Cohere Rerank 4 Pro | Contextual AI Rerank v2 Instruct | Description |
|---|---|---|---|
| Latency Metrics | |||
| Mean | 541ms | 3010ms | Average response time |
| P50 | 489ms | 3042ms | 50th percentile (median) |
| P90 | 729ms | 3283ms | 90th percentile |
Explore More
Compare more rerankers
See how all reranking models stack up. Compare Cohere, Jina AI, Voyage, ZeRank, and more. View comprehensive benchmarks, compare performance metrics, and find the perfect reranker for your RAG application.