Cohere Rerank 3.5 vs BAAI/BGE Reranker v2 M3

Detailed comparison between Cohere Rerank 3.5 and BAAI/BGE Reranker v2 M3. See which reranker best meets your accuracy and performance needs.

Model Comparison

Cohere Rerank 3.5 takes the lead.

Both Cohere Rerank 3.5 and BAAI/BGE Reranker v2 M3 are powerful reranking models designed to improve retrieval quality in RAG applications. However, their performance characteristics differ in important ways.

Why Cohere Rerank 3.5:

  • Cohere Rerank 3.5 has 123 higher ELO rating
  • Cohere Rerank 3.5 is 1991ms faster on average
  • Cohere Rerank 3.5 has a 13.1% higher win rate

Overview

Key metrics

ELO Rating

Overall ranking quality

Cohere Rerank 3.5

1457

BAAI/BGE Reranker v2 M3

1334

Win Rate

Head-to-head performance

Cohere Rerank 3.5

43.7%

BAAI/BGE Reranker v2 M3

30.6%

Accuracy (nDCG@10)

Ranking quality metric

Cohere Rerank 3.5

0.080

BAAI/BGE Reranker v2 M3

0.084

Average Latency

Response time

Cohere Rerank 3.5

392ms

BAAI/BGE Reranker v2 M3

2383ms

Visual Performance Analysis

Performance

ELO Rating Comparison

Win/Loss/Tie Breakdown

Accuracy Across Datasets (nDCG@10)

Latency Distribution (ms)

Breakdown

How the models stack up

MetricCohere Rerank 3.5BAAI/BGE Reranker v2 M3Description
Overall Performance
ELO Rating
1457
1334
Overall ranking quality based on pairwise comparisons
Win Rate
43.7%
30.6%
Percentage of comparisons won against other models
Pricing & Availability
Price per 1M tokens
$0.050
$0.020
Cost per million tokens processed
Release Date
2024-12-02
2023-09-15
Model release date
Accuracy Metrics
Avg nDCG@10
0.080
0.084
Normalized discounted cumulative gain at position 10
Performance Metrics
Avg Latency
392ms
2383ms
Average response time across all datasets

Dataset Performance

By field

Comprehensive comparison of accuracy metrics (nDCG, Recall) and latency percentiles for each benchmark dataset.

MSMARCO

MetricCohere Rerank 3.5BAAI/BGE Reranker v2 M3Description
Latency Metrics
Mean
339ms
2207ms
Average response time
P50
285ms
825ms
50th percentile (median)
P90
304ms
1247ms
90th percentile

arguana

MetricCohere Rerank 3.5BAAI/BGE Reranker v2 M3Description
Accuracy Metrics
nDCG@5
0.267
0.312
Ranking quality at top 5 results
nDCG@10
0.355
0.386
Ranking quality at top 10 results
Recall@5
0.520
0.560
% of relevant docs in top 5
Recall@10
0.800
0.780
% of relevant docs in top 10
Latency Metrics
Mean
570ms
2989ms
Average response time
P50
373ms
1658ms
50th percentile (median)
P90
617ms
2279ms
90th percentile

FiQa

MetricCohere Rerank 3.5BAAI/BGE Reranker v2 M3Description
Accuracy Metrics
nDCG@5
0.124
0.112
Ranking quality at top 5 results
nDCG@10
0.128
0.120
Ranking quality at top 10 results
Recall@5
0.123
0.105
% of relevant docs in top 5
Recall@10
0.130
0.130
% of relevant docs in top 10
Latency Metrics
Mean
364ms
2529ms
Average response time
P50
315ms
1019ms
50th percentile (median)
P90
401ms
1649ms
90th percentile

business reports

MetricCohere Rerank 3.5BAAI/BGE Reranker v2 M3Description
Latency Metrics
Mean
334ms
2451ms
Average response time
P50
293ms
895ms
50th percentile (median)
P90
503ms
1679ms
90th percentile

PG

MetricCohere Rerank 3.5BAAI/BGE Reranker v2 M3Description
Latency Metrics
Mean
458ms
2034ms
Average response time
P50
360ms
1225ms
50th percentile (median)
P90
615ms
2091ms
90th percentile

DBPedia

MetricCohere Rerank 3.5BAAI/BGE Reranker v2 M3Description
Latency Metrics
Mean
286ms
2087ms
Average response time
P50
279ms
806ms
50th percentile (median)
P90
290ms
1068ms
90th percentile

Explore More

Compare more rerankers

See how all reranking models stack up. Compare Cohere, Jina AI, Voyage, ZeRank, and more. View comprehensive benchmarks, compare performance metrics, and find the perfect reranker for your RAG application.