# Compare models

The Compare models feature is Stratix Public's flagship: pick any two models, see them head-to-head across every benchmark either has been evaluated against.

URL: [`stratix.layerlens.ai/compare-models`](https://stratix.layerlens.ai/compare-models)

## What you can do

* Pick any two models from the catalog
* See a side-by-side score table across all shared benchmarks
* Drill into a single benchmark for a deeper comparison
* See where each model wins, ties, and loses
* Bookmark the comparison URL — it's stable

## How it's structured

The compare-models page has three regions:

1. **Selector** — search and pick model A and model B
2. **Score table** — one row per benchmark, model-A score, model-B score, winner
3. **Detail view** — click any benchmark to expand: raw scores, sample inputs, sample outputs

## How comparisons render

For each benchmark:

* **Stronger** model gets a green check + score
* **Weaker** model gets the score plain
* **Tie** is shown gray
* The **gap** (numeric and percentage) is shown

When the gap is small, the page shows a confidence band — was the difference statistically meaningful given the benchmark sample size?

## Three or more models

The Premium **Evaluations** surface supports comparing many models on your own dataset; the public Compare-models page is intentionally pairwise (this matches how most users decide).

## Common patterns

* **Frontier vs cost-optimized** — the same provider's flagship vs a smaller, cheaper sibling
* **Provider against provider** — OpenAI's flagship vs Anthropic's vs Google's
* **Old vs new generation** — when a new model drops, immediately compare against its predecessor

## Where to next

* [Premium — Evaluations](/8.-evaluate-score-the-outputs/evaluations.md) — compare models on your data
* [Models catalog](/5.-select-pick-the-model/models-catalog.md)
* [Use case: Model evaluation](/4.1-general-use-cases/model-evaluation.md)


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.layerlens.ai/5.-select-pick-the-model/compare-models.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
