# BYOK models

Stratix supports **OpenAI-compatible endpoints** as custom models. Register your endpoint, store the API key encrypted in the BYOK vault, and Stratix treats it as first-class — usable in evaluations, comparisons, scorers, judges, and trace evaluations.

## Why BYOK

* You're running a self-hosted model (vLLM, Together, Fireworks, Nebius, SambaNova, custom)
* You want token consumption to bill against your own provider account
* You want the same dashboard for public and private models

## What "OpenAI-compatible" means

The endpoint exposes the OpenAI Chat Completions and (where supported) Embeddings APIs at a base URL. Stratix's evaluation engine sends requests in OpenAI shape and parses responses likewise.

## Two billing surfaces

When evaluating with a BYOK custom model:

* **Your provider** bills the **token consumption** (what your endpoint actually does)
* **Stratix ECU** bills the **platform work** (judge calls, trace evaluation engine, etc.)

## Setup

In Premium **Models → Register custom model**:

1. Endpoint URL
2. API key (encrypted in the BYOK vault)
3. Model identifier
4. Capability tags

Stratix probes the endpoint and confirms it responds.

## Security

* Keys encrypted at rest, never returned via API (only masked prefix shown)
* Org-scoped — only members with `models:write` can register or rotate
* Rotation supported

## Tested providers

* vLLM (self-hosted)
* Together AI
* Fireworks AI
* Nebius
* SambaNova Cloud
* MLflow gateways
* Custom endpoints behind a reverse proxy

## Where to next

* [BYOK custom models (Premium)](/5.-select-pick-the-model/byok-custom-models.md)
* [Models](/5.-select-pick-the-model/models.md)
* [Integrations](https://github.com/LayerLens/gitbook-full/blob/main/05-select/catalog/README.md)


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.layerlens.ai/5.-select-pick-the-model/byok-models.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
