# Integrations landing

Stratix integrates with your AI stack at three layers:

1. **Platform integrations** — configurable from Premium Settings → Integrations
2. **SDK auto-instrumentation** — drop-in instrumentation for LLM providers via `layerlens.instrument`
3. **CI/CD and BYOK** — operational integrations

## Platform integrations

Configurable from **Premium → Settings → Integrations**:

* **Langfuse** — connect an existing Langfuse instance to ingest trace data into your Stratix workspace
* **BYOK custom models** — register OpenAI-compatible endpoints (also surfaced under the Models page) — see [BYOK custom models](/5.-select-pick-the-model/byok-custom-models.md)

## SDK auto-instrumentation

The `stratix-python` SDK's `layerlens.instrument` package wraps LLM provider clients to emit traces automatically.

**Production adapters with live integration tests:**

* [OpenAI](/10.-integrate-connect-your-stack/provider-openai.md) — chat completions, embeddings, tool-use, streaming
* [Anthropic](/10.-integrate-connect-your-stack/provider-anthropic.md) — Messages API, multi-turn, cache metadata, streaming

**Available via SDK extras (drop-in instrumentation):**

| Provider         | Install                                            |
| ---------------- | -------------------------------------------------- |
| OpenAI           | `pip install 'layerlens[providers-openai]'`        |
| Anthropic        | `pip install 'layerlens[providers-anthropic]'`     |
| Azure OpenAI     | `pip install 'layerlens[providers-azure-openai]'`  |
| AWS Bedrock      | `pip install 'layerlens[providers-bedrock]'`       |
| Google Vertex AI | `pip install 'layerlens[providers-google-vertex]'` |
| Cohere           | `pip install 'layerlens[providers-cohere]'`        |
| Mistral          | `pip install 'layerlens[providers-mistral]'`       |
| Ollama           | `pip install 'layerlens[providers-ollama]'`        |
| LiteLLM          | `pip install 'layerlens[providers-litellm]'`       |

For canonical sample code, see [`samples/integrations/`](https://github.com/layerlens/stratix-python/tree/main/samples/integrations) — manual trace upload (`openai_traced.py`, `anthropic_traced.py`) and zero-code auto-instrumentation (`openai_instrumented.py`, `langchain_instrumented.py`).

For frameworks beyond what the canonical SDK samples cover, the [`samples/instrument/`](https://github.com/layerlens/stratix-python/tree/main/samples/instrument) directory and SDK `pyproject.toml` extras list the adapter surface that ships with the SDK.

## Auto-instrumentation primitives

For pipelines without a dedicated provider adapter, use the manual primitives:

```python
from layerlens.instrument import trace, span

@trace
def run_pipeline(query: str):
 with span("retrieval"):
 chunks = retrieve(query)
 with span("llm-call"):
 return llm_call(query, chunks)
```

## CI/CD

* [GitHub Actions](/6.-build-wire-your-code/cicd-github-actions.md) — drop-in workflow file ships in the SDK at `samples/cicd/github_actions_gate.yml`
* [GitLab CI](/6.-build-wire-your-code/cicd-gitlab.md)
* [Buildkite](/6.-build-wire-your-code/cicd-buildkite.md)
* [Containerized CI (Dockerfile)](/6.-build-wire-your-code/containerized-ci.md)

## Notifications

Notifications are configured in **Premium → Settings → Notifications**. In-app and email subscriptions support:

* Evaluation completion
* Trace evaluation threshold crossing
* GEPA optimization completion
* ECU low-balance warnings
* Org events (invitations, role changes)
* Billing events

For routing notifications to Slack or PagerDuty, customers route their own webhooks from notification events to chat / alerting platforms. See [Cookbook: Slack notification routing](/6.-build-wire-your-code/integration-slack.md) and [PagerDuty escalation](/6.-build-wire-your-code/integration-pagerduty.md) for shaped patterns.

## Partner clouds (BYOK / OpenAI-compatible)

Any OpenAI-compatible endpoint works via `client.models.create_custom(...)`:

* [Fireworks AI](https://github.com/LayerLens/gitbook-full/blob/main/10-integrate/partner-fireworks.md)
* [Nebius](https://github.com/LayerLens/gitbook-full/blob/main/10-integrate/partner-nebius.md)
* [SambaNova](https://github.com/LayerLens/gitbook-full/blob/main/10-integrate/partner-sambanova.md)
* [MLflow gateway](https://github.com/LayerLens/gitbook-full/blob/main/10-integrate/partner-mlflow.md)

## Where to next

* [SDK auto-instrumentation samples](https://github.com/layerlens/stratix-python/tree/main/samples/integrations)
* [BYOK custom models](/5.-select-pick-the-model/byok-custom-models.md)
* [Workflow: Instrument](/9.-improve-tune-the-system/workflow.md)


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.layerlens.ai/10.-integrate-connect-your-stack/10-integrate.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
