# Instrument LlamaIndex

{% hint style="warning" %}
**Private preview.** This surface is not yet generally available. Adapters live in the Python SDK today and the rendering / configuration UX is under iteration. Contact <support@layerlens.ai> to request access to the private preview.
{% endhint %}

**When to use:** LlamaIndex pipeline (RAG, query engines) and you want traces in Stratix.

## Approach 1 — SDK adapter (when available)

The Stratix SDK ships LlamaIndex adapter coverage as an optional extras group:

```bash
pip install 'layerlens[frameworks-llamaindex]'
```

Wire the LayerLens callback into your `CallbackManager`. For canonical patterns (auto-instrumentation primitives), see the [`samples/instrument/`](https://github.com/layerlens/sdk-python/tree/main/samples/instrument) directory and the LangChain instrumented sample (same pattern).

## Approach 2 — manual trace upload from your pipeline

Capture per-query input + output, upload as a trace file:

```python
from layerlens import Stratix
client = Stratix()

# Run your LlamaIndex query
response = query_engine.query("What did the report say about Q3?")

# Build a trace record (any flexible Dict[str, Any] for `data`)
trace_record = {
 "input": "What did the report say about Q3?",
 "output": str(response),
 "spans": [
 {"name": "retrieval", "chunks": [n.node.text for n in response.source_nodes]},
 {"name": "synthesis", "text": str(response)},
 ],
}

# Write to JSONL and upload (≤50 MB per file)
import json
from pathlib import Path
Path("trace.jsonl").write_text(json.dumps(trace_record))
result = client.traces.upload("trace.jsonl")

# Score with a faithfulness judge
client.trace_evaluations.create(trace_id=result.trace_ids[0], judge_id=FAITHFULNESS_JUDGE_ID)
```

## See also

* [Workflow: Instrument](/6.-build-wire-your-code/workflow.md)
* [Integrations](/6.-build-wire-your-code/migration.md)
* [SDK reference: traces](https://github.com/LayerLens/gitbook-full/blob/main/13-reference/sdk-python/traces.md)
* [SDK adapter samples](https://github.com/layerlens/stratix-python/tree/main/samples/instrument)


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.layerlens.ai/6.-build-wire-your-code/instrument-llamaindex.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
