Getting Started
StackLens is an observability and governance platform for AI systems. It gives your team visibility into every LLM call, versioned control over prompts, and policy enforcement for compliance.
What you get
- StackTrace — trace every LLM call, agent run, and RAG pipeline. See model, tokens, cost, and latency in real time.
- FlowOps — version control, environments, and A/B testing for prompts. Manage prompts as engineering artifacts, not hardcoded strings.
- GovernAI — PII detection, policy-as-code enforcement, and EU AI Act compliance reporting.
Choose your setup
Cloud (managed)
Sign up at app.getstacklens.ai to get started immediately. No infrastructure to manage.
- Create an account and organization.
- Generate an API key under Settings → API Keys.
- Install the SDK and send your first trace.
Self-hosted
Run StackLens on your own infrastructure. See the self-hosting guide.
Install the Python SDK
pip install stacklensSend your first trace
import stacklens
stacklens.configure(api_key="sl-xxxx") # from Settings → API Keys
stacklens.trace("my-first-trace", model="gpt-4o", provider="openai", input_tokens=150, output_tokens=200)Open the StackLens dashboard — your trace will appear in StackTrace within seconds.
Next steps
- StackTrace — full tracing guide
- FlowOps — prompt version control
- GovernAI — governance and compliance
- Python SDK reference
- Self-hosting