v0.1 Private Beta — Now Open

The Lens Your
AI Agents Deserve

Real-time observability for autonomous AI agents — trace every LLM call, tool use, and decision branch before production breaks you.

Get Early Access See How It Works
agentixlens.com/dashboard — run:ax_9f2e84b1
Workspace Agents
Total Runs
8,241
↑ 14.2% today
Avg Latency
1.84s
↓ 0.2s vs avg
Success Rate
98.7%
↑ 0.3% this hr
Token Cost
$0.048
per run avg
Agent Execution Trace #ax_9f2e84b1
agent:init
42ms
llm:plan
820ms
tool:web_search
380ms
llm:summarize
510ms
agent:output
92ms
research-agent run #8241 ✓ 1.84s outreach-agent run #8240 ✓ 2.01s data-agent run #8239 ✗ timeout research-agent token cost $0.041 outreach-agent tool calls 4 tools data-agent latency avg 1.6s research-agent run #8241 ✓ 1.84s outreach-agent run #8240 ✓ 2.01s data-agent run #8239 ✗ timeout research-agent token cost $0.041 outreach-agent tool calls 4 tools data-agent latency avg 1.6s
12ms Avg instrumentation overhead
100% Open-source SDK
Any LLM Model-agnostic tracing
1 Line Integration to start
Features

Everything you need to
understand your agents

AgentixLens gives you a transparent window into every decision your AI agents make — in real time.

Visual Trace Explorer
See every LLM call, tool invocation, and decision branch visualized in a waterfall timeline. Drill into individual spans with full input/output inspection.
real-time
Cost & Token Tracking
Monitor token consumption and compute cost per agent run, per model, per tool call. Set budgets and receive alerts before your bill surprises you.
per-run granularity
Latency Heatmaps
Identify performance bottlenecks across your agent fleet with per-step latency breakdowns and p50/p95/p99 percentile heatmaps.
percentile analysis
Failure Detection & Replay
Automatically capture failed runs with full context. Replay any past run to reproduce bugs — no more "it only fails in prod" mysteries.
full-context capture
Multi-Agent Graph View
For orchestrated agent pipelines, see the full call graph — which agent spawned which, where messages flowed, and where orchestration broke.
multi-agent
Model-Agnostic SDK
Works with any LLM — Claude, GPT, Gemini, Llama, Mistral, or local models. One SDK, zero vendor lock-in, open telemetry compatible.
any model
How It Works

From zero to full visibility
in under 5 minutes

Instrument once, observe forever. No agents rearchitecting required.

01 // INSTALL
Add the SDK
One pip install, one import. AgentixLens wraps your existing agent code with zero-config telemetry capture.
02 // INSTRUMENT
Wrap Your Agent
Decorate your agent entrypoint. The SDK auto-captures every LLM call, tool use, input, output, and metadata.
03 // SHIP
Deploy as Normal
Your agent runs normally. AgentixLens streams traces to the dashboard with under 12ms overhead per call.
04 // OBSERVE
See Everything
Open the dashboard. Watch live traces, costs, latency, failures — and finally understand what your agents actually do.
Integration

Two lines to full
observability

AgentixLens is built for developers — minimal config, maximum insight. Works with LangChain, AutoGen, CrewAI, custom agents, and anything that calls an LLM.

  • Works with LangChain, AutoGen, CrewAI, LlamaIndex
  • Open-source SDK, self-hostable backend
  • OpenTelemetry compatible export
  • Zero PII — traces stay in your infra
  • Async-safe, thread-safe, production-ready
Python · agent.py
from agentixlens import lens, trace

# Initialize — points to your dashboard
lens.init(project="my-agent")

# Wrap your agent — that's it!
@trace("research-agent")
async def run_agent(query: str):
    # Your existing agent code — unchanged
    plan = await llm.call(query)
    results = await tool_search(plan)
    answer = await llm.summarize(results)
    return answer

# AgentixLens auto-captures:
# → every llm.call() input + output
# → tool latency & return values
# → token count + estimated cost
# → errors with full stack context
Limited beta spots remaining

Stop flying blind.
Start using the Lens.

Join 800+ developers on the waitlist. Free during beta.

// no credit card · no spam · open-source SDK included