Available Integrations
All integrations are included in the sepurux package. Install once, capture traces automatically, then use CI to gate on reliability.
# Install the recorder for your stack
pip install sepurux
npm install @sepurux/recorder
go get github.com/sepurux/go-recorder@latestOpenAI
AvailableWrap any openai.OpenAI client with instrument_openai. Every chat.completions.create call is recorded as an llm_call event with model, messages, output, and latency. Tool calls in the response are captured automatically.
- llm_call with model + latency
- tool_call for function calls in response
- failure on API errors
pip install sepurux openaiLangChain
AvailablePass SepuruxCallbackHandler to any LangChain chain, agent, or LLM. LLM calls, tool call/result pairs, and the final agent output are all captured without changes to your chain code.
- llm_call with model name + latency
- tool_call and tool_result pairs
- agent_output on finish
pip install sepurux langchain-coreOpenAI Agents SDK
Coming soonNative tracing for the OpenAI Agents SDK — captures handoffs, tool use, and guardrail events from multi-agent pipelines.
CrewAI
Coming soonAutomatic trace recording for CrewAI agent crews — captures task assignments, tool executions, and crew-level outputs.
OpenAI
instrument_openai wraps any openai.OpenAI (or AsyncOpenAI) client. Pass it a recorder from client.trace() and every chat.completions.create call is captured automatically.
import openai
from sepurux import SepuruxClient
from sepurux.integrations.openai import instrument_openai
client = SepuruxClient.from_env()
openai_client = openai.OpenAI()
with client.trace(
"customer_refund_flow",
{"ticket_id": "t-101"},
campaign_id=os.environ["SEPURUX_CAMPAIGN_ID"],
) as trace:
ai = instrument_openai(openai_client, recorder=trace)
response = ai.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Classify this support ticket"}],
)
print(trace.trace_id)
print(trace.run_id)No code changes
Your existing chat.completions.create calls work as-is — just swap the client.
Async support
Call await ai.chat.completions.acreate(...) for AsyncOpenAI — same recording behavior.
Tool call capture
Function calls returned by the model are automatically recorded as tool_call events.
LangChain
SepuruxCallbackHandler hooks into LangChain's callback system. Pass it via config to any chain, agent, or LLM and it records everything without changes to your chain logic.
from sepurux.integrations.langchain import SepuruxCallbackHandler
handler = SepuruxCallbackHandler(
"customer_refund_flow",
{"ticket_id": "t-101"},
campaign_id=os.environ["SEPURUX_CAMPAIGN_ID"],
)
# Pass to any LangChain chain, agent, or LLM
result = agent_executor.invoke(
{"input": "Process refund for ticket t-101"},
config={"callbacks": [handler]},
)
# Upload trace and start reliability run
handler.finish()
print(handler.trace_id)
print(handler.run_id)
# Or use as a context manager — finish() is called automatically
with SepuruxCallbackHandler("refund_flow", campaign_id="...") as handler:
result = chain.invoke(inputs, config={"callbacks": [handler]})Any runnable
Works with chains, agents, AgentExecutor, LLMs — anything that accepts callbacks.
Context manager
Use with SepuruxCallbackHandler(...) as handler: and finish() is called on exit.
Full event capture
LLM calls with latency, tool call/result pairs, and final agent output — all recorded.
Coming Soon
These integrations are in development. Star the repo or sign up for updates to be notified when they ship.
OpenAI Agents SDK
Capture handoffs, tool use, and guardrail events from multi-agent pipelines built on the OpenAI Agents SDK.
CrewAI
Record task assignments, tool executions, and crew-level outputs from CrewAI agent crews.
AutoGen
Trace multi-agent conversations, function calls, and code execution steps in AutoGen workflows.
Custom integration
Any framework can be instrumented manually using client.trace() and the TraceRecorder API.
