Prerequisites
Installation Guide
Configure your project to pull from Backbase Artifactory and install the SDK.
Quick Setup
1. Configure Environment Variables
Create a.env file with your credentials:
.env
2. Create Your First Agent
Createagent.py with AI Gateway and Observability:
agent.py
3. Run Your Agent
If you see a response, you’ve successfully connected to the AI Gateway! Check your LangFuse dashboard to see the trace.
What Happens Next?
When you initialize observability before creating the AI Gateway, all LLM calls are automatically traced—no additional code required. This integration gives you:- AI Gateway: Routes your LLM calls through the Backbase AI Platform with automatic authentication, agent ID validation, and policy enforcement
- Automatic Tracing: Every gateway call appears in LangFuse with full context: tokens, latency, cost, and request/response data
- Zero Configuration: The SDK handles instrumentation automatically—just initialize observability first
The observability module auto-instruments AI Gateway calls. Initialize
init() before creating gateway instances to ensure all calls are captured from the start.Customize Your Setup
- AI Gateway: Configure streaming, tools, framework adapters, error handling, and advanced client options
- Observability: Add custom tracing, configure backends (Datadog, Grafana), set up framework callbacks, and tune export settings