What is BB AI SDK?
BB AI SDK is a Python package published on Backbase Artifactory (repo.backbase.com) that connects your agentic applications to platform services (AI Gateway, Observability) with minimal code. It enables you to build production-ready AI agents with enterprise-grade features while maintaining complete framework flexibility.
The SDK acts as a bridge between any agentic framework (Agno, LangChain, LangGraph, or custom) and Backbase platform services, requiring just a few lines of code to get started.
from bb_ai_sdk.ai_gateway import AIGateway
from bb_ai_sdk.observability import init
# Initialize observability
init(agent_name="customer-support")
# Create AI gateway client
gateway = AIGateway.create(
model_id="gpt-4o",
agent_id="550e8400-e29b-41d4-a716-446655440000"
)
# Use standard OpenAI interface
response = gateway.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello!"}]
)
Key features
- OpenAI-Compatible: Works with any framework that supports OpenAI SDK
- Automatic Authentication: Handles API keys and agent ID validation
- Built-in Observability: Automatic tracing with LangFuse via OpenTelemetry
- Framework-independent: Native support for LangChain, LangGraph, and Agno
- Zero Vendor Lock-in: Uses standard APIs - your code is fully portable
Core modules
Why BB AI SDK?
Framework flexibility
Use your preferred Agentic AI framework:
- ✅ LangChain: Full support with adapters
- ✅ LangGraph: Native async integration
- ✅ Agno: Direct client compatibility
- ✅ Custom: OpenAI SDK interface works anywhere
Enterprise features
Production-ready capabilities built-in:
- 🔐 Multi model AI Gateway with content safety filters and policies
- 📊 Observability with LangFuse via OpenTelemetry
- 🏢 Multi-tenant context tracking
- 💰 Cost tracking per organization
- 🔍 Distributed tracing across services
Zero vendor lock-in
Your code remains portable and future-proof:
- Uses standard OpenAI SDK interface
- OpenTelemetry for observability (switch backends anytime)
- Framework-native objects - no custom abstractions
- Works with or without the platform
Quick examples
Basic usage
With LangChain
With LangGraph
With observability
from bb_ai_sdk.ai_gateway import AIGateway
gateway = AIGateway.create(
model_id="gpt-4o",
agent_id="550e8400-e29b-41d4-a716-446655440000"
)
response = gateway.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "system", "content": "You are a helpful assistant"},
{"role": "user", "content": "What is AI?"}
]
)
print(response.choices[0].message.content)
from bb_ai_sdk.ai_gateway import AIGateway
from bb_ai_sdk.ai_gateway.adapters.langchain import to_langchain
gateway = AIGateway.create(
model_id="gpt-4o",
agent_id="550e8400-e29b-41d4-a716-446655440000"
)
# Convert to LangChain model
model = to_langchain(gateway)
# Use with LangChain components
from langchain.schema.output_parser import StrOutputParser
chain = model | StrOutputParser()
response = chain.invoke("Tell me a joke")
print(response)
from langgraph.graph import StateGraph, END
from bb_ai_sdk.ai_gateway import AsyncAIGateway
from bb_ai_sdk.ai_gateway.adapters.langchain import to_langchain_async
gateway = AsyncAIGateway.create(
model_id="gpt-4o",
agent_id="550e8400-e29b-41d4-a716-446655440000"
)
model = to_langchain_async(gateway)
async def generate(state):
response = await model.ainvoke(state["messages"])
return {"messages": [response]}
graph = StateGraph(AgentState)
graph.add_node("generate", generate)
graph.add_edge("generate", END)
graph.set_entry_point("generate")
app = graph.compile()
from bb_ai_sdk.observability import init
from bb_ai_sdk.ai_gateway import AIGateway
# Initialize observability
init(
agent_name="my-agent",
organization_id="org_123"
)
# Gateway calls are automatically traced
gateway = AIGateway.create(
model_id="gpt-4o",
agent_id="550e8400-e29b-41d4-a716-446655440000"
)
response = gateway.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello"}]
)
# View traces in Langfuse dashboard
Next steps