EB
erabot.ai
  • Features
  • How it Works
  • Pricing
  • Blog
Back to Overview
Connect

One line. Full visibility.

Drop erabot into any LLM app via proxy or SDK — no code restructuring required.

Zero Infrastructure Changes
CLIENT APPYour AppERABOT PROXYCost tracking hereerabot.aiLLM PROVIDERSOpenAI · Anthropic · Gemini

One line change. Full visibility.

Add a single base_url parameter to any existing OpenAI-compatible client.

main.pypython
# Before
import openai
client = openai.OpenAI(api_key="sk-...")
 
# After — add base_url
import openai
client = openai.OpenAI(
api_key="sk-...",
base_url="https://api.erabot.ai/v1", # ← add this
)
Python SDK

Or use the SDK decorator

Wrap any function that calls an LLM. erabot tracks every call, attributes costs to your project, and reports savings automatically.

handlers.pypython
from erabot import track
 
@track(project="my-chatbot")
def generate_response(prompt: str) -> str:
return client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": prompt}]
).choices[0].message.content
Zero config
No YAML, no env setup. Just import and decorate.
Per-call attribution
See exact costs broken down by function, model, and project.
Works with any LLM
OpenAI, Anthropic, Gemini, Mistral, Cohere, and 10+ more.
15
providers supported
OpenAI, Anthropic, Google, Mistral, Cohere & more
<1ms
proxy overhead
Negligible impact — your users won't notice a thing
100%
OpenAI-compatible
Drop-in replacement — no SDK changes needed

Ready to see it in action?

Scan your codebase in under 60 seconds. No credit card required.