Drop erabot into any LLM app via proxy or SDK — no code restructuring required.
Add a single base_url parameter to any existing OpenAI-compatible client.
# Beforeimport openaiclient = openai.OpenAI(api_key="sk-...")# After — add base_urlimport openaiclient = openai.OpenAI(api_key="sk-...",base_url="https://api.erabot.ai/v1", # ← add this)
Wrap any function that calls an LLM. erabot tracks every call, attributes costs to your project, and reports savings automatically.
from erabot import track@track(project="my-chatbot")def generate_response(prompt: str) -> str:return client.chat.completions.create(model="gpt-4o-mini",messages=[{"role": "user", "content": prompt}]).choices[0].message.content