Skip to main content

1. Proxy Calls (/v1/...)

For POST /v1/chat/completions and POST /v1/completions:
  • X-API-Key: lt_... (TraceLM project key)
  • Authorization: Bearer <provider_key>
  • Optional X-Provider: openai | anthropic | google (defaults to openai)
curl -X POST "https://api.tracelm.ai/v1/chat/completions" \
  -H "X-API-Key: $TRACELM_API_KEY" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -H "X-Provider: openai" \
  -H "Content-Type: application/json" \
  -d '{"model":"gpt-4o-mini","messages":[{"role":"user","content":"Hello"}]}'

2. Product APIs (/api/v1/...)

Most management/reporting endpoints use JWT auth:
  • Authorization: Bearer <access_token>
Some task/conversation endpoints accept either JWT or X-API-Key.

Agent Context Headers (Optional)

Send these on proxy requests for execution grouping:
  • X-Task-ID
  • X-Task-Name
  • X-Conversation-ID
  • X-User-ID
These headers drive timeline assembly, loop/failure/context detection, and conversation grouping.