Skip to main content

Prerequisites

Before you begin, you’ll need:
  • A TraceLM account (sign up here)
  • An OpenAI API key
  • Python 3.8+ or Node.js 16+

Step 1: Get Your API Key

1

Create a Project

After logging in, create a new project from the dashboard. Each project has its own API key and isolated traces.
2

Generate API Key

Navigate to your project settings and generate a TraceLM API key. Your key will start with lt_.
Keep your API key secure. Never commit it to version control or expose it in client-side code.

Step 2: Install the SDK

pip install tracelm

Step 3: Initialize the Client

from tracelm import TraceLM

tracelm = TraceLM(
    api_key="lt_your-tracelm-key",        # Your TraceLM API key
    openai_api_key="sk-your-openai-key",  # Your OpenAI API key
)
For production, use environment variables instead of hardcoding API keys:
  • TRACELM_API_KEY
  • OPENAI_API_KEY

Step 4: Send Your First Trace

from tracelm import TraceLM

tracelm = TraceLM()

# Make a traced LLM call
response = tracelm.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Hello, world!"}]
)

print(response.choices[0].message.content)

Step 5: View Your Traces

Head to the TraceLM Dashboard to see your traces in real-time. You’ll see:
  • All LLM calls with request/response content
  • Latency and token usage metrics
  • Quality signals and detection results
  • Task and conversation groupings

Using Tasks for Agent Observability

If you’re building an AI agent, use tasks to group related LLM calls:
from tracelm import TraceLM

tracelm = TraceLM()

# Create a task to group related LLM calls
with tracelm.task(name="booking_flow", user_id="user_123") as task:
    response1 = tracelm.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": "Find flights to NYC"}]
    )

    response2 = tracelm.chat.completions.create(
        model="gpt-4o-mini",
        messages=[
            {"role": "user", "content": "Find flights to NYC"},
            {"role": "assistant", "content": response1.choices[0].message.content},
            {"role": "user", "content": "Book the cheapest one"}
        ]
    )

    # Complete and run detection
    result = task.complete()

    if result.loops.detected:
        print(f"Warning: Loop detected!")
    if result.failures.total > 0:
        print(f"Warning: {result.failures.total} tool failures")

Next Steps