Skip to main content
This guide will help you start logging your LLM requests with Muxx in just a few minutes.

Prerequisites

  • A Muxx account (sign up here)
  • An API key from your LLM provider (OpenAI, Anthropic, or Google)

Step 1: Create a Project

  1. Log in to the Muxx dashboard
  2. Create or select an organization
  3. Click New Project and give it a name
  4. Add your LLM provider API key in the project settings

Step 2: Create Your Muxx API Key

  1. In your project, go to SettingsAPI Keys
  2. Click Create Key
  3. Choose Live for production or Test for development
  4. Click Create and copy the key immediately
muxx_sk_live_xxxxxxxxxxxxxxxxxxxx
The full key is only shown once. Store it securely—you’ll need it to authenticate requests.
See API Keys for more details on key management and security best practices.

Step 3: Choose Your Integration

The simplest option—just change your base URL.
from openai import OpenAI

client = OpenAI(
    api_key="your-openai-key",  # Or use OPENAI_API_KEY env var
    base_url="https://gateway.muxx.dev/v1",
    default_headers={
        "X-Muxx-Api-Key": "muxx_sk_live_xxxxxxxxxxxx"
    }
)

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}]
)

Step 4: View Your Logs

After making a request, head to the Muxx dashboard. You’ll see:
  • Request logs with full payloads
  • Token usage and cost breakdown
  • Latency measurements
  • Model and provider information

Next Steps