Documentation Index
Fetch the complete documentation index at: https://docs.muxx.dev/llms.txt
Use this file to discover all available pages before exploring further.
This guide shows you how to route your LLM requests through the Muxx Gateway.
Prerequisites
- A Muxx account with a project created
- Your Muxx API key (
muxx_sk_live_...)
- Provider API keys configured in your project settings
Configuration
OpenAI
from openai import OpenAI
client = OpenAI(
base_url="https://gateway.muxx.dev/v1",
default_headers={
"X-Muxx-Api-Key": "muxx_sk_live_xxxxxxxxxxxx"
}
)
response = client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "user", "content": "Explain quantum computing in simple terms"}
]
)
print(response.choices[0].message.content)
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: 'https://gateway.muxx.dev/v1',
defaultHeaders: {
'X-Muxx-Api-Key': 'muxx_sk_live_xxxxxxxxxxxx',
},
});
const response = await client.chat.completions.create({
model: 'gpt-4o',
messages: [
{ role: 'user', content: 'Explain quantum computing in simple terms' },
],
});
console.log(response.choices[0].message.content);
curl https://gateway.muxx.dev/v1/chat/completions \
-H "Content-Type: application/json" \
-H "X-Muxx-Api-Key: muxx_sk_live_xxxxxxxxxxxx" \
-d '{
"model": "gpt-4o",
"messages": [
{"role": "user", "content": "Explain quantum computing in simple terms"}
]
}'
Anthropic
from anthropic import Anthropic
client = Anthropic(
base_url="https://gateway.muxx.dev/v1",
default_headers={
"X-Muxx-Api-Key": "muxx_sk_live_xxxxxxxxxxxx"
}
)
response = client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1024,
messages=[
{"role": "user", "content": "Explain quantum computing in simple terms"}
]
)
print(response.content[0].text)
import Anthropic from '@anthropic-ai/sdk';
const client = new Anthropic({
baseURL: 'https://gateway.muxx.dev/v1',
defaultHeaders: {
'X-Muxx-Api-Key': 'muxx_sk_live_xxxxxxxxxxxx',
},
});
const response = await client.messages.create({
model: 'claude-3-5-sonnet-20241022',
max_tokens: 1024,
messages: [
{ role: 'user', content: 'Explain quantum computing in simple terms' },
],
});
console.log(response.content[0].text);
Environment Variables
We recommend storing your Muxx API key in environment variables:
export MUXX_API_KEY="muxx_sk_live_xxxxxxxxxxxx"
Then reference it in your code:
import os
from openai import OpenAI
client = OpenAI(
base_url="https://gateway.muxx.dev/v1",
default_headers={
"X-Muxx-Api-Key": os.environ["MUXX_API_KEY"]
}
)
You can attach custom metadata to requests for filtering in the dashboard:
client = OpenAI(
base_url="https://gateway.muxx.dev/v1",
default_headers={
"X-Muxx-Api-Key": "muxx_sk_live_xxxxxxxxxxxx",
"X-Muxx-Metadata": '{"user_id": "user_123", "feature": "chat"}'
}
)
This metadata appears in the dashboard and can be used for filtering and analytics.
Streaming
The gateway fully supports streaming responses:
stream = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Write a haiku"}],
stream=True
)
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
Next Steps
Caching
Enable response caching to reduce costs
Rate Limiting
Configure request limits