Documentation Index Fetch the complete documentation index at: https://docs.muxx.dev/llms.txt
Use this file to discover all available pages before exploring further.
This guide shows you how to add observability to your Node.js/TypeScript LLM application.
Prerequisites
Muxx TypeScript SDK installed (npm install muxx)
Your Muxx API key
An LLM provider SDK (OpenAI, Anthropic, etc.)
Basic Usage
Wrap Your Client
The simplest way to use Muxx is to wrap your LLM client:
import { Muxx } from 'muxx' ;
import OpenAI from 'openai' ;
// Initialize Muxx
const muxx = new Muxx ({ apiKey: 'muxx_sk_live_xxxxxxxxxxxx' });
// Wrap the OpenAI client
const client = muxx . wrap ( new OpenAI ());
// Use as normal - all calls are automatically traced
const response = await client . chat . completions . create ({
model: 'gpt-4o' ,
messages: [{ role: 'user' , content: 'What is the capital of France?' }],
});
console . log ( response . choices [ 0 ]. message . content );
That’s it! All requests are now logged to Muxx.
Adding Context
Traces
Group related operations into traces:
import { Muxx } from 'muxx' ;
import OpenAI from 'openai' ;
const muxx = new Muxx ();
const client = muxx . wrap ( new OpenAI ());
async function summarizeDocument ( document : string ) : Promise < string > {
return muxx . trace ( 'document-summary' , async () => {
// Extract key points
const points = await client . chat . completions . create ({
model: 'gpt-4o' ,
messages: [{ role: 'user' , content: `Extract key points from: ${ document } ` }],
});
// Generate summary
const summary = await client . chat . completions . create ({
model: 'gpt-4o' ,
messages: [
{ role: 'user' , content: `Summarize these points: ${ points . choices [ 0 ]. message . content } ` },
],
});
return summary . choices [ 0 ]. message . content ?? '' ;
});
}
// Both LLM calls are grouped under one trace
const result = await summarizeDocument ( 'Your long document here...' );
Spans
Add more granular tracking with spans:
import { Muxx } from 'muxx' ;
import OpenAI from 'openai' ;
const muxx = new Muxx ();
const client = muxx . wrap ( new OpenAI ());
async function summarizeDocument ( document : string ) : Promise < string > {
return muxx . trace ( 'document-summary' , async () => {
const points = await muxx . span ( 'extract-points' , async () => {
return client . chat . completions . create ({
model: 'gpt-4o' ,
messages: [{ role: 'user' , content: `Extract key points: ${ document } ` }],
});
});
const summary = await muxx . span ( 'generate-summary' , async () => {
return client . chat . completions . create ({
model: 'gpt-4o' ,
messages: [{ role: 'user' , content: `Summarize: ${ points . choices [ 0 ]. message . content } ` }],
});
});
return summary . choices [ 0 ]. message . content ?? '' ;
});
}
Add custom metadata for filtering:
await muxx . trace (
'user-chat' ,
async () => {
// Your code here
},
{ metadata: { userId: 'user_123' , feature: 'support' } }
);
Error Handling
Errors are automatically captured in traces:
await muxx . trace ( 'risky-operation' , async () => {
try {
await riskyOperation ();
} catch ( error ) {
// Error is captured with full stack trace
throw error ;
}
});
Next Steps
Tracing Learn more about traces and spans
OpenAI OpenAI-specific features