OpenAI Integration
The TypeScript SDK provides automatic instrumentation for all OpenAI API calls using OpenTelemetry.How It Works
When you initialize withinstrumentModules: { OpenAI }
:
- The SDK instruments the OpenAI client using OpenTelemetry
- Every API call is automatically captured as an Event
- Token usage and costs are calculated automatically
- Multimodal inputs (text + images) are handled seamlessly
Setup
What Gets Captured
For every OpenAI API call:- Input: Messages, prompts, and system instructions
- Model: The model used (gpt-4, gpt-3.5-turbo, etc.)
- Output: Complete response including function calls
- Token Usage: Input and output token counts
- Cost: Calculated based on current pricing
- Timing: Request duration
- Images: Automatically uploaded when using vision models
Examples
Basic Chat Completion
Streaming Responses
Vision Models
Function Calling
Embeddings
Minimal Example
Cost Tracking
The SDK includes built-in pricing for all OpenAI models:Model | Input Cost | Output Cost |
---|---|---|
gpt-4 | $0.03/1K | $0.06/1K |
gpt-4-turbo | $0.01/1K | $0.03/1K |
gpt-3.5-turbo | $0.0005/1K | $0.0015/1K |
o1-preview | $0.015/1K | $0.06/1K |
o1-mini | $0.003/1K | $0.012/1K |
Advanced Configuration
Custom OpenAI Client Options
Error Handling
Troubleshooting
Events Not Appearing
- Check import order - OpenAI must be imported AFTER
lai.init()
- Verify initialization - Ensure
instrumentModules
included the OpenAI module
Incorrect Costs
The SDK uses built-in pricing data. For custom models or pricing:See Also
- Anthropic Integration - Claude tracking
- Advanced Features - Multimodal, masking
- API Reference - Complete documentation